?

Log in

No account? Create an account

BloodSpell Development Updates

TOGLfaceS

BloodSpell Development Updates

TOGLfaceS

Previous Entry Share Next Entry
We've just realized we haven't really talked about our lipsynching tool, TOGLfaceS, even though we've been using it for a while. Whoops. Time to correct that little oversight.

Some people would be dissuaded from making Machinima in Neverwinter Nights (NWN) because it doesn't have any kind of built-in lip-synching. Not us. We got in touch the very clever Anthony Bailey, and asked him to give us some lip-synching, please.

He took advantage of the fact that NWN uses OpenGL, a standard application programming interface, to display it's output on your screen. It was possible to intercept the OpenGL commands from the game, and change them before sending them on their way - replacing one texture on a model with another one, for example. Do that with a bunch of textures, and you've got a kind of hi-tech flickbook you can apply to models.

Anthony quickly got back to us with TOGLfaceS - "Take Over GL face Skins". TOGLfaceS uses a text file to bind specific keys to in-game characters and expressions - using those keys will then let you puppet character expressions in game, swapping textures to make characters look happy, or sad, or change their mouth shapes and give them some lip-synching. We had our lip-synching tool. All we needed were lips to synch.

We had to create the textures we wanted to swap to give our characters facial expressions and mouth movement. Between Adobe's Paintshop, Editpad, a PLT converter, and the NWN Explorer, it was easy to create new textures for our characters, and models to view the textures on - but it wasn't so easy to get them looking acceptable. We're still refining some of our characters' faces now, and we're halfway through filming.



Our main characters needed 6 faces for each emotion:
- resting face with the eyes open and the mouth closed
- talking face with the mouth slightly open
- talking face with the mouth open
- blinking face with the eyes and mouth closed
- looking left
- looking right



Any bit-part characters who speak also needed a number of faces - usually, just the top 4 on our list. With characters having a number of expressions (normal, angry, frightened, happy, sad), that's a lot of faces. In fact, our character heads module (more on this in a minute) contains over 200 different faces for characters.



So. We'd made a couple of hundred heads for our characters. What now? Well, we needed to get those heads into the game. We put our created models and textures into a HAK (one of NWN's asset files) and made a game module, imaginatively called 'Character Heads'. This module is made up of half a dozen maps, filled with the disembodied heads of our characters - each with a different facial expression.

That sounds a little strange, but there's a perfectly reasonable explanation. TOGLfaceS can only apply textures that have been loaded into the game, so if we want a character to talk, we need to have loaded all of his talking head textures. Creating one module filled with heads and loading it before changing modules and starting shooting is more efficient than putting all of your character heads into all of your modules, after all. Efficiency also explains the decapitations - we don't change any other textures, and over 200 extra bodies means longer load times.

We'll be releasing TOGLfaceS when we have the time in our production schedule - if you're desperate to make NWN machinima, you may as well start making your faces now - you might be done by the time we release it.
  • CrazyTalk

    (Anonymous)
    Playing aroound with the CrazyTalk idea - here is an animation of Jared reciting Lord Byron: http://www.archive.org/download/jaredbyron/JaredCrush.wmv

    Was easy and quick to do. :)
Powered by LiveJournal.com