Artists dealing with facial animation tend to concentrate on head morph targets with emotions and lip-sync. LIFESTUDIO:HEAD technology makes it possible for AI routines to do some work for the artist. The system was made with balance and intelligence: human intelligence can create a set of parameterized emotions - Macro-Muscles, and dull artificial intelligence can use them in real time. Thanks to LIFESTUDIO:HEAD software, mixing different Macro-Muscles gives a visually recognizable effects; instead of making the whole animation, an artist creates bricks of emotions, and the whole animation can be generated later in real-time. This is key to non-linear animation. Of course to get the best results an artist may create the whole animation instead of AI - this process becomes easy as it is based on the comprehensive library.

From one prototype


To create a head, an artist can start with a prototype and sculpt a head model changing its mesh form with the help of about a hundred sliders. A hierarchy of sliders cascades from general features like gender, race or age to subtle features like lip thickness or nose bridge length, this makes the process of creation smooth and fast. If you do not like the slider set, then add your own slider or modify the existing one. Select from a giant texture library and built-in texture constructor to create an appropriate texture. Simply get a base texture and add make-up, wrinkles, scars, moles, moustaches etc. You can add more features by exporting the resulting texture, fine-tune it in any 2D editor and load back into the LIFESTUDIO:HEAD program. As a result, you will get a head model - with up to four LOD produced automatically! To save more time, use the head generation facility and select the head you like from an unlimited range of different heads.

Any character can
To see this video, you need to upgrade your Flash Player

play any animation


To animate a head, you load facial expressions onto the timeline using the animation library with dozens of predefined emotions and animation fragments. Add sound files and generate tracks with lip-synch data. If you add a smile, your character will speak while smiling - you do not need to change lip-synch tracks. By adding eye tracking, your character will follow the target while head turns. You may even animate the pupil size, and of course you may update the animation library - add your own items or edit the existing ones.
At any time after you finish animating, you may change lip-synch tracks keeping other animations untouched, for example for localization to another language. With a proper game data structure you may use the benefit of a console converting routine that generates lip-synch tracks for a new language in batch mode and drastically reduce time for localization.

Integrating into 3ds Max and Maya

LIFESTUDIO:HEAD Muscles Setup plug-ins for Autodesk® 3ds Max or Maya brings the technology to custom heads not based on predefined LIFESTUDIO:HEAD prototypes, while Import-Export plug-ins allows rendering animations in full-featured 3ds Max or Maya's scenes.