For use in 2D or 3D character animation (lip synchronisation) applications, especially fit for live, direct voice-to-animation conversion (eg live television shows).
A ready to use lipsynchronisation module is available in SDK format.
It can be used off-line or with live voice input.
A minimal delay of about 300 milliseconds between voice input and viseme vector output can be compensated with equally delayed voice output.
Your application has to take care of the animation itself. The LipAlike module just tells every moment how the lips and other facial expressions should be mixed, for instance by morphing typical viseme 3D meshes. Voice input and/or output can be handled by the module.
The LipAlike engine is highly bound to the Windows environment, but the animation itself can work on any other system, by the use of a network connection.