Adobe's new software Character Animator for After Effects allows people to animate characters with their face. Apparently, this allows easy creation of characters in illustrator then transition to after effects for animation using the face. The software analyzes facial movement in real time, then converts it into the movement of the character. Adobe is planning to release this software with the next update for the creative cloud version of after effects, so that is when you can get your hands on it.
I don't really know how I feel about this software. At the current stage of it, it doesn't seem to have any practical use other than entertaining yourself. I am however interested in what the future holds for this software, because of all the possibilities this could open up in the way of animation. I am specifically interested in the way it lip syncs. If it was more accurate i would definitely use it to speed up the time it takes me to lip-sync.
Valuable information in your blog and I really appreciate your work and keep it up dude I really very informative blog about the Digital Puppets keep posting..
ReplyDeleteDigital Puppets