何だこれスゲェ!
Depthセンサーで表情をキャプチャするFaceShiftっていうツールとMakeHumanの新しいフェイシャルリグを利用して、Blender上で表情のアニメーションを作るデモ動画。
http://vimeo.com/97096194
MakeHumanの新しいフェイシャルリグシステムは、表情変化をブレンドシェイプじゃなくてボーンで制御することで異なる顔にも転送できるようにしてるわけね。
スポンサーリンク
MakeHuman + Faceshift + Blender!
スポンサーリンク
We are porting SLSI’s FaceShift script for Blender to the next version of MakeHuman facial rig.
It was originally written by Sign Language Synthesis and Interaction group at DFKI/MMCI (Saarbrücken, Germany) to work with MakeHuman Alpha 7 and now Jonas Hauquier is modifying it in order to work with the new MakeHuman rigging (still under development).
What you see is a prototype of a script that will be part of the official MHtools Blender scripts.
It illustrates that it is very easy to make the new MakeHuman face rig compatible with FaceShift, and that it will be not very complex to integrate the MH mesh with faceShift in tools like Maya.
It also shows the power of the new face rig in MH, and how portable it is across entirely different human models.
スポンサーリンク
コメント