Filmmaker Albert Bozesan put Wan 2.2 Animate through its paces, testing its claim to transfer body and facial movements from real footage onto AI-generated images—specifically turning his office performance into a 1940s film poster illustration style.

What Works:

  • Body control - Outperforms paid competitors like Runway Act Two for complex movements and prop handling (Bozesan's test included towel manipulation and detailed gestures that translated cleanly)

  • Closeup and medium shots - Handles detailed actions well at these distances

  • Single-camera workflow - No specialized equipment required, just standard video reference

What Doesn't:

  • Facial consistency - Characters' faces warp dramatically compared to reference images; Clark's hair color shifts between shots

  • Wide shots - Break down completely due to resolution limitations

  • Requires tool mixing - Bozesan had to patch in Kling 2.5 i2v for insert shots and InfiniteTalk for lip sync when Wan melted facial features

Bottom Line: Wan 2.2 Animate delivers on body motion transfer in ways existing tools don't—particularly for directors wanting precise gesture control in stylized animation. But it's a multi-tool workflow, not a one-stop solution. You'll need to composite around its facial and wide-shot limitations, which makes it more "powerful ingredient" than complete pipeline.

Reply

or to participate

Keep Reading

No posts found