Runway has launched Act-Two, its next-generation AI-powered motion capture model that creates character animations using only a driving performance video and reference character image. The technology tracks full body movements, facial expressions, and hand gestures without requiring specialized hardware or motion capture suits, marking a significant shift toward accessible animation workflows for creators across budget levels.

The company announced the release is now available to all users.

Act-Two delivers major improvements in fidelity and consistency over previous AI motion capture models while supporting diverse character types and artistic styles. The model automatically incorporates environmental motion into animations, adding naturalistic details like subtle body weight shifts that make characters feel more alive.

Beyond the Suit: Traditional mocap barriers disappear with single-video input requirements

Act-Two fundamentally changes how motion capture works by requiring only two inputs: a performance video of an actor and a reference character. This eliminates the need for expensive motion capture studios, specialized suits with markers, or complex hardware setups that have historically limited access to high-quality animation.

The model tracks comprehensive movement data including:

  • Full body motion across all major joints and limbs

  • Head orientation and positioning in 3D space

  • Facial expressions with fine detail capture

  • Hand and finger gestures for nuanced character interaction

Unlike traditional mocap systems that require controlled studio environments, Act-Two works with standard video footage shot in typical lighting conditions. This accessibility opens professional-grade animation to independent creators, smaller studios, and projects with limited budgets.

Frame Rate Reality: Technical specifications align with industry production standards

Act-Two outputs animations at 24 fps, matching standard film and television production frame rates. The model supports multiple aspect ratios including 16:9, 9:16, 1:1, 4:3, 3:4, and 21:9, providing flexibility for various distribution platforms and creative formats.

Generated clips can run up to 30 seconds per session with a minimum duration of 3 seconds. The web-based platform operates entirely online, eliminating the need for proprietary software installations or local processing power.

Runway's pricing structure charges 5 credits per second of generated animation, with a minimum 3-second charge per generation. An "Explore Mode" offers unlimited generations for experimentation and iteration.

Character Versatility: AI adapts performance to any artistic style or character type

Act-Two's character compatibility extends beyond human figures to animate any character type across artistic styles. The model translates live performance to photorealistic characters, stylized animations, cartoonish figures, and non-human entities while maintaining performance fidelity.

The X announcement showcases the model's ability to work with diverse character designs and environments without compromising motion quality. This flexibility allows creators to experiment with different artistic directions using the same performance capture.

For gesture control, when provided with a character image, Act-Two enables creators to define specific hand and body movements through the driving performance video. The model also automatically generates contextual environmental motion, adding realistic details like background interaction and natural body language.

Workflow Integration: Rapid prototyping capabilities transform pre-production processes

Act-Two's speed enables new approaches to animation workflows, particularly in pre-production and previsualization. Directors can quickly translate live-action test shoots into digital character animations, allowing for rapid iteration on character performance and scene blocking.

The technology proves especially valuable for:

  • Storyboard animation where static boards can be brought to life with character movement

  • Game development for quick NPC behavior testing and player character animation

  • Commercial production enabling fast turnaround on spokesperson avatars and product demonstrations

  • Virtual production integration for real-time character animation in digital environments

Best results require well-lit performance videos with clear visibility of movements and minimal background obstructions. Character reference images should clearly show body positioning and hands when gesture control is needed.

The Final Cut: Democratized animation tools reshape industry economics and creative access

Act-Two represents a fundamental shift in animation accessibility, removing technical and financial barriers that have traditionally separated professional studios from independent creators. By eliminating expensive hardware requirements and complex technical workflows, the technology enables a broader range of creators to produce high-quality character animation.

The model's impact extends beyond individual creators to reshape industry economics. Smaller studios can now compete with larger facilities in animation quality, while established studios can allocate resources toward creative development rather than technical infrastructure.

Act-Two will likely accelerate experimentation in character animation across media formats, from social media content to feature film production. The technology's ability to work with any character type and artistic style suggests we'll see increased diversity in animated content as creation barriers continue to fall.

Reply

or to participate

Keep Reading

No posts found