Eyeline Studios, powered by Netflix, has introduced Go-with-the-Flow, a new technique for controlling motion patterns in video diffusion models.
This innovation allows users to manipulate camera and object movements within a scene and even transfer motion patterns between videos, offering enhanced creative control in video production.
wow..
Netflix just dropped an AI video generator, it lets you animate of your object with precise KEYFRAMEs, it's crazy..
this is the first in AI world
here's how it works:
— el.cine (@EHuanglu)
10:37 AM • Jan 22, 2025
Go-with-the-Flow works with both image-to-video and text-to-video models, capable of deriving 3D scenes from motion information alone.
The method enables various motion control types, including cut-and-drag animations and first-frame editing.
The technique fine-tunes a base model using warped noise instead of pure i.i.d. Gaussian noise, maintaining the same computational cost.
Users can adjust motion control strength through "noise degradation" at inference-time.
It can transfer motion patterns between videos, including 3D-rendered turntable camera motions and DAVIS dataset motions.
The technique allows for advanced camera control applications, such as creating coherent 3D scenes from a single image using monocular depth estimation.
Go-with-the-Flow improves temporal consistency in image-to-image translation tasks like relighting and super-resolution, without requiring additional training.
Netflix's Go-with-the-Flow represents a significant advancement in video generation and editing techniques. As the technology continues to develop and integrate with existing tools, we can expect to see more sophisticated and user-friendly options for video manipulation and creation in the near future, potentially transforming the landscape of digital content production.
Reply