Runway releases an impressive new video-generating AI model

https://techcrunch.com/feed/ Hits: 21
Summary

AI startup Runway on Monday released what it claims is one of the highest-fidelity AI-powered video generators yet. Called Gen-4, the model is rolling out to the company’s individual and enterprise customers. Runway claims that it can generate consistent characters, locations, and objects across scenes, maintain “coherent world environments,” and regenerate elements from different perspectives and positions within scenes. “Gen-4 can utilize visual references, combined with instructions, to create new images and videos utilizing consistent styles, subjects, locations, and more,” Runway wrote in a blog post, “[a]ll without the need for fine-tuning or additional training.” Runway, which is backed by investors including Salesforce, Google, and Nvidia, offers a suite of AI video tools, including video-generating models like Gen-4. It faces stiff competition in the video generation space, including from OpenAI and Google. But the company has fought to differentiate itself, inking a deal with a major Hollywood studio and earmarking millions of dollars to fund films using AI-generated video. Runway says that Gen-4 allows users to generate consistent characters across lighting conditions using a reference image of those characters. To craft a scene, users can provide images of subjects and describe the composition of the shot they want to generate. “Gen-4 excels in its ability to generate highly dynamic videos with realistic motion as well as subject, object, and style consistency with superior prompt adherence and best-in-class world understanding,” the company claims in its blog post. “Runway Gen-4 [also] represents a significant milestone in the ability of visual generative models to simulate real-world physics.” Gen-4, like all video-generating models, was trained on a vast number of examples of videos to “learn” the patterns in these videos to generate synthetic footage. Runway refuses to say where the training data came from, partly out of fear of sacrificing competiti...

First seen: 2025-03-31 15:42

Last seen: 2025-04-01 11:46