March 21, 2025

Runway, an AI startup that helped develop the AI picture generator Secure Diffusion, launched its first mobile app yesterday to present customers entry to Gen-1, its video-to-video generative AI mannequin. The app is at the moment solely out there on iOS gadgets.

With the brand new app, customers will be capable of document a video from their telephones and generate an AI video in minutes.  They will additionally rework any current video of their library through the use of textual content prompts, photos or model presets.

Customers can choose from Runway’s checklist of presets like “Cloudscape,” or rework their video to seem like it’s a claymation, charcoal sketch, watercolor artwork, paper origami and extra. You can even add a picture or kind an thought into the textual content field.

The app will then generate 4 previews so that you can choose from. As soon as you choose which one you want essentially the most, it would take a couple of minutes to provide the ultimate product. We examined the app ourselves and located it took about 60 seconds or longer. Typically it takes two minutes to generate.

Naturally, as with all AI generator, the outcomes aren’t good and sometimes look distorted or bizarre trying. On the whole, the idea of AI video mills could seem foolish and possibly even gimmicky. Nevertheless it’s simple to see simply how cool it may very well be because the tech evolves over time.

Meta and Google have each launched text-to-video mills, that are referred to as Make-A-Video and Imagen, respectively.

Regardless, we discovered that Runway’s cellular app was simple to make use of and general enjoyable to fiddle with.

Under is one instance that we got here up with, utilizing a clip of Michael Scott from “The Office.” A textual content immediate we entered was “sensible puppet.”

(Warning: the result’s terrifying.)

We additionally tried “3D animation,” which turned out alright.

Granted, there are just a few different caveats apart from glitches and warped faces.

If customers need the free model, there’s a restrict of 525 credit, they usually can solely add movies which might be 5 seconds lengthy. Every second of a video makes use of 14 credit.

Sooner or later, Runway plans so as to add assist for longer movies, co-founder and CEO Cristóbal Valenzuela advised TechCrunch. The app will proceed to enhance and launch new options, he added.

“We’re targeted on bettering effectivity, high quality and management. Within the coming weeks and months, you’ll see every kind of updates, from longer outputs to higher-quality movies,” Valenzuela mentioned.

Additionally, notice that the app doesn’t generate nudity or copyright-protected work, so you’ll be able to’t create movies that mimic the model of in style IP.

Runway’s new cellular app has two premium plans: Commonplace ($143.99/12 months) and Professional ($344.99/12 months). The usual plan provides you 625 credit/month and different premium options like 1080p video, limitless tasks and extra. The Professional plan affords 2250 credit/month and all of Runway’s 30+ AI instruments.

A month after Runway launched Gen-1 — which launched in February — Runway rolled out its Gen-2 mannequin. Arguably a step up from text-to-image fashions Secure Diffusion and DALL-E, Gen-2 is a text-to-video generator, so customers will be capable of generate movies from scratch.

Runway has slowly begun to roll out entry to its closed beta for Gen-2, Valenzuela advised us.

The app at the moment helps the Gen-1 mannequin, nevertheless, Gen-2 will quickly change into out there together with Runway’s different AI instruments, equivalent to its image-to-image generator.

Runway has developed varied AI-powered video-editing software program ever because it launched in 2018. The corporate has a wide range of totally different instruments inside its web-based video editor, equivalent to body interpolation, background elimination, blur results, a function that cleans or removes audio and movement monitoring, amongst many others.

The instruments have helped content material creators and even film/TV studios cut back the time spent modifying and creating movies.

As an example, the visual effects team behind “Every part In every single place All at As soon as” used Runway’s tech to assist create the scene within the movie the place Evelyn (Michelle Yeoh) and Pleasure (Stephanie Hsu) are in a multiverse the place they’ve been become shifting rocks.

Plus, the graphics workforce behind CBS’ “The Late Present with Stephen Colbert” used Runway to trim down hours of modifying to solely 5 minutes, according to art director Andro Buneta.

Runway additionally operates Runway Studios, its leisure and manufacturing division.