top of page

Using Runway Act: One with Blender 3D

  • rich-66
  • Mar 18
  • 2 min read

Runway has been moving through iterations and updates at speed and although we’re seeing more and more AI video generation tools, this still remains as the most popular. With text prompts, lip syncing, audio creation and now Act: One, it’s spreading is utility for video production and creativity. It’s proving to be a one stop shop for assets, but like with all AI tools, shouldn’t necessarily be used in isolation, but rather as one part of a process.


What is Act: One?

Act: One generates character animations mapped to a source video, linked to an image reference. Of course, there are requirements for this, limiting the user to close-up shots of faces and clear video recordings of the speaker. It can’t handle wide shots for now, but what it can achieve is something very special. This isn’t just a simple filter. It analyses the character image and maps it to the actors face movements, resulting in something comparable to motion capture. This provides a solution to creators who want to animate dialogue, who have a lower technical ability. With a few clicks you’re ready to go…just make sure you have the credits! Runway works off a credit system, where you can top up 1,000 credits for £12. Each video generation cost around 200 credits, so if you’re a hobbyist, it might be best to keep the videos a little shorter.


Bringing in Blender 3D

Runway has it’s own character references you can choose from it’s gallery, but if you’re creating an animation with your own modelled and rigged character, you can easily set this up. Simply render out a frame of the character and import. You could even use Runway to generate background animation if you wanted to add a bit more life to your scene.


This method provides a great solution for a lot of animators who don’t want the hassle of rigging and lip syncing in Blender 3D. Simple render out your shot, record your video and dialogue, then generate. You’d just need to figure out a way to transition from wider shots to close-up shots in your editing.


Is it perfect? Not yet. Is it usable though? Absolutely. Let’s not also forget the amount of time you can save using this method, which ultimately translates to lower costs, to both you and the client!

 
 
 

Comentários


bottom of page