How to use Runway's Motion Brush

Поділитися
Вставка

КОМЕНТАРІ • 55

  • @aivideoschool
    @aivideoschool  6 місяців тому +2

    Here's a tutorial on the newer Runway Multi Motion Brush: ua-cam.com/video/P41bfQrsgDY/v-deo.html

  • @k29utube36
    @k29utube36 8 місяців тому +7

    Proximity is used to move in the Z-axis direction, toward the viewer or away from the viewer.

  • @KoolKenny765
    @KoolKenny765 5 місяців тому

    Oh, awesome man, thanks for showing me how to brush. I can't wait to make a short film

    • @aivideoschool
      @aivideoschool  5 місяців тому

      Awesome, I'm glad it was helpful! Good luck with your short film

  • @LL-vd3qi
    @LL-vd3qi 7 місяців тому +1

    Thank you for taking us along on your testing. Picked up some tips. 👍

  • @thelthrythquezada8397
    @thelthrythquezada8397 10 місяців тому +3

    Bah-rooooooooo!!!! I knew it, I kneeeeeew that I was going to LOVE the future (Speaking of 10 year old be back in the early 90s) Well, there are things about the modern age but this is cool

  • @birdsongs482
    @birdsongs482 10 місяців тому +4

    I began using it a few days ago, and I often just want little movements like the fire animation in the fireplace, or candle flames, and the result is usually hit or miss. And sometimes, it adds ships in the water on its own. It works great on clouds though.
    Anyways, Thanks for the video!

    • @aivideoschool
      @aivideoschool  10 місяців тому +2

      I had the same issue with a fireplace. I wish there was a brush that was like a negative prompt ("don't add motion to this area").

    • @georgew2014
      @georgew2014 10 місяців тому +4

      I add things to my prompts that act like negative prompts. For example, "Clouds moving left to right, while the moon is static." The bit about the moon is the pseudo-negative prompt that reduces potential morphing. I paint the direction of the cloud motion over some of the clouds AND adjust the motion sliders. The length of the brush strokes, combined with sliders, can create nice slo-mo effects.

    • @birdsongs482
      @birdsongs482 10 місяців тому

      @@georgew2014 Thank you, I appreciate it

    • @Silent_Owl91
      @Silent_Owl91 9 місяців тому

      I want to use this to animate movement on a fireplace does this work correctly for it? Or is there an alternative for animating a fireplace please? ​@@aivideoschool

    • @TimothyDark
      @TimothyDark 8 місяців тому

      @@georgew2014This tip is awesome. Thank you!

  • @TheBlessingReport
    @TheBlessingReport 9 місяців тому +1

    Can you make a list in order from best to worst for moonvalley, runway ml gv2, pika labs 1.0, Google text to video. And I don’t need a full video. A comment really is fine because I’m trying to use image prompts

    • @aivideoschool
      @aivideoschool  9 місяців тому +1

      For image to video, Runway and Pika 1.0 are the best ones I've used in terms of staying close to the original image and being able to inpaint/outpaint the generated video. Which one is better seems more like a preference in how you like to work. If you prefer classic video editing tools, then Runway. If you prefer something that feels closer to editing a social post, then Pika 1.0. Either way, you'll have to constantly play around with motion settings and prompts to get the result you want, but that's part of the fun imo.

  • @RoofAndAMeal4UsAll
    @RoofAndAMeal4UsAll 10 місяців тому +5

    If you paint only one side of a head and then use the horizontal control to move the head a tiny bit, the back of the head might not move. It's tricky

    • @aivideoschool
      @aivideoschool  10 місяців тому

      I wish I had tried this in the demo, it seems like the right approach

    • @ronilevarez901
      @ronilevarez901 4 місяці тому +1

      Everything on image animation is "tricky".
      I haven't been able so far to make any decent looking video no matter what I try or what website I use. Much less get the exact movement I need.
      I feel like giving up on AI video generation.

    • @RoofAndAMeal4UsAll
      @RoofAndAMeal4UsAll 4 місяці тому

      @@ronilevarez901 I hear that. One has to tailer story telling to what outputs the generator provides. It is super-hard to write like that. The other suck is the artifacts in everything, there's AI artifacting in what were once reputable shows, it cheapens them to forego editing out artifacts where possible. This is why I do what I do, it is something that AI is passable at.

    • @ronilevarez901
      @ronilevarez901 4 місяці тому +2

      @@RoofAndAMeal4UsAll Well, I can write stuff from anything I get put in front and steering the story to anywhere necessary. That wouldn't be a problem. The thing is, I don't want to do that anymore.I want to tell the story that I already have in my mind/paper so I need to make the exact animation I need.
      Sometimes I wonder if these tools weren't created to show us how much professional animators are needed! XD

    • @RoofAndAMeal4UsAll
      @RoofAndAMeal4UsAll 4 місяці тому +1

      @@ronilevarez901 I understand. With the short Sasquatch film I made I wanted to do much more than the tech would provide. Runwayml Lip sync doesn't work on animated characters as i'm sure you know which makes it about useless. ua-cam.com/video/rQ9r3dAqiOc/v-deo.html. ua-cam.com/video/_hjXcRiQcHA/v-deo.html. I think we have to perservere until such time as the tools mature enough. If we want to upset the status quo and move up from fine artist to storyteller/producer. Keep pushing forward!

  • @michelhildebrand9559
    @michelhildebrand9559 9 місяців тому +1

    Great video. Which video editor software do you use to have better animated flames in the fireplace behind the dragon. Can you make a video to animate flames in fireplace.

    • @aivideoschool
      @aivideoschool  9 місяців тому

      I usually use CapCut but I used Premiere for the dragon fireplace video because it was easier to mask. I generated 5 or so versions then did a reverse mask so the animation from the fireplace played over the still image.

  • @TheBlessingReport
    @TheBlessingReport 8 місяців тому +1

    Can we get a topaz video upscale ai enhancer alternatives video and pricing with free alternatives?

    • @aivideoschool
      @aivideoschool  8 місяців тому +1

      I'm not totally sold on video upscalers but CapCut has one that does 2x that's free. I want to do a video on all the AI magic tools in CapCut because I think they've surpassed Adobe Premiere in that regard.

  • @SebAnt
    @SebAnt 8 місяців тому +1

    Excellent inspiration!

  • @mrbabyhugh
    @mrbabyhugh 6 місяців тому

    8:45 AI will still read the entire image and respond, which is great. Note. from the girl in the train it made the girl eyes wink to the sun rays. So here, it made all that it identified as "water" to also animate. I think the brush is mainly to control direction and fine detail, like the dragon and fireplace. but it still tried to adjust room lighting to the flames.

  • @sophiesflyingunicorn8197
    @sophiesflyingunicorn8197 8 місяців тому +1

    do you recommend using the paid version of runway or free version is fine?
    Im a newbie but I find this so interesting

    • @aivideoschool
      @aivideoschool  8 місяців тому

      I'd recommend trying all the free ones you can, while they're still free. Discord has a lot of servers with free AI generation ability. Runway and Pika are both making great tools on their paid plans.

  • @donaldbough3445
    @donaldbough3445 10 місяців тому +2

    Cool to see the examples, but the Runway tutorial quickly shows proximity is z-axis in and out. Not what this video says. Still helpful to see someone play around with it!

    • @aivideoschool
      @aivideoschool  10 місяців тому +1

      I was wondering how I missed that and then saw Runway updated their tutorial video (and interface) since I recorded the demo. The X/Y/Z axis labels help out a lot, thanks for pointing that out.

  • @ChariesTales
    @ChariesTales 5 місяців тому

    Is the free version safe for commercial use?

    • @aivideoschool
      @aivideoschool  5 місяців тому

      According to their website it is: help.runwayml.com/hc/en-us/articles/18927776141715-Usage-rights
      "As a user on any of our plans (Free, Standard, Pro, or Unlimited) you retain complete commercial rights to all content generated or edited using Runway. You're 100% free to use any content you create using Runway both commercially or non-commercially, and all copyright for your creations and generations is held by you."

    • @ChariesTales
      @ChariesTales 5 місяців тому

      @@aivideoschool Thank you

  • @Trigun892
    @Trigun892 5 місяців тому

    How did you get the dragon not to move at all in the last one? Even if I do not paint over something it will still assume it is moving

    • @aivideoschool
      @aivideoschool  5 місяців тому +1

      It might have been luck, but I painted the fire in the fireplace and used the prompt: "the fire crackles in the fireplace." Maybe the combination of prompt and brush focuses the motion to just those elements??

  • @gayshawndayleequeef1946
    @gayshawndayleequeef1946 10 місяців тому +2

    Does it eat up more credits applying it to genereated videos?

    • @aivideoschool
      @aivideoschool  10 місяців тому

      Whoa, I didn't realize you could use text-to-video image previews. Looks like it uses the same credits as a normal Gen-2 four second video though. ua-cam.com/users/shortsHF9Rjnl_zFw?feature=share

  • @MoonHutMusic
    @MoonHutMusic Місяць тому

    Has anyone figure out how to loop a video so motion brush ends up in the same spot for extended videos?

  • @sergiolopes5620
    @sergiolopes5620 7 місяців тому +1

    isn't proximity how close or far something appears? Would make more sense no?

  • @EndoSkull
    @EndoSkull 10 місяців тому +2

    To me it looks like you were painting too much. Example lady turns to window - shouldn't you have only painted her head/hair - With the sword should you have only painted around the blade?

    • @aivideoschool
      @aivideoschool  10 місяців тому +1

      Yeah, I definitely over-did the brush on the subway woman. The sword seems to have worked great until I added the camera motion but if I were to do it again, I would add a text prompt describing the sword effect more.

  • @gayshawndayleequeef1946
    @gayshawndayleequeef1946 10 місяців тому +1

    What system and browser are you using? My version doesn't look anything like that on windows chromium.

    • @aivideoschool
      @aivideoschool  10 місяців тому

      I'm on a MacBook and use Chrome for Runway. (Runway seems to work better on Chrome than Safari for Macs.)

  • @Niko-sf1sw
    @Niko-sf1sw 9 місяців тому

    i cant see the brush... i click on motion brush but my cursor is not an purple dot.... some1 help pls

    • @addytv18
      @addytv18 3 місяці тому

      Throw your pc from the window. That might work

  • @carkawalakhatulistiwa
    @carkawalakhatulistiwa 10 місяців тому +1

    9 months since gen 1 ai. and we can make a good video trailer.i I hope next year the video generator can be up to 30 seconds

    • @aivideoschool
      @aivideoschool  10 місяців тому

      I can't believe it hasn't even been one year

  • @antithesisOhfBeing
    @antithesisOhfBeing 9 місяців тому

    if paint his arm it will move as well

  • @brytonkalyi277
    @brytonkalyi277 10 місяців тому +1

    °•°• I believe we are meant to be like Jesus in our hearts and not in our flesh. But be careful of AI, for it is just our flesh and that is it. It knows only things of the flesh (our fleshly desires) and cannot comprehend things of the spirit such as peace of heart (which comes from obeying God's Word). Whereas we are a spirit and we have a soul but live in the body (in the flesh). When you go to bed it is your flesh that sleeps but your spirit never sleeps (otherwise you have died physically) that is why you have dreams. More so, true love that endures and last is a thing of the heart (when I say 'heart', I mean 'spirit'). But fake love, pretentious love, love with expectations, love for classic reasons, love for material reasons and love for selfish reasons that is a thing of our flesh. In the beginning God said let us make man in our own image, according to our likeness. Take note, God is Spirit and God is Love. As Love He is the source of it. We also know that God is Omnipotent, for He creates out of nothing and He has no beginning and has no end. That means, our love is but a shadow of God's Love. True love looks around to see who is in need of your help, your smile, your possessions, your money, your strength, your quality time. Love forgives and forgets. Love wants for others what it wants for itself. Take note, true love works in conjunction with other spiritual forces such as patience and faith (in the finished work of our Lord and Savior, Jesus Christ, rather than in what man has done such as science, technology and organizations which won't last forever). To avoid sin and error which leads to the death of our body and also our spirit in hell fire, we should let the Word of God be the standard of our lives not AI. If not, God will let us face AI on our own and it will cast the truth down to the ground, it will be the cause of so much destruction like never seen before, it will deceive many and take many captive in order to enslave them into worshipping it and abiding in lawlessness. We can only destroy ourselves but with God all things are possible. God knows us better because He is our Creater and He knows our beginning and our end. Our prove text is taken from the book of John 5:31-44, 2 Thessalonians 2:1-12, Daniel 7-9, Revelation 13-15, Matthew 24-25 and Luke 21. Let us watch and pray... God bless you as you share this message to others.