Cinematic AI Camera Movement Test - LUMA 1.6

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 12

  • @tstone9151
    @tstone9151 Місяць тому +2

    Using a combination of Luma and Kling seems to yield the best results. Kling has the best end-frame tool, but Luma is better at camera movement. So it's best to get the end-frame from Luma, and use Kling to interpolate between the start and end frame since the quality and animation is better. Hopefully Kling's next update will provide an easier workflow

  • @alteredalley
    @alteredalley Місяць тому +1

    Luma 👏👏👏👏👏👏❤️❤️❤️

  • @TheBann90
    @TheBann90 Місяць тому

    "Keep distance static" or 'static camera distance' works well with Kling. And it sometimes understands clockwise, but not counterclockwise 😅
    Zooming in or out at the right place on the other hand is trickier

  • @HistoryViper
    @HistoryViper Місяць тому

    Luma is great for transforming shots, kling isn't very good at it, but much better at everything else. . Great job again! For video generators, I came up with COALS: Camera: Camera Movement (fpv, over the shoulder, etc.) Object: The subject of the video. Action: what the subject/objectnis doing. Location: Time, (1500 AD/CE) the place, (city, bar, bedroom, etc.) and background (including lighting). Style: 50mm lens, neon, silhouette, etc. lighting van be added here also (gel lighting) and any other styles (sepia, pixar, etc.). Doesn't work all the time and definitely doesn't work with runway but for kling, minimax and luma it works great.

    • @armondtanz
      @armondtanz Місяць тому

      I get the worst camera movement in kling.
      I've just moved over to kling from luma.
      Luma every now and then would give u s shot kubrick would be proud of, real fancy and energetic. I get very little in kling :(

    • @HistoryViper
      @HistoryViper Місяць тому

      Hmm. I use the RunwayML help section as a guide for the camera motions. Sometimes I have a hard time describing the shot I want. A lot of the time I will also either use Leonardo or MJ for the beginning frames... Kling you have to prompt style, I've gotten several shots of anime characters morphing into realistic style when I don't add style.

  • @GuyGtmn
    @GuyGtmn Місяць тому

    Hey, there's actually more movements in the dropdown menu. I guess you should use the middle mouse button to scroll down to see them.

    • @armondtanz
      @armondtanz Місяць тому

      Ha. Took me a while to notice that. 😅

  • @TheinterfaceTvSeries
    @TheinterfaceTvSeries Місяць тому

    I have a test for you and any other AI filmmakers to try. “an extreme closeup of a stylish men’s dress shoe pressing the gas pedal of a sports car” or some iteration of that. I could not get this simple insert shot to work on any platform. I eventually had to find stock footage and run it through Gen-3’s video to video to get it to work. You can see this on my channel on the video titled “The Gauntlet Episode 1.” That shot was my greatest AI video challenge to date. A failed one.

    • @armondtanz
      @armondtanz Місяць тому +1

      I'd generate it using either, ideogram flux or mystic.
      They are really good at prompt understanding.