Haydn Rushworth: Making a Watchable AI Feature Film (How I Generate It - Episode 4)
Вставка
- Опубліковано 1 сер 2024
- Haydn Rushworth is on a quest to turn a screenplay he's been working on for 10 years into a watchable movie using generative AI. Documenting his filmmaking journey on his UA-cam channel, Haydn highlights different tools and storytelling techniques while being very realistic about the current state of AI and its drawbacks. Haydn's dedication to his craft and his realistic approach to the evolving AI technology make this a conversation you don't want to miss!
Don't forget to subscribe for more conversations with AI creators like this one!
🔗 Subscribe to Haydn Rushworth @HaydnRushworth-Filmmaker
/ @haydnrushworth-filmmaker
🍿 Watch Haydn's journey begin!
• Making my Movie with A...
✅ CHAPTERS
00:00 Intro
01:18 The current reality of making an AI feature film
08:14 AI is "Artificial Head, Not Heart"
10:10 Creating emotion in AI characters
11:34 Haydn talks about LTX Studio impressions
15:20 Tip for creating emotion with Midjourney /describe
19:09 Film & Video Game industry Venn diagram
24:19 Character consistency vs Continuity
26:17 Machinima and Plotagon (Mike goes on a tangent)
31:23 Hypothetical question: Could Haydn make his dream movie with today's AI technology?
38:31 Haydn Rushworth talks about his screenplay "Telepathy Sucks"
Fun Fact: While this interview was being edited, Luma Dream Machine and Runway Gen-3 Alpha came out.
*************
Here is the (unpaid!) promotional information for Haydn's eBook version of his screenplay discussed in this episode. Check out
If you feel like charming your way into the hearts of the women in your life (wife, GF, mother, sisters, lovely cat lady next door who watches your house for you while you're away...)
Consider buying the eBook of my screenplay, "Telepathy SUCKS!"
I spent more than a decade laughing and crying (literally) whilst I wrote the screenplay, and it's the reason this entire channel exists.
Amazon.com
www.amazon.com/dp/B0D7QN5N6W
Amazon.co.uk
www.amazon.co.uk/dp/B0D7QN5N6W
or, you could really go to town and buy the paperback version:
Amazon.com
www.amazon.com/dp/B0D7SMR8JQ?...
Amazon.co.uk
www.amazon.co.uk/dp/B0D7SMR8J...
*************
*************
#AI #Filmmaking #AIStorytelling #HaydnRushworth #HowIGenerateIt #aivideocreation #AIFilm #AIArt #Screenwriting #AIJourney #Storytelling #CreativeProcess #aiinterview #aipodcast #podcast #futureoffilm - Фільми й анімація
FANTASTIC interview. And, btw, when it comes to continuity, I just imagine that I have a constantly hungover script supervisor that's the producer's daughter and therefore unfireable 😅. Looking forward to seeing your films Haydn! ✨️
😆😂😆 That's the best way to think of AI tools that I've heard yet. Thanks very much, @TashaCaufield
I can't wait to see what Haydn does with Runway Gen-3 alpha. In the demo video where the wig falls on the bald guy, he goes through three stages of emotion (sad, surprised, happy). This capability literally didn't exist one week ago.
Genuinely excited to work with it... that's if it doesn't turn out to be another endlessly-teasing SORA 😆
3:15 and onward is so true. The magic of these technologies is that ideas that the Hollywood execs and investors see as "not worthy" can finally get made. To me its like what manga is to anime. It can help to prove ideas and share with niche audiences. And if an exec wants to pick it up then win win. But yes the idea is to go niche
Yes, totally. Plus, there are so many people in those niches who are starving for content because the big players don't want to "niche down" too much.
I've just finished editing a video on that very subject. Some great comments from George Lucas about the lack of imagination in studio executives and how he only got Star Wars made because he hired a concept artist to illustrate the script as he was writing it. Sounds like a job for AI to me :-)
@@HaydnRushworth-Filmmaker can't wait to see it! Thanks for the follow as well 😉
Really great interview!!
Thanks very much 🙂
Awesome conversation. I've thought about using '/describe function to backwards engineering. I've been resorting to using custom GPTs to help me build prompts based on images I want to recreate. MidJourney is definitely its owb language
That reverse engineering trick is great for picking up words when you're not sure what to call a certain style or texture.
It really is its own language. If you have any luck getting consistent emotions in photo-realistic images (other than smiling or happy), I'd love to know your secret. 🙂
I cheered for everything Hayden said. I’m working on my 10 year 12:10 old script as well and really related to his comments I especially liked what you said about continuity vis-à-vis character consistency. Can this family’s kitchen stay the same from scene to scene (with minor changes? I keep having to tweak the script to match the technology which is just wrong! As my grandma says “From your mouth to God’s ear.” I wish LTX would hire you guys as consultants
I've taken an approach that is kinda like the old days of 2D animation. Basically create stationary background, create characters separately and then create scenes using Canva and animate them with your favorite video generation tool like Runway ML. It's very hit or miss like you said. I'm just happy hearing about all these stories being unlocked thanks to the technology.
I'm so glad the interview was helpful and I loved reading your comment. Sometimes it's nice to hear others are experimenting with trying to solve the same kinds of problems with this constantly evolving tech.
Heyyyyyyy, thanks very much @suzannecarter445. Mike would definitely be my first choice as a consultant to any of the generative AI companies 🙂