To clarify, this generates new video (up to 2s) and ambience audio like roomtone (up to 10 seconds). It is not a frame hold, is a freeze frame of the last frame --- not video. Gen Extend aims to reproduce the frame rate and cadence of the video so you can keep the shot on longer for editing rhythm or to fill a gap.
Played with this quite a bit earlier today and tried challenging it. Got some impressive results of people in interview settings and some dancing. Other shots were definitely rough. Hands are not its friend though and the color change when it switches to the generated content is definitely noticable. I can see some uses in its current state, but it has a long way to go.
I hope some day Adobe retains the same colours and contrast after the export. Even after applying QT gamma compensation lut while exporting it’s close to what timeline is. But not the same. I want this fix so badly.
SO cool - thank you for always providing incredible and easy to understand info - we look to you first for almost every PP related question we have, thank you for all you do!
can they just please give us a decent STABILIZER that uses the before and after frames to fill in the borders so we don't have to crop a stabilized video
"Hey there! I noticed while you were editing your own clip and at the end of the video, the lighting changes a bit, and it seems like the clip might be looping the existing footage.
I'm interested to see how the generative extend will work with a less static shot. Like shaky cam or a zoom for example 🤔 hmm... Will test in the morning once I can some zzz's 😴
Good video. Will try. I don't need AI to do major, awe inspiring things. The extend helps, well would have helped in my old position. But now I just want a better remix tool, be able to type prompts that reduce labor time developing something, i.e. create mogrts or templates I can reuse. This would be great AI IMHO
Isn't this basically using the same artificial intelligence almost every professional video camera compression method offers where the camera creates new frames based off the previous one. Such as Sony's H.264 compression with variations like XAVC-I (Intraframe) and XAVC-L (Long-GOP). Only instead of on device processing its processed in the cloud?
Kinda pointless at this stage. I would like to see a end clip where there is movement. The ones showed in the example are pretty much static. Which you would get better results slowing down that parte and use optical flow
A frame hold is a still image of the last frame, extend is generating more video from the video to make it longer in the same fps. Try it out with moving shots and aerials 👍
Remix is for instrumental music to splice together repeitive beats to make it shorter or longer. Generative Extend is literally generating more audio of the same sound for additional time.
can't wait for my next 'renewal' when Adobe finds out that my CC expired and then I'm no longer 'latched' to them 🥰 heard that lots of people had problems unsubscribing. i am one of the lucky ones
To clarify, this generates new video (up to 2s) and ambience audio like roomtone (up to 10 seconds). It is not a frame hold, is a freeze frame of the last frame --- not video. Gen Extend aims to reproduce the frame rate and cadence of the video so you can keep the shot on longer for editing rhythm or to fill a gap.
Can't wait to try this!
Sounds like I need Davinci Resolve
Played with this quite a bit earlier today and tried challenging it. Got some impressive results of people in interview settings and some dancing. Other shots were definitely rough. Hands are not its friend though and the color change when it switches to the generated content is definitely noticable. I can see some uses in its current state, but it has a long way to go.
I hope some day Adobe retains the same colours and contrast after the export. Even after applying QT gamma compensation lut while exporting it’s close to what timeline is. But not the same. I want this fix so badly.
Thats Easy to fix, just go tô Lumetri Settings and change the Gamma Playback from Web to QuickTime, and enable GPU acceleration.
@@matheus4089 Great tip! Thanks for sharing.
SO cool - thank you for always providing incredible and easy to understand info - we look to you first for almost every PP related question we have, thank you for all you do!
Happy to help!
Great video, perfect explanations. Here in Brazil I follow all your videos. Thanks.
can they just please give us a decent STABILIZER that uses the before and after frames to fill in the borders so we don't have to crop a stabilized video
premiere really need a new and improved graph editor for keyframes like after effects , i really hate the tiny graph window you get
I was waiting for this!
:) me too. glad it's here for testing!
Have you considered making the Gal Tool Kit available in Final Cut?
I'm leaving Adobe Ecosystem
Omg… have been waiting for this…
Whoa! This is wild!
"Hey there! I noticed while you were editing your own clip and at the end of the video, the lighting changes a bit, and it seems like the clip might be looping the existing footage.
I'm interested to see how the generative extend will work with a less static shot. Like shaky cam or a zoom for example 🤔 hmm... Will test in the morning once I can some zzz's 😴
Good video. Will try. I don't need AI to do major, awe inspiring things. The extend helps, well would have helped in my old position. But now I just want a better remix tool, be able to type prompts that reduce labor time developing something, i.e. create mogrts or templates I can reuse. This would be great AI IMHO
Please make videos on vox style transitions and the storyboards
This seems intriguing great for still shot and room audio but not sure about the artifatcs
It's important to test so there will be less artifacts : )
Isn't this basically using the same artificial intelligence almost every professional video camera compression method offers where the camera creates new frames based off the previous one. Such as Sony's H.264 compression with variations like XAVC-I (Intraframe) and XAVC-L (Long-GOP). Only instead of on device processing its processed in the cloud?
What about the remix tool button, has it been cancelled? It was better for audio
Kinda pointless at this stage. I would like to see a end clip where there is movement. The ones showed in the example are pretty much static. Which you would get better results slowing down that parte and use optical flow
Thank you
Thank You 🪄
:)
Welcome to the future!
Gal you are amazing
I just want my premiere to stop crashing on sequoia, the last update seems to fix it 🙏
What's the diff between extending a clip and adding a hold frame?
A frame hold is a still image of the last frame, extend is generating more video from the video to make it longer in the same fps. Try it out with moving shots and aerials 👍
In beta? I expected this being part of the initial 2025 release.
All the new tools go through beta first for necessary testing.
For the audio part isn't it's like the remix tool?
Remix is for instrumental music to splice together repeitive beats to make it shorter or longer. Generative Extend is literally generating more audio of the same sound for additional time.
@@PremiereGal now I understand thanks
Probably will work like this in the next ten years, when they forget to update, like every new feature...
👍👍
what build of beta is it? is it v25.1.0.12 ?
i will never upload anything to adobe cloud ! never !
My video is getting pixelated after rendering. Any solution?"
your generated extensions or your entire video export? Make sure you are exporting to match sequence in high-bit-rate.
can't wait for my next 'renewal' when Adobe finds out that my CC expired and then I'm no longer 'latched' to them 🥰 heard that lots of people had problems unsubscribing. i am one of the lucky ones
I just need Object removal Please! 🙏🙏🙏
🪄
🪄
5:30 why you do that? Not cool.
Gotta have a bit of pre-halloween fun!
@@PremiereGal 😁
first
🪄
🪄
🪄
🪄
🪄
🪄
🪄
🪄