Thanks! We moved house beginning of the year. One of the reason I haven't been able to release a lot yet. Still have to tweak the lighting setup it seems ;)
I was 🤏 this close to leaving a comment under one of your last videos inquiring whether anything about 19 was in development - wouldn't have been rude and pushy AT ALL ("hey Bernd - make video!" 😂) However, your videos are just the gold standard when it comes to explaining core functionalities in Fusion
Thanks! Generally BMD is keeping their development very secret until the point where the CEO unveils the new release. So wouldn't be able to provide any insights ahead of time either 🙃 But also I like some time to test stuff before I talk about it 🤣
Danke für das Review! Da sind echt einige Verbesserungen dabei über die ich mich mega freue. Verbesserte Tracker, schnellere Magic Mask, verbessertes Denoising und das wichtigste: Umso mehr 3D-Rendering möglich ist, desto weniger muss ich Blender nutzen 😅
@@VFXstudy Es geht eher darum, in bestehendes Videomaterial zur Ergänzung 3D-Modelle einzufügen. Wenn ich z.B. einen Wikingercosplayer filme und das meiste VFX-Edit eh in Davinci mache, wäre es interessant vermeintliche Gegner, Kreaturen etc. als geriggtes 3D-Modell in Fusion vielleicht sogar animieren zu können. Was nicht bedeutet, dass ich Blender schon komplett aufgegeben habe^^
Hey Bernd! Danke für die Infos! Am meisten freue ich mich über die neuen/schnelleren Tracker. Vielleicht kannst du mir (weiter)helfen: Ich brauche einfach nur ein Textfeld, welches den Text automatisch bricht. Also eine ganz klassische Textbox. Ich kann einfach nicht verstehen, wie sowas nicht in diese krasse Software implementiert werden konnte. Und nun kommst du. Gibt es das und ich sehe es nur nicht? Gibt es Plugins? Habe ein Lua-Script gefunden, aber ich scheitere dran, es zu installieren (auf MacOS).
Textfeld hat standardmäßig keine Begrenzung, damit ist es schwierig das gut zu lösen. Scripting geht aber auch nicht einfach wenn es für jede Schrift etc. gehen soll. In textverarbeitungsprogrammen geht sowas ganz einfach, aber die erlauben auch nicht buchstaben einzeln umzupositionieren oder einen Trackingparameter zu animieren.... Kann sein, dass BMD es nicht implementiert hat weil es evtl. mit anderen Features kollidiert. Meine Spekulation dazu....
@@jkartz92 EXR workflow already exists yes but the workflow is bit cumbersome and old school compared to something like Nuke but deep compositing is not possible in Fusion.
When BMD refers to Deep Pixel Compositing, they talk about tools that use technical passes like World Position pass, Depth map, Normal map etc. for 2D effects that use depth. You can use it for relighting and fog effects etc for example or create volume masks. When Nuke talks about deep data , they mean layered images where semitransparent pixels can have different depth values and be layered on top of each other within one image. That allows for even better depth isolation and avoids anti-aliazing issues. The language is confusing and nowadays the Nuke terminology has become better known. Unfortunately that part isn't supported in Fusion so far. A lot of Nuke users coming to Fusion ask for it and ask for custom channel support as well to load arbitrary channels from exr's into one image stream - BMD must be aware of the request, but not sure how high it's on the priority list.
Probably coming soon. Also did a session on 3D in general (and partially covering USD) in a recent live webinar for BMD. Might be repeated in the future, stay tuned.
I was waiting for your take on the new stuff. I learned so much of what I know on this channel. Im curious about the intellitrack tracking, if it's any better, etc. Going to watch and find out now I guess.
Thanks for this video. I just upgraded to Resolve 19. For version 18.6, I setup a Color management for Fusion using Luts and color transform nodes based on previous turotrials. The new auto color management still does not work for me without the color transforms. I hope to see tutorial for the Fusion color management.
I might do another tutorial sometime soon. But what was not working? One thing I noticed recently is that if you change to output space to something like Gamma 2.4 or so (not the scene default) then you should go into the viewer LUT and enable "Forward OOTF". I already informed BMD that this isn't done automatically and hope that they can fix it. Maybe there are other small hick-ups that others find in their setup, but generally speaking it seems to work...
Thanks for your immediate reply. My files are from Blender which I set to a " linear color space" These files are very dark inside Fusion. I applied a LUT to the monitors to match the ones in the edit page, and one Color Space transform after the media output to correct the external monitor. When I set the Resolve 19 Color management to Auto, or manual setting the images are still dark once I remove the LUTs and color space transform.
Also might be worth mentioning that if you add an Extrude3D to any shape tool it promotes it into the 3D environment, and you can also animate using a path in 3D too, though it is a little tricky having to manipulate a path using the spline editor in X,Y and Z.
Agreed. Also a good thing for having that adjustable/renderable direclty in compositing. Not sure though how often Storm will be enough here - for some smoke stack perhaps? Do you have more insights there reg. render engine?
@@VFXstudy Not really more insights, but i had tested some workflows last year and the final integration is much better than i thought before! You can render the temperature informations and everything like in blender and it render very fast. For some 3d explosion effects it‘s very handy👌
Nice one Bernd. As a vertical node guy, I’ve been so chuffed with the UI layout, not being able to stack widescreen shots was driving me up the wall when using my laptop! Not seen anyone chat about the Object Removal tool yet. Been fiddling around with it but not sure how it works, can’t find any documentation. Maybe something you could cover at some point!
Die USD-Tools sind ein Highlight. Beim "uRenderer" kann man beim "Lighting" "Enable Shadows" hinzufügen. Auf was bezieht sich das, dass das Objekt einen Schatten wirft? Wenn ich das anklicke, kommt kein Effekt zustande. Oder braucht man eine stärkerer Grafikkarte dafür? Ansonsten sehr gut gelungene Zusammenfassung, was es so Neues in der neuen Version gibt!
Grundsätzlich ja - es sollten dann schatten in der scene hinzukommen. Dann ist nur noch die Frage wie weit die von der jeweiligen Lichtquelle unterstützt werden und welche scenarien Storm hier unterstützt.... An der GPU sollte es nicht liegen.
@@VFXstudy Habe jetzt nochmals ein wenig herumprobiert. Die Schatten wirft es nur auf das nächststehende Objekt, aber nicht auf den Boden. Wahrscheinlich noch ein Bedienungsfehler meinerseits?
The Multi-Poly and IntelliTrack look cool. Also nice to see the USD improvements. That will be handy when working with pipelines that involve other 3D software *cough*Blender*cough*. Speaking of USD, I got a lot out of your recent Working With 3D Models In Fusion webinar. It was very helpful watching you iterate through the problems that arose in dealing with USD. Worthwhile stuff.
@@VFXstudy Definitely! But don't give me false hope by mentioning "ray traced lighting" or any sort of lighting improvements to fusion 😭 Though perhaps it's not as much of a fantasy...hopefully lol
Love your videos, thank you for making them! Just a lighting rec. I think you should drop your key down 6 inches and move it a bit to camera right. Or put a bounce card to fill your left eye. That being said, love your videos and forget what I said :)
That's interesting, have you already tested it for any commercial or film project? All this is great, but we need it to be tested in actual projects, being live action, commercials, CG, etc....
Yes, and now we have the stable release. But every production is different. So always do your own tests when in doubt. Nobody will be able to test everything and every scenario.
It's great that they finally addressed color management in fusion. You mentioned using it in DaVinci YRGB color managed mode. Does it still work the same if you do manual color management using CST's instead? Thanks.
@@VFXstudy I tested color management on canon r6 Rec 2020 canon log 3 clips. and the viewer is a lot darker in Fusion compare to Edit page. No color grading in color page , no lut in Fusion page viewer. I tested several color management settings : 1/ SDR automatic management, 2 / custom settings with input rec2020 canon log 3, timeline DWG/davinci intermediate, timeline sdr100 output REC 709 3/ other custom settings. and the result is always the same. I also set input color space and gamma to rec2020 canon log 3 on my clips . If you have an idea ....
Scale to Format and Multipoly are probably my favorites. Everything that saves nodes and setup time is always welcome. Better tracking of course as well. USD is not interesting without a serious production renderer. For matte objects alembic is still easier and quicker to handle. And if you have a 3d software where you create these files you can usually render there more efficiently since 3D setups slow comps down massively. I would love to see some Omniverse tech implemented there...
Omniverse is NVIDIA only I believe? Might be hard for BMD since they tend to focus on platform agnostic. Yes, a production quality render engine would be good. Fingers crossed. The speed at wich USD scenes can be loaded and the options of filtering with the scene browser etc. are quite nice. So I think it's going in the right direction.
Thanks Bernd! In terms of the colour management, I can't find a way to have my fusion titles in the right colours on the edit page. When I apply a specific hexcode to the text element or solid background and then the same hexcode to the text+ element, the colours look quite different. What am I doing wrong?
If color management is enabled, then everything from Fusion goes through color management. So if you use the Hex code in Fusion, it still gets remapped on the way out towards the edit. One way for this specific example might be to do a color space transform from your output space back to the linear space you are working in in Fusion and choose the colors in that output space. Will probably do a tutorial on that, but might take a bit of time.
Manually setting the source colour space to ITU-R BT.709 (scene) and enabling Remove Curve and Pre-Divide/Post-Multiply seems to work for me (on a Mac), but I'm not sure why or if it's the right thing to do...
Thanks for the review Bernd. Apparently you do some juggling based on the knickknacks on your shelves. Are you currently at 4 balls? If we ever cross paths, I'll be happy to teach you 5.
Haha. Not really. Just once in a blue moon. With some warm up practice I can do 4 for a few seconds or so... But my daughter loves grabbing them when she comes into my office, she can juggle with one 🤪
Wie viel stunden an Video Material beinhaltet ungefähr ihr Fusion 2022 seminar bundle auf tutorial-experts? Da es ja schon recht teurer ist, als die anderen kurse zu fusion. Würden sie mir den kurs als Einsteiger empfehlen oder soll ich mir zum start lieber einen anderen kurs holen? Mit videobearbeitung habe ich schon einiges an erfahrung, nur mit fusion bisher noch nicht.
Müssten so um die 16 Stunden sein. Es kommen demnächst noch ein paar updates hinzu für ergänzungen in v18 und v19 - auch wenn diese noch nicht im offiziellen Zertifizierungsprogramm von BMD enthalten sind.
Und Ja, das würde ich definitv für Einsteiger in Fusion empfehlen. Die Seminare sind alle spezifischer für einzelne Themen und manche Seminare auch schon fortgeschrittener. Der Fusion Kurs ist ein guter gesamtüberblick über Fusion und deckt die häufigsten Szenarien und wesentlichen Themenbereiche ab. Es ist aber kein Resolve Einsteigerkurs. Also ein bisschen übung im Schnitt ist vorher hilfreich.
Hi good video as always. im just doing a few edits to a clip on the colour page but in fusion the colour does not change, it does in the edit page . its strange as half an hour ago it was showing all the changes made across all tabs, colour, fusion and edit .....I have gone through all th buttons to see if I have turned something off but I haven't. please can you help. many thanks
Intelli Tracker looks so helpful for me, to begin my tracking journey... Now all i want is SVG native support and a better way to edit on beat. Meanwhile, maybe i-m dumb, but i don't understand what could i make with Referenced Fusion Compositions...
SVG can be imported via Fusion-Import. Not sure what you mean by native. What I would like to see is getting them imported straight into the shape tool system rather than the plain polygons and also somehow grouped / organized in a better way. Patrick Sterling put out a video on the some more things with the Reference comp. But also more overlay type options. I'll see if I use something like that in the future. Did similar stuff via Fusion effect within multicam before. But let's see....
Yes! But it'll take some time - good thing: those updates will be additions. Pretty much everything you learned in previous versions remains useful, we just got some additions.
Big agree that reference comps are cool, but not really all that helpful for what they're supposed to help with... I'd MUCH rather have a 'shared group of nodes' that I can adjust in one comp, and know that it's being adjusted in every other comp as well. THAT would be a game changer....
15:30 do i see this right, that the Ultra Noise Reduction works better, if you want to avoid showing details in the grain? Its hard to see Noise on YT, because YT filters every video :/ ...So i am thinking of retouches, when you put the noise back on, you dont want to have the details in the grain, yu just put work in to remove
What was that file that was called “exr” that you loaded in the usd section of the tutorial? Cause if it’s a way to have a exr sequence somehow combined into a file is really cool and I never heard of.
Do you mean the textures that I loded? those were just images (single frame). Exr sequences have multiple files - but that's also the point of it that you can save and load frame by frame independent of what's happening elsewhere in the sequence.
@@VFXstudy i know that they are a single frame, but i thought that they're shown with regular thumbnails and not a thumbnail that says "exr". (i realized that thats what it is a few days ago when i tries using an exr image sequence)
and do you have an explanation for why a masked video clip is semitransparent in the masked areas when the alpha value for that part is 0? (the masked part has color values)
i work on a rtrx 4090, ryzen 7950x and 64GB Ram (6000Mhz) and i would realy welcome to have more performance in fusion based things (e.g. Text+). It's improved?
It's possible that there are improvements under the "General Performance and Stability Improvements" heading. But I haven't heard anything specific besides the Magic Mask in the current release. But you have performance problems with Text+ and this setup? That seems strange unless you have some really complex title effects or work on some insane resolution....?
Things worth mentioning are: Render is much faster even on high settings (I think 3-4 times faster than before in ver 18.6.6) There is no difference in the quality of the video whether you render it separately or immediately under YT with the button. Audio now supports 320 birate. I did some tests and it turned out that the standard YT settings are better (image and especially sound) than setting them manually.
Thanks, yeah I haven't looked into all of that yet. Render performance always depends on a ton of stuff (codec, hardware, timeline....). I think that's why the advertisement usually says something like "now up to 3x faster".
Regarding rendering, they might look towards incorporating the renderer used in the Bevy game engine, or the Godot game engine, both MIT license (permissive). Regarding other features, how about some more AI stuff? I suggested segmentation to someone (yolo, segment anything - open source NN models) who I think has already passed it along, but segmentation would be extremely useful, but there are TONs of other AI features we could have.
Not sure what type of segmentation. Magic Mask is one form of image segmentation I suppose. A dedicated one trained specifically for Chroma Keying might be nice. For the Depth map it would be great if it could track over the image or even give a solution in combination with 3D camera tracking that aligns with the tracking features. It's easy to come up with ideas, but might be much harder to implement them all.
@@VFXstudy - magic mask is really good. It's not "segment anything" tho which is pretty darn good especially when combined with object detectors like YOLO or Grounding Dino. You could just "tell it" what to segment out. Definitely need this AI driven masking in there.
You mean for Proxy and Auto Proxy? It's still the same as before - you need to enable the Timeline Proxy (half or quater) AND the Proxy in Fusion via right-click on the Playback bar. Fusion's own proxy setting with 1/30 etc. is ignored in Resolve as far as I know and only works in Fusion Studio.
Just tested a few projects I had in 18.6 last week. There was no prompt to upgrade the project or the database it was in. I would still strongly recommend a backup point before - not sure what 18.6 will do if 19.0 tools and features pop up in a project...
@@VFXstudy I did a backup, upgraded to 19 and it crashed on me immediately + had some serious UI bugs. I downgraded to 18.6.6 right away and couldn't open the particular project without restoring archive. So beware!
Thanks for sharing that observation. So +1 for backup! I am wondering if it's a general thing or if the crashes did something to your project that made it incompatible 🤔 Either way - bottom line: better safe than sorry...
@@VFXstudy It's been historically always the case that newer project versions don't work with older Resolve versions. Maybe it's an UI bug that it doesn't tell you about upgrading?
I'm disappointed with the Referenced Fusion composition. I thought finally, They've made it possible to adjust a fusion clip's track layers on the edit page to see the consequences of adjustments in real time. I will finally be able to retime clips on the edit page. swap out clips on the edit page. shorten and slip clips on the edit page, everything updating fluidly like it should... Instead the Referenced Fusion Composition seems like a glorified adjustment layer?? wth?? They need to invent and code a "Fusion Flex Clip" that allows the user to adjust the 'grouped tracks' of a "Fusion Clip" on the ORIGINAL timeline. All timing, slipping, lengthening, shortening etc, is reflected in the Fusion page. Why does that not already exist? It is so awkward to adjust things by "open in timline" of a fusion clip. You're currently asked to make guess work with changes to are making. it's inpractical. a time sink and deeply stressful. Not to mention you need to know what you want to comp before hand rather than having freedom to explore now and make adjustments later. This is something that must be solved. It is way WAY too anti-user-friendly for 2024.
Interesting point of view. Guess that's not how I typically use Fusion. But a lot of what you want is still available. Even with the regular Fusion clip, you can open it in a timeline and change stuff. With the Referenced comp you can do even more of that, if you select multiple clips into the referenced comp you can edit the lower layers and I think even swap if I remember correctly. Anyhow, personally I'm missing other features here, like a "linked composition" with option to unlink certain parameters or similar. But anyhow, we'll see how BMD takes this further.
Well I would think that Silouhette is probably ahead of the game if you have to do a lot of roto tasks etc. But for some simple fixes, I find it does the job. What are you missing or comparing it to?
Outstanding as always Bernd. Love the new setup too.
Thanks! We moved house beginning of the year. One of the reason I haven't been able to release a lot yet. Still have to tweak the lighting setup it seems ;)
Intellitrack is a blessing
Yes. Think so too. Now they should just update the tracker interface and align it with the planar tracking tool...
@@VFXstudy respect your roots. Know the legacy of the interface.
Excellent breakdown, I missed a lot of the new features and great you have tested them for us!
Thanks. There's indeed quite a bit of usefully stuff.
Thanks. I would like to see an additional video about the OFX Render Plugin.
What are you missing? It was quite straight forward with Fusion Studio. Would be nice to hear when people give it a try in other OFX Hosts.
So glad they implemented color management into Fusion!
Yes, still hoping for ACES but it's going in the right direction.
Has been waiting for you to upload a new video! glad to see you dude, also wanted to thank you because you helped me a lot learning resolve.
Happy to see that!
Thank you. Had been waiting for your well-researched infos and comments.
Glad it was helpful!
I was 🤏 this close to leaving a comment under one of your last videos inquiring whether anything about 19 was in development - wouldn't have been rude and pushy AT ALL ("hey Bernd - make video!" 😂)
However, your videos are just the gold standard when it comes to explaining core functionalities in Fusion
Thanks! Generally BMD is keeping their development very secret until the point where the CEO unveils the new release. So wouldn't be able to provide any insights ahead of time either 🙃
But also I like some time to test stuff before I talk about it 🤣
Danke für das Review! Da sind echt einige Verbesserungen dabei über die ich mich mega freue. Verbesserte Tracker, schnellere Magic Mask, verbessertes Denoising und das wichtigste: Umso mehr 3D-Rendering möglich ist, desto weniger muss ich Blender nutzen 😅
Haha, also ich glaube ich würde noch lieber mit Blender Cycles oder Evee rendern anstatt mit Storm.
@@VFXstudy Es geht eher darum, in bestehendes Videomaterial zur Ergänzung 3D-Modelle einzufügen. Wenn ich z.B. einen Wikingercosplayer filme und das meiste VFX-Edit eh in Davinci mache, wäre es interessant vermeintliche Gegner, Kreaturen etc. als geriggtes 3D-Modell in Fusion vielleicht sogar animieren zu können. Was nicht bedeutet, dass ich Blender schon komplett aufgegeben habe^^
He's back!
Yes was moving house and busy with other stuff. Hopefully more to come soon. But can't do weekly with my tutorials :)
Hey Bernd! Danke für die Infos! Am meisten freue ich mich über die neuen/schnelleren Tracker. Vielleicht kannst du mir (weiter)helfen: Ich brauche einfach nur ein Textfeld, welches den Text automatisch bricht. Also eine ganz klassische Textbox. Ich kann einfach nicht verstehen, wie sowas nicht in diese krasse Software implementiert werden konnte. Und nun kommst du. Gibt es das und ich sehe es nur nicht? Gibt es Plugins? Habe ein Lua-Script gefunden, aber ich scheitere dran, es zu installieren (auf MacOS).
Textfeld hat standardmäßig keine Begrenzung, damit ist es schwierig das gut zu lösen. Scripting geht aber auch nicht einfach wenn es für jede Schrift etc. gehen soll. In textverarbeitungsprogrammen geht sowas ganz einfach, aber die erlauben auch nicht buchstaben einzeln umzupositionieren oder einen Trackingparameter zu animieren.... Kann sein, dass BMD es nicht implementiert hat weil es evtl. mit anderen Features kollidiert. Meine Spekulation dazu....
@@VFXstudy Nooooooo. Wie schade. Danke für die schnelle Antwort! :)
Nice small and QOL updates, nothing groundbreaking like a new EXR workflow or deep compositing but the new multipoly is a great feature update.
ipresume exr workflow & deep compositing already exists in fusion
@@jkartz92 EXR workflow already exists yes but the workflow is bit cumbersome and old school compared to something like Nuke but deep compositing is not possible in Fusion.
@@rano12321 but this has been mentioned in their site, "Deep Pixel Compositing" are they different?
When BMD refers to Deep Pixel Compositing, they talk about tools that use technical passes like World Position pass, Depth map, Normal map etc. for 2D effects that use depth. You can use it for relighting and fog effects etc for example or create volume masks. When Nuke talks about deep data , they mean layered images where semitransparent pixels can have different depth values and be layered on top of each other within one image. That allows for even better depth isolation and avoids anti-aliazing issues. The language is confusing and nowadays the Nuke terminology has become better known. Unfortunately that part isn't supported in Fusion so far. A lot of Nuke users coming to Fusion ask for it and ask for custom channel support as well to load arbitrary channels from exr's into one image stream - BMD must be aware of the request, but not sure how high it's on the priority list.
I hope someday BMD would put deep composting I'm fusion@@VFXstudy
Bruh, just yesterday I came back your Resolve 18 vids to refresh my memory 🥲 You're a beast!
Thanks a lot!
Bern Rocks! Thanks as always.
🥰
Hello Bernd, Thank you for this review. Could you do a tutorial on USD please? Thank you.
Probably coming soon. Also did a session on 3D in general (and partially covering USD) in a recent live webinar for BMD. Might be repeated in the future, stay tuned.
Thank you for the excellent overview!
Glad you enjoyed it!
I was waiting for your take on the new stuff. I learned so much of what I know on this channel.
Im curious about the intellitrack tracking, if it's any better, etc. Going to watch and find out now I guess.
In my tests so far it was superior.
Awesome rundown mate !
Thanks!
Thanks for this video. I just upgraded to Resolve 19. For version 18.6, I setup a Color management for Fusion using Luts and color transform nodes based on previous turotrials. The new auto color management still does not work for me without the color transforms. I hope to see tutorial for the Fusion color management.
I might do another tutorial sometime soon. But what was not working? One thing I noticed recently is that if you change to output space to something like Gamma 2.4 or so (not the scene default) then you should go into the viewer LUT and enable "Forward OOTF". I already informed BMD that this isn't done automatically and hope that they can fix it. Maybe there are other small hick-ups that others find in their setup, but generally speaking it seems to work...
Thanks for your immediate reply. My files are from Blender which I set to a " linear color space" These files are very dark inside Fusion. I applied a LUT to the monitors to match the ones in the edit page, and one Color Space transform after the media output to correct the external monitor. When I set the Resolve 19 Color management to Auto, or manual setting the images are still dark once I remove the LUTs and color space transform.
Also might be worth mentioning that if you add an Extrude3D to any shape tool it promotes it into the 3D environment, and you can also animate using a path in 3D too, though it is a little tricky having to manipulate a path using the spline editor in X,Y and Z.
Yes, we got the Extrude 3D in 18.6 I think.
Many Thanks for review!
My pleasure!
Thank you! The vdb‘s inside usd are a great improvement!
Agreed. Also a good thing for having that adjustable/renderable direclty in compositing. Not sure though how often Storm will be enough here - for some smoke stack perhaps? Do you have more insights there reg. render engine?
@@VFXstudy Not really more insights, but i had tested some workflows last year and the final integration is much better than i thought before! You can render the temperature informations and everything like in blender and it render very fast. For some 3d explosion effects it‘s very handy👌
Nice one Bernd. As a vertical node guy, I’ve been so chuffed with the UI layout, not being able to stack widescreen shots was driving me up the wall when using my laptop!
Not seen anyone chat about the Object Removal tool yet. Been fiddling around with it but not sure how it works, can’t find any documentation. Maybe something you could cover at some point!
Oh did they bring it to Fusion now? I remember trying it in Color with mixed results.
Die USD-Tools sind ein Highlight. Beim "uRenderer" kann man beim "Lighting" "Enable Shadows" hinzufügen. Auf was bezieht sich das, dass das Objekt einen Schatten wirft? Wenn ich das anklicke, kommt kein Effekt zustande. Oder braucht man eine stärkerer Grafikkarte dafür? Ansonsten sehr gut gelungene Zusammenfassung, was es so Neues in der neuen Version gibt!
Grundsätzlich ja - es sollten dann schatten in der scene hinzukommen. Dann ist nur noch die Frage wie weit die von der jeweiligen Lichtquelle unterstützt werden und welche scenarien Storm hier unterstützt.... An der GPU sollte es nicht liegen.
@@VFXstudy Habe jetzt nochmals ein wenig herumprobiert. Die Schatten wirft es nur auf das nächststehende Objekt, aber nicht auf den Boden. Wahrscheinlich noch ein Bedienungsfehler meinerseits?
The Multi-Poly and IntelliTrack look cool. Also nice to see the USD improvements. That will be handy when working with pipelines that involve other 3D software *cough*Blender*cough*.
Speaking of USD, I got a lot out of your recent Working With 3D Models In Fusion webinar. It was very helpful watching you iterate through the problems that arose in dealing with USD. Worthwhile stuff.
Thanks. I should probably some more 3D stuff on UA-cam again 🤔
@@VFXstudy Definitely!
But don't give me false hope by mentioning "ray traced lighting" or any sort of lighting improvements to fusion 😭
Though perhaps it's not as much of a fantasy...hopefully lol
Love your videos, thank you for making them! Just a lighting rec. I think you should drop your key down 6 inches and move it a bit to camera right. Or put a bounce card to fill your left eye. That being said, love your videos and forget what I said :)
Thanks a lot. Yes, I realized during editing that the light was bad. New office and new setup needs some tweaking.
That's interesting, have you already tested it for any commercial or film project? All this is great, but we need it to be tested in actual projects, being live action, commercials, CG, etc....
Yes, and now we have the stable release. But every production is different. So always do your own tests when in doubt. Nobody will be able to test everything and every scenario.
It's great that they finally addressed color management in fusion. You mentioned using it in DaVinci YRGB color managed mode. Does it still work the same if you do manual color management using CST's instead? Thanks.
If you do it manually, you can now use CSTs inside the Fusion Viewer LUTs. So that workflow got easier as well.
Ur intro is insane
That'll be the DaVinci Resolve 3D interface from Resolve version 20 :) :) :)
@@VFXstudy awesome 👍👍👍👍👍👍👍👍👍
Thanks great review. Color Management is not working for me. I'am on Mac, perharps it's a MAc issue.
Hard to say. I doubt that it's a general Mac thing since BMD does a lot of dev and testing for MAC. But what aspect is not working for you?
@@VFXstudy I tested color management on canon r6 Rec 2020 canon log 3 clips. and the viewer is a lot darker in Fusion compare to Edit page. No color grading in color page , no lut in Fusion page viewer. I tested several color management settings : 1/ SDR automatic management, 2 / custom settings with input rec2020 canon log 3, timeline DWG/davinci intermediate, timeline sdr100 output REC 709 3/ other custom settings. and the result is always the same. I also set input color space and gamma to rec2020 canon log 3 on my clips . If you have an idea ....
Your back😮😮😮
Never left :-) Just busy moving house and other stuff.... I'll try hard to release tutorials a bit more frequently.
Scale to Format and Multipoly are probably my favorites. Everything that saves nodes and setup time is always welcome. Better tracking of course as well. USD is not interesting without a serious production renderer. For matte objects alembic is still easier and quicker to handle. And if you have a 3d software where you create these files you can usually render there more efficiently since 3D setups slow comps down massively. I would love to see some Omniverse tech implemented there...
Omniverse is NVIDIA only I believe? Might be hard for BMD since they tend to focus on platform agnostic. Yes, a production quality render engine would be good. Fingers crossed. The speed at wich USD scenes can be loaded and the options of filtering with the scene browser etc. are quite nice. So I think it's going in the right direction.
@@VFXstudy It doesn't have to be omniverse, but hardware rendering is more advanced practically everywhere outside fusion.
Thanks Bernd! In terms of the colour management, I can't find a way to have my fusion titles in the right colours on the edit page. When I apply a specific hexcode to the text element or solid background and then the same hexcode to the text+ element, the colours look quite different. What am I doing wrong?
If color management is enabled, then everything from Fusion goes through color management. So if you use the Hex code in Fusion, it still gets remapped on the way out towards the edit. One way for this specific example might be to do a color space transform from your output space back to the linear space you are working in in Fusion and choose the colors in that output space. Will probably do a tutorial on that, but might take a bit of time.
Manually setting the source colour space to ITU-R BT.709 (scene) and enabling Remove Curve and Pre-Divide/Post-Multiply seems to work for me (on a Mac), but I'm not sure why or if it's the right thing to do...
Thanks for the review Bernd. Apparently you do some juggling based on the knickknacks on your shelves. Are you currently at 4 balls? If we ever cross paths, I'll be happy to teach you 5.
Haha. Not really. Just once in a blue moon. With some warm up practice I can do 4 for a few seconds or so... But my daughter loves grabbing them when she comes into my office, she can juggle with one 🤪
Incredible channel!
Thanks a lot! I'm not releasing much but I try to make it count :)
Wie viel stunden an Video Material beinhaltet ungefähr ihr Fusion 2022 seminar bundle auf tutorial-experts? Da es ja schon recht teurer ist, als die anderen kurse zu fusion. Würden sie mir den kurs als Einsteiger empfehlen oder soll ich mir zum start lieber einen anderen kurs holen? Mit videobearbeitung habe ich schon einiges an erfahrung, nur mit fusion bisher noch nicht.
Müssten so um die 16 Stunden sein. Es kommen demnächst noch ein paar updates hinzu für ergänzungen in v18 und v19 - auch wenn diese noch nicht im offiziellen Zertifizierungsprogramm von BMD enthalten sind.
Und Ja, das würde ich definitv für Einsteiger in Fusion empfehlen. Die Seminare sind alle spezifischer für einzelne Themen und manche Seminare auch schon fortgeschrittener. Der Fusion Kurs ist ein guter gesamtüberblick über Fusion und deckt die häufigsten Szenarien und wesentlichen Themenbereiche ab. Es ist aber kein Resolve Einsteigerkurs. Also ein bisschen übung im Schnitt ist vorher hilfreich.
Hi good video as always.
im just doing a few edits to a clip on the colour page but in fusion the colour does not change, it does in the edit page .
its strange as half an hour ago it was showing all the changes made across all tabs, colour, fusion and edit .....I have gone through all th buttons to see if I have turned something off but I haven't. please can you help.
many thanks
Really hard to say without being at your screen. Check Fusion Viewer LUTs, Color Management Settings...
@@VFXstudy Thank you for the reply, still the same. is this omitting that only the studio version has as im using the free version
thanks again
Intelli Tracker looks so helpful for me, to begin my tracking journey...
Now all i want is SVG native support and a better way to edit on beat.
Meanwhile, maybe i-m dumb, but i don't understand what could i make with Referenced Fusion Compositions...
SVG can be imported via Fusion-Import. Not sure what you mean by native. What I would like to see is getting them imported straight into the shape tool system rather than the plain polygons and also somehow grouped / organized in a better way.
Patrick Sterling put out a video on the some more things with the Reference comp. But also more overlay type options. I'll see if I use something like that in the future. Did similar stuff via Fusion effect within multicam before. But let's see....
hey! are you going to update your courses for this version?
Yes! But it'll take some time - good thing: those updates will be additions. Pretty much everything you learned in previous versions remains useful, we just got some additions.
Big agree that reference comps are cool, but not really all that helpful for what they're supposed to help with... I'd MUCH rather have a 'shared group of nodes' that I can adjust in one comp, and know that it's being adjusted in every other comp as well. THAT would be a game changer....
Yes indeed. That's what we need.
15:30 do i see this right, that the Ultra Noise Reduction works better, if you want to avoid showing details in the grain? Its hard to see Noise on YT, because YT filters every video :/ ...So i am thinking of retouches, when you put the noise back on, you dont want to have the details in the grain, yu just put work in to remove
real raytracing would be dope! I`m planning to learn Unreal Engine 5 cause it gives nice results fast and then compositing in Fusion
Haven't looked into Unreal yet. But should be a good workflow
@@VFXstudy yeah I`ve been Blender guy for years but plan to leave only as modeling tool mostly
thanks!
You're welcome 😊
What was that file that was called “exr” that you loaded in the usd section of the tutorial? Cause if it’s a way to have a exr sequence somehow combined into a file is really cool and I never heard of.
Do you mean the textures that I loded? those were just images (single frame). Exr sequences have multiple files - but that's also the point of it that you can save and load frame by frame independent of what's happening elsewhere in the sequence.
@@VFXstudy i know that they are a single frame, but i thought that they're shown with regular thumbnails and not a thumbnail that says "exr". (i realized that thats what it is a few days ago when i tries using an exr image sequence)
and do you have an explanation for why a masked video clip is semitransparent in the masked areas when the alpha value for that part is 0? (the masked part has color values)
i work on a rtrx 4090, ryzen 7950x and 64GB Ram (6000Mhz) and i would realy welcome to have more performance in fusion based things (e.g. Text+). It's improved?
It's possible that there are improvements under the "General Performance and Stability Improvements" heading. But I haven't heard anything specific besides the Magic Mask in the current release. But you have performance problems with Text+ and this setup? That seems strange unless you have some really complex title effects or work on some insane resolution....?
@@VFXstudy nothing special. Max 4k Timeline. If i use more text+ nodes (or layers on the edit page) my performance goes dramatically down
Too bad Color management for Aces still not fully implemented yet.
Agreed, hopefully that's the next step.
What is the benefit of working in ACES vs Davinci Wide gamut?
Things worth mentioning are: Render is much faster even on high settings (I think 3-4 times faster than before in ver 18.6.6)
There is no difference in the quality of the video whether you render it separately or immediately under YT with the button. Audio now supports 320 birate. I did some tests and it turned out that the standard YT settings are better (image and especially sound) than setting them manually.
Thanks, yeah I haven't looked into all of that yet. Render performance always depends on a ton of stuff (codec, hardware, timeline....). I think that's why the advertisement usually says something like "now up to 3x faster".
How to use Uimageplane for any video footage I am using but not working, can you suggest me?
I don't think that's supported so far. As far as I can tell it's only for still images.
If they add a few more Multi tools to avoid noodles it will turn into a full layer based After Effects... 😛
Haha - no chance!
Regarding rendering, they might look towards incorporating the renderer used in the Bevy game engine, or the Godot game engine, both MIT license (permissive).
Regarding other features, how about some more AI stuff? I suggested segmentation to someone (yolo, segment anything - open source NN models) who I think has already passed it along, but segmentation would be extremely useful, but there are TONs of other AI features we could have.
Not sure what type of segmentation. Magic Mask is one form of image segmentation I suppose. A dedicated one trained specifically for Chroma Keying might be nice. For the Depth map it would be great if it could track over the image or even give a solution in combination with 3D camera tracking that aligns with the tracking features. It's easy to come up with ideas, but might be much harder to implement them all.
@@VFXstudy - magic mask is really good. It's not "segment anything" tho which is pretty darn good especially when combined with object detectors like YOLO or Grounding Dino. You could just "tell it" what to segment out. Definitely need this AI driven masking in there.
I can't seem to edit text in the viewer or move letters around.
From the edit page of v19, enable Fusion overlay, then double click into the text or right-click to enable it via the context menu.
@@VFXstudy Thanks, will give it a try!
Anyone know ? Where is 1/30 quality dropper for better preview in fusion 19 , I used this feature in Davinci 18 fusion but In 19 I didn't get
You mean for Proxy and Auto Proxy? It's still the same as before - you need to enable the Timeline Proxy (half or quater) AND the Proxy in Fusion via right-click on the Playback bar. Fusion's own proxy setting with 1/30 etc. is ignored in Resolve as far as I know and only works in Fusion Studio.
@@VFXstudy ✅❤️ thanks you sir for your help 🤗
Dear sir uimageplane not work like as 3d imageplane you have any solution for this
Yes, as far as I can tell it's only supporting stills so far. I think for film projection etc. you still need the traditional system.
If you upgrade your project, you unfortunately can't step back
But going from 18.6 to 19 does not require a project upgrade as far as I can see.
Just tested a few projects I had in 18.6 last week. There was no prompt to upgrade the project or the database it was in. I would still strongly recommend a backup point before - not sure what 18.6 will do if 19.0 tools and features pop up in a project...
@@VFXstudy I did a backup, upgraded to 19 and it crashed on me immediately + had some serious UI bugs. I downgraded to 18.6.6 right away and couldn't open the particular project without restoring archive. So beware!
Thanks for sharing that observation. So +1 for backup!
I am wondering if it's a general thing or if the crashes did something to your project that made it incompatible 🤔
Either way - bottom line: better safe than sorry...
@@VFXstudy It's been historically always the case that newer project versions don't work with older Resolve versions. Maybe it's an UI bug that it doesn't tell you about upgrading?
Thanks! Fusion Studio 19 is pretty buggy
Where are you seeing buggy behaviour?
Really? Anything specific that you found broken?
❤🎉😊 thanks
You're welcome 😊
I'm disappointed with the Referenced Fusion composition.
I thought finally, They've made it possible to adjust a fusion clip's track layers on the edit page to see the consequences of adjustments in real time. I will finally be able to retime clips on the edit page. swap out clips on the edit page. shorten and slip clips on the edit page, everything updating fluidly like it should...
Instead the Referenced Fusion Composition seems like a glorified adjustment layer?? wth??
They need to invent and code a "Fusion Flex Clip" that allows the user to adjust the 'grouped tracks' of a "Fusion Clip" on the ORIGINAL timeline.
All timing, slipping, lengthening, shortening etc, is reflected in the Fusion page. Why does that not already exist? It is so awkward to adjust things by "open in timline" of a fusion clip. You're currently asked to make guess work with changes to are making. it's inpractical. a time sink and deeply stressful.
Not to mention you need to know what you want to comp before hand rather than having freedom to explore now and make adjustments later.
This is something that must be solved. It is way WAY too anti-user-friendly for 2024.
Interesting point of view. Guess that's not how I typically use Fusion. But a lot of what you want is still available. Even with the regular Fusion clip, you can open it in a timeline and change stuff. With the Referenced comp you can do even more of that, if you select multiple clips into the referenced comp you can edit the lower layers and I think even swap if I remember correctly.
Anyhow, personally I'm missing other features here, like a "linked composition" with option to unlink certain parameters or similar. But anyhow, we'll see how BMD takes this further.
But Paint tool in Fusion is still shit
Well I would think that Silouhette is probably ahead of the game if you have to do a lot of roto tasks etc. But for some simple fixes, I find it does the job. What are you missing or comparing it to?