in Germany I am a freelance lecturer at an academy for post-production and teach beginners. I will definitely include your two videos on Fusion and Nuke, as well as your workflow DR->Nuke/Blender->DR, in my classes. They show very well that it’s not about which app is the best, but which solution is optimal for the chosen path! Thank you for your efforts in creating these clips.
Hahahaha 🤣 Definitely recommend it 👌 I've been editing in Resolve for about 4 years after ditching premiere. It's so much better. And fusion is a great AE alternative
@@AlfieVaughan Nice! Yeah, I want to finally take the jump, too! Only my client work keeps me on adobe software. I think I will learn it this summer. Your video is a great guide for me to get into it! Thanks! 😁🙌
6:15 thanks for the shoutout dude! Hopefully I was able to help you learn Fusion, lol. Glad to see you gave Fusion another try and have had a much better experience than the last time lol. Fusion has a Stmap node called texture, but yeah it's not as good as the Nuke one or StMapper you got from Reactor as well. So hopefully in future BMD would add a proper STMap node in Fusion as well. Yeah, it's true. Fusion doesn't have deep compositing built in. It has some tools under the deep pixels' category, but it's not the same as deep comp as those deep pixel tools were made before deep compositing became a thing as far as ik lmao. Yeah, I definitely agree about the pricing as well, NukeX is too expensive for most people who are not VFX pros and at the top of the food chain where studios are making lots of money, the cost of the tool is pretty negligible. But IMO people who are freelancers or running smaller scale indie studios in other parts of the world where they don't make as much as studios in western countries if Fusion can do the job then why not. Great video regardless.
Thanks Alfie. STmaps and I think the roto workflow are definite pain points in Fusion. I didn't know what I was missing until I learned Nuke. Thanks for pointing out the difference with depth channels and deep composting. I was a little confused on those.
My first freelance compositing gig was on anamorphic 2.0 footage. At the time, I was an avid Fusion user, but never really had to worry about distortion maps (should have, but didn’t). Needless to say, motion tracking and compositing the 3D elements became really difficult, and that’s what started my journey into Nuke Indie two years ago. And to your point, I had no idea what I was missing in terms of the roto workflow! I knew to break my roto up into simpler shapes, and track where ever possible, but man. Nuke’s roto node just smokes fusion in comparison. Don’t get me wrong, I’m always taking a peak to see what features being pushed by BlackMagic. Since Nuke is a business expense, and a costly one at that, (Nuke indie costs pretty much the same as a creative cloud license,) I’m always eyeing Fusion to see if it can replace Nuke.
Glad you liked it! Fusion has something called "deep pixel compositing" which is actually just using a position pass from what I've seen. It's not actually deep which is slightly confusing...
man, those 9 minutes summarized my 3+ years trying to figure out the blender-fusion vfx workflow... I hope everyone taking on a remotely similar journey as me, finds this video right away!
Ah good! I'm glad. I've also got a video about my whole workflow that you might like. It's not blender to fusion but there's probably some gems in there too in terms of render setup etc from Blender
One thing I'm liking about this dude is that his viewpoint is very unbiased. Despite using nuke for more than 6 years, he still appreciates fusion and points out issues within nuke like the default layout (btw all software default layouts are bullshit😂). It's pretty rare to see such unbiased opinion. In pcbuilding communities, amd fan bois just hate intel and vice versa. I wish there's some guy who tells the pros and cons of a cpu manufacturer by being unbiased.
@AlfieVaughan If you ever decide to do an another part to this series I would recommend trying out the Hos_SplitEXR_Ultra script. (It already comes within reactor.) It’s basically where you take a loader node within fusion and select your EXR and then you can run the script on the node, it basically takes the channels of the EXR, eg. depth, position, diffuse ect and create separate nodes for each pass. Then you can combine the nodes together using the channel booleans node. Basically you’re doing multilayer EXR compositing.
Thanks for the tip! That's good to know. I suspected there would be something like that to make working with multilayer files easier. I didn't really need it for this video as there were only 2 channels but definitely good to know for stuff in the future 👍
Came here to say this - this being said I winced a little when you said “it’s just a different way of working” I work on tiny freelance projects (motion graphics and some basic CG comp) and coming from nuke the lack of proper arbitrary channels starts to drive you insane after a while. Like rebuilding a beauty in fusion is like 10 times longer in fusion haha, you were being very diplomatic
I always forget about Reactor. I have Studio, but I use it primarily for color corrects, not comping. Might need to take a closer look after this. Great vid, my friend!
Thanks a lot! It's worth checking out. I don't think I'll continue to use fusion for comping but definitely for motion graphics during editing. And even for that I think some of the stuff on reactor will be useful
Another excellent video dude! Nice job. This kind of content is great. For years I've been debating giving Fusion a shot. You've convinced me to give it a go. Cheers man!
Fusion have a built in node for using stmaps, it’s called Texture node. Also, stmaps are not the same from software to software and the stmapper node has settings for handling that.
Yeah I tried that but it doesn't work with alphas and overscan like I said in the video. Without that, it's basically useless for CG compositing as you need the alpha for overlaying it on the footage
Yes I'd love to hear about the features fusion has over nuke. I tried to do as much prep for this video as I could to get the facts right but it's easy to miss things
Well done, Alfie. Much better this time. Still some Fusion stuff is missing, but overall, this is a much fairer comparison and I do appreciate the time you put into this.
@@AlfieVaughan Nothing major, and perhaps just a matter of personal opinion really. 1. I was hoping you'd use Fusion Studio since you were using the paid version. With the paid version of Resolve, you can also download Fusion Studio which is the stand-alone version of Fusion and is much more direct with regards to compositing and offers some better (in my opinion) workflows. 2. You can load multi-layer EXRs into Fusion Studio and use a Fuse (add-on) from Reactor to split the render passes out into their own loader (read) nodes. Though I have found that Fusion's handling of EXRs to be less efficient than Nuke and with heavy comps with many render layers and/or EXR sequences, this can slow down performance. In that case, the way you went about it is a good option. 3. Millolab has created, among others, a great macro for matching blacks that's based by Tony Lyons's tool for Nuke and will do a killer job and matching your blacks. Likewise, you can get a Grade fuse (node) that's sort of close to Nuke's grade node, though I admit, not as intuitive, but it works. Also, get nuke2fusion to set Fusion up to act like Nuke with regards to hot keys and layouts. A must have if you've come from Nuke, for sure. This is basically a personal preference thing and I know you most likely weren't aware of these 3rd party tools. I bring them up because they really do add significant quality of life improvements to Fusion and there are many more that just make working in Fusion more productive. With that said, I completely agree with you about the lens distortion workflow, and I've wished for a long time that Fusion would add similar functionality to Nuke for this. I'll keep my fingers crossed. Also, the deep ability of Nuke is a given, though this doesn't affect me as I don't have the need. But I do get how it can be a deal breaker for those artists who need it in their workflow. For me, although I acknowledge Nuke's advantages, Fusion is just more practical for the work I do and the budgets I work with. Once again, great work on the video. Cheers!
some advantages Fusion has over Nuke: motion graphics, 3D particles, and can work seamlessly with audio in the Resolve version or import audio in the standalone version.
In the Resolve FX category, there is a localized replace tool that can sample colors from the surrounding area and fill in the selected region. I don't know what it's called in English, but in the Chinese version, it's referred to as the "Localized Replace Tool."
In Fusion, wouldn't the Lens Distort node do the undistort trick? Note that I am not referring to the Lens Distortion node which is a different node and it is a simple lens effect. The Lens Distort is also able to load external Distortion Data.
I think Deep compositing is probably the only main thing if you are not working for a major studio that has already Nuke into its workflow. For me, I use fusion because it is better integrated for my braw footage workflow with resolve color grading.
Yeah it's only something that will matter to a very small amount of people. That's why I said this video is more from my perspective rather than just generally saying Nuke is better. Totally get the braw thing too. I'm about to switch from a Sony camera to the new Black Magic Pyxis 6k when it comes out in June and excited to try the raw workflow
Awesome video as always, thanks for sharing with us! I have one video idea just to toss out; it’d be great to have a professional compositor do a video of all the common terms such as Mult, Pre-mult, ST Map etc. Literally just a talking head video, I would watch it at least 10 times to commit it all to memory.
Yeah 😅 changing nukes appearence for the nodes to remove the gradient shading tricks me into feeling like it's not still a design from about 2007 🤣 But although it's not the prettiest I actually really like the interface for it's functionality. Its very well thought out
Very curious as to what others think but would Fusion be a good beginner tool for artists just starting with a nodal compositing workflow? Like how After Effects was an introduction for many of us in to compositing in general.
It's definitely good in the sense that it's accessible and free. But I don't think it's any easier to learn than Nuke. Getting used to nodes is the main jump which obviously you'd have to do either way. I think the most important thing for beginners would be which one has better free tutorials online as that's what you need most early on
Super informative sir! Though I was wondering if it's possible for you to share some plugins you used in this video. I'll keep a note of them and install em as I progress with the learning. Thanks again for sharing such amazing content on Yt❤
I've been using Nuke NC, but I'll be switching to Fusion soon. Mainly because of Nuke's restrictions (eg: I can't do any sort of lens distortion workflow because undistorted 1080p footage is bigger than 1080p, so I can't write out the undistorted footage for tracking. And I can't export a tracked camera from Nuke to Blender). Hopefully I'll like Fusion as much as Nuke.
Yeah I had similar issues when I started using NC. Slightly annoying. You can drop the overscan like I did in this video to export the UD plate. It just won't have the bits off the frame but I prefer to work that way anyway so I don't have to change my camera size in Blender. There's no good way to get cameras out of Nuke NC sadly so I used to just track in Blender instead. But much prefer Nukes tracker which is why I started using Nuke Indie amongst other reasons! I think Fusion is fine and if Nuke vanished overnight I'd get used to it within a week or 2. But ultimately, Nuke is the king! 👑
I wonder if fusion will replace AE as the starter tool of choice. I started in AE and when it came time to upgrade to Nuke the switch to nodes took a bit to get used to. If I had started on fusion I feel the learning curve to Nuke would have been a breeze.
I think it depends what you're doing. AE is definitely more geared towards motion graphics and animationy stuff. But if it's full blown VFX then yeah fusion would be a much better choice to start with. And you're right, the jump to nuke would be pretty straight forward!
I think there’s a solid chance! I work at my school’s video production team, and I’m starting to see that trend. More and more incoming students are starting off with Davinci, and therefore, they’re starting off in the Fusion page for effects as well. I don’t remember where I saw it, but I remember reading/ hearing that adobe used to make their software incredibly easy to pirate, which led to a whole generation learning on Adobe and then sticking with it. However, ever since it went creative cloud, it’s not as accessible to the everyday kid who wants to edit their videos for fun, and I think Davinci has pretty quickly eaten up that market share. It will be interesting to see how Fusion develops because of that.
I worked with Adobe tools extensively when I started my career, about 7 years in PPro and AE. Now I've been working with DaVinci for editing for about 7 years and motion graphics / VFX for about 3. For editing, grading and audio, DaVinci is hands down the more capable and streamlined software compared to the Adobe package. No need to export things in between, just switch tabs and use all these extremely powerful tools simultaneously. Fusion has a steeeeeep learning curve for someone coming from AE. The software UI, keyframe handling, node structure and tool portfolio are so different that it took me about 1-1.5 years to get comfortable enough that I didn't have to go to AE for my graphics too much. Now I work almost exclusively in Fusion, with just some small instances in AE. If the video is a long complex motion graphics thing, I would still use AE. For VFX, Fusion is better than AE hands down. When a project gets complicated, managing the effects in a node structure is much more efficient and transparent, nevermind talking about all the tools specifically designed for VFX in Fusion. For motion graphics, AE is better in some ways, whereas Fusion has some tools that AE doesn't offer. Benefits of AE over Fusion: - Vector workflows and integration with illustrator are lightyears ahead of Fusion (this might change in a year or two with shape nodes). - Timeline performance is better in my opinion. I haven't tested this thoroughly, but Fusion feels to clog up faster from simple to medium complexity projects. - Alignment and distribution tools + snapping are non-existent in Fusion (there are some third party plugins, however). - Moving keyframes around is more intuitive in AE, the Fusion UI can be quite cumbersome - AE's UI is was more intuitive to learn. You can choose objects in the viewport and move them as you want, whereas Fusion is more numbers and property based. - The timeline workflow is more commonly adopted and easy to understand. In AE you can just cut a layer and it will not be visible. In Fusion you COULD do this for nodes, but it's restricted by UI and underlying logic. Object visibility is controlled with merging nodes or node clusters together and animating blends or covering the screen with another object. Benefits of Fusion over AE: - Complex projects easier to navigate due to nodes (this increases your skill ceiling quite fast) - Colour grading workflows at your fingertips (e.g. filmic grading & effects on top of the motion graphics) - Some tools are great for motion graphics (modifiers like follower, anim curves) - 3D workflows - Most animation curve tools and UI choices are better than stock AE. - Precomps are node clusters, which is easier to manage. - Creating tools and templates is pretty easy, sharing node structures to colleagues is a breeze I probably missed a lot. But this is my experience thus far.
I've been using Fusion now for my recent freelance gigs because they were using Fusion. Had fun with it although I still prefer blender's compositor just because I'm used to it, no other reason.
@@AlfieVaughan haha. It's just because it's the one I'm used to but I'm sure that will change once I get more experience with other software. Have fun with the Curve Pigeon System I sent
Hey man, any chance you have a breakdown of the steps you took to get Fusion set up the way you did? Written or otherwise. It's hard to track down exactly what's needed to get it set up quite the same!
Hey! Changing the layout to mid flow gets a similar layout with the node graph on the right which is how I have nuke setup. You can do this under the fusion menu along the top. Then if you right click in the node graph you can set the nodes to snap together when they're aligned. I can't remember exactly what the setting is called and don't have fusion in front of me to check. But it should be pretty obvious what one it is. There's also a setting in there for making the nodes connect vertically instead of horizontally which is more like nuke too. Hope that helps!
@@AlfieVaughan Hmm I got most of it happening, but mid flow could not be found anywhere for the life of me! Thanks very much for your reply, I come back to this constantly, it's awesome to see real practiced vfx techniques versus fully homebrewers where there's a lot of guess work. Trying to find a workflow for stmaps outside of nuke. They're so useful and seemingly not really created or utilised anywhere else really which is baffling!
That is very cool, I would love to have a longer tutorial fusion for Nuke people, now that I'm working for myself I switched to davinci but fusion is still somehow uncomfortable for me
Thanks! A lot of people have asked for it but I don't feel like I know fusion well enough to teach it. What you see in this video is more or less everything I've learned to do in it 🤣
Alfie, what are your thoughts on playback and processing speeds, considering all the optimizations in DaVinci and the extensive support for GPU acceleration in almost every process? I've been using Nuke as my primary tool for quite some time, occasionally checking in on developments in Fusion. Is there a significant difference in the speed of file playback and the performance of heavy tools (such as CameraTracker)? Naturally, it would be best to test these aspects on complex scenes with defocusing, multipasses, and similar elements. But mb you could share your opinion, cuz I already see that Nuke is the slowest software atm in terms of playback files and working tools
For this video they seemed about the same but like you said these shots were very simple so not the best test. I haven't used fusion is any complex capacity with 3D defocusing etc so can't really comment on how they compare
Offtopic question: How important is video resolution to a VFX artist? I'm sure you're gonna say, that quality (well filmed, well lit) exceeds quantity (resolution). But I'm sure, things like masking and tracking is easier on higher resolution. Is there a sweet spot, where the video gets so heavy to work on, that the benefit dissapear and you'd rather have a lower resolution to work on? I'm of course asking, because I wanna know, if VFX artists are looking forward to working on Blackmagic 17K footage or they'd rather work on 2K Arri footage.
There's definitely a sweet spot. The jump from 2k to 4k is horrible. Even on the machines at work that cost $20k+ each. It's exponentially slower to playback, track, render etc. I think you'd be surprised to hear that almost everything still delivers in HD (or 2K), even films for cinema screens. Most projects I work on are shot in either 4.6k or 6k which is the native res for the cameras (usually an Arri Alexa mini). But when we ingest the footage our colour department scans everything down to 2.8k which is what we work on and the final deliveries are usually HD. High Res cameras have been around for years already, like the RED helium etc that shoot 8k. We work with that footage all the time just never at native res. So the 17k black magic files won't be worked on at 17k either. It's not physically possible even with the best computers
I've used Blenders compositor. It's ok but it lacks a lot of basic tools that would make anything more than basic overlaying of renders very difficult/ impossible. And I'm a massive fan of blender so I'm not saying that out of spite 🤣
Are you thinking of deep pixel compositing? They're not the same thing... Fusion doesn't support the type of deep I'm demonstrating in Nuke in this video
Deep pixel isn't actually deep compositing unfortunately. It's just badly named. In fusion, deep pixel compositing just means using the position pass. It's not the same thing as proper deep data
Ah, man, we’d love to see you create some tutorials on Fusion! There’s really not much quality content about it on UA-cam, especially from someone with your level of expertise.
Thanks a lot! A few people have asked. The truth is I don't really know it well enough to teach others. Id have to spend some time getting good at it first!
I’m super glad you found out how to get the colorspace working in Fusion! I’ll have to rewatch that at 0.5 speed, because that’s been a pain point for me for a while! When you say load the Aces config, do you mean load the actual file? Cause that was another pain point for me when using it, was anytime I created a new OCIO node, I had to point it to the ACES config again, so I just ended up always duplicating the OCIO node. Did loading the ACES config on a project settings level fix that for you? I would also love to see your take on the roto in Fusion. When I switched from Fusion to Nuke I had no idea what I was missing. I don’t remember if it’s out of beta yet, but Fusion has updated their roto node to have layers! I’m pretty stoked for it, but the feature still seems incomplete in comparison. Then finally, could you do a (pardon the pun) deeper dive on deep compositing? I have a pretty similar workflow to yours, between Nuke and Blender. From my knowledge, Blender is unable to render in deep, but it does have a world position pass, which I’m pretty sure Nuke and Fusion can use. Pretty sure Fusion calls it a Volume mask, or something like that. I haven’t messed with it to much, but is there any way to convert that world position pass to deep data in Nuke? Again not to familiar with deep, since my 3D rendering experience is limited to Cycles, but isn’t the magic with deep also that it samples the pixels behind an object as well? So if you have a blur, it can more realistically let the colors from the background peak through? Sorry for the long comment with lots of questions, but as a budding freelance compositor and 3D generalist, your channel has had some of the most useful, and practical information for someone like me. I really enjoy your content, and I’ve gotten a lot of value out of it over the past few years, so thank you!
I didn't use any OCIO nodes to be honest but this definitely worked for the viewer. Last time I just worked in linear as I couldn't figure out how to get the ACEScg plates to display correctly. I loaded the actual aces config.io file into the viewer like a LUT as you see in the video and then set my input and output colour spaces and then it behaves like nuke! Not sure I can stomach doing a roto video... 😅 Not exactly my favourite topic. But I've been told a few times about the roto update in fusion. Good they're keeping up. No the world position pass is a totally different thing to deep data. You can't convert it into deep and Blender can't render deep sadly. Maya and Houdini are the only 3D programs that can export it as far as I know. Fusion has something called deep pixel compositing which is misleading as it sounds like deep but it's actually just using the position pass from what I've read... Just bad terminology by the looks of it
@@AlfieVaughan thanks for the clarification on how Aces worked! As for deep, that’s what I thought, which is a shame. It’s definitely weird that Fusion uses the term deep, cause it is quite misleading, but using the world position pass is probably better than nothing
What about the speed difference between Nuke and Fusion ? The effects,especially the GPU accelerated ones and the speed of reading the files sequence into cache.
@@AlfieVaughan at some point I read opinions about how Fusion or Davinci was much faster since it took better use of Nvidia GPU, even of multiple cards at the same time, but seeing how Nuke is much more expensive, maybe that's not true.
@RealTimeFilms I don't know enough to go e a valid perspective on that I don't think 🤣 But nuke has a lot of GPU accelerated nodes too. Especially for heavy stuff
Thanks! I did install it initially but I found it was changing some hot keys I didn't want changed. So in the end I just rebound a few basic ones that I wanted. Like pressing tab to search for nodes etc
New here, Enjoyed your thoughts & professional opinion. Wanted to grasp more on where Fusion could potentially stand in the VFX world for comp. I am used to compositing smoke, fire, special effects, and light passes from Multi-layer EXR's into After Effects but its felt like a nightmare the last year for work. In love with Resolve's Grading System and it being also able to composite 3D Renders? A double whammy for me!. Great Video, I'm looking to move into Fusion away from AE and I think this just helped my decision.
There is a lot of history behind why some functions are missing in fusion, such as deep exr, so you need to bear with that. It has been a long time coming because when BMD took over fusion, a lot of the VFX end suffered because BMD prioritizes the needs of their davinci userbase, unfortunately, and so the legacy VFX fusion users have had their tool wishlist items held back or outright ignored because if resolve users don't want it or need it (being colourists, mostly) then things like deep exr in fusion simply fell off the radar despite the userbase for fusion demanding it for well over a decade. Case in point- before BMD took over fusion, they had a more intuitive and fully customizable GUI but then BMD took it over and 'wanted fusion to look just like davinci' and so a lot of the user friendly intuitive productivity aspects were outright removed. That said, deep exr WAS available in fusion v9 just prior to BMD owning fusion, and with the SDK changover, that was never updated... but it is coming to reactor soon, along with other major toolsets ;) Apart from that, it was good to see that you took time to get more familiar with fusion and gave it another go. Coming from an all nuke background, naturally you would find much of it bizarre. You seemed to get the gist of it after taking time to pick it up though, however there are still things you are not really handling in the most efficient manner :) ...the tracker can be used like it is in Nuke for example, and the colour tools are more capable than you think. Once you get more familiar with it, you'll appreciate it more. Good you're seeing it for yourself though. next time, try a big heavy 3D scene with loads of geometry. Compare that to how nuke handles it ;)
Thanks for taking the time to explain all that! That's really interesting to hear. Sounds like a missed opportunity for Black Magic but like you say, it's probably just not a priority for them!
@@AlfieVaughan One has to realize that BMD does not 'really' think of the foundry as a 'competitor'. The reason for this is because BMD makes cameras and other high-end hardware for film, TV, and video production. Their flagship software is davinci, which is a gold standard for grading feature films (ask yourself when was the last time there was PR for nukestudio colour grading a major feature.) Who BMD IS competing against is Premiere, AE, Flame, Baselight, and Scratch. Their interest with fusion's VFX end is less concerning for them, unfortunately. This is why we have sadly lagged a bit behind nuke for VFX toolsets such as deep exr, which, again, existed for fusion 9 from a community member, and we will be seeing it again in fusiuon 19 once BMD releases the SDK. Things were looking good until BMD bought eyeon software- just prior to the takeover, fusion was picking up again. I used fusion on some major television series and feature films back then, it wasn't uncommon. Which brings up the other point- BMD is pretty slow to release the SDK to 3rd party developers, so when major updates arrive, the serious VFX tools made by the community (just look in reactor) have to stand by for updates if they don't work in the newest release. That said, we have a powerful band of developers who are passionate about bringing excellent tools into fusion, and i know of many jaw dropping ones that will be coming soon. You will definitely want to keep your eyes open :) Oh one last note: did you install the nuke2fusion toolset from reactor? might have made the learning curve a little bit easier on you as well.
@@JAK-gh4ez It's great to hear from a seasoned professional and an "insider" that Fusion is being looked at with some real intent from the community side. I'd call myself an extremely heavy user of the entire DaVinci package as a whole (with Fusion for motion graphics). The power of creating in software where every aspect of filmmaking is integrated in such a smooth way is liberating and exhilarating. I would've never believed this was possible when I started years and years ago.
Wow, I've heard about similar stories but didn't know about it in details. When I was checking out old interviews and articles from 2014, and it seemed like people were very excited about Fusion and expected a great future for Fusion, especially in high-end VFX after BMD's acquisition. But it has been 10 year since the acquisition and BMD has barely added anything solid to Fusion, and most of the updates has been to make Fusion work better with the edit page and so DVR could take advantage of Fusion's powerful feature set, motion graphics update etc. But there hasn't been anything new to address most of the shortcomings of Fusion like a lack of improved EXR workflow, spline warp, better depth blur, proper stmap workflow, something like nuke's curve tool and the list goes on and on. Man, my mind still can't wrap around that Fusion used to have support for deep EXR, but it doesn't anymore. Although BMD is a much bigger company than Eyeon software, their main focus is in hardware and broadcast market, so even tho I don't think they are actively trying to kill Fusion by buying it from Eyeon 10 years ago and slowly not updating it much, so people would stop using it, but I think it's just BMD is a much smaller company than most other software companies like Adobe, Autodesk etc. and has an even smaller software dev team. So it just comes down to priority IG and not having enough resources to develop Fusion, as they're giving away Fusion standalone for free with Resolve studio now. As much as I'd like BMD to develop Fusion more instead of unnecessary cut page updates, for Fusion to be able to be a viable competitor to Nuke for high end VFX, it just seems like not going to happen unless they change their mind.
hey man, I just wondered why did you pick the black point in the "reversed" grade node, instead picking lift or offset in the normal grade,. btw the "toe" node gives much flimic look for blacks. thanks for the video!
Thanks! It's actually not an insta360 it's a Ricoh Theta Z1. And in terms of workflow with this camera, there is none 🤣 It merged and stitches the HDRI in camera which as far as I know no other 360 camera does at the moment. Very expensive... But save a lot of time. So the files come out the camera as 32bit EXRs and can go straight into Nuke/Blender for use
But are you merging with photishop? Because when I do that, I never get such a clean image like you. My ist still overexposed. So I must lower the exposure in the 3d packet. But then Imthe picture is to black.
I know your focus was on Nuke versus Resolve for VFX. But I'm wondering if they would be a great combo - Use Resolve for editing/grading and VFX, and for high-end VFX use Nuke and export it back to Resolve. Would this be a good workflow? I'm switching from Premiere/AE to Resolve/Nuke (mainly focusing on Resolve for now) so I'd love to hear your thoughts on them together.
That's exactly what I do for my videos. I edit and grade in resolve and think it's fantastic at both of those things. But for the VFX I export everything to EXR sequences and work in Nuke and Blender. I've got a whole video about it. It's slightly due an update but you get the idea... ua-cam.com/video/dlSOkXT7Lxk/v-deo.htmlsi=2wSKbHsUNzLh2EhM
It just takes more time and the end result is the same. To properly set up a lens grid you have to check all the points are exactly on the edges of the squares and move them manually if needed. Which takes a few minutes each time and then the end result is outputting the distortion. So using an STmap just skips that step and gets you the distortion instantly
Hi thanks for the run through, does anyone know why when i do a similar work flow and try to open the exported fbx file in cinema 4D, it always is the incorrect resolution? In cinema 4d. Thanks. It does not seem to retain the resolution of the timeline in the fbx export?
3D cameras don't have anything to do with the project resolution. They have their own sensory size etc that's independent from the resolution of the shots. I don't use C4D but I would imagine you just need to set the project resolution to be the correct size when you first make the scene. I do the same in Blender
I use Flame for running the edits at work. I haven't done any comping with it but we use it for conform/ online etc. I think it's great. I'd much rather comp in Nuke which I still do but as a timeline tool flame is brilliant
@@AlfieVaughan I'm curious about swapping out Flame Online for Fusion Online. We Grade in Resolve and to have both in 1 package would save a lot of time and headache exporting back and forward between the two. Do you think Fusion is a capable competitor to Flame for 2D Online work like that?
What's your verdict on performance differences between nuke and fusion? I haven't used fusion in a while but I find nuke to be a bit lacking in that regard as it is still very much single threaded and not using the GPU as much as some other modern software.
@@AlfieVaughan thanks for the reply. Btw; I recently dug a bit deeper into 3D tracking and watched an interview with the guy who created syntheyes. He mentions that it is basically better not to provide the solver with camera back data because the mm values you find online are often not entirely precise and neither are the focal lengths of lenses. The software is far better at calculating these values precisely just by itself. In my experience the hpix error usually just goes up when I input these so I was glad to find out that it's actually better to just don't input them. Cheers
@MrJemabaris yes Ive found this as well. I usually don't do this anymore in Nuke when I'm tracking. I just did it in this video so that the trackers were both working off the same information to see which produced a better result
Couldn't agree with you more! Also, I started earning more from each job because I comp in nuke.. Really levels things out. I'd go with fusion if I wasn't making money from vfx, small full service production company, or was currently using ae
Good to know! It's like how Flame used to cost an absolute fortune to have the machine and the license etc. so being a flame artist meant people could charge insane day rates
Haha! I get asked that a lot but I don't know it well enough to confidently say that the way I do things is correct. It's a different story with nuke because I was properly trained to use it. It's a good idea but I'm conscious of spreading the wrong info
@@AlfieVaughan You def know about VFX. I Used to work with After Effects, but lately I've been doing everything on Fusion (Got tired of blue screens lmao). If I'm correct the underlying principles and techniques are pretty much the same, Just in a different package. Also... my man, I'm no industry expert but I don't think there's a "rigght way" for doing VFX. If it does the job, it does the job. There's always some other "more optimal" way. I'm just excited there's more VFX Fusion tuts now days
The principles are definitely the same and that's the part that transfers to all software. But having said that, trying to teach them in a software Im not a familiar with will likely lead to doing things in an objectively incorrect way. For example, in this video I wasn't totally sure why the STmap lens distortion in fusion was different to Nuke and if I'd made a point of saying "this is how to use STmaps in Fusion" then I might be showing people a way that doesn't actually work. I totally get your point but there are also some things that can definitely be incorrect
That's deep pixel compositing. It's not the same thing. That's just using the position pass to create masks. Deep in Nuke is a totally different thing and way more powerful
@@AlfieVaughan thanks for the quick reply! It’s awesome to see creators interact with their audience, I know it’s not easy some times. Last question, do you have any tips for smooth playback in davinci? My computer is a little more up to date than yours but my playback is awful in fusion and just makes workflow frustrating when dealing with large composites. Thanks for your time !
To address your comment about deep compositing and the depth data, I'm pretty sure it is addressed by Millolab Tuts, Blender to Fusion vid, he mentions something about the coordinate system in Blender is different than the one in Fusion so you would have to switch some channels around to get the proper position data, just another "lovely" quirk about Fusion. So I believe it is possible, this is just not my traditional wheel house of knowledge. Once again, great video. (Hopefully my comment goes through even though I'm talking about another video)
Thanks! Unfortunately deep pixel compositing isn't the same as deep in Nuke and proper deep data. What youre referring to is actually just a common position pass AOV and not actual deep data. They're very different systems that for some reason have been confusingly named the same thing 🤣
How about bounding box management? For example, in Nuke you can set it to A side in your copy/merge(mask) node and have the alpha channel determine the bounding box of the premultiplied image before merging it back to the plate. In fusion I have no clue how I should do that. Also, cropping it doesnt seem to be the way to go. Any Fusion expert here who could help me out ?:)
it is even better in fusion most of the time it automatically sets to smallest size when there is alpha channel. or use auto domain or set domain node. and many nodes like masks have cliping control in image tab. and if you want to veiw box size with resolution enable it in desired veiwer by right clicking in any veiwer and region < click show DOD.
Worked with fusion on couple animation projects (both 2D and 3D). Liked it a lot, very capable. However I find Nuke much cleaner workflow, more intuitive, faster, and I love making tools too much. Fusion really missing shuffle node, when you have 10-20 layers of lights and all the passes imaginable it gets really messy. On the flip side, I prefer fusion's paint tools way more and Fusion's 3D tools are way ahead of the one in Nuke. Also Fusion's defocus workflow is a mess. On the other note: Still crying that Natron stopped active development, would be cool to see the video on it with your opinion.
@@IBpostproduction-e4q if it is what you mean it's not exactly the same. Nuke or even Natron Shuffle is closest to Fusion channel boolean. As far as I'm aware right now even with plugins it's not possible due to how fusion treats channels. The issue is not accessing the channels from the beginning, but rather being able to work with them down the line. Right now you have to have ton of loaders everywhere which gets messy pretty fast, with proper shuffle you can literally render one file that'll contain all the layers (you totally should never do that due to performance though)
Crumple pop by Boris FX is your friend to get rid of that annoying echo in the room where you do the conclusion. Other than that, thanks for the video.
Thanks for the recommendation! I don't usually record in there as the room is much bigger 🤣 I think resolve has a similar vocal cleanup tool. I'll investigate 😎
@@AlfieVaughan Wow. That’s the fastest response I’ve ever had on UA-cam! Perhaps there’s a Klaxon that goes off in your room for every comment that immediately makes you jump out of bed? I’m a hobbyist so can’t afford Nuke. Twenty-five years ago when I had no money or capable PC I did visit a company in Soho in London to enquire about Flame, which was the compositor of choice back then, The receptionist burst out laughing. It was £250,000 for a licence. And I didn’t even own a Mac! But those music videos by Chris Cunningham were good. So I’m pleased the prices have dropped somewhat. But maybe not enough. For us hobbyists Black Magic are opening doors I guess. What would be your opinion on Indie Nuke.vs Fusion? I’m trying to bend my head around Silhouette by Boris FX. Maybe that’s enough if only doing a 2D composition.
@farmersuiticles hahaha 🤣 I get a notification on my phone each time so I usually see a comment and reply straight away! Yes Flame was incredibly expensive back in the day 🤣 Not just the software but also having the hardware to properly run it would cost more than most peoples houses... But that's sometimes the cost for the best of the best. These days Autodesk have made the licenses more affordable and technology means you can run flame on a mid range laptop now. Personally I'd pick Nuke Indie over fusion. Indie is essentially fully featured except the render limit of 4k resolution and some integration limits with pipelines etc. I think in a head to head indie is the better choice. But hard to argue with fusions price tag
I think you're referring to fusion's "deep pixel" compositing. It's not the same thing (confusingly named!). Deep pixel in fusion is similar to using a position pass in Nuke. Totally different
You will have to look for a studio that uses fusion. Theyre not very common so your choices will be a lot more limited and it depends if they're hiring or not. For the best chance of being hired you're better off learning the most commonly used software
Big plus about Fusion for me is that I'm the only person in my studio that is using it. I've set up fusion render farm (which is free) on all PCs and now I have more then 20 PCs calculating my compositions. It's insanely fast and all that for 300$!
How is it a plus that you’re the only person using it. You can’t pass off the project, or collaborate with ease. If you go on vacation or are sick no one can open your project. Don’t say job security either because the moment those situations arise a smart lead [producer or owner] would restructure. I also don’t know why you have 20 computers rendering composites from a single artist when one decent computer can handle it in the evenings. Don’t tell me you’re using other people’s stations while they’re working. 🤦♂️
@I3ra that's quite a quite a narrow minded perspective in my opinion... While technically you're right about it being difficult to hand off, if you're doing a shot that you know will definitely not be picked up by someone else then it's not a big deal. Other software can have tools and things that are uniquely useful and good for problem solving. I've been using blender quite extensively for the last 7 years while working as a compositor at VFX studios. No one else uses it and can't pick it up if I do something in blender. But for standalone work where I know it's just me, it's solved some enormous problems that would have taken 5 times as long to do solely in comp. As for the render farm, valid point but maybe their individual machines aren't that good? I've also done jobs at 15k that took 7 hours to render overnight. If you're working late and need to send something quicker than 7 hours then it could be a 21 minute render with other machines. More power is never a bad thing
Because the sun is usually over exposed still even in the lowest exposure. You need it to be a tiny pin prick in the HDRI for the lighting to look real. If it's too big and blown out you get really soft shadows. So it's better to paint it out and then use a sun lamp as that lets you control it more effectively
I was hoping to get a one-time pay license for Nuke for my Synagogue. Once I get my Rabbi ordination, I'll be developing a video production studio where we will be creating documentaries from our Yeshiva. Subscriptions are the very reason why I will no longer use ADOBE software. Thank you for this video. I was wondering, is there a website that does step by step tutorials for Nuke? Thanks.
I think it's subscription only but if you're not making money from what you produce then you can use nuke non commercial which is free. Alternatively, fusion is davinci resolve is very good and free or you can buy the studio version which is about £240 as a one off payment
I used the lens grid we have at work to profile all my lenses. One of the perks of being at a studio! Without one there's not really any way to do it properly. You need to print off the chart and stick it on a big piece of wood backing so it's straight and then film all your lenses looking at it to extract the distortion data
They were both about the same. But it doesn't get much more simple than these shots so I'm not surprised. I haven't done any massive comps on fusion but it would be interesting to see!
Great point about how the further you get in your career, your projects will (hopefully) get higher end and bigger budgets..in which case the cost of your software starts to matter less and less to the individual...since those costs get absorbed by the projects. I get that this would 100% matter to freelancers, lone wolves, and start ups...but for medium to large film or commercial projects...i haven't really cared about how much something costs in a hot minute (and focusing on 100% the best software package to get the job done right...and quickly).
I don't get paid to promote Nuke. I just reached out to them after paying for Nuke myself for a couple of years and asked if they would be interested in providing a license. Which they did in exchange for me making some of their official tutorials. I don't think it changes my perspective at all. And like I said, if I wasn't getting it for free I would totally pay for Indie
Thanks! Could be but I doubt it as several 3D softwares are able to render it so it's not exclusive in that sense. I think it's more likely no one else has bothered putting it in a compositor as it's not an important feature to 99% of the user base
Thanks! I didn't really put any thought into the grade 🤣 just upped the contrast and saturation and added a vignette. It was just meant to be a cheesy look rather than anything serious
@@AlfieVaughan Yea I know! I am actually surprised you made yourself try fusion. I use it as my comp app, but only because I don't comp much. For me Nuke just clicks, channels, masks, everything seams well thought-out. Fusion often feels like 3 softwares glued together. Even node connecting is pain. If I's comp a lot, I'd pay for nuke.
I've tried it. Blenders compositor is terrible 🤣 I absolutely love Blender for it's 3D stuff but it's compositor is so behind anything else. It's basically useless for anything serious
Sshhh, Blender fan boys don't like their software to be criticized! They think it's the best thing, since sliced bread. Best editor, best compositor, best 3D modelling software etc. I've seen several Blender fans claiming that Blender beats everything on the market, when it comes to compositing.
Eyeon FUSION was the King of compositing up (and if you ask me it still is, Nuke is just hype) until The Foundry poured a lot of money onto marketing Nuke and offered incentives to studios to use Nuke. Eyeon was not very well organized and their marketing was non existent, back then since all major fx studios were using Fusion and its lic cost the same price a Nuke lic costs now. Thank God for Blackmagic for buying Fusion making it cheaper to own and developing DaVinci resolve to seamlessly work with it. Also Fusion has a prettier and more user friendly interface. Nuke is NOT better than Fusion.
Quite a bold statement to say one is definitely better than the other. They're both good. Fusion is incredible value but for the work I do on high end commercials, TV and film, nuke is much more suited to the workflows we use than fusion.
@@AlfieVaughan Well, my statement basically came from a friend and coworker of over 20 years that is now a big vfx supervisor and grew up with Fusion. I started on Fusion but I found a high paying job in another field so I stopped doing fx work, I only do it as a hobby on my own projects. Another thing I didn't mention is that there is no Union in US to represent and protect the fx artists who are underpaid and overworked and don't get paid much unless they have a supervising position.
Always enjoy your videos Alfie but not sure how you can be 100% objective if Nuke sponsor you UA-cam channel? I think both products are very close, Nuke though is a bit more polished and better supported but I also think a bit slower under the hood. The artist will always be the defining difference at this level IMHO, I still watch Nuke tuts to improve my work in Fusion.
Thanks! It's a fair question. The best answer I can give is that they don't give me any money or ask me to say anything good about Nuke. They just give me a license each year. And I approached them to ask for one. I also paid for Nuke Indie for 2 years out of my own money when I was doing some freelancing outside of my day job at The Mill. So I'd say I've put my money where my mouth is by actually using it as a customer before I was affiliated with them in any way. But you're right, now it's a bit of a different story!
So is Nuke unless you're working commercially! I do agree but like I said, this is from the perspective of a professional, not the average user. And also picking software based on how the UI looks is not a good idea 🤣
Working at some of the best VFX studios in the world for 7 years? 🤷♂️ But in all seriousness, by definition a professional is someone that gets paid to do a job. I get paid to spend 50+ hours a week working as a compositor. How's that?
You have to film one for your specific lens on a lens grid. We have one in our studio so I just used the one at work. But they're pretty massive so to make your own one is a bit complicated. Worth looking into if you want to get serious about lens distortion workflows
in Germany I am a freelance lecturer at an academy for post-production and teach beginners. I will definitely include your two videos on Fusion and Nuke, as well as your workflow DR->Nuke/Blender->DR, in my classes. They show very well that it’s not about which app is the best, but which solution is optimal for the chosen path! Thank you for your efforts in creating these clips.
Thank you! I hope your students like the videos :)
That's it, I AM DITCHING ADOBE!
Hahahaha 🤣 Definitely recommend it 👌 I've been editing in Resolve for about 4 years after ditching premiere. It's so much better. And fusion is a great AE alternative
@@AlfieVaughan Nice! Yeah, I want to finally take the jump, too! Only my client work keeps me on adobe software. I think I will learn it this summer. Your video is a great guide for me to get into it! Thanks! 😁🙌
I am switching to Adobe sadly because of school but I still prefer oyhers
Fusion is leaps and bounds ahead of AE when it comes to vfx compositing.
I would say AE is still better for motion graphics stuff, but other than that, I don’t find many reasons to stick to adobe
Yay! I'm glad to see you take another crack at it!
Thanks! It only took 10 months 🤣
I’m a Nuke artist trying to learn Fusion. Very helpful, thank you Alfie
No problem! Good luck :)
6:15 thanks for the shoutout dude! Hopefully I was able to help you learn Fusion, lol.
Glad to see you gave Fusion another try and have had a much better experience than the last time lol.
Fusion has a Stmap node called texture, but yeah it's not as good as the Nuke one or StMapper you got from Reactor as well. So hopefully in future BMD would add a proper STMap node in Fusion as well.
Yeah, it's true. Fusion doesn't have deep compositing built in. It has some tools under the deep pixels' category, but it's not the same as deep comp as those deep pixel tools were made before deep compositing became a thing as far as ik lmao.
Yeah, I definitely agree about the pricing as well, NukeX is too expensive for most people who are not VFX pros and at the top of the food chain where studios are making lots of money, the cost of the tool is pretty negligible. But IMO people who are freelancers or running smaller scale indie studios in other parts of the world where they don't make as much as studios in western countries if Fusion can do the job then why not. Great video regardless.
Hey how did you changed the layout of davinci resolve?
@@connorjade5460 Workspace > Layout presets > Fusion presets > Midflow.
@@connorjade5460 6:05
It was a very objective summary.
There is good, better and best. The more a device knows, the harder it is to learn.
Thanks Alfie.
STmaps and I think the roto workflow are definite pain points in Fusion. I didn't know what I was missing until I learned Nuke.
Thanks for pointing out the difference with depth channels and deep composting. I was a little confused on those.
My first freelance compositing gig was on anamorphic 2.0 footage. At the time, I was an avid Fusion user, but never really had to worry about distortion maps (should have, but didn’t). Needless to say, motion tracking and compositing the 3D elements became really difficult, and that’s what started my journey into Nuke Indie two years ago. And to your point, I had no idea what I was missing in terms of the roto workflow! I knew to break my roto up into simpler shapes, and track where ever possible, but man. Nuke’s roto node just smokes fusion in comparison.
Don’t get me wrong, I’m always taking a peak to see what features being pushed by BlackMagic. Since Nuke is a business expense, and a costly one at that, (Nuke indie costs pretty much the same as a creative cloud license,) I’m always eyeing Fusion to see if it can replace Nuke.
Glad you liked it! Fusion has something called "deep pixel compositing" which is actually just using a position pass from what I've seen. It's not actually deep which is slightly confusing...
man, those 9 minutes summarized my 3+ years trying to figure out the blender-fusion vfx workflow... I hope everyone taking on a remotely similar journey as me, finds this video right away!
Ah good! I'm glad. I've also got a video about my whole workflow that you might like. It's not blender to fusion but there's probably some gems in there too in terms of render setup etc from Blender
oh boy, took me some time to figure out how to deal with Agx (too used to linear workflow)
One thing I'm liking about this dude is that his viewpoint is very unbiased. Despite using nuke for more than 6 years, he still appreciates fusion and points out issues within nuke like the default layout (btw all software default layouts are bullshit😂). It's pretty rare to see such unbiased opinion. In pcbuilding communities, amd fan bois just hate intel and vice versa. I wish there's some guy who tells the pros and cons of a cpu manufacturer by being unbiased.
Haha! Thanks very much :)
@@AlfieVaughan mention not🗿
@AlfieVaughan If you ever decide to do an another part to this series I would recommend trying out the Hos_SplitEXR_Ultra script. (It already comes within reactor.) It’s basically where you take a loader node within fusion and select your EXR and then you can run the script on the node, it basically takes the channels of the EXR, eg. depth, position, diffuse ect and create separate nodes for each pass. Then you can combine the nodes together using the channel booleans node. Basically you’re doing multilayer EXR compositing.
Thanks for the tip! That's good to know. I suspected there would be something like that to make working with multilayer files easier. I didn't really need it for this video as there were only 2 channels but definitely good to know for stuff in the future 👍
Came here to say this - this being said I winced a little when you said “it’s just a different way of working” I work on tiny freelance projects (motion graphics and some basic CG comp) and coming from nuke the lack of proper arbitrary channels starts to drive you insane after a while. Like rebuilding a beauty in fusion is like 10 times longer in fusion haha, you were being very diplomatic
FYI, it's super eqsy to set up scene for 3D compositing is fusion. And you can do it wothin the free version of resolve.
I always forget about Reactor. I have Studio, but I use it primarily for color corrects, not comping. Might need to take a closer look after this. Great vid, my friend!
Thanks a lot! It's worth checking out. I don't think I'll continue to use fusion for comping but definitely for motion graphics during editing. And even for that I think some of the stuff on reactor will be useful
Another excellent video dude! Nice job. This kind of content is great. For years I've been debating giving Fusion a shot. You've convinced me to give it a go. Cheers man!
Thank you! It's surprisingly good :)
Love these in-depth composite workflow video. Davinci is just so good considering value but totally understand how pro Nuke is.
Thank you! Glad you liked it :)
Fusion have a built in node for using stmaps, it’s called Texture node. Also, stmaps are not the same from software to software and the stmapper node has settings for handling that.
Yeah I tried that but it doesn't work with alphas and overscan like I said in the video. Without that, it's basically useless for CG compositing as you need the alpha for overlaying it on the footage
@@AlfieVaughanwell, I personally use stmapper way more flexible.
Also, there are quite a few tools that Fusion have that Nuke does not. You should talk about those aswell. If you need I could point those out.
Yes I'd love to hear about the features fusion has over nuke. I tried to do as much prep for this video as I could to get the facts right but it's easy to miss things
I also just recognised your username from when I was looking into some of the nodes. I came across some of your videos and tools. Cool stuff!
Well done, Alfie. Much better this time. Still some Fusion stuff is missing, but overall, this is a much fairer comparison and I do appreciate the time you put into this.
Thank you! Out of interest, what did I miss?
@@AlfieVaughan Nothing major, and perhaps just a matter of personal opinion really.
1. I was hoping you'd use Fusion Studio since you were using the paid version. With the paid version of Resolve, you can also download Fusion Studio which is the stand-alone version of Fusion and is much more direct with regards to compositing and offers some better (in my opinion) workflows.
2. You can load multi-layer EXRs into Fusion Studio and use a Fuse (add-on) from Reactor to split the render passes out into their own loader (read) nodes. Though I have found that Fusion's handling of EXRs to be less efficient than Nuke and with heavy comps with many render layers and/or EXR sequences, this can slow down performance. In that case, the way you went about it is a good option.
3. Millolab has created, among others, a great macro for matching blacks that's based by Tony Lyons's tool for Nuke and will do a killer job and matching your blacks. Likewise, you can get a Grade fuse (node) that's sort of close to Nuke's grade node, though I admit, not as intuitive, but it works. Also, get nuke2fusion to set Fusion up to act like Nuke with regards to hot keys and layouts. A must have if you've come from Nuke, for sure. This is basically a personal preference thing and I know you most likely weren't aware of these 3rd party tools. I bring them up because they really do add significant quality of life improvements to Fusion and there are many more that just make working in Fusion more productive.
With that said, I completely agree with you about the lens distortion workflow, and I've wished for a long time that Fusion would add similar functionality to Nuke for this. I'll keep my fingers crossed. Also, the deep ability of Nuke is a given, though this doesn't affect me as I don't have the need. But I do get how it can be a deal breaker for those artists who need it in their workflow. For me, although I acknowledge Nuke's advantages, Fusion is just more practical for the work I do and the budgets I work with. Once again, great work on the video. Cheers!
some advantages Fusion has over Nuke: motion graphics, 3D particles, and can work seamlessly with audio in the Resolve version or import audio in the standalone version.
I use fusion for the motion graphics in my videos. Not that it's very much but it works well!
Thanks for taking to time to share your knowledge. Very helpful.
No problem!
In the Resolve FX category, there is a localized replace tool that can sample colors from the surrounding area and fill in the selected region. I don't know what it's called in English, but in the Chinese version, it's referred to as the "Localized Replace Tool."
Good to know! I'll have a look. Thank you :)
I think the node is called "Dead Pixel Fixer"
@@xanzuls No not this one. I changed the language to English and It is called “Patch Replacer”
@@Tensor_7 oh i see, i'm using the free version, it's a studio only node.
In Fusion, wouldn't the Lens Distort node do the undistort trick? Note that I am not referring to the Lens Distortion node which is a different node and it is a simple lens effect.
The Lens Distort is also able to load external Distortion Data.
Does it also work with the all the other channels in a multi layer exr? Everything I tried only applied the distortion to rgb
I think Deep compositing is probably the only main thing if you are not working for a major studio that has already Nuke into its workflow.
For me, I use fusion because it is better integrated for my braw footage workflow with resolve color grading.
Yeah it's only something that will matter to a very small amount of people. That's why I said this video is more from my perspective rather than just generally saying Nuke is better. Totally get the braw thing too. I'm about to switch from a Sony camera to the new Black Magic Pyxis 6k when it comes out in June and excited to try the raw workflow
Awesome video as always, thanks for sharing with us! I have one video idea just to toss out; it’d be great to have a professional compositor do a video of all the common terms such as Mult, Pre-mult, ST Map etc. Literally just a talking head video, I would watch it at least 10 times to commit it all to memory.
Thanks! Haha ok, I'll have a think about it
Fusion is so old, but opening it up reminded me of using it as a companion compositor to Maya
You should see Flame's interface 😅
Fusion is old but the UI has been updated after v9 which isn't really the case for maya or even nuke.
Yeah 😅 changing nukes appearence for the nodes to remove the gradient shading tricks me into feeling like it's not still a design from about 2007 🤣 But although it's not the prettiest I actually really like the interface for it's functionality. Its very well thought out
@@rano12321 Hey what's wrong with Maya bruteforcing all of its workload through one CPU thread?....🤣
@@buda3d2007 lol, this is why bifrost is a plugin and not part of native maya lmao.
Very curious as to what others think but would Fusion be a good beginner tool for artists just starting with a nodal compositing workflow? Like how After Effects was an introduction for many of us in to compositing in general.
It's definitely good in the sense that it's accessible and free. But I don't think it's any easier to learn than Nuke. Getting used to nodes is the main jump which obviously you'd have to do either way. I think the most important thing for beginners would be which one has better free tutorials online as that's what you need most early on
amazing! so glad i focused on resolve and not adobe stuff 2 years ago :D
Thank you! Yeah it's much better in my opinion
You are really Professional!
And your videos and tutorials make us Professional.
Please again breakdown your any VFX project
Thank you
Thanks! Will do
Would love to see more fusion based tutorial series!!
Thanks! A lot of people have asked about this. To be honest I don't know fusion well enough to feel comfortable teaching it. But maybe in the future
@@AlfieVaughan 😇
Super informative sir!
Though I was wondering if it's possible for you to share some plugins you used in this video.
I'll keep a note of them and install em as I progress with the learning. Thanks again for sharing such amazing content on Yt❤
Thanks! I didn't use many plugins in this video as I wanted to keep it fairly simple. Exponential glow is the main one
I've been using Nuke NC, but I'll be switching to Fusion soon. Mainly because of Nuke's restrictions (eg: I can't do any sort of lens distortion workflow because undistorted 1080p footage is bigger than 1080p, so I can't write out the undistorted footage for tracking. And I can't export a tracked camera from Nuke to Blender). Hopefully I'll like Fusion as much as Nuke.
Yeah I had similar issues when I started using NC. Slightly annoying. You can drop the overscan like I did in this video to export the UD plate. It just won't have the bits off the frame but I prefer to work that way anyway so I don't have to change my camera size in Blender.
There's no good way to get cameras out of Nuke NC sadly so I used to just track in Blender instead. But much prefer Nukes tracker which is why I started using Nuke Indie amongst other reasons!
I think Fusion is fine and if Nuke vanished overnight I'd get used to it within a week or 2. But ultimately, Nuke is the king! 👑
Fantastic Video! Thorough & succinct! ✨
Thank you Prashan! 🙏
I wonder if fusion will replace AE as the starter tool of choice. I started in AE and when it came time to upgrade to Nuke the switch to nodes took a bit to get used to.
If I had started on fusion I feel the learning curve to Nuke would have been a breeze.
I think it depends what you're doing. AE is definitely more geared towards motion graphics and animationy stuff. But if it's full blown VFX then yeah fusion would be a much better choice to start with. And you're right, the jump to nuke would be pretty straight forward!
I think there’s a solid chance! I work at my school’s video production team, and I’m starting to see that trend. More and more incoming students are starting off with Davinci, and therefore, they’re starting off in the Fusion page for effects as well. I don’t remember where I saw it, but I remember reading/ hearing that adobe used to make their software incredibly easy to pirate, which led to a whole generation learning on Adobe and then sticking with it. However, ever since it went creative cloud, it’s not as accessible to the everyday kid who wants to edit their videos for fun, and I think Davinci has pretty quickly eaten up that market share. It will be interesting to see how Fusion develops because of that.
for vfx fusion is leaps and bounds ahead of ae.
I worked with Adobe tools extensively when I started my career, about 7 years in PPro and AE. Now I've been working with DaVinci for editing for about 7 years and motion graphics / VFX for about 3.
For editing, grading and audio, DaVinci is hands down the more capable and streamlined software compared to the Adobe package. No need to export things in between, just switch tabs and use all these extremely powerful tools simultaneously.
Fusion has a steeeeeep learning curve for someone coming from AE. The software UI, keyframe handling, node structure and tool portfolio are so different that it took me about 1-1.5 years to get comfortable enough that I didn't have to go to AE for my graphics too much. Now I work almost exclusively in Fusion, with just some small instances in AE. If the video is a long complex motion graphics thing, I would still use AE.
For VFX, Fusion is better than AE hands down. When a project gets complicated, managing the effects in a node structure is much more efficient and transparent, nevermind talking about all the tools specifically designed for VFX in Fusion.
For motion graphics, AE is better in some ways, whereas Fusion has some tools that AE doesn't offer.
Benefits of AE over Fusion:
- Vector workflows and integration with illustrator are lightyears ahead of Fusion (this might change in a year or two with shape nodes).
- Timeline performance is better in my opinion. I haven't tested this thoroughly, but Fusion feels to clog up faster from simple to medium complexity projects.
- Alignment and distribution tools + snapping are non-existent in Fusion (there are some third party plugins, however).
- Moving keyframes around is more intuitive in AE, the Fusion UI can be quite cumbersome
- AE's UI is was more intuitive to learn. You can choose objects in the viewport and move them as you want, whereas Fusion is more numbers and property based.
- The timeline workflow is more commonly adopted and easy to understand. In AE you can just cut a layer and it will not be visible. In Fusion you COULD do this for nodes, but it's restricted by UI and underlying logic. Object visibility is controlled with merging nodes or node clusters together and animating blends or covering the screen with another object.
Benefits of Fusion over AE:
- Complex projects easier to navigate due to nodes (this increases your skill ceiling quite fast)
- Colour grading workflows at your fingertips (e.g. filmic grading & effects on top of the motion graphics)
- Some tools are great for motion graphics (modifiers like follower, anim curves)
- 3D workflows
- Most animation curve tools and UI choices are better than stock AE.
- Precomps are node clusters, which is easier to manage.
- Creating tools and templates is pretty easy, sharing node structures to colleagues is a breeze
I probably missed a lot. But this is my experience thus far.
I've been using Fusion now for my recent freelance gigs because they were using Fusion. Had fun with it although I still prefer blender's compositor just because I'm used to it, no other reason.
Good to know! Didn't realise you preferred blenders compositor 🤣 I've never heard anyone else say that
@@AlfieVaughan haha. It's just because it's the one I'm used to but I'm sure that will change once I get more experience with other software. Have fun with the Curve Pigeon System I sent
Hey man, any chance you have a breakdown of the steps you took to get Fusion set up the way you did? Written or otherwise. It's hard to track down exactly what's needed to get it set up quite the same!
Hey! Changing the layout to mid flow gets a similar layout with the node graph on the right which is how I have nuke setup. You can do this under the fusion menu along the top. Then if you right click in the node graph you can set the nodes to snap together when they're aligned. I can't remember exactly what the setting is called and don't have fusion in front of me to check. But it should be pretty obvious what one it is. There's also a setting in there for making the nodes connect vertically instead of horizontally which is more like nuke too. Hope that helps!
@@AlfieVaughan Hmm I got most of it happening, but mid flow could not be found anywhere for the life of me! Thanks very much for your reply, I come back to this constantly, it's awesome to see real practiced vfx techniques versus fully homebrewers where there's a lot of guess work. Trying to find a workflow for stmaps outside of nuke. They're so useful and seemingly not really created or utilised anywhere else really which is baffling!
That is very cool, I would love to have a longer tutorial fusion for Nuke people, now that I'm working for myself I switched to davinci but fusion is still somehow uncomfortable for me
Thanks! A lot of people have asked for it but I don't feel like I know fusion well enough to teach it. What you see in this video is more or less everything I've learned to do in it 🤣
Alfie, what are your thoughts on playback and processing speeds, considering all the optimizations in DaVinci and the extensive support for GPU acceleration in almost every process? I've been using Nuke as my primary tool for quite some time, occasionally checking in on developments in Fusion. Is there a significant difference in the speed of file playback and the performance of heavy tools (such as CameraTracker)? Naturally, it would be best to test these aspects on complex scenes with defocusing, multipasses, and similar elements. But mb you could share your opinion, cuz I already see that Nuke is the slowest software atm in terms of playback files and working tools
For this video they seemed about the same but like you said these shots were very simple so not the best test. I haven't used fusion is any complex capacity with 3D defocusing etc so can't really comment on how they compare
Offtopic question:
How important is video resolution to a VFX artist? I'm sure you're gonna say, that quality (well filmed, well lit) exceeds quantity (resolution). But I'm sure, things like masking and tracking is easier on higher resolution. Is there a sweet spot, where the video gets so heavy to work on, that the benefit dissapear and you'd rather have a lower resolution to work on?
I'm of course asking, because I wanna know, if VFX artists are looking forward to working on Blackmagic 17K footage or they'd rather work on 2K Arri footage.
There's definitely a sweet spot. The jump from 2k to 4k is horrible. Even on the machines at work that cost $20k+ each. It's exponentially slower to playback, track, render etc. I think you'd be surprised to hear that almost everything still delivers in HD (or 2K), even films for cinema screens. Most projects I work on are shot in either 4.6k or 6k which is the native res for the cameras (usually an Arri Alexa mini). But when we ingest the footage our colour department scans everything down to 2.8k which is what we work on and the final deliveries are usually HD. High Res cameras have been around for years already, like the RED helium etc that shoot 8k. We work with that footage all the time just never at native res. So the 17k black magic files won't be worked on at 17k either. It's not physically possible even with the best computers
I would love to see you do compositing in blender and see if it is even possible.
I've used Blenders compositor. It's ok but it lacks a lot of basic tools that would make anything more than basic overlaying of renders very difficult/ impossible. And I'm a massive fan of blender so I'm not saying that out of spite 🤣
Fusion can do Deep Compositing just fine out of the box for many years now; first shown years ago with the "Anonymous" movie...
Are you thinking of deep pixel compositing? They're not the same thing... Fusion doesn't support the type of deep I'm demonstrating in Nuke in this video
Come on over to our side, man. You know you want to!
🤣🤣🤣
For Deep compositing, I think the tools you're looking for are under the Deep Pixel menu in fusion
Deep pixel isn't actually deep compositing unfortunately. It's just badly named. In fusion, deep pixel compositing just means using the position pass. It's not the same thing as proper deep data
Ah, man, we’d love to see you create some tutorials on Fusion! There’s really not much quality content about it on UA-cam, especially from someone with your level of expertise.
Thanks a lot! A few people have asked. The truth is I don't really know it well enough to teach others. Id have to spend some time getting good at it first!
I’m super glad you found out how to get the colorspace working in Fusion! I’ll have to rewatch that at 0.5 speed, because that’s been a pain point for me for a while! When you say load the Aces config, do you mean load the actual file? Cause that was another pain point for me when using it, was anytime I created a new OCIO node, I had to point it to the ACES config again, so I just ended up always duplicating the OCIO node. Did loading the ACES config on a project settings level fix that for you?
I would also love to see your take on the roto in Fusion. When I switched from Fusion to Nuke I had no idea what I was missing. I don’t remember if it’s out of beta yet, but Fusion has updated their roto node to have layers! I’m pretty stoked for it, but the feature still seems incomplete in comparison.
Then finally, could you do a (pardon the pun) deeper dive on deep compositing? I have a pretty similar workflow to yours, between Nuke and Blender. From my knowledge, Blender is unable to render in deep, but it does have a world position pass, which I’m pretty sure Nuke and Fusion can use. Pretty sure Fusion calls it a Volume mask, or something like that. I haven’t messed with it to much, but is there any way to convert that world position pass to deep data in Nuke? Again not to familiar with deep, since my 3D rendering experience is limited to Cycles, but isn’t the magic with deep also that it samples the pixels behind an object as well? So if you have a blur, it can more realistically let the colors from the background peak through?
Sorry for the long comment with lots of questions, but as a budding freelance compositor and 3D generalist, your channel has had some of the most useful, and practical information for someone like me. I really enjoy your content, and I’ve gotten a lot of value out of it over the past few years, so thank you!
I didn't use any OCIO nodes to be honest but this definitely worked for the viewer. Last time I just worked in linear as I couldn't figure out how to get the ACEScg plates to display correctly. I loaded the actual aces config.io file into the viewer like a LUT as you see in the video and then set my input and output colour spaces and then it behaves like nuke!
Not sure I can stomach doing a roto video... 😅 Not exactly my favourite topic. But I've been told a few times about the roto update in fusion. Good they're keeping up.
No the world position pass is a totally different thing to deep data. You can't convert it into deep and Blender can't render deep sadly. Maya and Houdini are the only 3D programs that can export it as far as I know. Fusion has something called deep pixel compositing which is misleading as it sounds like deep but it's actually just using the position pass from what I've read... Just bad terminology by the looks of it
@@AlfieVaughan thanks for the clarification on how Aces worked! As for deep, that’s what I thought, which is a shame. It’s definitely weird that Fusion uses the term deep, cause it is quite misleading, but using the world position pass is probably better than nothing
Well how to setup fusion like this, Is there a tutorial
I couldn't find any that covered it all. Some bits I researched others I got told by fusion users
What about the speed difference between Nuke and Fusion ? The effects,especially the GPU accelerated ones and the speed of reading the files sequence into cache.
I didn't notice a speed difference between the 2 in this video but the comps were quite simple. Probably not the best test
@@AlfieVaughan at some point I read opinions about how Fusion or Davinci was much faster since it took better use of Nvidia GPU, even of multiple cards at the same time, but seeing how Nuke is much more expensive, maybe that's not true.
@RealTimeFilms I don't know enough to go e a valid perspective on that I don't think 🤣 But nuke has a lot of GPU accelerated nodes too. Especially for heavy stuff
Great comparison. Which version of Fusion did you use? DaVinci Resolve, DaVinci Resolve Studio or Fusion Studio?
Thank you! I'm using Resolve Studio here
Great video! Are you using any Nuke2Fusion style shortkeys and mods?
Thanks! I did install it initially but I found it was changing some hot keys I didn't want changed. So in the end I just rebound a few basic ones that I wanted. Like pressing tab to search for nodes etc
New here, Enjoyed your thoughts & professional opinion. Wanted to grasp more on where Fusion could potentially stand in the VFX world for comp. I am used to compositing smoke, fire, special effects, and light passes from Multi-layer EXR's into After Effects but its felt like a nightmare the last year for work. In love with Resolve's Grading System and it being also able to composite 3D Renders? A double whammy for me!. Great Video, I'm looking to move into Fusion away from AE and I think this just helped my decision.
Thanks a lot! I think it's a good move from AE and I'm sure you'll be happy once you do
There is a lot of history behind why some functions are missing in fusion, such as deep exr, so you need to bear with that. It has been a long time coming because when BMD took over fusion, a lot of the VFX end suffered because BMD prioritizes the needs of their davinci userbase, unfortunately, and so the legacy VFX fusion users have had their tool wishlist items held back or outright ignored because if resolve users don't want it or need it (being colourists, mostly) then things like deep exr in fusion simply fell off the radar despite the userbase for fusion demanding it for well over a decade. Case in point- before BMD took over fusion, they had a more intuitive and fully customizable GUI but then BMD took it over and 'wanted fusion to look just like davinci' and so a lot of the user friendly intuitive productivity aspects were outright removed.
That said, deep exr WAS available in fusion v9 just prior to BMD owning fusion, and with the SDK changover, that was never updated... but it is coming to reactor soon, along with other major toolsets ;)
Apart from that, it was good to see that you took time to get more familiar with fusion and gave it another go. Coming from an all nuke background, naturally you would find much of it bizarre. You seemed to get the gist of it after taking time to pick it up though, however there are still things you are not really handling in the most efficient manner :)
...the tracker can be used like it is in Nuke for example, and the colour tools are more capable than you think. Once you get more familiar with it, you'll appreciate it more. Good you're seeing it for yourself though. next time, try a big heavy 3D scene with loads of geometry. Compare that to how nuke handles it ;)
Thanks for taking the time to explain all that! That's really interesting to hear. Sounds like a missed opportunity for Black Magic but like you say, it's probably just not a priority for them!
@@AlfieVaughan One has to realize that BMD does not 'really' think of the foundry as a 'competitor'. The reason for this is because BMD makes cameras and other high-end hardware for film, TV, and video production. Their flagship software is davinci, which is a gold standard for grading feature films (ask yourself when was the last time there was PR for nukestudio colour grading a major feature.)
Who BMD IS competing against is Premiere, AE, Flame, Baselight, and Scratch. Their interest with fusion's VFX end is less concerning for them, unfortunately.
This is why we have sadly lagged a bit behind nuke for VFX toolsets such as deep exr, which, again, existed for fusion 9 from a community member, and we will be seeing it again in fusiuon 19 once BMD releases the SDK. Things were looking good until BMD bought eyeon software- just prior to the takeover, fusion was picking up again. I used fusion on some major television series and feature films back then, it wasn't uncommon.
Which brings up the other point- BMD is pretty slow to release the SDK to 3rd party developers, so when major updates arrive, the serious VFX tools made by the community (just look in reactor) have to stand by for updates if they don't work in the newest release.
That said, we have a powerful band of developers who are passionate about bringing excellent tools into fusion, and i know of many jaw dropping ones that will be coming soon.
You will definitely want to keep your eyes open :)
Oh one last note: did you install the nuke2fusion toolset from reactor? might have made the learning curve a little bit easier on you as well.
@@JAK-gh4ez It's great to hear from a seasoned professional and an "insider" that Fusion is being looked at with some real intent from the community side.
I'd call myself an extremely heavy user of the entire DaVinci package as a whole (with Fusion for motion graphics). The power of creating in software where every aspect of filmmaking is integrated in such a smooth way is liberating and exhilarating. I would've never believed this was possible when I started years and years ago.
Wow, I've heard about similar stories but didn't know about it in details. When I was checking out old interviews and articles from 2014, and it seemed like people were very excited about Fusion and expected a great future for Fusion, especially in high-end VFX after BMD's acquisition. But it has been 10 year since the acquisition and BMD has barely added anything solid to Fusion, and most of the updates has been to make Fusion work better with the edit page and so DVR could take advantage of Fusion's powerful feature set, motion graphics update etc. But there hasn't been anything new to address most of the shortcomings of Fusion like a lack of improved EXR workflow, spline warp, better depth blur, proper stmap workflow, something like nuke's curve tool and the list goes on and on. Man, my mind still can't wrap around that Fusion used to have support for deep EXR, but it doesn't anymore.
Although BMD is a much bigger company than Eyeon software, their main focus is in hardware and broadcast market, so even tho I don't think they are actively trying to kill Fusion by buying it from Eyeon 10 years ago and slowly not updating it much, so people would stop using it, but I think it's just BMD is a much smaller company than most other software companies like Adobe, Autodesk etc. and has an even smaller software dev team. So it just comes down to priority IG and not having enough resources to develop Fusion, as they're giving away Fusion standalone for free with Resolve studio now. As much as I'd like BMD to develop Fusion more instead of unnecessary cut page updates, for Fusion to be able to be a viable competitor to Nuke for high end VFX, it just seems like not going to happen unless they change their mind.
hey man, I just wondered why did you pick the black point in the "reversed" grade node, instead picking lift or offset in the normal grade,. btw the "toe" node gives much flimic look for blacks. thanks for the video!
That's just how I learned to do it. There's lots of ways to do things in Nuke!
@@AlfieVaughan yep, I had just wondered :)
Hi.
Great comparison. I would like to see your hdri Workflow with your insta360.
Would that be possible?
Thanks! It's actually not an insta360 it's a Ricoh Theta Z1. And in terms of workflow with this camera, there is none 🤣 It merged and stitches the HDRI in camera which as far as I know no other 360 camera does at the moment. Very expensive... But save a lot of time. So the files come out the camera as 32bit EXRs and can go straight into Nuke/Blender for use
But are you merging with photishop? Because when I do that, I never get such a clean image like you. My ist still overexposed. So I must lower the exposure in the 3d packet. But then Imthe picture is to black.
Nope. All the merging is done in the camera. I don't have to process it at all. It literally comes out the camera ready to use. That's why I bought it
nuke plugins be wild but cant sell my kidney
🤣🤣🤣 yeah they're a bit pricey. Lots of good free ones though!
where do you get the stmap of your lens?
I shot it on the lens grids at work. I work at a vfx studio
I know your focus was on Nuke versus Resolve for VFX. But I'm wondering if they would be a great combo - Use Resolve for editing/grading and VFX, and for high-end VFX use Nuke and export it back to Resolve. Would this be a good workflow? I'm switching from Premiere/AE to Resolve/Nuke (mainly focusing on Resolve for now) so I'd love to hear your thoughts on them together.
That's exactly what I do for my videos. I edit and grade in resolve and think it's fantastic at both of those things. But for the VFX I export everything to EXR sequences and work in Nuke and Blender. I've got a whole video about it. It's slightly due an update but you get the idea...
ua-cam.com/video/dlSOkXT7Lxk/v-deo.htmlsi=2wSKbHsUNzLh2EhM
@@AlfieVaughan I reccomend reading up on DaVinci's VFX Connect feature in the manual. Should speed up your workflow. :)
why not use Fusion?
Was wondering what are rage benefits of using a STmap instead of just dropping in your lens grid and undistorting it
It just takes more time and the end result is the same. To properly set up a lens grid you have to check all the points are exactly on the edges of the squares and move them manually if needed. Which takes a few minutes each time and then the end result is outputting the distortion. So using an STmap just skips that step and gets you the distortion instantly
Hi thanks for the run through, does anyone know why when i do a similar work flow and try to open the exported fbx file in cinema 4D, it always is the incorrect resolution? In cinema 4d. Thanks. It does not seem to retain the resolution of the timeline in the fbx export?
3D cameras don't have anything to do with the project resolution. They have their own sensory size etc that's independent from the resolution of the shots. I don't use C4D but I would imagine you just need to set the project resolution to be the correct size when you first make the scene. I do the same in Blender
How familiar are you with Flame? And how good do you think Resolve/Fusion is as an alternative for Flame Online editing?
I use Flame for running the edits at work. I haven't done any comping with it but we use it for conform/ online etc. I think it's great. I'd much rather comp in Nuke which I still do but as a timeline tool flame is brilliant
@@AlfieVaughan I'm curious about swapping out Flame Online for Fusion Online. We Grade in Resolve and to have both in 1 package would save a lot of time and headache exporting back and forward between the two. Do you think Fusion is a capable competitor to Flame for 2D Online work like that?
@marty9369 it's difficult to say as I haven't done any online in resolve. But it's a very capable editing software... So I don't see why not
What's your verdict on performance differences between nuke and fusion? I haven't used fusion in a while but I find nuke to be a bit lacking in that regard as it is still very much single threaded and not using the GPU as much as some other modern software.
They seemed about the same here in terms of playback and render time. Although these were both very basic shots so not sure it was the best test!
@@AlfieVaughan thanks for the reply. Btw; I recently dug a bit deeper into 3D tracking and watched an interview with the guy who created syntheyes. He mentions that it is basically better not to provide the solver with camera back data because the mm values you find online are often not entirely precise and neither are the focal lengths of lenses. The software is far better at calculating these values precisely just by itself. In my experience the hpix error usually just goes up when I input these so I was glad to find out that it's actually better to just don't input them. Cheers
@MrJemabaris yes Ive found this as well. I usually don't do this anymore in Nuke when I'm tracking. I just did it in this video so that the trackers were both working off the same information to see which produced a better result
Amazing
Thanks!
Couldn't agree with you more! Also, I started earning more from each job because I comp in nuke.. Really levels things out. I'd go with fusion if I wasn't making money from vfx, small full service production company, or was currently using ae
Good to know! It's like how Flame used to cost an absolute fortune to have the machine and the license etc. so being a flame artist meant people could charge insane day rates
If you started making VFX tutorials for fusion. I WOULD EAT THOSE SO BAD. If there's one thing lacking is tuts for more complex workflos in fusion
Haha! I get asked that a lot but I don't know it well enough to confidently say that the way I do things is correct. It's a different story with nuke because I was properly trained to use it. It's a good idea but I'm conscious of spreading the wrong info
@@AlfieVaughan You def know about VFX. I Used to work with After Effects, but lately I've been doing everything on Fusion (Got tired of blue screens lmao). If I'm correct the underlying principles and techniques are pretty much the same, Just in a different package. Also... my man, I'm no industry expert but I don't think there's a "rigght way" for doing VFX. If it does the job, it does the job. There's always some other "more optimal" way. I'm just excited there's more VFX Fusion tuts now days
The principles are definitely the same and that's the part that transfers to all software. But having said that, trying to teach them in a software Im not a familiar with will likely lead to doing things in an objectively incorrect way. For example, in this video I wasn't totally sure why the STmap lens distortion in fusion was different to Nuke and if I'd made a point of saying "this is how to use STmaps in Fusion" then I might be showing people a way that doesn't actually work. I totally get your point but there are also some things that can definitely be incorrect
You can do deep comp in Fusion
That's deep pixel compositing. It's not the same thing. That's just using the position pass to create masks. Deep in Nuke is a totally different thing and way more powerful
"Oh Shit... Here we go again..." 0:00
🤣🤣🤣🤣🤣
I think it would be nice to see a slow step by step tutorial about this.
As in a step by step on the fusion part of the video? Comping the CG?
❌❌❌❌❌❌❌❌❌❌❌❌❌Brother how did you arranged the panels in fusion. I thought it's not possible. Is it? . Or it's because of your screen size?
You can't arrange them properly but there are several different layout presets to choose from under the fusion --> layout menu
@@AlfieVaughan aahaa. Got it. Thanks
What are your computer specs? Love the content btw
Thanks! I have a 3090, i7 6700k and 32GB of RAM
@@AlfieVaughan thanks for the quick reply! It’s awesome to see creators interact with their audience, I know it’s not easy some times.
Last question, do you have any tips for smooth playback in davinci? My computer is a little more up to date than yours but my playback is awful in fusion and just makes workflow frustrating when dealing with large composites.
Thanks for your time !
I can't see your Amazon Product VFX Breakdown, You delete it from UA-cam?
That was in my "what's it like to work at a VFX studio" video. It's still live!
@@AlfieVaughan Ok Thank You
To address your comment about deep compositing and the depth data, I'm pretty sure it is addressed by Millolab Tuts, Blender to Fusion vid, he mentions something about the coordinate system in Blender is different than the one in Fusion so you would have to switch some channels around to get the proper position data, just another "lovely" quirk about Fusion.
So I believe it is possible, this is just not my traditional wheel house of knowledge.
Once again, great video.
(Hopefully my comment goes through even though I'm talking about another video)
Thanks! Unfortunately deep pixel compositing isn't the same as deep in Nuke and proper deep data. What youre referring to is actually just a common position pass AOV and not actual deep data. They're very different systems that for some reason have been confusingly named the same thing 🤣
@@AlfieVaughan Alright gotcha, I appreciate the clarification
How about bounding box management? For example, in Nuke you can set it to A side in your copy/merge(mask) node and have the alpha channel determine the bounding box of the premultiplied image before merging it back to the plate. In fusion I have no clue how I should do that. Also, cropping it doesnt seem to be the way to go. Any Fusion expert here who could help me out ?:)
try the matte control node.
it is even better in fusion most of the time it automatically sets to smallest size when there is alpha channel. or use auto domain or set domain node. and many nodes like masks have cliping control in image tab. and if you want to veiw box size with resolution enable it in desired veiwer by right clicking in any veiwer and region < click show DOD.
@@aman34587 Thanks! :)
Worked with fusion on couple animation projects (both 2D and 3D). Liked it a lot, very capable. However I find Nuke much cleaner workflow, more intuitive, faster, and I love making tools too much. Fusion really missing shuffle node, when you have 10-20 layers of lights and all the passes imaginable it gets really messy. On the flip side, I prefer fusion's paint tools way more and Fusion's 3D tools are way ahead of the one in Nuke. Also Fusion's defocus workflow is a mess. On the other note: Still crying that Natron stopped active development, would be cool to see the video on it with your opinion.
Sounds very similar to my experience!
the Fusion Reactor plugin has a solution that helps separate channels similar to the shuffle mode
@@IBpostproduction-e4q you mean hos_Split_exr?
@@arseniysemin1361 that may be it. I'm still learning myself
@@IBpostproduction-e4q if it is what you mean it's not exactly the same. Nuke or even Natron Shuffle is closest to Fusion channel boolean. As far as I'm aware right now even with plugins it's not possible due to how fusion treats channels. The issue is not accessing the channels from the beginning, but rather being able to work with them down the line. Right now you have to have ton of loaders everywhere which gets messy pretty fast, with proper shuffle you can literally render one file that'll contain all the layers (you totally should never do that due to performance though)
Crumple pop by Boris FX is your friend to get rid of that annoying echo in the room where you do the conclusion.
Other than that, thanks for the video.
Thanks for the recommendation! I don't usually record in there as the room is much bigger 🤣 I think resolve has a similar vocal cleanup tool. I'll investigate 😎
@@AlfieVaughan Wow. That’s the fastest response I’ve ever had on UA-cam! Perhaps there’s a Klaxon that goes off in your room for every comment that immediately makes you jump out of bed?
I’m a hobbyist so can’t afford Nuke. Twenty-five years ago when I had no money or capable PC I did visit a company in Soho in London to enquire about Flame, which was the compositor of choice back then, The receptionist burst out laughing. It was £250,000 for a licence. And I didn’t even own a Mac!
But those music videos by Chris Cunningham were good.
So I’m pleased the prices have dropped somewhat. But maybe not enough. For us hobbyists Black Magic are opening doors I guess.
What would be your opinion on Indie Nuke.vs Fusion?
I’m trying to bend my head around Silhouette by Boris FX. Maybe that’s enough if only doing a 2D composition.
@farmersuiticles hahaha 🤣 I get a notification on my phone each time so I usually see a comment and reply straight away!
Yes Flame was incredibly expensive back in the day 🤣 Not just the software but also having the hardware to properly run it would cost more than most peoples houses... But that's sometimes the cost for the best of the best. These days Autodesk have made the licenses more affordable and technology means you can run flame on a mid range laptop now.
Personally I'd pick Nuke Indie over fusion. Indie is essentially fully featured except the render limit of 4k resolution and some integration limits with pipelines etc. I think in a head to head indie is the better choice. But hard to argue with fusions price tag
@@AlfieVaughan Yeah but I can probably afford £400 a year. Thanks for your response.
Fusion 19 does deep compositing
I think you're referring to fusion's "deep pixel" compositing. It's not the same thing (confusingly named!). Deep pixel in fusion is similar to using a position pass in Nuke. Totally different
yeah, it's a good money saver 10% of the price of Nuke
Definitely!
So if Fusion were to add Deep Compositing it would be comparable? Good to know.
It still is comparable. Deep compositing isn't a useful feature to 99% of people using the software
Need blender aov pass like position with nuke
What do you mean? Nuke can work with position passes from Blender
I am learning fusion for vfx. I don't know if i can get a job as a fusion vfx artist
You will have to look for a studio that uses fusion. Theyre not very common so your choices will be a lot more limited and it depends if they're hiring or not. For the best chance of being hired you're better off learning the most commonly used software
Big plus about Fusion for me is that I'm the only person in my studio that is using it. I've set up fusion render farm (which is free) on all PCs and now I have more then 20 PCs calculating my compositions. It's insanely fast and all that for 300$!
Haha that's definitely a bonus 😁
How is it a plus that you’re the only person using it. You can’t pass off the project, or collaborate with ease. If you go on vacation or are sick no one can open your project. Don’t say job security either because the moment those situations arise a smart lead [producer or owner] would restructure.
I also don’t know why you have 20 computers rendering composites from a single artist when one decent computer can handle it in the evenings. Don’t tell me you’re using other people’s stations while they’re working. 🤦♂️
@I3ra that's quite a quite a narrow minded perspective in my opinion... While technically you're right about it being difficult to hand off, if you're doing a shot that you know will definitely not be picked up by someone else then it's not a big deal. Other software can have tools and things that are uniquely useful and good for problem solving. I've been using blender quite extensively for the last 7 years while working as a compositor at VFX studios. No one else uses it and can't pick it up if I do something in blender. But for standalone work where I know it's just me, it's solved some enormous problems that would have taken 5 times as long to do solely in comp.
As for the render farm, valid point but maybe their individual machines aren't that good? I've also done jobs at 15k that took 7 hours to render overnight. If you're working late and need to send something quicker than 7 hours then it could be a 21 minute render with other machines. More power is never a bad thing
@@I3ra you really don't know anything you are talking about dude.
Maybe a naïve question, but why do you want to erase the sun if you have it in your HDRI ? (Especially if it's taken in the same spot)
Because the sun is usually over exposed still even in the lowest exposure. You need it to be a tiny pin prick in the HDRI for the lighting to look real. If it's too big and blown out you get really soft shadows. So it's better to paint it out and then use a sun lamp as that lets you control it more effectively
@@AlfieVaughan thank you so much 🙏🙌
Which renderer is faster?
They were very similar. Both about a minute. But the shots were very basic so it's not the best test for render times
I was hoping to get a one-time pay license for Nuke for my Synagogue.
Once I get my Rabbi ordination, I'll be developing a video production studio where we will be creating documentaries from our Yeshiva.
Subscriptions are the very reason why I will no longer use ADOBE software.
Thank you for this video. I was wondering, is there a website that does step by step tutorials for Nuke?
Thanks.
I think it's subscription only but if you're not making money from what you produce then you can use nuke non commercial which is free. Alternatively, fusion is davinci resolve is very good and free or you can buy the studio version which is about £240 as a one off payment
How is that stmap generated?
I used the lens grid we have at work to profile all my lenses. One of the perks of being at a studio! Without one there's not really any way to do it properly. You need to print off the chart and stick it on a big piece of wood backing so it's straight and then film all your lenses looking at it to extract the distortion data
But Fusion from version 18 should have deep compositing! Overall, great review
Are you thinking of deep pixel compositing? They're not the same thing. I thought that too when I was researching for this video
I was curious about the render time between nuke and Fusion, in this video Fusion is about a minute. What about nuke?
They were both about the same. But it doesn't get much more simple than these shots so I'm not surprised. I haven't done any massive comps on fusion but it would be interesting to see!
at least we all agree that adobe can burn in hell
😅
Great point about how the further you get in your career, your projects will (hopefully) get higher end and bigger budgets..in which case the cost of your software starts to matter less and less to the individual...since those costs get absorbed by the projects.
I get that this would 100% matter to freelancers, lone wolves, and start ups...but for medium to large film or commercial projects...i haven't really cared about how much something costs in a hot minute (and focusing on 100% the best software package to get the job done right...and quickly).
Yeah exactly. It's all a natural part of the process and usually the expense can be passed on to the client
gotta learn nuke but as a kid going to school its impossible to afford
There's Nuke non-commercial. It's literally free like I said in this video 🤷♂️
@@AlfieVaughan i think i just got the student license. Thanks for your video tho I used to watch it a lot finally getting into it
Watch 13:08 first. This should have been at the beginning of the video, not 13 minutes in.
I don't get paid to promote Nuke. I just reached out to them after paying for Nuke myself for a couple of years and asked if they would be interested in providing a license. Which they did in exchange for me making some of their official tutorials. I don't think it changes my perspective at all. And like I said, if I wasn't getting it for free I would totally pay for Indie
I've given up on fusion after losing my track data every time i open the program, cost me money and time.
This is not normal. When did you last use it, and what version?
We're you on the 19 beta?
Nothing like nuke.
They're both node-based compositors...
you were?
Well... I was a bit harsh on it the first time 😅
@@AlfieVaughan just finished watching. Nice.
constructive feedback
That's more like it! 👌
Much fairer video, re deep I have a theory that its actually behind a patent & weta gave it exclusively to nuke thats why bmd cant do anything 😅
Thanks! Could be but I doubt it as several 3D softwares are able to render it so it's not exclusive in that sense. I think it's more likely no one else has bothered putting it in a compositor as it's not an important feature to 99% of the user base
@@AlfieVaughan fair, here at least BMD is trying to be competitive on the USD front :)
Everything is good in this video, except for the grade (sorry) :)
Thanks! I didn't really put any thought into the grade 🤣 just upped the contrast and saturation and added a vignette. It was just meant to be a cheesy look rather than anything serious
@@AlfieVaughan Yea I know!
I am actually surprised you made yourself try fusion. I use it as my comp app, but only because I don't comp much.
For me Nuke just clicks, channels, masks, everything seams well thought-out. Fusion often feels like 3 softwares glued together. Even node connecting is pain.
If I's comp a lot, I'd pay for nuke.
Try composting in Blender next time
I've tried it. Blenders compositor is terrible 🤣 I absolutely love Blender for it's 3D stuff but it's compositor is so behind anything else. It's basically useless for anything serious
Sshhh, Blender fan boys don't like their software to be criticized! They think it's the best thing, since sliced bread. Best editor, best compositor, best 3D modelling software etc.
I've seen several Blender fans claiming that Blender beats everything on the market, when it comes to compositing.
I'm a blender fan boy 👀 just not of the editor 😏
@@AlfieVaughan Probably not the kind of fan boy I'm talking about, 😉
And yeah, I know you're using Blender.
Eyeon FUSION was the King of compositing up (and if you ask me it still is, Nuke is just hype) until The Foundry poured a lot of money onto marketing Nuke and offered incentives to studios to use Nuke. Eyeon was not very well organized and their marketing was non existent, back then since all major fx studios were using Fusion and its lic cost the same price a Nuke lic costs now. Thank God for Blackmagic for buying Fusion making it cheaper to own and developing DaVinci resolve to seamlessly work with it. Also Fusion has a prettier and more user friendly interface. Nuke is NOT better than Fusion.
Quite a bold statement to say one is definitely better than the other. They're both good. Fusion is incredible value but for the work I do on high end commercials, TV and film, nuke is much more suited to the workflows we use than fusion.
@@AlfieVaughan Well, my statement basically came from a friend and coworker of over 20 years that is now a big vfx supervisor and grew up with Fusion. I started on Fusion but I found a high paying job in another field so I stopped doing fx work, I only do it as a hobby on my own projects. Another thing I didn't mention is that there is no Union in US to represent and protect the fx artists who are underpaid and overworked and don't get paid much unless they have a supervising position.
Always enjoy your videos Alfie but not sure how you can be 100% objective if Nuke sponsor you UA-cam channel? I think both products are very close, Nuke though is a bit more polished and better supported but I also think a bit slower under the hood. The artist will always be the defining difference at this level IMHO, I still watch Nuke tuts to improve my work in Fusion.
Thanks! It's a fair question. The best answer I can give is that they don't give me any money or ask me to say anything good about Nuke. They just give me a license each year. And I approached them to ask for one. I also paid for Nuke Indie for 2 years out of my own money when I was doing some freelancing outside of my day job at The Mill. So I'd say I've put my money where my mouth is by actually using it as a customer before I was affiliated with them in any way. But you're right, now it's a bit of a different story!
You might as well be speaking Russian!
It's all very geeky 🤓
or English for those who speak Russian.
Fusion is free and wayy more appealing visually opposed to nuke, therefore more people are gonna use it.
So is Nuke unless you're working commercially! I do agree but like I said, this is from the perspective of a professional, not the average user. And also picking software based on how the UI looks is not a good idea 🤣
What makes someone a "Professional Nuke Artist " lul
Working at some of the best VFX studios in the world for 7 years? 🤷♂️ But in all seriousness, by definition a professional is someone that gets paid to do a job. I get paid to spend 50+ hours a week working as a compositor. How's that?
Fusion fucking sucks
😅
how so?
where can u buy lens grid STMAP ??
You have to film one for your specific lens on a lens grid. We have one in our studio so I just used the one at work. But they're pretty massive so to make your own one is a bit complicated. Worth looking into if you want to get serious about lens distortion workflows