Thank you for demonstrating (and including the links!) to these papers. Glad we already have access to one of these techniques and excited to see more commercially-available tools based on the other pieces of research you've highlighted!
@@smanzoli yes it's great but not nearly as accurate as real ray tracing. There are a couple vids showing how trees for example and some other objects appears mich brighter in lumen than they would ray traced. That's because lumen uses spacial light samples, not Pixel light sample like raytracing. So instead of shooting rays out of the camera, it samples 3d pixels (voxels) in a cube grid on the map. That's basically what they always used for rendering game lighting, with the difference that lumen works in real time and doesn't need to be baked.
Yeah, that realtime denoising is awesome! In terms of quality, it's still not quite as good as something like Intel's OpenImageDenoise (which runs on the CPU). The difference is their method is realtime, and OpenImageDenoise isn't, without a ridiculous CPU. Edit: nevermind, a developer working on OpenImageDenoise got it running in realtime on both AMD and Nvidia GPUs, meaning it isn't locked to just Nvidia. OpenImageDenoise is definitely the better solution, or I suppose will be once they finish up the build that functions on the GPU.
Thank you for presenting all of this astounding research. More than the technology itself, it is your life-affirming delivery that inspires optimism and enthusiasm for a future that otherwise contains so many bright unknowns as to be terrifying. You are the Bob Ross of deep learning.
Moore's Law has been dying this millenia so hardware improvements have become incremental and not exponential like in the 'good old days' when we expected a doubling of performance every generation, now the younger generation considers a 25% increase to be significant; so it's really only Algorithms/AI getting exponentially smarter that's giving these massive improvements, but like Moore's Law that's also bound to reach a limit, hopefully later than sooner. What's really needed is new computing technology that once again follows Moore's Law, but since capitalism is still making huge profits from old silicon tech I ain't holding my breath.
our best bet right now is improvements in algorithms particularly AI or machine learning. machine learning is more on guessing rather than actual computation. but the thing with machine learning is learns so it also scales up as time goes on just like the transistors in a chip. what's rough guess right now could be an accurate outcome in the future.
I did my master on this. Computer Graphics. And I must confess this is insane. Like no way this is possible. I need to read all four of the papers. Wow.
Still squeezing in 2022! I remember me and my sister trying to render Bryce pictures on a pentuim 90. It took HOURS to make a sphere over water, lol. Miraculous stuff here.
Back in about 1987 I was using the forerunner of cinema 4D on an Amega 500 and I remember setting up a scene and going on holiday for three weeks and when I come back it had only done half the scene!
One of the things that's changed is that we have more cores now. Even at the time, if you were fortunate enough to have a dual-processor rig, that would have been much closer to 45 hours as ray tracing is trivial to run in parallel processes. And with modern heardware, just being able to split the load to 8 or more processors alone would cut the time down to just over 10 hours. With the more efficient processors and increased RAM it would be done much faster.
You can get a better sense of what paper 4 is doing when you look at the edges of the image where the train is just coming into frame. You can see the detail jump levels as that spot on the train spends more time in the frame. A possible workaround to this particular detail is to render to a virtual screen slightly larger than the actual screen and cropping out the edges when displaying on screen.
If you're talking about 13:56, you might be mistaking the raw noisy section for one with less detail. The first few columns on the left side show the noisy input to the denoising algorithm, which takes up the rest of the screen.
this doesn't work because camera can rotate, it doesn't just go left or right. all these effect are not new, there are working example of this shader that u can apply on existing game with api hook because gamers always find way to make their game look better. all these implementation suffer from a problem when you rotate. the best "solution" i seen from is to hide this from user by implementing a camera edge motion blur when the camera rotates, u cannot predict how a player control camera will turn.
This is really an amazing time to be alive. In my particular case, I was a child, when the video game "Pong" was released. Looking at the developement of computers and AI from then to now is unbelieveble. Btw., I was born in 1967, which means 2 years before astronauts landed on the moon with a computer on board that had 80 kilobytes of memory.
Dude :D As much as I love your enthusiasm about it, your seemingly random emphasis and speaking with an invisible comma after every second word hurt my ears terribly ^^ Still thank you for the video ;)
This lineup of papers is not just perfect for high end graphics cards , too. Considering that theyre optimised where possible and still going for more, smaller graphics cards and possibly devices that arent normally MEANT to do such things could possibly opt in for a similar method for demos! This is a big change. Better light transports for the same amount of time and effort is wonderful to keep track of. Thank you alot!
Those read, listen to / bugged / eavesdrop / overheared / tapped, watched, spied, copied and steal many of my ideas (=inventions) initiated by me: + open, transparent 360° sunlight Buildings, construction methods, architectures, designs, concepts; + step-floor / -storey / -level / -tier pyramid (e.g., as a residential building); + 360° stepped floors / storey / level / tier buildings (constructions, architectures, construction methods, designs, concepts, 2D, 3D, models); + 3D 360° environment (surround), volume, space (room) audio / sound; + 360° environment (surround), volume, room screens (displays), monitors, TVs; + screens (displays), monitors, TVs without backlighting; + many things with magnets like Micro-OLED; + 360° screens, monitors, TVs, panels, glass; + curved screens, monitors, TVs, panels, glass; + 360° reflections and light digital: "ENB", "Ray-Tracing" / "RTX", “Lumen illumination” and whatever renaming! + Short Throw Projectors; + dark backgrounds / themes / skins for windows, browser windows, internet sites, programs / apps, etc.; + fanless, passive cooling; + thermal pads; + heat pipes; heat fins (no sure yet); + 360° heat fins; + Water cooling (unwanted, because it conducts electricity) and + liquid cooling (wanted) for the computer and personal computer / PC application area! - I am not the inventor of refrigeration by nitrogen, instant- / blast-freeze / -freezing; + 360° open, transparent PC cases; + 360° Sphere, Ball as a Wheel (2D, 3D) “Omniwheel” / “Omni-Wheel”; + from a standing position / standing a 360° turn / rotation in any direction on all objects, including logically and of course vehicles, transporters, robots of all kinds; + flying objects, planes that take off and land vertically, for example by hot air, example fighter jet; + skyscraper up to the weightlessness of space ("space scraper") - "mega construction", “orbital”, etc.; + bendable, flexible, foldable screens / displays / monitors; + bracket / mounting arms for PC, screens (not sure yet), + 360° "ergonomic" bracket / mounting arms; + LED lights (at least its implementation and intended use outside of tv and pc-monitor); RGB LED “Nanoleaf”, etc. + and much more! + I'm not the inventor of VR, but of "AR", AR glass, AR glasses; It was only later that I realized that they derived a lot from my ideas (=inventions), a lot came about that has to do with color and light, through me as an initiator, booster / catalyst, e.g., through my idea, invention of the Screens without a backlight and without a built-in / integrated backlight! It is no coincidence that only afterwards, after I initiated this, they built, built and are building those inventions thanks to my impetus! They sell my ideas (inventions) as theirs! They are not the inventors, but the first technical implementers of my ideas (=inventions)! And those are not the inventors, but the thieves of my ideas (=inventions) initiated by me, because those act as if I wasn't the first hand and the first domino, and they take unjustly, undeservedly a lot of money, stolen money (blood money), recognitions, awards, certificates, fame and history, they boast of my laurels / merits! Before me they all were stucked at LCD, Plasma TV and less lights! I have made a deep impact in evolution! Those manipulate, sabotage, falsify, distort images, paintings, digitized and real, animations, videos, films, also composed of many images, even the publication dates of mine, others and their posts, images, videos, etc.! You have to understand, those can distort everything that can be heard and seen in real life and digitally! They block and delete my pictures, videos, posts, comments, comments-answers and answers! They are poisoning and murdering the world with fake diseases, treatments, "vaccinations"/ "vaccines" and injections by syringe! Now they also make it out as if they haven't been ripping off, cheating, enslaving, murdering other countries with the money currency, money exchange by even -99% for more than two centuries! And as if I'm not the first to disclose that and more! As if I didn't disclose and initiate > 1.00 Ruble (₽) = 1.00 asian Yen (¥) = 1.00 Euro (€) = 1.00 Dollar ($) = any (X) any country < years ago! Each and every non-civilian you hear and see on TV is involved! You can hear and see their > blue blue blue
@@damysticalone87 even if one of these things is true, an idea is not an invention. An idea is just an idea. An invention is the implementation of an idea.
@@Fingerblasterstudios You criminal or and supporter of criminals! Childmurderer or and supporter of chiuldmurderer! Laws! "Protection of Intellectual Property"! "Patent Rights"! Theft!
If / When you have money, health or "natural" disaster problems or if / when you are somehow affected by traffic accidents, traffic jams at some point, somewhere, then think of me, share my posts and my pages! Be sure to read my comments under my first videos completely via laptop or PC! Be sure to read my contributions, posts, comments under my first videos thoroughly, maybe no longer visible via phone app because they intentionally set it that way! Use laptop or PC!
The modern tech is fantastic. But it's also quite complex. One of the real amazing things about classical raytracing was that you could famously learn to get started "In One Weekend." Even as somebody who has been at least somewhat engaged in CG for many years (you've all seen stuff I worked on when I worked in VFX) the new stuff is less exciting because I have pretty much zero illusions that I am ever going to write an implementation of a full modern renderer by myself. Even just trying to use nVidia's Restir implementation library in my own engine rather than implementing it myself is more complex than writing a whole renderer used to be!
Totally agree, we've reached a point where amateurs or even low-budget professionals simply can't compete on their own, no matter how skilled they are.
@@alefratat4018 I think low budget studios and amateurs can always compete. You dont have to use RTX if you dont want to. And the main reason I love indie gams is because they are much simpler and focused on the fun part, with great idea for game mechanics or story. AAA games usually go full bonkers on graphics, but the story is usually lacking, and they are sterile most of the time from some new interesting game mechanics.
@@RamsesTheFourth Doesn't he mean writing it into a custom engine? I mean just putting ray tracing in a game is easy for anyone at a basic level, it's included in UE5. I can enable it easily there - but I'd be clueless on adding it to a custom engine.
I remember last year, as I was browsing a few (not yet reviewed/incomplete/old) papers on arxiv, and stumbled across a thesis that was talking about cone tracing... I downloaded it and was like "hey one day I'll read it". I never did. This is probably not the same paper but the second you pronounced "voxel cone tracing" I was like "of COURSE". Thanks for the vid, love your work. To think they actually thought about that ten years ago... EDIT: turns out I still have the PDF of the thesis, it's "Audio and Visual Rendering with Perceptual Foundations", which doesn't match any of NVIDIA's papers, yet is dated 2009, so...
I wish you would tell us what kind of graphics card these researchers are using so that we can understand whether they are using the newest GPUs or CPUs as well as the what version of the software. I feel like that would be more informative. Your information is vastly and greatly informative but having those little bits of information will really tell us if they are using current software, GPUs and CPUs. Thank you so much and keep the excellent information coming! :)
They can't have a GPU much more powerful than 2x - 4x of any current high end one. So if they say 80 fps you can expect for sure to get above 20 fps with a good GPU. Unless it's something that requires a lot of VRAM
@@descai10 I mean the video mentions "real time" many times, and 20 fps should be an ok number to consider "real time". If your concern is knowing what hardware was used just to run it yourself, then considering the same case you can expect a nice 5 fps. Which is great compared to a single frame taking minutes to render
@@descai10 on that scene, with that gpu, yes. Only that the video shows multiple technologies at multiple points in time, and there's no mention of the hardware used like the OP is asking
14:00 seems like motion vectors artifacts. It is a genius move by NVIDIA scientists to use these! 14:35 Democratizing but usually only as long as you have specific hardware ^^
Specific, ready-to-use, affordable (relatively...) hardware that I can also use to game :D I think that more importantly, the papers being released mean that other chipmakers can also implement the hardware acceleration.
The first law of access is that democratization is a process. Do not look at where we are, look at where we will be, 2 more graphics cards down the line.
This video is now more relevant than when it was published. I could not wait for ReSTIR to be even considered or implemented in games and now is a reality with CP2077. It is a big big deal.
Another way to make light transport faster is distance based light resolution downsampling. Far away caustics and reflections are not gonna be visible due to the amount of pixels in the image anyways so there is no need for them to be super refined, so as long as the stuff closer to the camera is good enough resolution everything further away can be progressively downsampled to reduce memory usage and possibly increase the accuracy where it matters more..
@@clonkex yeah, that might be quite a good trick. Maybe in combination with some ai based stuff you can downsample parts of the view that doesn't change considerably between frames too.
Oof. Sympathy and empathy to the people who saw that research from 11 years ago and mistakenly thought that widespread adoption and advancement of it would be just around the corner. At least it's in the right hands now. Thank you for bringing it to our attention, Károly!
Those read, listen to / bugged / eavesdrop / overheared / tapped, watched, spied, copied and steal many of my ideas (=inventions) initiated by me: + open, transparent 360° sunlight Buildings, construction methods, architectures, designs, concepts; + step-floor / -storey / -level / -tier pyramid (e.g., as a residential building); + 360° stepped floors / storey / level / tier buildings (constructions, architectures, construction methods, designs, concepts, 2D, 3D, models); + 3D 360° environment (surround), volume, space (room) audio / sound; + 360° environment (surround), volume, room screens (displays), monitors, TVs; + screens (displays), monitors, TVs without backlighting; + many things with magnets like Micro-OLED; + 360° screens, monitors, TVs, panels, glass; + curved screens, monitors, TVs, panels, glass; + 360° reflections and light digital: "ENB", "Ray-Tracing" / "RTX", “Lumen illumination” and whatever renaming! + Short Throw Projectors; + dark backgrounds / themes / skins for windows, browser windows, internet sites, programs / apps, etc.; + fanless, passive cooling; + thermal pads; + heat pipes; heat fins (no sure yet); + 360° heat fins; + Water cooling (unwanted, because it conducts electricity) and + liquid cooling (wanted) for the computer and personal computer / PC application area! + 360° open cooling, e.g., with air-flow by air-holes, no cases, no walls, no cover, etc.; - I am not the inventor of refrigeration by nitrogen, instant- / blast-freeze / -freezing; + 360° open, transparent PC cases; + 360° Sphere, Ball as a Wheel (2D, 3D) “Omniwheel” / “Omni-Wheel”; + from a standing position / standing a 360° turn / rotation in any direction on all objects, including logically and of course vehicles, transporters, robots of all kinds; + flying objects, planes that take off and land vertically, for example by hot air, example fighter jet; + skyscraper up to the weightlessness of space ("space scraper") - "mega construction", “orbital”, etc.; + bendable, flexible, foldable screens / displays / monitors; + bracket / mounting arms for PC, screens (not sure yet), + 360° "ergonomic" bracket / mounting arms; + LED lights (at least its implementation and intended use outside of tv and pc-monitor); RGB LED “Nanoleaf”, etc. + and much more! + I'm not the inventor of VR, but of "AR", AR glass, AR glasses; It was only later that I realized that they derived a lot from my ideas (=inventions), a lot came about that has to do with color and light, through me as an initiator, booster / catalyst, e.g., through my idea, invention of the Screens without a backlight and without a built-in / integrated backlight! It is no coincidence that only afterwards, after I initiated this, they built, built and are building those inventions thanks to my impetus! They sell my ideas (inventions) as theirs! They are not the inventors, but the first technical implementers of my ideas (=inventions)! And those are not the inventors, but the thieves of my ideas (=inventions) initiated by me, because those act as if I wasn't the first hand and the first domino, and they take unjustly, undeservedly a lot of money, stolen money (blood money), recognitions, awards, certificates, fame and history, they boast of my laurels / merits! Before me they all were stucked at LCD, Plasma TV and less lights! I have made a deep impact in evolution! Those manipulate, sabotage, falsify, distort images, paintings, digitized and real, animations, videos, films, also composed of many images, even the publication dates of mine, others and their posts, images, videos, etc.! You have to understand, those can distort everything that can be heard and seen in real life and digitally! They block and delete my pictures, videos, posts, comments, comments-answers and answers! They are poisoning and murdering the world with fake diseases, treatments, "vaccinations"/ "vaccines" and injections by syringe! Now they also make it out as if they haven't been ripping off, cheating, enslaving, murdering other countries with the money currency, money exchange by even -99% for more than two centuries! And as if I'm not the first to disclose that and more! As if I didn't disclose and initiate > 1.00 Ruble (₽) = 1.00 asian Yen (¥) = 1.00 Euro (€) = 1.00 Dollar ($) = any (X) any country < years ago! Each and every non-civilian you hear and see on TV is involved! You can hear and see their > blue blue blue
If / When you have money, health or "natural" disaster problems or if / when you are somehow affected by traffic accidents, traffic jams at some point, somewhere, then think of me, share my posts and my pages! Be sure to read my comments under my first videos completely via laptop or PC! Be sure to read my contributions, posts, comments under my first videos thoroughly, maybe no longer visible via phone app because they intentionally set it that way! Use laptop or PC!
If / When you have money, health or "natural" disaster problems or if / when you are somehow affected by traffic accidents, traffic jams at some point, somewhere, then think of me, share my posts and my pages! Be sure to read my comments under my first videos completely via laptop or PC! Be sure to read my contributions, posts, comments under my first videos thoroughly, maybe no longer visible via phone app because they intentionally set it that way! Use laptop or PC!
With ray tracing , fluid dynamics , ai etc ,games are set to take a huge leap forward but the best thing is it looks like it's all in favour of the indie developer, this will make games riveting because it'll be like reading a book by a great author, one mind is the way.
It kind of reminds me of when George Lucas made the prequel trilogies. It relied so much of CGI and effects, that nobody involved in the production thought to worry about the acting, writing or plot. The result are some of the worst movies in the franchise. (Although I will say seeing Yoda in that light saber fight was almost enough to make watching it worthwhile, almost)
I like your videos but I am sometimes disappointed with how the commentary is heavily weighted towards your emotional responses to the technology rather than the specifics of the technical challenges of the problem, and more particularly how the solution works. If you could give us more education and less how you feel about it, I at least would appreciate that. Thank you for your work
he only introduce the paper, if you want an explanation, you should dig your own information on the web. there are a lot of things that won't see the light of the day if it's not introduced. if he explain how the technology work, then nobody gonna watch his channel
@@jensenraylight8011 I would totally watch his channel for that reason and there are a bunch of channels like that that are quite popular in other areas that go into deep Dives into technology. Unfortunately there's very little popularization of the Science and Technology of artificial intelligence. There's a large amount of stuff in physics and engineering and a decent amount of stuff in chemistry and biology. They take the latest news and give a not only analysis of what it can do and what it might be useful for, but usually spend the bulk of the time explaining how the new technology works, sometimes incorrectly but you know it's UA-cam. I like this channel because it's clear that this wonderful gentleman is smart and does a lot of important reading. I'm just saying that for me personally as much as I enjoy hearing him happy, I wait a lot to hear more explanation of the problem and the solution.
@@michalchik It takes quite a bit of natural intelligence to comprehend the artificial one, and judging by the majority of commenters here the channel would lose most of its audience if it offered deep insights into the subject instead of the regular "ooh, shiny!" eye candy.
@@getsideways7257 lol, perhaps you're right but I think it's worth a try and I think the author of the channel might actually appreciate the response. There are big markets for short and long high quality explanations. On the short side take a look at all the wonderful things that minute physics does for example.
I've been fascinated by raytracing since "Imagine" on the Amiga. It's unbelievable how far raytracing has developed today. Back then, it took me four weeks to do one picture and it was a dream that one day it would happen in real time, although it's not perfect yet. Thank you for your videos.
Those read, listen to / bugged / eavesdrop / overheared / tapped, watched, spied, copied and steal many of my ideas (=inventions) initiated by me: + open, transparent 360° sunlight Buildings, construction methods, architectures, designs, concepts; + step-floor / -storey / -level / -tier pyramid (e.g., as a residential building); + 360° stepped floors / storey / level / tier buildings (constructions, architectures, construction methods, designs, concepts, 2D, 3D, models); + 3D 360° environment (surround), volume, space (room) audio / sound; + 360° environment (surround), volume, room screens (displays), monitors, TVs; + screens (displays), monitors, TVs without backlighting; + many things with magnets like Micro-OLED; + 360° screens, monitors, TVs, panels, glass; + curved screens, monitors, TVs, panels, glass; + 360° reflections and light digital: "ENB", "Ray-Tracing" / "RTX", “Lumen illumination” and whatever renaming! + Short Throw Projectors; + dark backgrounds / themes / skins for windows, browser windows, internet sites, programs / apps, etc.; + fanless, passive cooling; + thermal pads; + heat pipes; heat fins (no sure yet); + 360° heat fins; + Water cooling (unwanted, because it conducts electricity) and + liquid cooling (wanted) for the computer and personal computer / PC application area! - I am not the inventor of refrigeration by nitrogen, instant- / blast-freeze / -freezing; + 360° open, transparent PC cases; + 360° Sphere, Ball as a Wheel (2D, 3D) “Omniwheel” / “Omni-Wheel”; + from a standing position / standing a 360° turn / rotation in any direction on all objects, including logically and of course vehicles, transporters, robots of all kinds; + flying objects, planes that take off and land vertically, for example by hot air, example fighter jet; + skyscraper up to the weightlessness of space ("space scraper") - "mega construction", “orbital”, etc.; + bendable, flexible, foldable screens / displays / monitors; + bracket / mounting arms for PC, screens (not sure yet), + 360° "ergonomic" bracket / mounting arms; + LED lights (at least its implementation and intended use outside of tv and pc-monitor); RGB LED “Nanoleaf”, etc. + and much more! + I'm not the inventor of VR, but of "AR", AR glass, AR glasses; It was only later that I realized that they derived a lot from my ideas (=inventions), a lot came about that has to do with color and light, through me as an initiator, booster / catalyst, e.g., through my idea, invention of the Screens without a backlight and without a built-in / integrated backlight! It is no coincidence that only afterwards, after I initiated this, they built, built and are building those inventions thanks to my impetus! They sell my ideas (inventions) as theirs! They are not the inventors, but the first technical implementers of my ideas (=inventions)! And those are not the inventors, but the thieves of my ideas (=inventions) initiated by me, because those act as if I wasn't the first hand and the first domino, and they take unjustly, undeservedly a lot of money, stolen money (blood money), recognitions, awards, certificates, fame and history, they boast of my laurels / merits! Before me they all were stucked at LCD, Plasma TV and less lights! I have made a deep impact in evolution! Those manipulate, sabotage, falsify, distort images, paintings, digitized and real, animations, videos, films, also composed of many images, even the publication dates of mine, others and their posts, images, videos, etc.! You have to understand, those can distort everything that can be heard and seen in real life and digitally! They block and delete my pictures, videos, posts, comments, comments-answers and answers! They are poisoning and murdering the world with fake diseases, treatments, "vaccinations"/ "vaccines" and injections by syringe! Now they also make it out as if they haven't been ripping off, cheating, enslaving, murdering other countries with the money currency, money exchange by even -99% for more than two centuries! And as if I'm not the first to disclose that and more! As if I didn't disclose and initiate > 1.00 Ruble (₽) = 1.00 asian Yen (¥) = 1.00 Euro (€) = 1.00 Dollar ($) = any (X) any country < years ago! Each and every non-civilian you hear and see on TV is involved! You can hear and see their > blue blue blue
I'm an amateur 3d artist and rendering have always been pulling me back. I always have to make a lot of compromises to get a good enough look in "short" time. Or find fake solutions that will mimic what I want to achieve. Especially if I want to make an animation - I can't wait weeks to render a short video, as it means my PC will be out of use for this whole time. So this technology is salvation, if it will be implemented, it will allow me and many other artists let loose of imagination to create what we want, without fear of weeks of rendering ahead of us. So I agree. What a time to be alive.
A big difference in the real time ray tracing vs the long renderings are the amount of rays that you can calculate. The real time ray tracing is an approximation, but good enough for many applications.
@@nagualdesign I think some of these are about choosing which rays to emit in the first place to improve problem areas rather than just the usual random distribution. But yes, general denoising algorithms used in post-processing for ray traced images could be used on photos taken with the wrong iso settings or in conditions below the sensor's rated light levels. I suspect some of the non-AI ones used for ray tracing were adapted from ones originally developed for photo and video denoising.
@@nagualdesign ISO involves two different types of noise at opposite ends with a sweet spot that minimizes for a given light level if you fix the aperture and exposure time. I'm not certain but I think the most analogous type here would be selecting too low of an ISO for the light level rather than the sensor noise when the ISO is too high. Either way, the "right" ISO should minimize the noise.
I was wondering about whether they might want to limit the noise reduction in extremely low light areas of an image, since that may help add another subtle touch of realism. Of course, it all depends on where and how it's being applied I suppose.
4:40 I would really like to get a full view of this room as the design is bloody awesome, does anybody know where to download the program/watch a full video of it?
I grow more and more in love with how incredible solutions to incredible problems are often collecting dust in a drawer somewhere waiting for proper implementation.
Those read, listen to / bugged / eavesdrop / overheared / tapped, watched, spied, copied and steal many of my ideas (=inventions) initiated by me: + open, transparent 360° sunlight Buildings, construction methods, architectures, designs, concepts; + step-floor / -storey / -level / -tier pyramid (e.g., as a residential building); + 360° stepped floors / storey / level / tier buildings (constructions, architectures, construction methods, designs, concepts, 2D, 3D, models); + 3D 360° environment (surround), volume, space (room) audio / sound; + 360° environment (surround), volume, room screens (displays), monitors, TVs; + screens (displays), monitors, TVs without backlighting; + many things with magnets like Micro-OLED; + 360° screens, monitors, TVs, panels, glass; + curved screens, monitors, TVs, panels, glass; + 360° reflections and light digital: "ENB", "Ray-Tracing" / "RTX", “Lumen illumination” and whatever renaming! + Short Throw Projectors; + dark backgrounds / themes / skins for windows, browser windows, internet sites, programs / apps, etc.; + fanless, passive cooling; + thermal pads; + heat pipes; heat fins (no sure yet); + 360° heat fins; + Water cooling (unwanted, because it conducts electricity) and + liquid cooling (wanted) for the computer and personal computer / PC application area! - I am not the inventor of refrigeration by nitrogen, instant- / blast-freeze / -freezing; + 360° open, transparent PC cases; + 360° Sphere, Ball as a Wheel (2D, 3D) “Omniwheel” / “Omni-Wheel”; + from a standing position / standing a 360° turn / rotation in any direction on all objects, including logically and of course vehicles, transporters, robots of all kinds; + flying objects, planes that take off and land vertically, for example by hot air, example fighter jet; + skyscraper up to the weightlessness of space ("space scraper") - "mega construction", “orbital”, etc.; + bendable, flexible, foldable screens / displays / monitors; + bracket / mounting arms for PC, screens (not sure yet), + 360° "ergonomic" bracket / mounting arms; + LED lights (at least its implementation and intended use outside of tv and pc-monitor); RGB LED “Nanoleaf”, etc. + and much more! + I'm not the inventor of VR, but of "AR", AR glass, AR glasses; It was only later that I realized that they derived a lot from my ideas (=inventions), a lot came about that has to do with color and light, through me as an initiator, booster / catalyst, e.g., through my idea, invention of the Screens without a backlight and without a built-in / integrated backlight! It is no coincidence that only afterwards, after I initiated this, they built, built and are building those inventions thanks to my impetus! They sell my ideas (inventions) as theirs! They are not the inventors, but the first technical implementers of my ideas (=inventions)! And those are not the inventors, but the thieves of my ideas (=inventions) initiated by me, because those act as if I wasn't the first hand and the first domino, and they take unjustly, undeservedly a lot of money, stolen money (blood money), recognitions, awards, certificates, fame and history, they boast of my laurels / merits! Before me they all were stucked at LCD, Plasma TV and less lights! I have made a deep impact in evolution! Those manipulate, sabotage, falsify, distort images, paintings, digitized and real, animations, videos, films, also composed of many images, even the publication dates of mine, others and their posts, images, videos, etc.! You have to understand, those can distort everything that can be heard and seen in real life and digitally! They block and delete my pictures, videos, posts, comments, comments-answers and answers! They are poisoning and murdering the world with fake diseases, treatments, "vaccinations"/ "vaccines" and injections by syringe! Now they also make it out as if they haven't been ripping off, cheating, enslaving, murdering other countries with the money currency, money exchange by even -99% for more than two centuries! And as if I'm not the first to disclose that and more! As if I didn't disclose and initiate > 1.00 Ruble (₽) = 1.00 asian Yen (¥) = 1.00 Euro (€) = 1.00 Dollar ($) = any (X) any country < years ago! Each and every non-civilian you hear and see on TV is involved! You can hear and see their > blue blue blue
Man the video was interesting, but the way he talked was hard to bear. Having a pause after every second word and then dragging every thrid word for half a second is an interesting rhythm perhaps, but hard to follow without going insane. I do get language barrier, but with a Script and Preparation, this should be possible to overcome. I'm not a native speaker either.
Yeah reading the comments it seems many non-native English speakers have trouble understanding/listening to his voice. I personally have watched many of his videos and didn't even realize anyone had this problem. For me its extremely clear and understandable, but English is my first language so that may be why. I can maybe see his tone/pauses being confusing or jarring for someone non-native....
@@GeorgeAlexanderTrebek I didn't say confusing, or that I didn't understand it, his Pronounciation was very clear after all; it was simply Torture for the Mind. The Language itself was not the Problem, it's his speech Patterns whose Weirdnesses I attribute to his probable Migration Background / English Unfamiliarity. I did say "hard to follow" but only if you look at the Words in a Vacuum, I said "hard to follow without going insane"^^ So yeah, I learned a lot from the Video, just at the Cost of some Mental Stability that Evening lol If someone was talking like that in my own Language I'd be equally annoyed.
@@dieblauebedrohung hmm interesting, that's fair. I never really said That YOU said confusing or didnt understand, more that many non native English speakers have these issues just by reading the comments. I also said that it might be hard to listen to or be jarring for non native speakers, which is what you mentioned after about him having irregular speak patterns. I watch his videos at 1.1 or 1.2 speed so less noticeable for me anways....
I think it's one of your most underrated videos! People can truly bring such a level of vividness and clarity to the subject when they're actual qualified experts on the topic. I love how this is basically an ode to a cool algorithm, focusing on as much detail as UA-cam's format would allow instead of glossing over it completely like in OpenAI's recent marketing brochures of papers. Wish you the best of luck!
Am I having a stroke, or have I now watched 4 videos from you that are basically the same? I love you videos and have watched for years, but surely they should be additive and we shouldn’t have to pretend that it it currently takes a week to render caustics each time. It feels like Nvidia sponsored content by this point.
It's interesting being able to watch this journey of ray tracing both from the coding and technical side of it, here, and watching it actually being implemented, such as in gaming communities. Admittedly, part of me is also interested in the optimization of code more than anything else. I'd love to see some projects focusing on doing the same things we can already do, but for less processing power and less storage space.
I feel like in terms of getting it into real time one trick will be to actually put the speed of light into your simulation - that is a real life ray only moves so far in the few ns of a processor tic, so only calculate that much of the path before starting the next cycle. yes you would need very fast processor cycles, and more ram to track it all, but that is something were pushing to build more and more anyway
I think you are correct. Current technique is by tracing the ray from the viewpoint. This is incorrect and is really just cheating, but they say the opposite is "impossible". I disagree, it is possible but just takes longer to compute. What really needs to happen is we need a space-time simulation that emanates the 'light' or photons throughout the cells of the scene, creating a wave format of light. The waves will naturally provide the specular and reflective, refractive properties that light in our real life has. But that's only the first step, we would then need to simulate a lens that can accurately capture these waves that are emanating and bouncing throughout the cells, a virtual model that implements the properties of a camera lens could be a good example. The limitations of this approach is immense, but it is the correct approach, and if we wish to truly have life like real time rendering, the hardware needs to address these limitations. Memory, computational power. Maybe someday when we have consumer level quantum mechanical machines, but for now we can start on the software side of things by creating slow prototypes.
You dont even need to do the whole scene. For animations you know where the camera will be in the next several frames, you can do a quick "What will it be able to see" outgoing ray trace with say 1000 rays (instead of the normal 8.3 million per frame) and back track using the speed limited version in the high detail levels 4k rendering needs. In a video Game you could cut it down to what positions could the camera reach within the back track time of the longest ray. The player can only move so fast, and turn so much per frame. Theres a finite amount of scene they will be able to look at. This is still a ways away from being game engine ready, but the unpredictability of a physics driven and user interactive scene combined with the very tight deadline of having to wait for physics engine and still finish within 1 / 120th of a second 120 times a second is a much more challenging.
There seems to be a loss of texture resolution that you didnt bring up. Other than that its a great video. Look at the beam in 11:46 to see what i mean
Only people that work with CGI knows how crazy and incredible to have RT in real time in games is. Gamers don't know 1/2 of this even means. I don't care much for RT in games, in my view most of the implementations are very poor, but still incredible.
We are right to think that voxel is the future. Good days when light calculations will be better with both physics and pixel graphics in a world filled with atoms instead of mesh. Pixels are powerful because images are also made up of pixels. I think a format such as png, jpeg, etc., where voxels and pixel matrices are compressed, can also be used for voxels.
@@Razumen I didn't say it was. I'm talking about how voxel will be a strong investment for the future. The reason for this is the need for a voxel-based structure in many places. fluid physics, light-per-pixel calculation, etc. that is, dressing dimensional entities by forcing a dimensionless entity such as mesh. instead of weird stuff like uv in mesh technology for example. color, physical information and much more information can be given pixel by pixel. it will be perfect. and yes you are right my comment is a bit off topic in the video.
One computer graphics advancement that I hope will be enabled by ray tracing is vector geometry. Since we aren't doing computations per polygon anymore but per light ray, we can potentially have "infinite" number of polygons, i.e. smooth vector 3d models.
is this even progress anymore? you're just pretty much clickbaiting at this point. if you took a thumbnail from a video released a year ago and compared it to this, there would be the unclear (real) version and the cgi edition. i know that you're looking for content nearly every day but understand it takes a while for stuff to happen. rome wasn't built in a day. i'm just posting this as i'm concerned where the content is headed and i think you should slow down. i mean, the last couple videos were literally just recycled content.
This video could be reduced 80% if we exclude all the awe and amazement expressions from Károly, BUT these papers improvement clearly deserve each one of them!
this actually a very great research. The problem of water light refraction is not the noise, but the fireflies (like noise but it is the artifacts of bounching light). Until now there is no solution than tricking the scene. But now this, amazing...
For the missing paths, the fireflies, can these spots be estimated by having the geometric position of that spot averaged with the values near it? Similar to anti-alising algorithms but with light rays?
Incredible video. The amount of times i have rendered little home made projects and watched videos about rendering i never knew why noise would show up in rendering. But In the first minute of this video I had learned why.
Why do you pause so much in between sentences? It gets quite annoying and distracting after a period of time because it feels like the video was recorded by a 9th grade student for his summer project. You have done excellent research for the material content required for this video, but the delivery needs a little more work.
13:40 why is the caustic pattern on the ground more developed on the right but the light reflecting off the back of the rabbit looks more noisey than the older version on the eft?
So much of this video is exclamations of "omg wow look at it! Amazing, wow! It takes Days! Or does it? It's impossible! Or is it? Days or milliseconds? Hold on to your papers!" It's almost like this is a spoof on a Two Minute Papers video ... This whole thing could have been 1/2 the length and have 3x the technical content ...
*Why not "solve backwards" instead?* Set the default "un-traced" space to either some impossible color (eg: 100% transparent), or (bright pink, or the opposite of the nearest "traced" pixels) something. Then just scan for one of these "impossible areas", and shoot rays from the camera towards that area & see where they end up. (I bet this has already been tried, so-) Why wouldn't this work?
That'd get you to the first bounce of light, but can't handle the interaction of all sources of lights interfering with every pixel on screen. Basically what pixel shaders have done for the past 20 years. The benefit of this is getting actual diffuse lighting.
i just wish this feature will make its way to blender, i am more then OK with the shortcommings if it means my renders will be any faster, and it could improve viewport clarity as currently you get some really funky artifacts in poorly lit scenes.
I love seeing the improvements being done, great work. As an artist myself, these changes will help in so many ways. Question: While Ray-Tracing is impressive, isn't Path-Tracing better, doesn't it give more realistic results? If so then is the reason it's not used to much in real-time due to the time it takes to calculate the light rays??
Ray tracing in the research sense refers to path tracing where the light rays originate from the light source and bounce around the scene until they hit the camera. Ray tracing in the graphics sense refers to light rays originating from the camera that bounce around the scene and end at the light source, which is computationally faster but less accurate than path tracing. In fact rasterization is technically also ray tracing, but with a single ray originating from the camera that ends when it hits an object and doesn't bother finding a light source.
We don't want Nvidia to include some 3d rendering of a bunch of marbles. We want the driver source code behind the GPU we have spent hounderds of dollars for. We want full support with linux without wasting hours looking for solution for easily evadable problemes.
I found out at 13:25 that the new method has some more troubles calculating specular rays but is better with reflection and refraction rays but still it has much less Noise than the previous method but if we also overlay a view of the previous frames it would have even less Noise and also some Motion blur so it looks more natural also NVIDIA could use a seed function to remove even more flickering instead of using a random number each time each frame
I really like the software advancements NVIDIA is doing, but sometimes I can't help but wonder about their hardware, Especially with the new RTX 40 line. I hope they can address the fact you need a 700W+ PSU just to run a graphics card..
No, you don't; it's only on a halo product, and the consumption increase is the new normal if you want to reach that level of processing capabilities. If it could be helped, it would, but limitations in Moore's law and costs aren't helping.
You are just too young. I remember when we laughed a the Pentium CPU, because it required cooling. It was cooled by a tiny fan, little less than 2 inches. Before that CPUs didn't have cooling at all, not even passive. And that also meant that power consumption was only a few watts. Power and cooling requirements have been increasing ever since. In exchange you get even higher performance. It's possible to turn down the voltage and frequency a bit to use less power, and need less cooling, but you get less performance, and it doesn't reduce the cost of the card significantly, because you are mostly paying for the silicon. Also 1000W+ PSUs aren't that expensive anymore. At least not compared to a 4090.
By the way, Martin of Wintergatan referenced your catchphrase in his latest Marble Machine X video where he's testing repeatability of a marble gate and making improvements.
I'd be curious to learn how nVidia's ray-tracing papers and their applications differ from or are related to those which led to AMD and Intel's ray tracing support. nVidia uses their own proprietary tensor cores to handle the vast majority of the ray-tracing calculations, so I'm curious if AMD and Intel have simply adapted their process and algorithms to more abstracted instruction sets for their conventional graphics processor unit modules or if they have had to create their own method for calculating the ray-traced light dynamics for their scenes. It's a really cool subject and I'd love to see a deep-dive video about it sometime with analysis on how these different platforms have managed to get ray-tracing to work for them and how the different iterations of the process have changed how the image is formed.
Love the enthusiasm in this video 😃, however it’d be nice if you went into more details about how do these new gamechangers work and manage to improve performances
The solution is impressive. The sample scenes are terrible. Unless you understand the value of what you're looking at most of them look like an old videogame.
They lost quality on the bunny ears and significant energy in specular reflections. Reflections are smudgy and jumpy over time from undersampling in temporal domain. Eventually I'd like to see image based BRDFs as well so the materials can get some more variety.
Thank you for demonstrating (and including the links!) to these papers. Glad we already have access to one of these techniques and excited to see more commercially-available tools based on the other pieces of research you've highlighted!
Thank you so much! So delighted to see that you Fellow Scholars are also enjoying light transport - it's the best! 😊
UE5 uses Lumen, and it’s not only awesome and blazing fast for Global Illymination, but also you can add Voxel 2.0 plugin
Great research
@@smanzoli yes it's great but not nearly as accurate as real ray tracing. There are a couple vids showing how trees for example and some other objects appears mich brighter in lumen than they would ray traced. That's because lumen uses spacial light samples, not Pixel light sample like raytracing. So instead of shooting rays out of the camera, it samples 3d pixels (voxels) in a cube grid on the map. That's basically what they always used for rendering game lighting, with the difference that lumen works in real time and doesn't need to be baked.
Yeah, that realtime denoising is awesome! In terms of quality, it's still not quite as good as something like Intel's OpenImageDenoise (which runs on the CPU). The difference is their method is realtime, and OpenImageDenoise isn't, without a ridiculous CPU.
Edit: nevermind, a developer working on OpenImageDenoise got it running in realtime on both AMD and Nvidia GPUs, meaning it isn't locked to just Nvidia. OpenImageDenoise is definitely the better solution, or I suppose will be once they finish up the build that functions on the GPU.
As the render times get shorter, the 2 minute papers get longer 😛
Lol
They're now two minutes to the power of four lol
The real fundamental law of papers
I enjoyed the longer format as I don’t watch every video.
Heisenberg would be proud
I feel there's no way I can hold my papers down with these ferocious winds of change
We're gonna need a bigger paperweight.
@@alansmithee419 only an RTX 4090 could possibly work
@@UON yeah wont even have any papers left to hold down when that's bought
same
I want a paper on trying take get a realistic render of a Paper being squeeze that’s 100% realistic
It is such an honor to get an invitation to GTC to talk about light transport. Thank you so much! 🙏 So happy! 😊
Thank you for presenting all of this astounding research. More than the technology itself, it is your life-affirming delivery that inspires optimism and enthusiasm for a future that otherwise contains so many bright unknowns as to be terrifying. You are the Bob Ross of deep learning.
You’ve more than earned it! :)
what a time to be alive!
You really deserve this honor, sir! 👏
Do a video on the new Jurassic-1 language model
I’m always amazed how even as hardware gets faster, algorithms still keep getting smarter, so you get a double boost in performance.
Moore's Law has been dying this millenia so hardware improvements have become incremental and not exponential like in the 'good old days' when we expected a doubling of performance every generation, now the younger generation considers a 25% increase to be significant; so it's really only Algorithms/AI getting exponentially smarter that's giving these massive improvements, but like Moore's Law that's also bound to reach a limit, hopefully later than sooner. What's really needed is new computing technology that once again follows Moore's Law, but since capitalism is still making huge profits from old silicon tech I ain't holding my breath.
@@MadScientist512 we’re getting 50-70% performance uplift per generation in GPUs, I don’t know where you got the 25% from
@@MadScientist512 we are getting 2x improvement if you combine both hardware and software i.e. dlss
@@n9s3nse10 dlss is software. From strictly hardware point, we are nowadays getting 25% to 33% at most per generation
our best bet right now is improvements in algorithms particularly AI or machine learning.
machine learning is more on guessing rather than actual computation. but the thing with machine learning is learns so it also scales up as time goes on just like the transistors in a chip. what's rough guess right now could be an accurate outcome in the future.
I did my master on this. Computer Graphics. And I must confess this is insane. Like no way this is possible. I need to read all four of the papers. Wow.
Still squeezing in 2022! I remember me and my sister trying to render Bryce pictures on a pentuim 90. It took HOURS to make a sphere over water, lol. Miraculous stuff here.
Back in about 1987 I was using the forerunner of cinema 4D on an Amega 500 and I remember setting up a scene and going on holiday for three weeks and when I come back it had only done half the scene!
It did not crashed in 3 weeks?
What kind of sorcery is that
@@LongTran-em6hc 🤣🤣🤣
@@LongTran-em6hc I ran my Linux PC crypto mining for like a month and a half without rebooting it and it never crashed
One of the things that's changed is that we have more cores now. Even at the time, if you were fortunate enough to have a dual-processor rig, that would have been much closer to 45 hours as ray tracing is trivial to run in parallel processes. And with modern heardware, just being able to split the load to 8 or more processors alone would cut the time down to just over 10 hours. With the more efficient processors and increased RAM it would be done much faster.
You can get a better sense of what paper 4 is doing when you look at the edges of the image where the train is just coming into frame. You can see the detail jump levels as that spot on the train spends more time in the frame.
A possible workaround to this particular detail is to render to a virtual screen slightly larger than the actual screen and cropping out the edges when displaying on screen.
ah good idea. depends on if the wider area will take up enough resources to drop framerate, but I doubt it would drop very much
If you're talking about 13:56, you might be mistaking the raw noisy section for one with less detail. The first few columns on the left side show the noisy input to the denoising algorithm, which takes up the rest of the screen.
this doesn't work because camera can rotate, it doesn't just go left or right. all these effect are not new, there are working example of this shader that u can apply on existing game with api hook because gamers always find way to make their game look better. all these implementation suffer from a problem when you rotate. the best "solution" i seen from is to hide this from user by implementing a camera edge motion blur when the camera rotates, u cannot predict how a player control camera will turn.
This is really an amazing time to be alive.
In my particular case, I was a child, when the video game "Pong" was released.
Looking at the developement of computers and AI from then to now is unbelieveble.
Btw., I was born in 1967, which means 2 years before astronauts landed on the moon with a computer on board that had 80 kilobytes of memory.
You're old, wow
You're 65? What a time to be alive.
@@QuackZack I am 54 and the time is amazing regarding computing power.
If you live long enough you may be one of the first generation to reverse aging
Dude :D
As much as I love your enthusiasm about it, your seemingly random emphasis and speaking with an invisible comma after every second word hurt my ears terribly ^^
Still thank you for the video ;)
This lineup of papers is not just perfect for high end graphics cards , too. Considering that theyre optimised where possible and still going for more, smaller graphics cards and possibly devices that arent normally MEANT to do such things could possibly opt in for a similar method for demos! This is a big change. Better light transports for the same amount of time and effort is wonderful to keep track of. Thank you alot!
Those read, listen to / bugged / eavesdrop / overheared / tapped, watched, spied, copied and steal many of my ideas (=inventions) initiated by me:
+ open, transparent 360° sunlight Buildings, construction methods, architectures, designs, concepts;
+ step-floor / -storey / -level / -tier pyramid (e.g., as a residential building);
+ 360° stepped floors / storey / level / tier buildings (constructions, architectures, construction methods, designs, concepts, 2D, 3D, models);
+ 3D 360° environment (surround), volume, space (room) audio / sound;
+ 360° environment (surround), volume, room screens (displays), monitors, TVs;
+ screens (displays), monitors, TVs without backlighting;
+ many things with magnets like Micro-OLED;
+ 360° screens, monitors, TVs, panels, glass;
+ curved screens, monitors, TVs, panels, glass;
+ 360° reflections and light digital: "ENB", "Ray-Tracing" / "RTX", “Lumen illumination” and whatever renaming!
+ Short Throw Projectors;
+ dark backgrounds / themes / skins for windows, browser windows, internet sites, programs / apps, etc.;
+ fanless, passive cooling; + thermal pads; + heat pipes; heat fins (no sure yet); + 360° heat fins; + Water cooling (unwanted, because it conducts electricity) and + liquid cooling (wanted) for the computer and personal computer / PC application area!
- I am not the inventor of refrigeration by nitrogen, instant- / blast-freeze / -freezing;
+ 360° open, transparent PC cases;
+ 360° Sphere, Ball as a Wheel (2D, 3D) “Omniwheel” / “Omni-Wheel”;
+ from a standing position / standing a 360° turn / rotation in any direction on all objects, including logically and of course vehicles, transporters, robots of all kinds;
+ flying objects, planes that take off and land vertically, for example by hot air, example fighter jet;
+ skyscraper up to the weightlessness of space ("space scraper") - "mega construction", “orbital”, etc.;
+ bendable, flexible, foldable screens / displays / monitors;
+ bracket / mounting arms for PC, screens (not sure yet), + 360° "ergonomic" bracket / mounting arms;
+ LED lights (at least its implementation and intended use outside of tv and pc-monitor); RGB LED “Nanoleaf”, etc.
+ and much more!
+ I'm not the inventor of VR, but of "AR", AR glass, AR glasses;
It was only later that I realized that they derived a lot from my ideas (=inventions), a lot came about that has to do with color and light, through me as an initiator, booster / catalyst, e.g., through my idea, invention of the Screens without a backlight and without a built-in / integrated backlight! It is no coincidence that only afterwards, after I initiated this, they built, built and are building those inventions thanks to my impetus! They sell my ideas (inventions) as theirs! They are not the inventors, but the first technical implementers of my ideas (=inventions)! And those are not the inventors, but the thieves of my ideas (=inventions) initiated by me, because those act as if I wasn't the first hand and the first domino, and they take unjustly, undeservedly a lot of money, stolen money (blood money), recognitions, awards, certificates, fame and history, they boast of my laurels / merits!
Before me they all were stucked at LCD, Plasma TV and less lights!
I have made a deep impact in evolution!
Those manipulate, sabotage, falsify, distort images, paintings, digitized and real, animations, videos, films, also composed of many images, even the publication dates of mine, others and their posts, images, videos, etc.!
You have to understand, those can distort everything that can be heard and seen in real life and digitally!
They block and delete my pictures, videos, posts, comments, comments-answers and answers!
They are poisoning and murdering the world with fake diseases, treatments, "vaccinations"/ "vaccines" and injections by syringe!
Now they also make it out as if they haven't been ripping off, cheating, enslaving, murdering other countries with the money currency, money exchange by even -99% for more than two centuries! And as if I'm not the first to disclose that and more! As if I didn't disclose and initiate > 1.00 Ruble (₽) = 1.00 asian Yen (¥) = 1.00 Euro (€) = 1.00 Dollar ($) = any (X) any country < years ago!
Each and every non-civilian you hear and see on TV is involved! You can hear and see their > blue blue blue
@@damysticalone87 even if one of these things is true, an idea is not an invention. An idea is just an idea. An invention is the implementation of an idea.
@@Fingerblasterstudios
You criminal or and supporter of criminals! Childmurderer or and supporter of chiuldmurderer! Laws! "Protection of Intellectual Property"! "Patent Rights"! Theft!
@@Fingerblasterstudios
Laws! "Protection of Intellectual Property"! "Patent Rights"! Theft!
If / When you have money, health or "natural" disaster problems or if / when you are somehow affected by traffic accidents, traffic jams at some point, somewhere, then think of me, share my posts and my pages!
Be sure to read my comments under my first videos completely via laptop or PC!
Be sure to read my contributions, posts, comments under my first videos thoroughly, maybe no longer visible via phone app because they intentionally set it that way! Use laptop or PC!
Why do you talk like that? And NVIDIA isn't the sole superhero. The individual (s) who did the work also deserve praise.
The modern tech is fantastic. But it's also quite complex. One of the real amazing things about classical raytracing was that you could famously learn to get started "In One Weekend." Even as somebody who has been at least somewhat engaged in CG for many years (you've all seen stuff I worked on when I worked in VFX) the new stuff is less exciting because I have pretty much zero illusions that I am ever going to write an implementation of a full modern renderer by myself. Even just trying to use nVidia's Restir implementation library in my own engine rather than implementing it myself is more complex than writing a whole renderer used to be!
Totally agree, we've reached a point where amateurs or even low-budget professionals simply can't compete on their own, no matter how skilled they are.
The good old days for a lot of things are over, right!
@@alefratat4018 I think low budget studios and amateurs can always compete. You dont have to use RTX if you dont want to. And the main reason I love indie gams is because they are much simpler and focused on the fun part, with great idea for game mechanics or story. AAA games usually go full bonkers on graphics, but the story is usually lacking, and they are sterile most of the time from some new interesting game mechanics.
@@RamsesTheFourth Doesn't he mean writing it into a custom engine? I mean just putting ray tracing in a game is easy for anyone at a basic level, it's included in UE5. I can enable it easily there - but I'd be clueless on adding it to a custom engine.
@@kennylaysh2776 Thats what I understand he meant. Sure, If anyone uses commercial engine like UE/Unity then it should be easy to do.
I remember last year, as I was browsing a few (not yet reviewed/incomplete/old) papers on arxiv, and stumbled across a thesis that was talking about cone tracing... I downloaded it and was like "hey one day I'll read it". I never did.
This is probably not the same paper but the second you pronounced "voxel cone tracing" I was like "of COURSE".
Thanks for the vid, love your work. To think they actually thought about that ten years ago...
EDIT: turns out I still have the PDF of the thesis, it's "Audio and Visual Rendering with Perceptual Foundations", which doesn't match any of NVIDIA's papers, yet is dated 2009, so...
I wish you would tell us what kind of graphics card these researchers are using so that we can understand whether they are using the newest GPUs or CPUs as well as the what version of the software. I feel like that would be more informative. Your information is vastly and greatly informative but having those little bits of information will really tell us if they are using current software, GPUs and CPUs. Thank you so much and keep the excellent information coming! :)
They can't have a GPU much more powerful than 2x - 4x of any current high end one. So if they say 80 fps you can expect for sure to get above 20 fps with a good GPU.
Unless it's something that requires a lot of VRAM
@@Nico1a5 The scene where they mentioned 80 fps, they showed that they were using a 3090.
@@descai10 I mean the video mentions "real time" many times, and 20 fps should be an ok number to consider "real time".
If your concern is knowing what hardware was used just to run it yourself, then considering the same case you can expect a nice 5 fps. Which is great compared to a single frame taking minutes to render
@@Nico1a5 Not sure what you are talking about. They used a 3090. That means if you or I use a 3090, we will also get 80 fps in that scene.
@@descai10 on that scene, with that gpu, yes. Only that the video shows multiple technologies at multiple points in time, and there's no mention of the hardware used like the OP is asking
Your stuff is constantly top tier
14:00 seems like motion vectors artifacts. It is a genius move by NVIDIA scientists to use these!
14:35 Democratizing but usually only as long as you have specific hardware ^^
Specific, ready-to-use, affordable (relatively...) hardware that I can also use to game :D I think that more importantly, the papers being released mean that other chipmakers can also implement the hardware acceleration.
The first law of access is that democratization is a process. Do not look at where we are, look at where we will be, 2 more graphics cards down the line.
ReStir does use motion vector to benefit from light calculation done in previous frame
This video is now more relevant than when it was published. I could not wait for ReSTIR to be even considered or implemented in games and now is a reality with CP2077. It is a big big deal.
Another way to make light transport faster is distance based light resolution downsampling. Far away caustics and reflections are not gonna be visible due to the amount of pixels in the image anyways so there is no need for them to be super refined, so as long as the stuff closer to the camera is good enough resolution everything further away can be progressively downsampled to reduce memory usage and possibly increase the accuracy where it matters more..
How would you do that? Maybe render a depth map and fire more rays towards closer regions? Interesting concept
@@clonkex yeah, that might be quite a good trick. Maybe in combination with some ai based stuff you can downsample parts of the view that doesn't change considerably between frames too.
This sounds like what Unread Engine 5 does to concentrate its polygons closer to the viewer.
Oof. Sympathy and empathy to the people who saw that research from 11 years ago and mistakenly thought that widespread adoption and advancement of it would be just around the corner. At least it's in the right hands now. Thank you for bringing it to our attention, Károly!
Those read, listen to / bugged / eavesdrop / overheared / tapped, watched, spied, copied and steal many of my ideas (=inventions) initiated by me:
+ open, transparent 360° sunlight Buildings, construction methods, architectures, designs, concepts;
+ step-floor / -storey / -level / -tier pyramid (e.g., as a residential building);
+ 360° stepped floors / storey / level / tier buildings (constructions, architectures, construction methods, designs, concepts, 2D, 3D, models);
+ 3D 360° environment (surround), volume, space (room) audio / sound;
+ 360° environment (surround), volume, room screens (displays), monitors, TVs;
+ screens (displays), monitors, TVs without backlighting;
+ many things with magnets like Micro-OLED;
+ 360° screens, monitors, TVs, panels, glass;
+ curved screens, monitors, TVs, panels, glass;
+ 360° reflections and light digital: "ENB", "Ray-Tracing" / "RTX", “Lumen illumination” and whatever renaming!
+ Short Throw Projectors;
+ dark backgrounds / themes / skins for windows, browser windows, internet sites, programs / apps, etc.;
+ fanless, passive cooling; + thermal pads; + heat pipes; heat fins (no sure yet); + 360° heat fins; + Water cooling (unwanted, because it conducts electricity) and + liquid cooling (wanted) for the computer and personal computer / PC application area! + 360° open cooling, e.g., with air-flow by air-holes, no cases, no walls, no cover, etc.;
- I am not the inventor of refrigeration by nitrogen, instant- / blast-freeze / -freezing;
+ 360° open, transparent PC cases;
+ 360° Sphere, Ball as a Wheel (2D, 3D) “Omniwheel” / “Omni-Wheel”;
+ from a standing position / standing a 360° turn / rotation in any direction on all objects, including logically and of course vehicles, transporters, robots of all kinds;
+ flying objects, planes that take off and land vertically, for example by hot air, example fighter jet;
+ skyscraper up to the weightlessness of space ("space scraper") - "mega construction", “orbital”, etc.;
+ bendable, flexible, foldable screens / displays / monitors;
+ bracket / mounting arms for PC, screens (not sure yet), + 360° "ergonomic" bracket / mounting arms;
+ LED lights (at least its implementation and intended use outside of tv and pc-monitor); RGB LED “Nanoleaf”, etc.
+ and much more!
+ I'm not the inventor of VR, but of "AR", AR glass, AR glasses;
It was only later that I realized that they derived a lot from my ideas (=inventions), a lot came about that has to do with color and light, through me as an initiator, booster / catalyst, e.g., through my idea, invention of the Screens without a backlight and without a built-in / integrated backlight! It is no coincidence that only afterwards, after I initiated this, they built, built and are building those inventions thanks to my impetus! They sell my ideas (inventions) as theirs! They are not the inventors, but the first technical implementers of my ideas (=inventions)! And those are not the inventors, but the thieves of my ideas (=inventions) initiated by me, because those act as if I wasn't the first hand and the first domino, and they take unjustly, undeservedly a lot of money, stolen money (blood money), recognitions, awards, certificates, fame and history, they boast of my laurels / merits!
Before me they all were stucked at LCD, Plasma TV and less lights!
I have made a deep impact in evolution!
Those manipulate, sabotage, falsify, distort images, paintings, digitized and real, animations, videos, films, also composed of many images, even the publication dates of mine, others and their posts, images, videos, etc.!
You have to understand, those can distort everything that can be heard and seen in real life and digitally!
They block and delete my pictures, videos, posts, comments, comments-answers and answers!
They are poisoning and murdering the world with fake diseases, treatments, "vaccinations"/ "vaccines" and injections by syringe!
Now they also make it out as if they haven't been ripping off, cheating, enslaving, murdering other countries with the money currency, money exchange by even -99% for more than two centuries! And as if I'm not the first to disclose that and more! As if I didn't disclose and initiate > 1.00 Ruble (₽) = 1.00 asian Yen (¥) = 1.00 Euro (€) = 1.00 Dollar ($) = any (X) any country < years ago!
Each and every non-civilian you hear and see on TV is involved! You can hear and see their > blue blue blue
@@damysticalone87 ???
If / When you have money, health or "natural" disaster problems or if / when you are somehow affected by traffic accidents, traffic jams at some point, somewhere, then think of me, share my posts and my pages!
Be sure to read my comments under my first videos completely via laptop or PC!
Be sure to read my contributions, posts, comments under my first videos thoroughly, maybe no longer visible via phone app because they intentionally set it that way! Use laptop or PC!
@@damysticalone87 ///beep hoop////beepboop
If / When you have money, health or "natural" disaster problems or if / when you are somehow affected by traffic accidents, traffic jams at some point, somewhere, then think of me, share my posts and my pages!
Be sure to read my comments under my first videos completely via laptop or PC!
Be sure to read my contributions, posts, comments under my first videos thoroughly, maybe no longer visible via phone app because they intentionally set it that way! Use laptop or PC!
With ray tracing , fluid dynamics , ai etc ,games are set to take a huge leap forward but the best thing is it looks like it's all in favour of the indie developer, this will make games riveting because it'll be like reading a book by a great author, one mind is the way.
It kind of reminds me of when George Lucas made the prequel trilogies. It relied so much of CGI and effects, that nobody involved in the production thought to worry about the acting, writing or plot. The result are some of the worst movies in the franchise. (Although I will say seeing Yoda in that light saber fight was almost enough to make watching it worthwhile, almost)
a leap forward in visuals; probably not much in gameplay tho, which is what keeps a game going.
I'm more impressed at how impressed the narrator sounds
I like your videos but I am sometimes disappointed with how the commentary is heavily weighted towards your emotional responses to the technology rather than the specifics of the technical challenges of the problem, and more particularly how the solution works. If you could give us more education and less how you feel about it, I at least would appreciate that.
Thank you for your work
Indeed, I'm getting hand cramps at this point already because of all the paper squeezing...
Jokes aside, too much reused material in this video.
he only introduce the paper, if you want an explanation, you should dig your own information on the web.
there are a lot of things that won't see the light of the day if it's not introduced.
if he explain how the technology work, then nobody gonna watch his channel
@@jensenraylight8011 I would totally watch his channel for that reason and there are a bunch of channels like that that are quite popular in other areas that go into deep Dives into technology. Unfortunately there's very little popularization of the Science and Technology of artificial intelligence. There's a large amount of stuff in physics and engineering and a decent amount of stuff in chemistry and biology. They take the latest news and give a not only analysis of what it can do and what it might be useful for, but usually spend the bulk of the time explaining how the new technology works, sometimes incorrectly but you know it's UA-cam. I like this channel because it's clear that this wonderful gentleman is smart and does a lot of important reading. I'm just saying that for me personally as much as I enjoy hearing him happy, I wait a lot to hear more explanation of the problem and the solution.
@@michalchik It takes quite a bit of natural intelligence to comprehend the artificial one, and judging by the majority of commenters here the channel would lose most of its audience if it offered deep insights into the subject instead of the regular "ooh, shiny!" eye candy.
@@getsideways7257 lol, perhaps you're right but I think it's worth a try and I think the author of the channel might actually appreciate the response. There are big markets for short and long high quality explanations. On the short side take a look at all the wonderful things that minute physics does for example.
I've been fascinated by raytracing since "Imagine" on the Amiga. It's unbelievable how far raytracing has developed today. Back then, it took me four weeks to do one picture and it was a dream that one day it would happen in real time, although it's not perfect yet. Thank you for your videos.
Those read, listen to / bugged / eavesdrop / overheared / tapped, watched, spied, copied and steal many of my ideas (=inventions) initiated by me:
+ open, transparent 360° sunlight Buildings, construction methods, architectures, designs, concepts;
+ step-floor / -storey / -level / -tier pyramid (e.g., as a residential building);
+ 360° stepped floors / storey / level / tier buildings (constructions, architectures, construction methods, designs, concepts, 2D, 3D, models);
+ 3D 360° environment (surround), volume, space (room) audio / sound;
+ 360° environment (surround), volume, room screens (displays), monitors, TVs;
+ screens (displays), monitors, TVs without backlighting;
+ many things with magnets like Micro-OLED;
+ 360° screens, monitors, TVs, panels, glass;
+ curved screens, monitors, TVs, panels, glass;
+ 360° reflections and light digital: "ENB", "Ray-Tracing" / "RTX", “Lumen illumination” and whatever renaming!
+ Short Throw Projectors;
+ dark backgrounds / themes / skins for windows, browser windows, internet sites, programs / apps, etc.;
+ fanless, passive cooling; + thermal pads; + heat pipes; heat fins (no sure yet); + 360° heat fins; + Water cooling (unwanted, because it conducts electricity) and + liquid cooling (wanted) for the computer and personal computer / PC application area!
- I am not the inventor of refrigeration by nitrogen, instant- / blast-freeze / -freezing;
+ 360° open, transparent PC cases;
+ 360° Sphere, Ball as a Wheel (2D, 3D) “Omniwheel” / “Omni-Wheel”;
+ from a standing position / standing a 360° turn / rotation in any direction on all objects, including logically and of course vehicles, transporters, robots of all kinds;
+ flying objects, planes that take off and land vertically, for example by hot air, example fighter jet;
+ skyscraper up to the weightlessness of space ("space scraper") - "mega construction", “orbital”, etc.;
+ bendable, flexible, foldable screens / displays / monitors;
+ bracket / mounting arms for PC, screens (not sure yet), + 360° "ergonomic" bracket / mounting arms;
+ LED lights (at least its implementation and intended use outside of tv and pc-monitor); RGB LED “Nanoleaf”, etc.
+ and much more!
+ I'm not the inventor of VR, but of "AR", AR glass, AR glasses;
It was only later that I realized that they derived a lot from my ideas (=inventions), a lot came about that has to do with color and light, through me as an initiator, booster / catalyst, e.g., through my idea, invention of the Screens without a backlight and without a built-in / integrated backlight! It is no coincidence that only afterwards, after I initiated this, they built, built and are building those inventions thanks to my impetus! They sell my ideas (inventions) as theirs! They are not the inventors, but the first technical implementers of my ideas (=inventions)! And those are not the inventors, but the thieves of my ideas (=inventions) initiated by me, because those act as if I wasn't the first hand and the first domino, and they take unjustly, undeservedly a lot of money, stolen money (blood money), recognitions, awards, certificates, fame and history, they boast of my laurels / merits!
Before me they all were stucked at LCD, Plasma TV and less lights!
I have made a deep impact in evolution!
Those manipulate, sabotage, falsify, distort images, paintings, digitized and real, animations, videos, films, also composed of many images, even the publication dates of mine, others and their posts, images, videos, etc.!
You have to understand, those can distort everything that can be heard and seen in real life and digitally!
They block and delete my pictures, videos, posts, comments, comments-answers and answers!
They are poisoning and murdering the world with fake diseases, treatments, "vaccinations"/ "vaccines" and injections by syringe!
Now they also make it out as if they haven't been ripping off, cheating, enslaving, murdering other countries with the money currency, money exchange by even -99% for more than two centuries! And as if I'm not the first to disclose that and more! As if I didn't disclose and initiate > 1.00 Ruble (₽) = 1.00 asian Yen (¥) = 1.00 Euro (€) = 1.00 Dollar ($) = any (X) any country < years ago!
Each and every non-civilian you hear and see on TV is involved! You can hear and see their > blue blue blue
I'm an amateur 3d artist and rendering have always been pulling me back. I always have to make a lot of compromises to get a good enough look in "short" time. Or find fake solutions that will mimic what I want to achieve. Especially if I want to make an animation - I can't wait weeks to render a short video, as it means my PC will be out of use for this whole time. So this technology is salvation, if it will be implemented, it will allow me and many other artists let loose of imagination to create what we want, without fear of weeks of rendering ahead of us.
So I agree. What a time to be alive.
A big difference in the real time ray tracing vs the long renderings are the amount of rays that you can calculate. The real time ray tracing is an approximation, but good enough for many applications.
It's interesting how you think of the noise as a limitation of ray tracing when cameras and even your eyes have the same problem in low light.
"We simulate ISO 3200 film."
@@nagualdesign I think some of these are about choosing which rays to emit in the first place to improve problem areas rather than just the usual random distribution. But yes, general denoising algorithms used in post-processing for ray traced images could be used on photos taken with the wrong iso settings or in conditions below the sensor's rated light levels. I suspect some of the non-AI ones used for ray tracing were adapted from ones originally developed for photo and video denoising.
Actually cameras in modern smartphones use these sort of algorithms, to compensate for their small sensors.
@@nagualdesign ISO involves two different types of noise at opposite ends with a sweet spot that minimizes for a given light level if you fix the aperture and exposure time.
I'm not certain but I think the most analogous type here would be selecting too low of an ISO for the light level rather than the sensor noise when the ISO is too high. Either way, the "right" ISO should minimize the noise.
I was wondering about whether they might want to limit the noise reduction in extremely low light areas of an image, since that may help add another subtle touch of realism. Of course, it all depends on where and how it's being applied I suppose.
I've superglued my papers to my hands. Please help.
4:40 I would really like to get a full view of this room as the design is bloody awesome, does anybody know where to download the program/watch a full video of it?
Here is the video. There's a watermark that told me what to look up ;) ua-cam.com/video/h1hdAQQ3-Ck/v-deo.html&ab_channel=BR34K
I grow more and more in love with how incredible solutions to incredible problems are often collecting dust in a drawer somewhere waiting for proper implementation.
Is there any chance that these papers/techniques will be available/implemented in blender in future?
I wish there was no recap of all the previous videos he has done... it's called 2 min papers!
Why does it feel like the last five videos from this guy are basically the same video?
"Dear fellow scholars, this is Too Many Papers with doctor Károly Zsolnai-Fehér."
It's impossible to not get invested in something you talk about... you're just so passionate. These videos are really incredible.
It is a bit of a turn off imo. I feel like if he was more professional then this video could be 2 minutes long and still be 10 times more informative.
@@sk8erbyern Boring contrarianism!
Those read, listen to / bugged / eavesdrop / overheared / tapped, watched, spied, copied and steal many of my ideas (=inventions) initiated by me:
+ open, transparent 360° sunlight Buildings, construction methods, architectures, designs, concepts;
+ step-floor / -storey / -level / -tier pyramid (e.g., as a residential building);
+ 360° stepped floors / storey / level / tier buildings (constructions, architectures, construction methods, designs, concepts, 2D, 3D, models);
+ 3D 360° environment (surround), volume, space (room) audio / sound;
+ 360° environment (surround), volume, room screens (displays), monitors, TVs;
+ screens (displays), monitors, TVs without backlighting;
+ many things with magnets like Micro-OLED;
+ 360° screens, monitors, TVs, panels, glass;
+ curved screens, monitors, TVs, panels, glass;
+ 360° reflections and light digital: "ENB", "Ray-Tracing" / "RTX", “Lumen illumination” and whatever renaming!
+ Short Throw Projectors;
+ dark backgrounds / themes / skins for windows, browser windows, internet sites, programs / apps, etc.;
+ fanless, passive cooling; + thermal pads; + heat pipes; heat fins (no sure yet); + 360° heat fins; + Water cooling (unwanted, because it conducts electricity) and + liquid cooling (wanted) for the computer and personal computer / PC application area!
- I am not the inventor of refrigeration by nitrogen, instant- / blast-freeze / -freezing;
+ 360° open, transparent PC cases;
+ 360° Sphere, Ball as a Wheel (2D, 3D) “Omniwheel” / “Omni-Wheel”;
+ from a standing position / standing a 360° turn / rotation in any direction on all objects, including logically and of course vehicles, transporters, robots of all kinds;
+ flying objects, planes that take off and land vertically, for example by hot air, example fighter jet;
+ skyscraper up to the weightlessness of space ("space scraper") - "mega construction", “orbital”, etc.;
+ bendable, flexible, foldable screens / displays / monitors;
+ bracket / mounting arms for PC, screens (not sure yet), + 360° "ergonomic" bracket / mounting arms;
+ LED lights (at least its implementation and intended use outside of tv and pc-monitor); RGB LED “Nanoleaf”, etc.
+ and much more!
+ I'm not the inventor of VR, but of "AR", AR glass, AR glasses;
It was only later that I realized that they derived a lot from my ideas (=inventions), a lot came about that has to do with color and light, through me as an initiator, booster / catalyst, e.g., through my idea, invention of the Screens without a backlight and without a built-in / integrated backlight! It is no coincidence that only afterwards, after I initiated this, they built, built and are building those inventions thanks to my impetus! They sell my ideas (inventions) as theirs! They are not the inventors, but the first technical implementers of my ideas (=inventions)! And those are not the inventors, but the thieves of my ideas (=inventions) initiated by me, because those act as if I wasn't the first hand and the first domino, and they take unjustly, undeservedly a lot of money, stolen money (blood money), recognitions, awards, certificates, fame and history, they boast of my laurels / merits!
Before me they all were stucked at LCD, Plasma TV and less lights!
I have made a deep impact in evolution!
Those manipulate, sabotage, falsify, distort images, paintings, digitized and real, animations, videos, films, also composed of many images, even the publication dates of mine, others and their posts, images, videos, etc.!
You have to understand, those can distort everything that can be heard and seen in real life and digitally!
They block and delete my pictures, videos, posts, comments, comments-answers and answers!
They are poisoning and murdering the world with fake diseases, treatments, "vaccinations"/ "vaccines" and injections by syringe!
Now they also make it out as if they haven't been ripping off, cheating, enslaving, murdering other countries with the money currency, money exchange by even -99% for more than two centuries! And as if I'm not the first to disclose that and more! As if I didn't disclose and initiate > 1.00 Ruble (₽) = 1.00 asian Yen (¥) = 1.00 Euro (€) = 1.00 Dollar ($) = any (X) any country < years ago!
Each and every non-civilian you hear and see on TV is involved! You can hear and see their > blue blue blue
Stimpy does one hell of a voiceover in these. Kuddos!
Man the video was interesting, but the way he talked was hard to bear. Having a pause after every second word and then dragging every thrid word for half a second is an interesting rhythm perhaps, but hard to follow without going insane.
I do get language barrier, but with a Script and Preparation, this should be possible to overcome. I'm not a native speaker either.
Yeah reading the comments it seems many non-native English speakers have trouble understanding/listening to his voice. I personally have watched many of his videos and didn't even realize anyone had this problem. For me its extremely clear and understandable, but English is my first language so that may be why. I can maybe see his tone/pauses being confusing or jarring for someone non-native....
@@GeorgeAlexanderTrebek I didn't say confusing, or that I didn't understand it, his Pronounciation was very clear after all; it was simply Torture for the Mind. The Language itself was not the Problem, it's his speech Patterns whose Weirdnesses I attribute to his probable Migration Background / English Unfamiliarity.
I did say "hard to follow" but only if you look at the Words in a Vacuum, I said "hard to follow without going insane"^^
So yeah, I learned a lot from the Video, just at the Cost of some Mental Stability that Evening lol
If someone was talking like that in my own Language I'd be equally annoyed.
He talks like. William Shatner. from Star Trek. And is. Annoying to. Follow along. With all the. Unnecessary pausing. :-/
@@dieblauebedrohung hmm interesting, that's fair. I never really said That YOU said confusing or didnt understand, more that many non native English speakers have these issues just by reading the comments. I also said that it might be hard to listen to or be jarring for non native speakers, which is what you mentioned after about him having irregular speak patterns. I watch his videos at 1.1 or 1.2 speed so less noticeable for me anways....
@@GeorgeAlexanderTrebek Okay, fair enough. Wording lul.
But yeah, I sped it up as well and that helped a lot.
My favorite part is that Ren from Ren and Stimpy is summarizing these papers for us.
Didn't we see this video about a year ago? Why are we seeing this again? 🤔
I think it's one of your most underrated videos! People can truly bring such a level of vividness and clarity to the subject when they're actual qualified experts on the topic. I love how this is basically an ode to a cool algorithm, focusing on as much detail as UA-cam's format would allow instead of glossing over it completely like in OpenAI's recent marketing brochures of papers. Wish you the best of luck!
Why does he talk like that? wow so annoying, I like the topic but brah that's a cringy way to talk
Am I having a stroke, or have I now watched 4 videos from you that are basically the same?
I love you videos and have watched for years, but surely they should be additive and we shouldn’t have to pretend that it it currently takes a week to render caustics each time.
It feels like Nvidia sponsored content by this point.
It's interesting being able to watch this journey of ray tracing both from the coding and technical side of it, here, and watching it actually being implemented, such as in gaming communities. Admittedly, part of me is also interested in the optimization of code more than anything else. I'd love to see some projects focusing on doing the same things we can already do, but for less processing power and less storage space.
When I go to graduate school for physics, I wanna start a channel like yours but for pure mathematics and physics papers :D
I feel like in terms of getting it into real time one trick will be to actually put the speed of light into your simulation - that is a real life ray only moves so far in the few ns of a processor tic, so only calculate that much of the path before starting the next cycle. yes you would need very fast processor cycles, and more ram to track it all, but that is something were pushing to build more and more anyway
I think you are correct. Current technique is by tracing the ray from the viewpoint. This is incorrect and is really just cheating, but they say the opposite is "impossible". I disagree, it is possible but just takes longer to compute. What really needs to happen is we need a space-time simulation that emanates the 'light' or photons throughout the cells of the scene, creating a wave format of light. The waves will naturally provide the specular and reflective, refractive properties that light in our real life has. But that's only the first step, we would then need to simulate a lens that can accurately capture these waves that are emanating and bouncing throughout the cells, a virtual model that implements the properties of a camera lens could be a good example. The limitations of this approach is immense, but it is the correct approach, and if we wish to truly have life like real time rendering, the hardware needs to address these limitations. Memory, computational power. Maybe someday when we have consumer level quantum mechanical machines, but for now we can start on the software side of things by creating slow prototypes.
You dont even need to do the whole scene. For animations you know where the camera will be in the next several frames, you can do a quick "What will it be able to see" outgoing ray trace with say 1000 rays (instead of the normal 8.3 million per frame) and back track using the speed limited version in the high detail levels 4k rendering needs. In a video Game you could cut it down to what positions could the camera reach within the back track time of the longest ray. The player can only move so fast, and turn so much per frame. Theres a finite amount of scene they will be able to look at.
This is still a ways away from being game engine ready, but the unpredictability of a physics driven and user interactive scene combined with the very tight deadline of having to wait for physics engine and still finish within 1 / 120th of a second 120 times a second is a much more challenging.
Appreciate the flashing lights warning, thank you
There seems to be a loss of texture resolution that you didnt bring up. Other than that its a great video.
Look at the beam in 11:46 to see what i mean
I saw that and thought "Couldn't I have just used a gaussian blur? Result looks similar..."
Only people that work with CGI knows how crazy and incredible to have RT in real time in games is. Gamers don't know 1/2 of this even means. I don't care much for RT in games, in my view most of the implementations are very poor, but still incredible.
I'm so glad that you're becoming so popular in the field(s). Is there any way to invest in your success?
You are very kind, thank you! Just being with us and watching the series is an honor for us already. No need to do more - thank you! 🙏
wait I swear I have seen this video before! is this a reupload? :D
great video nonetheless!
We are right to think that voxel is the future. Good days when light calculations will be better with both physics and pixel graphics in a world filled with atoms instead of mesh. Pixels are powerful because images are also made up of pixels. I think a format such as png, jpeg, etc., where voxels and pixel matrices are compressed, can also be used for voxels.
It's not rendering in voxels, they're just used to divide up the scene.
@@Razumen I didn't say it was. I'm talking about how voxel will be a strong investment for the future. The reason for this is the need for a voxel-based structure in many places. fluid physics, light-per-pixel calculation, etc. that is, dressing dimensional entities by forcing a dimensionless entity such as mesh. instead of weird stuff like uv in mesh technology for example. color, physical information and much more information can be given pixel by pixel. it will be perfect. and yes you are right my comment is a bit off topic in the video.
One computer graphics advancement that I hope will be enabled by ray tracing is vector geometry. Since we aren't doing computations per polygon anymore but per light ray, we can potentially have "infinite" number of polygons, i.e. smooth vector 3d models.
is this even progress anymore? you're just pretty much clickbaiting at this point. if you took a thumbnail from a video released a year ago and compared it to this, there would be the unclear (real) version and the cgi edition. i know that you're looking for content nearly every day but understand it takes a while for stuff to happen. rome wasn't built in a day. i'm just posting this as i'm concerned where the content is headed and i think you should slow down. i mean, the last couple videos were literally just recycled content.
This guy sounds like, at any second, he will reveal an offer, that will change my life forever (at only $29,95).
A 16 minute Two Minute paper. You know it's going to be good.
Hope people developing Cycles and EEVEE see these papers.
This video could be reduced 80% if we exclude all the awe and amazement expressions from Károly, BUT these papers improvement clearly deserve each one of them!
Would it be. Possible. Tooo. Narrate this. In a waaay. That is. Human under. Standable? Oh, my! Yes, yes and yes.
this actually a very great research. The problem of water light refraction is not the noise, but the fireflies (like noise but it is the artifacts of bounching light). Until now there is no solution than tricking the scene. But now this, amazing...
For the missing paths, the fireflies, can these spots be estimated by having the geometric position of that spot averaged with the values near it? Similar to anti-alising algorithms but with light rays?
That's basically the beginning of a denoising algorithm.
Incredible video. The amount of times i have rendered little home made projects and watched videos about rendering i never knew why noise would show up in rendering. But In the first minute of this video I had learned why.
Why do you pause so much in between sentences? It gets quite annoying and distracting after a period of time because it feels like the video was recorded by a 9th grade student for his summer project. You have done excellent research for the material content required for this video, but the delivery needs a little more work.
13:40 why is the caustic pattern on the ground more developed on the right but the light reflecting off the back of the rabbit looks more noisey than the older version on the eft?
Why are you re-posting this video? You showed this one months ago now?
So much of this video is exclamations of "omg wow look at it! Amazing, wow! It takes Days! Or does it? It's impossible! Or is it? Days or milliseconds? Hold on to your papers!"
It's almost like this is a spoof on a Two Minute Papers video ...
This whole thing could have been 1/2 the length and have 3x the technical content ...
*Why not "solve backwards" instead?*
Set the default "un-traced" space to either some impossible color (eg: 100% transparent), or (bright pink, or the opposite of the nearest "traced" pixels) something.
Then just scan for one of these "impossible areas", and shoot rays from the camera towards that area & see where they end up.
(I bet this has already been tried, so-) Why wouldn't this work?
That'd get you to the first bounce of light, but can't handle the interaction of all sources of lights interfering with every pixel on screen. Basically what pixel shaders have done for the past 20 years. The benefit of this is getting actual diffuse lighting.
I miss the real Two Minute Papers. Does anyone know a channel where they still give a shit?
i just wish this feature will make its way to blender, i am more then OK with the shortcommings if it means my renders will be any faster,
and it could improve viewport clarity as currently you get some really funky artifacts in poorly lit scenes.
You made me wanna play some RTX game on my Nvdia graphic card more than study now XD
You should get paid by them for advertising 😅
This is mindblowing!!! Honestly, render costs/times is the reason why I quit working in 3D art.
duh
@@egretfx duh
@@facts9144 ok
It’s been pretty easy to render complex scenes for a while now… unless you quit over 10 years ago…
I love seeing the improvements being done, great work.
As an artist myself, these changes will help in so many ways.
Question: While Ray-Tracing is impressive, isn't Path-Tracing better, doesn't it give more realistic results?
If so then is the reason it's not used to much in real-time due to the time it takes to calculate the light rays??
Path tracing is often referred to as ray tracing. These papers mentioned are all variants of path tracing.
Ray tracing in the research sense refers to path tracing where the light rays originate from the light source and bounce around the scene until they hit the camera. Ray tracing in the graphics sense refers to light rays originating from the camera that bounce around the scene and end at the light source, which is computationally faster but less accurate than path tracing. In fact rasterization is technically also ray tracing, but with a single ray originating from the camera that ends when it hits an object and doesn't bother finding a light source.
I like these 2 minute papers but I can't stand the hyperbolic tone for EVERY paper. Chill out man..
We don't want Nvidia to include some 3d rendering of a bunch of marbles. We want the driver source code behind the GPU we have spent hounderds of dollars for. We want full support with linux without wasting hours looking for solution for easily evadable problemes.
I found out at 13:25 that the new method has some more troubles
calculating specular rays but is better with reflection and refraction rays
but still it has much less Noise than the previous method
but if we also overlay a view of the previous frames it
would have even less Noise and also some Motion blur
so it looks more natural
also NVIDIA could use a seed function to remove even more flickering
instead of using a random number each time each frame
I understood nothing but this dude being so excited about stuff was compelling to watch.
I really like the software advancements NVIDIA is doing, but sometimes I can't help but wonder about their hardware, Especially with the new RTX 40 line.
I hope they can address the fact you need a 700W+ PSU just to run a graphics card..
No, you don't; it's only on a halo product, and the consumption increase is the new normal if you want to reach that level of processing capabilities. If it could be helped, it would, but limitations in Moore's law and costs aren't helping.
The rtx4090 doesn't actually run at much higher power levels than the 3090.
@@guillermojperea6355 It's Nvidia's fault
You are just too young. I remember when we laughed a the Pentium CPU, because it required cooling. It was cooled by a tiny fan, little less than 2 inches. Before that CPUs didn't have cooling at all, not even passive. And that also meant that power consumption was only a few watts.
Power and cooling requirements have been increasing ever since. In exchange you get even higher performance. It's possible to turn down the voltage and frequency a bit to use less power, and need less cooling, but you get less performance, and it doesn't reduce the cost of the card significantly, because you are mostly paying for the silicon.
Also 1000W+ PSUs aren't that expensive anymore. At least not compared to a 4090.
I know nothing of what it takes to render light but I love how excited you are.
By the way, Martin of Wintergatan referenced your catchphrase in his latest Marble Machine X video where he's testing repeatability of a marble gate and making improvements.
oh man this is epic news for CG artists, cannot wait to speed up my workflow!!! what a time to be alive
I always enjoy the enthusiasm in your presentations.
The result of reinventing the wheel.
Don't listen to anyone.
aAand, sOooo, thEeen
Thanks!
Thank you so much for your generous support! 🙏
To hear so much enthusiasm for the topic in the voice of the narator cheered me up 😀
I'd be curious to learn how nVidia's ray-tracing papers and their applications differ from or are related to those which led to AMD and Intel's ray tracing support. nVidia uses their own proprietary tensor cores to handle the vast majority of the ray-tracing calculations, so I'm curious if AMD and Intel have simply adapted their process and algorithms to more abstracted instruction sets for their conventional graphics processor unit modules or if they have had to create their own method for calculating the ray-traced light dynamics for their scenes. It's a really cool subject and I'd love to see a deep-dive video about it sometime with analysis on how these different platforms have managed to get ray-tracing to work for them and how the different iterations of the process have changed how the image is formed.
That was a lot of triple yesses! 😃 Squeezing that paper! 📜
Love the enthusiasm in this video 😃, however it’d be nice if you went into more details about how do these new gamechangers work and manage to improve performances
Can u use a different accent in your text-to-speech program?
No need to be dicky. Dude's from Hungary.
Thanks Two Minute Papers, your videos are really amazing
I’m so sorry but the way this guy talks grates on my nerves like nothing else jesus christ
i disagree, his voice isnt bad at all
X Minutes Papers :) Not that I mind. Love your videos and grats on the GTC talk!
The solution is impressive. The sample scenes are terrible. Unless you understand the value of what you're looking at most of them look like an old videogame.
What a time to be a ... new sub to Two Minute Papers! LOL. Sooooo much to catch up on (it's been my ENTIRE WEEKEND)! 😍 Thanks for all you do!
They lost quality on the bunny ears and significant energy in specular reflections. Reflections are smudgy and jumpy over time from undersampling in temporal domain. Eventually I'd like to see image based BRDFs as well so the materials can get some more variety.
I would rather see advances looking like camera noise than this blurry mess honestly.
@@C.I... Check out the grain produced by a proper spectral path tracer with a camera model
Everytime he says "AND" I take a shot.
Can’t the standard error be calculated at each point? If the standard error is large, then the Ray is replaced by a diffuse light source. ?
ReSTIR does much more by checking neighbors in space and time as well as a "super bias" fallback like you suggest
Awesome video! Just started learning about Path Tracing and it's really cool stuff.
Caustics is exponentially bouncing lights. There is no way we can bruteforce this, we need to make shortcuts if we want it in real time.
Saw this on the stream of DotCsv with Asier, glad to see you there ;)
The borderline speaking/whispering is beyond annoying lmao