NVIDIA’s New Tech: Next Level Ray Tracing!

Поділитися
Вставка
  • Опубліковано 15 жов 2024
  • ❤️ Check out Microsoft Azure AI and try it out for free:
    azure.microsof...
    📝 The "Amortizing Samples in Physics-Based Inverse Rendering using ReSTIR" is available here:
    shuangz.com/pr...
    Erratum: at 5:12, I should have said "has 100x lower relative error". Apologies! Removed that part of the video so you won't hear it anymore.
    Andrew Price's Blender tutorials:
    • Blender Tutorial for C...
    📝 My paper on simulations that look almost like reality is available for free here:
    rdcu.be/cWPfD
    Or this is the orig. Nature Physics link with clickable citations:
    www.nature.com...
    🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
    Alex Balfanz, Alex Haro, B Shang, Benji Rabhan, Gaston Ingaramo, Gordon Child, John Le, Kyle Davis, Loyal Alchemist, Lukas Biewald, Martin, Michael Albrecht, Michael Tedder, Owen Skarpness, Richard Sundvall, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Tybie Fitzhugh, Ueli Gallizzi.
    If you wish to appear here or pick up other perks, click here: / twominutepapers
    Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
    Károly Zsolnai-Fehér's research works: cg.tuwien.ac.a...
    Twitter: / twominutepapers
    #nvidia

КОМЕНТАРІ • 190

  • @TwoMinutePapers
    @TwoMinutePapers  4 місяці тому +32

    Erratum: at 5:12, I should have said "has 100x lower relative error". Apologies and thanks for the catch @thomasgoodwin2648! 🙏 Update: Removed that part of the video so you won't hear it anymore.

    • @Tom-cq2ui
      @Tom-cq2ui 4 місяці тому +2

      I was just wondering about that. Thanks for the correction! Amazing topic too!

    • @etmax1
      @etmax1 4 місяці тому

      100 x lower error is not what the text said, it was 20 x lower error. The 100 x was the speed. 🙂

    • @321mumm
      @321mumm 4 місяці тому +1

      ​@@etmax1 The only time-related words i can see are "At equal frame time" which means the same time in my book. So there are no statements for being slower or faster as far as i could read.

    • @etmax1
      @etmax1 4 місяці тому

      @@321mumm Two Minute Papers acknowledged what I was saying and attributed it to thomasgoodwin2648 (presumably because they posted it first, so I suggest look more closely at what I wrote, or look at thomasgoodwin2648 s post.

  • @Nulley0
    @Nulley0 4 місяці тому +478

    Take a shot everytime Nvidia releases "Next level RayTracing"

    • @doodidood
      @doodidood 4 місяці тому +54

      Alcohol poisoning imminent

    • @Shy--Tsunami
      @Shy--Tsunami 4 місяці тому +5

      Consistent buzz

    • @dertythegrower
      @dertythegrower 4 місяці тому

      it came out in 2019ish.. I have built gaming pc since 1999... ray tracing came out with gaming for all in the 2050 and 2060 series.. i got a 2060 super as soon as i saw it, before the crash for parts

    • @AzumiRM
      @AzumiRM 4 місяці тому +25

      Take a shot every time apple says the "innovated" something that already exists.

    • @viktorianas
      @viktorianas 4 місяці тому +14

      So how did you end up in anonymous alcoholic club? Well... here's my story...

  • @thomasgoodwin2648
    @thomasgoodwin2648 4 місяці тому +91

    @5:07 Actually, it reads 'Up to 100x lower RELATIVE ERROR than baseline methods.' , not 100x faster. Still Awesome though.
    🖖🙂👍

    • @colinbrown7947
      @colinbrown7947 4 місяці тому +15

      Right before that, it also says "in the same timeframe" So two minute papers assumed a linear relationship between error and time, and guessed if you wanted the same quality it would be 100x faster. Although I think thats unlikely

    • @TwoMinutePapers
      @TwoMinutePapers  4 місяці тому +17

      I should have been a little more accurate there, good catch, thank you! Upvoted for visibility and added a note about it in the video description. Update: Removed that part of the video so you won't hear it anymore.

  • @test-uy4vc
    @test-uy4vc 4 місяці тому +185

    What a ray time to be traced alive!

    • @mr.critic
      @mr.critic 4 місяці тому +1

      ⚡😂

    • @locinolacolino1302
      @locinolacolino1302 4 місяці тому +1

      Life has always been ray traced throughout history, yet it is only now we must come to terms with the fact the ray is inescapable.

  • @Kuroo9
    @Kuroo9 4 місяці тому +65

    Two more papers down the line would be next week, right?

  • @Shy--Tsunami
    @Shy--Tsunami 4 місяці тому +50

    My mouth opened at the dragon modelling

    • @Nulley0
      @Nulley0 4 місяці тому +2

      That was "two papers up the line" (previous paper)

  • @thezoidmaster
    @thezoidmaster 4 місяці тому +22

    What a time to be alive!

  • @Mad3011
    @Mad3011 4 місяці тому +25

    What a time to be a ray!

    • @sash1ell
      @sash1ell 4 місяці тому

      Love getting traced.

  • @MTheoOA
    @MTheoOA 4 місяці тому +133

    This is a dream to me. I'm creating a world with more than 150 characters, more than 1000 buildings being draw by hand, and this... this is what i wanted. I can model, i'm an arch student, using Rhino, Grasshopper, etc. but it's just absolutely crazy what NVIDIA is doing. I hope this come soon.

    • @dertythegrower
      @dertythegrower 4 місяці тому +2

      It was awesome when it came out for us on the 2060 series

    • @ValidatingUsername
      @ValidatingUsername 4 місяці тому

      Try to downgrade the number of objects rendered to save energy requirements if you can.

    • @blackshard641
      @blackshard641 4 місяці тому +1

      Rhino? Now that's a software name I've not heard in a long time.

    • @WallyMahar
      @WallyMahar 4 місяці тому +2

      And please. Stop the erroneous UA-cam algorithm needing jump cuts with videos that have nothing to do with what you're talking about. That was kind of disappointing

    • @westingtyler1
      @westingtyler1 4 місяці тому +1

      ooh, what is your game? mine is called #NotSSgame, and I have a similar number of characters and buildings. I have some update vids about it. I hope to see more about your project soon!

  • @anywallsocket
    @anywallsocket 4 місяці тому +6

    You know what would make all these scenes even more realistic? Adding a bit of dirt everywhere 😂

  • @rallicat69
    @rallicat69 4 місяці тому +12

    IM REVERSE HOLDING ONTO MY PAPERS REALLY HARD SIR✋

  • @itskittyme
    @itskittyme 4 місяці тому +13

    Where is the previous video about how to control ChatGPT ?
    the video was removed before I had a chance to watch it 😞

    • @samuel.f.koehler
      @samuel.f.koehler 4 місяці тому +1

      me too!

    • @TwoMinutePapers
      @TwoMinutePapers  4 місяці тому +6

      I apologize as the quality of the video wasn't really what you would expect from us, and thus we removed it.

    • @itskittyme
      @itskittyme 4 місяці тому +1

      @@TwoMinutePapers aaww okay 😞thank you for letting us know

  • @DesignDebtClub
    @DesignDebtClub 4 місяці тому +1

    I feel like thinking about this as a way to take existing 3D renders back to 3D meshes is impressive but an odd and narrow use case.
    Seems to me that this is heading toward the ability to reconstruct scenes based off photographs - even things off camera based off shadows and reflections.
    It’s heading toward a tool for blade runner type detective work.

  • @Yamagatabr
    @Yamagatabr 4 місяці тому +1

    Oh man I love to do computer graphics. Thats what I dreamed of since I ws a kid. Is really a bummer to see the artistic process being ereased like this.

  • @alejobrcn6515
    @alejobrcn6515 4 місяці тому

    This voice and vídeo edition and script words are syntetic, that's Magic! 😮

  • @haydenveenstra1941
    @haydenveenstra1941 4 місяці тому

    What a time to be alive! I can't wait to see if this will be used for forensic science where shadows of objects are reverse engineered to expand a video or image in greater detail and help solve cases!

  • @AlessandroRodriguez
    @AlessandroRodriguez 4 місяці тому +1

    6:13 what a revelation!, I would have never thought you like papers

  • @scruffy3121
    @scruffy3121 4 місяці тому +1

    At 5:09 the marked text says 100x less error rate, not 100x faster. Am I missing something?

  • @ShadowRam242
    @ShadowRam242 4 місяці тому +1

    Inverse Rendering? Screw video games. Do you know what that would do for SLAM and robotic navigation?

  • @Verrisin
    @Verrisin 4 місяці тому

    is prism tracing possible ? - follow not single "line" - but genuine triangle (3 rays) mapped to screen find what areas it intersects, each contributing that percent of the color. Would split between many sub-prisms instead of running a ray all the way from start each time - Here knowing if the contribution is large or tiny.
    - Then "reflections" are of whole "triangle surface intersection" - creating a new, wider prism, with caring about detail less - possible multiple
    - One would batch not whole "path" of ray - but each "straight prism segment" - Then SORT all remaining by contribution (area intersection with ~screen) -- repeat until time runs out, then somehow cheaply "guess" the rest.

    • @Verrisin
      @Verrisin 4 місяці тому

      might need a different way to represent objects in scene, but if possible, I really like this conceptually.

  • @Jacen777
    @Jacen777 4 місяці тому

    This is a really good thing. I always wanted to be a fiction writer but I suffer from dyslexia. With the help of AI, I'm now well on my way to completing my first novel. It's important to note that AI is a tool that that allows me to bring my thoughts and ideas into the world. But it doesn't simply spit out the work. I still spend many hours planning, developing, guiding, tweaking and editing the entire process. These tools give me the ability to create in ways I could never dream of before. So, I think it's possible that AI will be used to empower new artists who previously faced some physical or mental disability that prevented them from creating in that space before. I believe this technology will create numerous new artists.

  • @wadeheying7117
    @wadeheying7117 4 місяці тому

    Thanks for including the legendary Andrew Price.

  • @Drokkstar_
    @Drokkstar_ 4 місяці тому

    Seeing real-world simulations become more accurate as well as getting faster makes me wonder if P really does equal NP.

  • @channel11121
    @channel11121 2 місяці тому

    How does it compare to the Mitsuba?

  • @Z_Inspector
    @Z_Inspector 4 місяці тому

    The Gigachad, Way2 Dank and Copege drinks in the first few scenes are hilarious

  • @goldenheartOh
    @goldenheartOh 4 місяці тому

    I'm picturing our grandkids using this thing to casually create games as easily as we doodle, & them being in awe that we were aever smart enough to write the code for games ourselves from scratch.

  • @geekswithfeet9137
    @geekswithfeet9137 4 місяці тому

    This sounds like an absolutely winner for computed tomography.

  • @ClintochX
    @ClintochX 4 місяці тому +2

    What's the ray tracing in this?

    • @_John_P
      @_John_P 4 місяці тому +3

      At 04:05, an example is given where the shadow is the input and the method reconstructs the object from it. The shadow even moves, making the object move in accordance, hence it's some sort of "reverse ray tracing" effect.

  • @MoMoGammerOfficial
    @MoMoGammerOfficial 4 місяці тому +1

    Whoa! this is game changer!

  • @Kenjineering
    @Kenjineering 4 місяці тому +1

    "Enhance 15 to 23. Give me a hard copy right there"

  • @eduardodubois4994
    @eduardodubois4994 4 місяці тому +5

    Wow just incredible.

  • @abowden556
    @abowden556 4 місяці тому

    I like this idea, I have been collecting images from beautiful or interesting places with the goal of someday using technology like this one them.

  • @SaumonDuLundi
    @SaumonDuLundi 4 місяці тому

    What a time to be alive!

  • @yumri4
    @yumri4 4 місяці тому

    it is much more close to it then the research paper 2 years ago that went through how generate a 3D object from a 2D image. She kind of got it kind of didn't The part that wasn't able to get correct is where there was a dip in the 3D object on the top that had nothing that went through the object. From this i think that example will still be unable to be done.

  • @sethart22
    @sethart22 4 місяці тому +4

    Is your voice generated by AI? Because it sure sounds like it.

  • @bug5654
    @bug5654 4 місяці тому

    0:11 Blender donut man sighted, community engagement in progress....
    (note: making fun of chat not accusing 2minpapers)

  • @ReportJungle
    @ReportJungle 4 місяці тому

    What a wonderful time to be alive.

  • @MikevomMars
    @MikevomMars 4 місяці тому

    This is similar to what the Quest 3 VR headset does when scanning your room and creating a 3D mesh from it.

  • @Entropy67
    @Entropy67 4 місяці тому

    that would be amazing, its a dream to make something with my hands and have that modelled into a game... with just a picture I can do that. Wow.

  • @The_Questionaut
    @The_Questionaut 4 місяці тому

    This is unrelated but- I was thinking for text to video being easy and intuitive to pose and animate characters.
    You could use something like controlnet with key frames, so you can use it for videos posing and movements of a character in the text to video generation.
    I don't think anyone has done this, I would love to see it done.
    Imagine posing your character just by dragging a stickman around then hitting the generate image button.
    Easy posing if someone pulls this off.

  • @handycap7625
    @handycap7625 4 місяці тому

    what happened to your 'can AI be controlled' video?

  • @derekgamer1978
    @derekgamer1978 4 місяці тому

    I'm so curious now. What if you put a non-linear geometry as the photo? Like a illusion.

  • @TomaszZamorski-b6v
    @TomaszZamorski-b6v 4 місяці тому

    2:26 what paper is that?

  • @dexgaming6394
    @dexgaming6394 4 місяці тому +1

    I'd really like to see a machine learning model completely relplace the rendering process. Imagine if you gave the model the textures, materials, and geometry information, and it could generate an image that appears like it was rendered with a slow path-tracer. That could completely make ALL path-tracers obsolete, and also make gaming far better if it could run in real time.

    • @raymond_luxury_yacht
      @raymond_luxury_yacht 4 місяці тому

      The future is diffusion ? Once consistency is achieved you just diffuse the frames

    • @dexgaming6394
      @dexgaming6394 4 місяці тому

      @@raymond_luxury_yacht No, not really. I think that diffusion would be too slow to run in real time. I thought of something like Neural Control Variates, which was also covered on this channel.
      ua-cam.com/video/yl1jkmF7Xug/v-deo.html

    • @dexgaming6394
      @dexgaming6394 4 місяці тому

      @@raymond_luxury_yacht No, because I think that process would be too slow to run in real time. There's something called Neural Control Variates, which has been covered on this same channel.
      ua-cam.com/video/yl1jkmF7Xug/v-deo.html
      The only problem is that this AI model is not available to the public.

  • @CarterSavin
    @CarterSavin 4 місяці тому +1

    0:53 Jam a man of fortune and J must seek my fortune - xQc

  • @parthasarathyvenkatadri
    @parthasarathyvenkatadri 4 місяці тому

    Wouldn't it be better to have a video of the place and it renders a 3d scene

  • @sammidgirl94
    @sammidgirl94 4 місяці тому

    How do magnets work?

  • @Troph2
    @Troph2 4 місяці тому

    Picture to 3d modeling is huge for 3d printing.

  • @Mranshumansinghr
    @Mranshumansinghr 4 місяці тому

    Soon I will not be texturing my models. Nvidia will do it for me. What a time to be Live!

  • @LydianMelody
    @LydianMelody 4 місяці тому

    Donut 5.0’s gonna be a real short video

  • @Qimchiy
    @Qimchiy 4 місяці тому

    Kinda like a morphing geometric version of Gaussian Splatting?

  • @deniskhafizov6827
    @deniskhafizov6827 4 місяці тому

    4:18 Human beings, specialized in x-ray crystallography (that is basically reconstructing molecules using their shadows), would doubt that. For example, Dorothy Hodgkin established the structure of vitamin B12 solely by hand without using computers. (and got Nobel prize for that)

  • @korinogaro
    @korinogaro 4 місяці тому

    Cool but it is hard to (especially in context of game making OP mentions in the beginning) come to ideas of creative use of this. I get situation when you've had hardware disaster, you lost a lot of data (3D models and materials included) but some backup with screenshots of models survived. So automatic re-making of the model. But how would you use it in constructive and not reconstructive way?

    • @somdudewillson
      @somdudewillson 4 місяці тому +2

      It would accelerate the process of going from concept art -> usable game asset by potentially providing a good starting point.

    • @korinogaro
      @korinogaro 4 місяці тому

      @@somdudewillson yeah, I thought about it couple seconds after posting. Designers make character/object designs, give it to 3D artists, they use something like that to make 3d models fast and then just touch it here and there.

    • @GCAGATGAGTTAGCAAGA
      @GCAGATGAGTTAGCAAGA 4 місяці тому +1

      @@korinogaro yes, but IN THE FUTURE there will be no concept artists, because we don't need them anymore! 🤓More than that, we will not need any human anymore! An AI for generating prompt, the next AI is generating art from prompt, the next one making models from this art and so on! Wow, what a time to be alive! 😎

  • @zrakonthekrakon494
    @zrakonthekrakon494 4 місяці тому

    It’s time we start asking the real questions, what will ray tracing look like 3 papers down the line… exactly the same? Probably

  • @georgepaschalis365
    @georgepaschalis365 4 місяці тому

    What's the 'Revolt' looking game? Would love to play that!

  • @coloryvr
    @coloryvr 4 місяці тому +1

    Yes! Amazing!

  • @KevinLarsson42
    @KevinLarsson42 4 місяці тому

    I can imagine a ton of use-cases of this

  • @astr010
    @astr010 4 місяці тому +20

    Imagine doing text to image to 3d scene

    • @KP-bi6px
      @KP-bi6px 4 місяці тому +4

      coming soon: create a movie/video game from text 😂

    • @dertythegrower
      @dertythegrower 4 місяці тому

      already existed 2023. for commercial use. on video it was shown year or so ago... iykyk

    • @dertythegrower
      @dertythegrower 4 місяці тому

      ​@@KP-bi6px that exists also like i said above.. 2023. it does exist, in first stages and shown om video if you dig through this kind of nonsense pushed to the top

    • @dertythegrower
      @dertythegrower 4 місяці тому

      all above is on video and exists... this guy shows mainstream

    • @KP-bi6px
      @KP-bi6px 4 місяці тому

      @@dertythegrower ah

  • @WallyMahar
    @WallyMahar 4 місяці тому

    I really thought you were going to do reverse Ray tracing .Ray Trace all the Rays from a photograph by looking at the materials, the glass, the reflections and reverse Ray tracing. It looked like you were going to show that but then ...Oh well next paper

  • @unadventurer_
    @unadventurer_ 4 місяці тому +1

    I'll eat my shoe if this guy can finish a sentence without awkwardly pausing every 3 words.

  • @mindful_clip
    @mindful_clip 4 місяці тому +1

    Im going outside with my robot.

  • @AAvfx
    @AAvfx 4 місяці тому

    😮😮😮😮😮Best show ever 😁

  • @noisetide
    @noisetide 4 місяці тому

    I'm getting raybumps...

  • @b.7944
    @b.7944 4 місяці тому

    it is theoritically impossible to model the invisible side of an object in a single image

    • @OGPatriot03
      @OGPatriot03 4 місяці тому

      What makes you say that? A human can do it by understanding the context behind the image.
      (If you've seen what a desk lamp looks like, then you can figure out what the back side of it likely looks like, with a high degree of accuracy) AI works in a similar fashion.

    • @b.7944
      @b.7944 4 місяці тому

      @@OGPatriot03 You will never be sure about an invisible side. You can only guess. How do you know this time that desk lamp looks different? There is not even an argument about this. It is just impossible. If a guess is sufficient for you, than that is fine. Or you can use more images.

  • @SeanMcCann70
    @SeanMcCann70 4 місяці тому +2

    Thank you so much for the donut reference

  • @MrSongib
    @MrSongib 4 місяці тому

    In my ideal future, we only need a 360 video of an object that we want to "Scan" to make a 3D and have every materials properties and after that, we only just one images to get the same results. that would be cool and EZ Clap.

    • @somdudewillson
      @somdudewillson 4 місяці тому +3

      If you have a 350 video of an object you already can reconstruct geometry and materials. Photogrammetry can do that, and it's been around for a while.

    • @MrSongib
      @MrSongib 4 місяці тому

      @@somdudewillson not that thing. xd
      something like a simple turn around videos without bunch of cam. I should choose my word more clearly next time. noted

  • @TaylorCks03
    @TaylorCks03 4 місяці тому +1

    This is amazing 😮

  • @pxrposewithnopurpose5801
    @pxrposewithnopurpose5801 4 місяці тому

    THATS FKING CRAZY

  • @pxrposewithnopurpose5801
    @pxrposewithnopurpose5801 4 місяці тому

    THATS CRAZY

  • @carlosrivadulla8903
    @carlosrivadulla8903 4 місяці тому +9

    Ok, I'm ending my blender subscription now

    • @goldenheartOh
      @goldenheartOh 4 місяці тому

      Blender has subscriptions??? I haven't kept up with it. Last I knew, it was free.
      Edit: part of me wonders if that was a joke to make fun of other apps.

    • @ChaineYTXF
      @ChaineYTXF 4 місяці тому

      ​@@goldenheartOhI'm surprised as well. Perhaps for fast rendering on distant servers?

  • @pxrposewithnopurpose5801
    @pxrposewithnopurpose5801 4 місяці тому

    lumen needs that

  • @Jasonl-h3q
    @Jasonl-h3q 4 місяці тому

    Do stimpy next pls!

  • @jakobhetland4083
    @jakobhetland4083 4 місяці тому

    Nvidia got some next level stuff every other day it feels like

  • @coisinho47
    @coisinho47 4 місяці тому

    so this is what they are going to make exclusive to rtx 5000 series cards...

  • @jamiespartz1316
    @jamiespartz1316 4 місяці тому

    All this great AI and you can't get a smooth talking bot for this video?

  • @caseycbenn
    @caseycbenn 4 місяці тому +1

    16 minutes to recreate a bush from a shadow you could create in Houdini in 30 seconds. Great... four bushes per hour, I am sure someone needs that somewhere. Its not the miracle cure its being hyped as. Also, we didn't see the back side of the dragon. Is it amazing you could reconstruct a scene from a photo? Sure, that sounds amazing. Ok fine, provide many photo references of all sides and bam, model, texture, etc. Still, it just changes from the creative craft of sculpting and modeling to photo taking and finding source material. Then, what? You are going the find the same images most people do on Google and end up with the same models everyone uses? I suppose the saving grace is that Concept Artists will become more in demand since they can truly create original ideas from many angles from their mind which could be fed to a machine to reconstruct in 3d. I am betting that aspect will actually be beneficial to the job market. Otherwise you'll give up the craft of modeling in favor of picture taking or image searching time. BUT.. yes, it is amazing reconstruction is possible this way, but it isn't an end all cure or replacement for design or creative direction.

  • @Sintrania
    @Sintrania 4 місяці тому

    More realistic, more performance hungry. More sales of 4090 class card. Hope they make use of these technique to hep lower performance requirements for rt work load.

    • @zachb1706
      @zachb1706 4 місяці тому

      The 5080 will probably beat the performance of the 4090, and the 6070 will probably beat the 5080. So in 10 years a 6070 will be more than capable of full ray tracing

  • @razorsyntax
    @razorsyntax 4 місяці тому

    So if you take an animated hypercube, which is the 3D projection of a 4D hypercube, would this be able to reconstruct the 4D cube in some way? Getting us closer to visualizing higher dimensions is a worthwhile effort.

    • @noob19087
      @noob19087 4 місяці тому

      Nope. We know exactly what 4D hypercubes look like mathematically. It's just that it can't be projected in reality because it requires information that doesn't exist in our universe, i.e. a fourth spatial dimension. No AI will ever fix that.

  • @mattbrandon9157
    @mattbrandon9157 4 місяці тому +4

    hearing this narrator was worse than nails scraping chalk board 20 seconds and i'm gone.

  • @ChaineYTXF
    @ChaineYTXF 4 місяці тому

    The point in time when humans reach 100% obsolescence draw nigh😢

  • @drewmandan
    @drewmandan 4 місяці тому

    I'm calling bullshit at 4:37. The information about the height of the octagonal prism is not contained in that shadow. It's not possible to match the height like that. There's some fuckery going on here.

  • @davekite5690
    @davekite5690 4 місяці тому

    'just imagine, with enough compute.... this being run on google maps and all the photo's people have shared... and all the cctv camera's and all the autonomous vehicle camera's... a live 3D model of the world... ... ... coming 'soon'.

  • @mercerwing1458
    @mercerwing1458 4 місяці тому

    2:25 Holy balls how can I access this?

  • @FeroxX_Gosu
    @FeroxX_Gosu 3 місяці тому

    RIP 3D modelers

  • @Graeme_Lastname
    @Graeme_Lastname 4 місяці тому

    Nice to see you m8. 🙂

  • @voltagetoe
    @voltagetoe 4 місяці тому

    glad i left the industry couple years ago - there's just nothing challenging/rewarding left anymore.

  • @mysticalmagician
    @mysticalmagician 4 місяці тому

    My friends, the Man who speaks on this site, saying good things of hope and love, is not a bad man. Listen to which is good, and forget the bad. We only call that which we do not understand, Evil. But the myriad ways and methods of the Creator, are full of mystery. So go forth, Love, Empathise, and Do good. Do only Good, not Evil. Do not hate yourselves, or the others around you, even if they seem weird, or strange. These are the Words of the Creator, and if you abide by them, you shall be saved. I am only blessed enough, and lucky enough, to be the Creator's messenger.

  • @zhonkvision
    @zhonkvision 4 місяці тому

    My wish comes true

  • @Scimblo
    @Scimblo 3 місяці тому

    Let me get ray tracing I can run please 🙏!!! That would be next level.

  • @P-G-77
    @P-G-77 4 місяці тому

    Awesome...

  • @Chewy427
    @Chewy427 4 місяці тому +7

    why do you sound like you're trying to hype up every single sentence

    • @VonsBuffet
      @VonsBuffet 4 місяці тому +1

      What a time to be alive.
      Or
      What a time to be AI.

  • @geniferteal4178
    @geniferteal4178 4 місяці тому

    Oh no, i've seen your face!😮😅 Nothing scary, just weird to see a face you've heard so many times but have no idea what they look like. It never matches what you expect. Nice to meet you! l o l😊

    • @geniferteal4178
      @geniferteal4178 4 місяці тому

      ​@@priyeshpvthanks! Watching again I can see different words being said though some basic movement is kind of in line with his speaking.

  • @andreas3904
    @andreas3904 4 місяці тому +6

    Is his voice ai generated or what? Why does it sound so weird, why does he stop all the time for half a second??

    • @egretfx
      @egretfx 4 місяці тому

      watch his video from 5 years ago and tell me if its AI.....idiot.

  • @web2yt488
    @web2yt488 4 місяці тому +2

    I hope youtube allows for voice filters.... I'm tired of your voice

  • @Drumaier
    @Drumaier 4 місяці тому

    As a 3d generalist I'm not loving any of this.

  • @czargs
    @czargs 4 місяці тому

    8 GB VRAM

  • @BigeppyFR
    @BigeppyFR 4 місяці тому +2

    Let's goooo

  • @t1460bb
    @t1460bb 4 місяці тому

    my godness

  • @vistaero
    @vistaero 4 місяці тому

    And here we go again, yet another video about Nvidia Ray Tracing. No thank you.

  • @TheChromePoet
    @TheChromePoet 4 місяці тому +2

    Guys, video game studios and their greedy practices are about to become obsolete. Indie devs will take over.

    • @Sekir80
      @Sekir80 4 місяці тому +1

      Oh, how I hope this comes true!

    • @blackshard641
      @blackshard641 4 місяці тому +3

      *laughs in parallels with film history*