Nvidia's New Plan Will Lock You In.

Поділитися
Вставка
  • Опубліковано 10 січ 2025

КОМЕНТАРІ • 1,8 тис.

  • @cyberwaste
    @cyberwaste 2 дні тому +2374

    I'm honestly sick of frame generation. It makes everything look like I've opened my eyes under water. Native looks so much better even when you turn down other settings to accommodate. Jensen and his jacket can piss right off.

    • @voided5788
      @voided5788 2 дні тому +137

      strong agree

    • @LICVIGO
      @LICVIGO 2 дні тому +84

      Exactly!!

    • @BellularNews
      @BellularNews  2 дні тому +349

      Yep, DLSS tends to look bad to me

    • @Jackomack
      @Jackomack 2 дні тому +88

      @@cyberwaste took me awhile to figure out why Helldivers on my PC looked so ass till I disabled DLSS/FSR

    • @BarDocK_LSSJ
      @BarDocK_LSSJ 2 дні тому +72

      ​@@Jackomack Helldivers does not have dlss/fsr, just has terrible taa

  • @dennisstanton6674
    @dennisstanton6674 День тому +145

    I believe latency is going to be more important than frame rate. What good is high FPS when you have to deal with lag?

    • @HaaskaChan
      @HaaskaChan День тому +2

      so true

    • @twizlestick8120
      @twizlestick8120 День тому +2

      most people wont care

    • @dankbonkripper2845
      @dankbonkripper2845 День тому +6

      @@twizlestick8120 the unfortunate truth. The casual will ruin it for the rest of us.

    • @twizlestick8120
      @twizlestick8120 День тому +2

      @@dankbonkripper2845 facts, they always do

    • @RiderOfKarma
      @RiderOfKarma День тому +1

      @@twizlestick8120 Casuals are not spending hundreds or thousands of dollars on graphics cards guys.

  • @dualie-dude-42
    @dualie-dude-42 2 дні тому +1304

    0:25 at the... lower end of the market? My guy, $550 was high end like 6 years ago. Nvidia already won in changing the culture and that sentence is proof.

    • @BellularNews
      @BellularNews  2 дні тому +261

      YEP. I remember the 980 & 260 days well :/

    • @mrclean2224
      @mrclean2224 2 дні тому +127

      the 2080 cost 700 MSRP on launch, $550 for low end feels like highway robbery

    • @Macheako
      @Macheako День тому +48

      Bro, don’t forget the catastrophe that was the 3000 series. They all reported semi-ok prices…..MSRP prices…..
      NOT real world prices 😂 lmao

    • @MisterWoes
      @MisterWoes День тому +57

      @@BellularNews Dude, 3080 had an MSRP of $700, then Cov and scalpers happened. Sold for way more and Nvidia loved that so they kept the price higher for the following gen. 1k MSRP for an 80 class card is disgusting.

    • @adi6293
      @adi6293 День тому +9

      ​@@BellularNewsRTX 260 was actually quite expensive when it first came out, only after AMD's HD4870 came out for almost half the price made nVidia drop theirs 😅

  • @jonnekallu1627
    @jonnekallu1627 День тому +141

    Fake frames, worse image quality and more lag.
    What a sales pitch...

    • @tripnils7535
      @tripnils7535 День тому +6

      Have you actually seen the DLSS4 tests? Let's be honest it's basically native quality. You have to pause the game and zoom in to actually notice the difference. And additional input lag is very minimal. 99% of people would never notice unless specifically pointed out to with a graph analyzer lol.
      That we live in a time where people get so outraged about such miniscule differences, while having double and tripple the framerate just shows how far we progressed. Of course you will find always people who complain about everything, even if it's a massive progress in the big picture.

    •  День тому +22

      @@tripnils7535 sorry but it's not gonna cut it. Generated frames have no real information in them. It's all fake, it's deceiving. We dont want it. We want our frames to show us what the game is calculating, not what the gpu is calculating

    • @heh2k
      @heh2k День тому +13

      @@tripnils7535 You're completely ignoring that input lag does not improve at all.

    • @twizlestick8120
      @twizlestick8120 День тому +3

      most people wont even care tho

    • @tripnils7535
      @tripnils7535 День тому +2

      @@heh2k I'm not ignoring it, the difference in input lag is just so minimal, most people simply won't care. People should stop exaggerating everything into the extreme.

  • @gabiruuuuuuuuu
    @gabiruuuuuuuuu 2 дні тому +549

    Is this really their angle from now on? Selling AI-generated frames as synonymous in value to real performance gains? This seems very off and concerning to me.

    • @tappy8741
      @tappy8741 2 дні тому +64

      Yes. Because gamers are no longer the core demographic. We're the secondary demographic that get the parts that failed to be binned for data centre
      edit: the point i meant to make is that ai is where the money is, so ai is where the hardware development is. Just like ray tracing that they shoehorned into a gamer context when all really it's been about professional use cases, they just want the hardware they make to be applicable to consumers too so we're another market even if it doesn't make a whole lot of sense.

    • @umarthdc
      @umarthdc 2 дні тому +39

      They are almost literally selling smoke. The truth is they have peaked 3 series ago and now they are inserting .jpgs to makes us believe we have a better GPU

    • @merlinwarage
      @merlinwarage День тому +5

      Do they make games run fast? If you didn't know how it worked, you wouldn't notice. It's just that drama is now a basic requirement for all shit.

    • @arenzricodexd4409
      @arenzricodexd4409 День тому +6

      What else can be done? Node shrink getting harder and crazy expensive. Even when they add more power to increase raw power CPU cannot keep up with the like 4090 at 4k. That's why despite massive increase of Cuda cores in 4090 vs 4080 the performance increase does not reflect that.

    • @xanira6367
      @xanira6367 День тому +7

      thats the best way of making money. developing and manufacturing hardware is more expensive than software is.

  • @ZeroTheHunter
    @ZeroTheHunter День тому +22

    7:49 this is my biggest complaint about videogames nowdays, why games become more demanding while they keep looking the same or worse? I can play UnrealTournament3 at 4k max settings at 60fps with only a rtx 2050 4gb, while games nowdays with same or worse visual quality cant run at more than 30fps 1080p upscaled from 720p at low/medium settings. If we have access to technology 10 times better than we used to be, why is the software so insanely bad? What makes a difference to our games to be this bad compared to old games? Heck, i can play games from 2014 at 1080p at max settings and still have better image quality than modern games running at 4k at max settings. Battlefield 1 looks better than most games nowdays and still runs better as well.

    • @mattrichardson2451
      @mattrichardson2451 День тому

      @ZeroTheHunter Lazy device not optimising is part of it I think. Might sound harsh but when modders can do it better, that's a problem.

    • @DatAsianGuy
      @DatAsianGuy 16 годин тому

      yup. the graphical difference in games between now and like back in 2016 is so minimal that you wonder.
      At this point I assume that big game studios and gpu manufacturers/Nvidia are in cahoots and do this on purpose to ensure that there is always a reason to buy the latest "bestest"

    • @lostinthenight2005
      @lostinthenight2005 5 годин тому

      @@DatAsianGuy From what I remember, they have always been in cahoots, (because if your game require more from your machine, then is easier to sell an "stronger" hardware), there is also the issue that big game studios have become, you could say "complacent", not optimizing for the average hardware, because, is easier to release a not optimized game than take the time to optimize and remove bugs

  • @RauschenPauli
    @RauschenPauli 2 дні тому +306

    yea im gonna wait for actual data to come out. nvidia has been lying for years at this point. taking their word is kinda insane.

    • @Caderynwolf
      @Caderynwolf День тому

      Well... They haven't "lied" for years in so much as not telling the whole truth... This time however, they blatantly outright lied... Not even a small lie but astronomically, like they don't even care if you know they are lying.

    • @nickstersss
      @nickstersss День тому +13

      i remember when i getting my new rig built my pc shop asked me to wait awhile more to wait real 3rd party benchmark from other channels to verify the actual performance and dont trust those from nvidia itself

    • @auroraflash
      @auroraflash День тому

      @@RauschenPauli I know people who did xD 🤣

    • @DragonHalo99
      @DragonHalo99 День тому +1

      Yea until I see numbers i'm not buying anything. Besides I have a 3090 kingpin in my desktop. Not many games I play are even capable of making my GPU work fully. I'll probably wait for the 60 series

    • @Dark_Tesla
      @Dark_Tesla День тому +1

      I said the same thing. I’m gonna wait for the nerds with no skin in the game to benchmark it.

  • @karakiri283
    @karakiri283 День тому +20

    I don't want fake frames. I just want beautiful and optimized games :'(

    • @bishopcruz
      @bishopcruz День тому +1

      You have them? Most games run well on current graphics cards, and new games have always pushed the hardware in ways that older cards could not handle as well. Witcher 2 killed lower end cards at the time, Half-Life 1 was a beast when it was released, too. Let's not even talk games like Half-Life 2, Crysis, or any other envelope pushing title at release. Average cards run modern games better than in most times in PC gaming history.

  • @ruslanmishiyev9815
    @ruslanmishiyev9815 2 дні тому +943

    AI generated frames don’t count, NVIDIA. As usual, a company makes the product worse, implements a crappy solution to undo some of the cost cutting, and calls it a cutting edge “feature”.

    • @ValkisCalmor
      @ValkisCalmor 2 дні тому +82

      They're not even trying to make good gaming hardware, all of their money goes into AI and then they look for some way to apply that to gaming even if nobody wants it.

    • @heramaaroricon4738
      @heramaaroricon4738 2 дні тому +60

      From a technical point it is a "feature". From a customer quality perspective, it is a scam.

    • @_godsl4yer_
      @_godsl4yer_ День тому +27

      You literally don't know how any of this works lol. You don't understand even the basics of semi conductors, yet speak so confidently online.

    • @mrclean2224
      @mrclean2224 День тому +18

      @@ValkisCalmor because they know they can do it and their team green guys will all buy it anyway. brand loyalty is super strong, to the point people are gonna buy scalped 50 series cards the same as they did the 30 and 40

    • @fleetadmiralj
      @fleetadmiralj День тому +4

      I mean, they count if consumers say they count, in the end.

  • @alorynelftuber
    @alorynelftuber День тому +79

    I don't want to see AMD try to compete in the "fake frame generator" segment. I'd like to see them try to take over native rendering and natural FPS.

    • @aninass
      @aninass День тому +8

      They won't, or possibly can't. Raw rasterisation will make their GPUs too expensive, and they really wouldn't want to throw away all the FSR developments they have made all these years.

    • @williehrmann
      @williehrmann День тому +6

      That is not possible or else Nvidia would have done it anyways.AMD is always only catching up to NVidia never setting standard so they probably won't even catch up in fake frames as they haven't in raytracing and FSR yet. they aren't even an a competitor to NVida any more at this point.

    • @fewsi
      @fewsi День тому +1

      Not possible while nvidia is dominating. Game's devs will put frame generations in system requirements(like dlss) and then if amd doesn't have their "response" they gonna be cooked

    • @REZZY_404
      @REZZY_404 День тому +10

      @@williehrmann I bet that's what Intel said about AMD until Ryzen dropped and crushed the living hell outta them lol I think they could do it or catch up heavily but they need to shape up their GPU department ASAP with how they did with the CPU side of things.

    • @redTanto
      @redTanto День тому +1

      The only graphics dedicated cores they have now are ray-tracing, the main one WAS cuda which is general purpose small math, but now the primary is tensor cores (for AI), then cuda, then rt. It's because all of their development is geared towards data centers and the AI boom, consumer markets to them are like selling scraps to get some extra cash.

  • @J-Vill
    @J-Vill 2 дні тому +476

    The disappointment when we are basically getting a software "update" instead of a hardware improvement

    • @markthestark1
      @markthestark1 2 дні тому +20

      Its not even that, you can already have up to x20 frame generation with the new version of lossless scaling for the low price of $6.99, nothing really exciting about 50 series sadly.

    • @kathrynck
      @kathrynck День тому +11

      Well, hardware-wise, 50-series is mostly some extra ram speed, better RT cores, and a TON more AI capability.
      The AI capability is driving the more aggressive "fake frames".
      For the most part (for those not into AI or AI-driven fake frames), 50-series looks like: a 4070 super+, a 4070 Ti super++, a 4080 super+, and a 4090 Ti.
      At least the pricing remained relatively stationary. That's not "good" but it's better than I expected.

    • @charlesurias
      @charlesurias День тому +2

      @@J-Vill Fortunately there is both. Win/Win

    • @artursobkow2680
      @artursobkow2680 День тому +7

      @@markthestark1 you do understand that lossless has 10x more graphical glitches ? I know I own it.. it's great scaling for older games and maybe 60fps lock for like ffs 10 with glitches but yeah.. Nvidia frame is better but still glitchy I hate fg both ways.

    • @The6th03
      @The6th03 День тому +4

      @@J-Vill this is a bit of moronic statement. The whole point of the future is not to rely on physical material or bulky goods but rather code and software, that make everything efficient.
      We want computers that can run without much hardware.

  • @cablefeed3738
    @cablefeed3738 День тому +79

    I don't care about AI fake frames. Or any other fake upscaling technology, raster or nothing

    • @Melsharpe95
      @Melsharpe95 11 годин тому +2

      Considering AMD is going the exact same route as nVidia you're screwed. Oh and so's Intel too.
      Byebye raster. Byebye 2016 thinking.

    • @MDxGano
      @MDxGano 5 годин тому +1

      The horse and buggy guys said the same thing...

    • @cablefeed3738
      @cablefeed3738 4 години тому

      @MDxGano completely different conversation. If you think i'm against a I, you're wrong. I'm against frame generation because it doesn't increase how responsive a game is thus fake frames

  • @Thund3rstone
    @Thund3rstone 2 дні тому +622

    Let us wait till Tech Jesus from Gamers Nexus took a look at that.

    • @jonas_bento
      @jonas_bento 2 дні тому +27

      Yet another waste of sand maybe?

    • @AshtonCoolman
      @AshtonCoolman 2 дні тому +42

      And don't forget Steve from Hardware Unboxed

    • @Satook
      @Satook День тому +29

      @@AshtonCoolmanthanks Steve!

    • @kathrynck
      @kathrynck День тому +5

      That wasn't different at all. Was it Steve?

    • @atnfn
      @atnfn День тому +29

      Don't need to wait for anyone to know that the 5070 is nowhere close to the performance of a 4090. It probably won't be much faster than a 4070 super. The "The performance of a 4090" is just frame gen nonsense.

  • @venomtailOG
    @venomtailOG День тому +194

    Frame generation is like a nice side sauce. A great bonus. However when the sauce becomes your meal, you'll hate it pretty quickly.

    • @EloquentiaSerpentis
      @EloquentiaSerpentis День тому

      Innovation under capitalism is the best, yup; At least two full generations of "AI" nonsense because its clearly won the war to become the new buzzword of choice across most industries and the investment/ownership class ain't tired of it yet.

    • @waiwai5233
      @waiwai5233 День тому +5

      Fake taste!

    • @TheGameGuruTv
      @TheGameGuruTv День тому +3

      what a great line!

    • @lordzed83
      @lordzed83 22 години тому

      frame gen needs 90+fps to be good from personal expirience

    • @Brutalhonesty11
      @Brutalhonesty11 12 годин тому

      You dont know what youre talking about.

  • @christopherwestphal7149
    @christopherwestphal7149 2 дні тому +188

    The problem is Nvidia has lied about performance and what to expect gen over gen before. For instance, from 2x to 3x claims over the 30 series. Testing showed a 15% performance increase. Then Nvidia said everyone was testing it wrong.
    Now we are having 2x gen over gen claims again... but based on what? DLSS has proven to be a black box that is sometimes great, often acceptable, other times game breaking. DLSS has been improving over the generations... but unless it's fundamentally different It's still going to very game to game if it's useful and how much so.
    What do people care about? A fast, responsive game with stable frame rates that hit the resolution people want. This has yet to happen on DLSS 3 where frame pacing is problematic. Did they fix this in DLSS 4? Nothing presented indicate they have.
    I hope Devs like yourself are pointing this out like you did in this video. Input to photon with consistency and good quality... you would think Nvidia would be demoing this vs the same charts, videos with FPS counters, etc.

    • @DoesNotInhale
      @DoesNotInhale 2 дні тому +5

      When AMD actually makes a decent GPU for once then the market can rectify. Until then keep coping and seething since Nvidia has no competition even with their dogsh*t backend features that are overshadowed by their hardware's base performance.

    • @christopherwestphal7149
      @christopherwestphal7149 2 дні тому +15

      @@DoesNotInhale I don't know what your talking about. AMD has been a fine alternative for years. Plus the frame pacing is a lot better. I have actually had some people ask me for help why after upgrading their gameplay is worse on 40s series. I had to drop them to DLSS Quality or turn it off to restore their ability to be competitive. Then several got pissed their frame rates went down.
      Sure if your using CUDA workflows you're in a shit show.
      However the main thing is if it's working well, you shouldn't notice or care what graphics card your using. Nvidia wants to force you to know this thinking that if they know it's Nvidia it will benefit them in the long run. There's plenty of times that this has backfired on.
      The strategy is good for acquiring new people or people that came off of the amazing 10 or 9 series cards. Sucks for retention afterwards.

    • @mrclean2224
      @mrclean2224 2 дні тому +7

      ​@@DoesNotInhale AMD cards are much better on a price to rasterized performance level than Nvidia has been for a while now, my guy. Nvidia beats them only at the top end, the 7800 XR, 7900 GRE/XT/XTX are all cheaper than a 4080 and have similar-better non RT performance

    • @arenzricodexd4409
      @arenzricodexd4409 День тому +1

      15% increase? 2080 to 3080 is not 15% increase only.

    • @sh_chef92
      @sh_chef92 День тому +3

      3080 to 4080 is 50% raster performance uplift.....

  • @_Azurael_
    @_Azurael_ День тому +10

    To be honest, grahics are going in a bad direction.
    Post processing, frame generation, faking more and more things.
    We used to see Graphical artifacts as errors or a red flag for our graphics cards life.
    Now? It's sold as a feature with all graphics cards and games.

    • @eilegz
      @eilegz День тому +3

      plus game optimization now its no existent, just use upscalers because thats what the narrative and the GPU companies its pushing foward sadly

  • @Raggyvil
    @Raggyvil 2 дні тому +247

    I hate these claims as they base their power levels to frame generation. Which is plain illusion. You are literally adding noise to electrical motor. False frames that are not real representation of events that are happening in game. Only the real rendered frame actually is the consequence of events and ticks in the game, rest is just fluff and generated, whether it's good or bad. Latency doesn't improve anywhere. Personally DLSS / upscaling things and such I can understand, but outside that, only real rendering performance is what I care about.

    • @VitorHugoOliveiraSousa
      @VitorHugoOliveiraSousa 2 дні тому +19

      Yep, DLSS/upscaling makes sense. You spend more power to render prettier pixels at lower resolution and framerate and them upscale. It has a tangible effect and in theory as long as AI get better is will continue to scale. Like upscaling 1080p to 8K someday. Even movies use upscaling, they do not render frames at 4K they render at 2K. But frame gen is dumb. Scaling it is meaningless. What is the point of 500 frames per second if only 60 impact latency? Even if the frames were perfect a smooth image means jackshit if there is no simultaneous decree in latency. And that is assuming that games will respect this limite, what we see already is games running at 30 using Fram gen to reach 60FPS. So you get 25 maybe 20 frames latency with a 60 frame artifacts ridden image.

    • @fastestdraw
      @fastestdraw 2 дні тому +1

      I think there is an argument for it if its fully integrated with systems like GGPO in online fighting games, where both clients are already 'speculatively exectuing' and suffering 3 frames of illusory rendering at 120fps with apx 20ms ping.
      Having your GPU render 3 frames for each mixup that is likely then pick the right one to show you with no rendering delay when it gets the signal, should give you transition frames for those mixups to reduce animation popping with minimal developer cost.
      If you're only storing the downscaled speculative frames that reduces render overhead. Since its rendering tweens anyway, issues with upscaling etc are also minimised, and while it wouldn't directly make the fps counter in the corner of the screen go up, it would improve online play just a bit by minimising the time between 'other players action -> pretty visual for it on screen'.
      The issue is whether nvidia have made DLSS simple and solid enough to implement that kind of speculative exection in a way that games can take advantage of. Cause if it has a frustrating developer experience and just that toggle as a stock offering its going to not offer much at all.

    • @ValkisCalmor
      @ValkisCalmor День тому +13

      Not only that, they (both NVIDIA and game publishers) keep showing off how it can go from 30fps or lower up to a playable frame rate, when in reality it is absolute trash in that scenario. Lower base frame rates result in even more latency and visual artifacting. This is a technology that only works well when you don't really need it.

    • @chainingsolid
      @chainingsolid День тому +1

      @@fastestdraw At least when the game is trying to speculate, it knows the most possible when doing so including how the game truly works. Poor gpu vendor tricks just have a pile of color..... If they did specific work for one game that's just the same driver optimization they've been doing for years, and all 3 gpu vendors are doing it!

    • @SoulOfFenrir
      @SoulOfFenrir День тому

      I don't want to say because we may be nearing a raw horsepower limit of gpus that are not multiple thousands dollars but it's starting to feel like it and this is the option they found to keep trying to raise frame rates every generation, fake frames or not.

  • @GladeRiven
    @GladeRiven День тому +48

    NOPE. Hate frame rate generation, hate NVidia business practices.

    • @Nurse_Xochitl
      @Nurse_Xochitl День тому

      I love you. :)
      I hate NVIDIA also. They are greedy and evil.

    • @nickt6595
      @nickt6595 23 години тому

      Agreed 💯
      Hate NV fake frame generation & bullying business practices.

  • @matreco2008
    @matreco2008 2 дні тому +329

    Sad times where £500+ ask price for a GPU seems a reasonable price

    • @martinwinther6013
      @martinwinther6013 2 дні тому +20

      id pay 500£ with a smile if it was top shelf. But thatl be freaking 2k+

    • @DavidH373
      @DavidH373 2 дні тому +20

      It's still not. I will not be conditioned to spend more than $300 and like it. Not to mention the Shrinkflation. I see what's happening. They're faking performance increases with software. Is it compelling? Yes. But I'm buying a GPU, a piece of Hardware and the software comes with it. I am not buying Software that comes with Hardware, thank you very much.

    • @chadyways8750
      @chadyways8750 2 дні тому +6

      i'll be honest, that price is actually reasonable considering global inflation over the past few years, not for that specific card, but in general

    • @ForOne814
      @ForOne814 2 дні тому +4

      Well, yeah, that kind of happens after you print so much money, lol.

    • @ChrisHöppner
      @ChrisHöppner 2 дні тому +7

      Remember when a decent GPU would run you 300 bucks? Peperridge farm remembers.

  • @PaulRoneClarke
    @PaulRoneClarke День тому +22

    Don't most people who want 100+ FPS want it for competitive advantage not image quality?
    Personally I'm happy with 60-80 fps and always will be from a visual point of view.
    My daughter who plays competitively is interested in much higher frame rates for competitive advantage in Overwatch, Valorant, CoD etc.
    But there is no competitive advantage if 75% of the frames are AI generated and do not reflect what your opponent is actually doing.
    AI frames offer zero value for me personally. I suspect others who play different games might love it, but for my use case it adds zero value to a card.

    • @Operational117
      @Operational117 День тому +3

      That's pretty much it.
      The issue as you mentioned is that AI-generated frames are ABSOLUTELY unwanted in highly competitive games. Fake frames will create fake enemy/teammate movement, and depending on what the "real" FPS is, you'll also have HORRENDOUS input lag!
      And while non-competitive games don't need these frame rates, so too should they not need to have frames generated through AI to reach the gold standard of 60 FPS. Doing so essentially means you admit to not caring about optimizing your game at all.

    • @PaulRoneClarke
      @PaulRoneClarke День тому

      @@Operational117 Spot on!! :)

    • @BlackTakGolD
      @BlackTakGolD День тому

      Just don't turn frame generation specifically on, fixed. Reflex 2 and super resolution will be advantageous for gaemers.

    • @lordzed83
      @lordzed83 22 години тому

      @@Operational117 maybe read about reflex 2 it fixes that problem

    • @balaurizburatori8607
      @balaurizburatori8607 12 годин тому

      That is mostly correct. But the actual reason high frame rates give you a "competitive advantage" is because they reduce the input latency.... which is exactly what frame generation harms. It's complete marketing BS that is useless in practice.

  • @wittmanml1705
    @wittmanml1705 День тому +6

    I live in Greece. These starter prices are NEVER representative of retail prices. They say 590dollars and I know for a fact I'll find it at least double that price

    • @VinniS-m2p
      @VinniS-m2p День тому

      Yep same for amd price dont match eu prices are and are complete bllshit

    • @MrKardovalk
      @MrKardovalk 9 годин тому

      Ye just EU thing trump ain’t our friend

  • @heramaaroricon4738
    @heramaaroricon4738 2 дні тому +104

    The worst part about that entire presentation was "AI Electricity". Like who the fuck made that term up????

    • @barry.w.christie
      @barry.w.christie День тому +5

      I think he was trying to draw a parallel between how electricity changed the world and that AI will do the same ... AI is fantastic in some respects, the downside is that everyone will become dumber and dumber as time goes on (look at what social media has done), they won't need to learn anything because AI will do it all for them!

    • @DatAsianGuy
      @DatAsianGuy 17 годин тому +2

      @@barry.w.christie a perfect society for the most powerful people alive. sheeps without the ability to think for themselves.

  • @OliviaBry
    @OliviaBry День тому +188

    Frame generation is like a tasty side dish. An excellent addition. However, as the sauce becomes your meal, you'll quickly grow to dislike it.

    • @MonkeyBitness
      @MonkeyBitness День тому +3

      AI wouldn't have come up with that inspired bit of wit....until now. All of the models have read these comments and have now integrated your slice of inspiration into the void. In fact, it is already being applied in tests that are impressing the devs and increasing the investment potential.

    • @50H3i1
      @50H3i1 День тому +1

      You can buy lossless scaling for 7-8$ so you can have all the generated frames you want

    • @evilme73
      @evilme73 День тому

      FG fake frames at 50% of your frames with added input latency is... eh, fine I guess. FG fake frames as the majority of your frames, with even more added input latency = no thanks. Absolute garbage for competitive games

    • @Dash123456789Brawl
      @Dash123456789Brawl День тому +5

      @@MonkeyBitness Unfortunately more true than you may have realized.... That wit was posted by a bot, I assume stolen from somebody actually clever.

    • @MonkeyBitness
      @MonkeyBitness День тому +1

      ​​@@Dash123456789Brawl ​ If that's true, I laugh at the punchline and cry at the implications , lol. Ugh.

  • @tiamem
    @tiamem 2 дні тому +136

    If Nividia solved the problems with artifacts and ghosting, I think they would have showed that in the demo.

    • @arenzricodexd4409
      @arenzricodexd4409 День тому +7

      That will be for reviewer to talk about for fine details like that. The presentation at CES mainly to talk about new features. But DF have videos on how the updated DLSS will address the ghosting/artifacting issue.

    • @UncleMisu
      @UncleMisu День тому +6

      @@tiamem there is already a video out with testing all the new technologies in Cyberpunk. It fixes a lot of stuff

    • @qwormuli77
      @qwormuli77 День тому +5

      @@UncleMisu Even the bullshots with slow, panning shots and predictable animation loops (the best possible scenario, not a realistic one) still have very sticking issues. Show me a single grassy field in windy weather and then you could _maybe_ have a shot at convincing me, if the issues weren't clearly there even though YT compression artifacts and capture further obfuscating the details.

    • @tripnils7535
      @tripnils7535 День тому +1

      @@qwormuli77 Some people will always find reasons to shit on things. Even tho objectively it's amazing technology. We are now at a point where games can be run at cap monitor refresh rates no matter the graphics. A couple of years ago 1080p and 60 fps was still considered average lmao.

    • @qwormuli77
      @qwormuli77 День тому +5

      @@tripnils7535 Brah, there are very valid reasons for certain dislikes and that canned "you're just a haturz!" answer really doesn't work. But apparently those issues cannot be discussed, because one might hurt the feelings of a company that has almost quintupled it's profit margins from their consumer products within a few years.
      "Cap monitor refresh rates" that don't gain the main benefits of the said frame rates, as it doesn't really operate on those purported frame times. This is largely a tool for marketing to sell big numbers and I for one really don't like corporations abusing peoples lack of knowledge.
      Or the wider games industry harms these patterns create. Or the way Nvidia is (again) trying to wrest competition in a direction that doesn't benefit the customer. Or a cavalcade of other issues that would require way more lengthy text bricks than I think would really benefit a comment section. But that is the thing with bullshit: It's way easier to call somebody something and throw a few magic numbers in the air than it is to actually take a look into said matter at hand.

  • @eugkra33
    @eugkra33 День тому +4

    2:22 unfortunately Far Cry 6 is not even pure raster. It has RT enabled. The raster gains might be less than 15%.

  • @SvenHeidemann-uo2yl
    @SvenHeidemann-uo2yl 2 дні тому +46

    I don't give a fuck about upscaling and frame generation.
    I don't like the trend Nvidia is setting.

    • @arenzricodexd4409
      @arenzricodexd4409 День тому +6

      Then stick to 1080p 60 medium. I still remember when nvidia pushing things like 4k back in 2015 so people keep buying high end hardware.

    • @Totenglocke42
      @Totenglocke42 День тому +1

      So you want bad performance and to run at low resolution? That's a weird new masochist fetish, but you do you.

    • @ravensblade
      @ravensblade День тому

      @@arenzricodexd4409 According to steam 56% gamers in last month used 1080p. Looking at other data like RAM, CPU and graphic cards i doubt most of player's PCs generate more then 60fps.
      I personally also use it as there is really not much gain with higher resolution. 2k is noticeable but not really worth it. And i don't want huge monitor on my desk. Unless you play very dynamic games anything above 60fps is barely noticeable, so that's often what i set. My PC can actually run most games at high details with 2k and 144fps but advantage on setting most things on 60fps is very silent PC and lower energy usage at that.

    • @BlackTakGolD
      @BlackTakGolD День тому

      Reflex 2 and the new super resolution are fine technologies, even if you despise frame gen, just turn it off and the blurring is going to be massively reduced with the new model too, I just don't get it.

    • @AFistfulOf4K
      @AFistfulOf4K День тому

      @@BlackTakGolD an influencer told him it was bad so it's bad. he's probably never even used it, or has used it and didn't know.

  • @cryam6428
    @cryam6428 День тому +8

    Still running a 1080 on a PC just old enough that it cannot upgrade to Windows 11. I'm quite happy with what I have.

    • @ALEXANDERdk007
      @ALEXANDERdk007 День тому +2

      I've been quit happy with my GTX 970 up to last gen as it has defiantly started to show it's age at this point

  • @senorf999
    @senorf999 2 дні тому +40

    better DLSS and multi frame generation will lead to is more games being released as unoptimised mess for people who cannot afford the technology.

    • @maelstromeous
      @maelstromeous День тому +2

      This is my fear as well, effectively mandating everyone use nvidia cards as intel and AMD can't keep up on the AI generation front. A sorry sight to see, optimised games are dead.

    • @kesamek8537
      @kesamek8537 День тому

      I can afford the technology but I don't want it, I like power efficiency and lean technical solutions. Nvidia is not a gaming company they are skimming the AI wave. Play older high quality games at 1080p 60fps and reclaim your sanity. Or you are being rinsed for nothing.

    • @AFistfulOf4K
      @AFistfulOf4K День тому +1

      There's 1.429 billion reasons that games will never be released in an optimized state ever again. You can work around it or you can just bitch on the internet but nobody with the power to change things cares.

    • @alexk9295
      @alexk9295 10 годин тому

      Like the new Monster Hunter

  • @Kapono5150
    @Kapono5150 День тому +49

    27 FPS on a 2000 dollar GPU, Get Hyped !!!

    • @JROCC0625
      @JROCC0625 День тому +6

      Thats path-tracing btw so yes no gpu can survive that raw power or bad optimization by cyberpunk

    • @sonidarc7641
      @sonidarc7641 День тому +3

      @@JROCC0625 I mean okay, bad optimization can heavily tank the performance. We see this now constantly with the critic on UE5 etc. But comparing the 4090 vs 5090 in the same game, whilst advertising the 5090 with over 60% more ray tracing power and 30% more cuda cores, I would still expect more than 7 FPS increase in the same game, looking at the raw performance of these cards.

    • @JROCC0625
      @JROCC0625 День тому

      @sonidarc7641 Just because they're is more doesn't always mean way better performance , if you have a 4070ti super for example upgrading only makes sense if your getting a 5080 because it does have better performance than the 4080 and higher cloak speed vs 5070ti. Personality I just built a pc unfortunately it's during the new gen so trying to make the right choice I've been researching for months (before built).

    • @DagobertX2
      @DagobertX2 День тому +1

      @@JROCC0625 And it's an illusion of selling a feature that works properly out of the box without 4x layers of a facade. They are basically Wizard of Oz.

  • @Drovek451
    @Drovek451 День тому +18

    FSR4 seems (at least the limited demos shown) at least a very good option now, so maybe the divide won't be as much vs DLSS.
    The whole thing with *even more fake frames* is still insane: the fact that your FPS counter at the top says 200 doesn't make your game magically respond at that speed. It'll be like playing constantly in molasses.

    • @xynonners
      @xynonners День тому +1

      if FSR is not AI and is still algorithmic, they'll never catch up

    • @Drovek451
      @Drovek451 День тому +1

      @@xynonners FSR4 is ML-based and looks like a real step-up over 3.1, though that's obviously in controlled-demos so far.

    • @albal156
      @albal156 День тому

      @@Drovek451 FSR4 is ML based and is also being shown at the Performance setting so its a worse case scenario and it looks pretty good to be honest.

    • @AFistfulOf4K
      @AFistfulOf4K День тому

      look up reflex 2

  • @FintanBarnett
    @FintanBarnett День тому +120

    If people believe that a 5070 equals a 4090, I have some oceanfront property in Arizona for sale...

  • @dp27thelight9
    @dp27thelight9 2 дні тому +76

    Rtx 4090 is under $900 on ebay with 24 gb. That's what you'll want to keep an eye on as these new cards launch.

    • @siriansight
      @siriansight 2 дні тому +2

      @@dp27thelight9 agree

    • @waxcutter9813
      @waxcutter9813 День тому +4

      I'd be sketched out by a used card for that low tho, basically half what the new ones go for

    • @GeekyGami
      @GeekyGami День тому

      ​@@waxcutter9813 The card I currently use was taken out of a system that had a critical component failure in the power supply that fried the board and the processor.
      Yet, it's fine.

    • @dp27thelight9
      @dp27thelight9 День тому

      @@waxcutter9813 They're new unopened RTX 4090s on ebay. But I also bought mining cards years ago covered in dust, I took them apart, blew them down and scrubbed them down in a isopropyl alcohol bath.
      Ran fine the whole time I had them. Then I resold them on eBay for more then I paid when mining blew up.

    • @B_Machine
      @B_Machine День тому +10

      Anyone doing this needs to watch for scams. Make sure the seller has a history, and preferably isn't from another country.

  • @pedropierre9594
    @pedropierre9594 День тому +5

    Idk I’d rather play 60 fps Native than 120 with DLSS idk if it’s me but it looks better that way many people don’t care

  • @fyrestorme
    @fyrestorme 2 дні тому +28

    the more frames you "generate" or fake (basically) , the more error you're going to inevitably run into - especially in scenes with a lot that is changing quickly and dynamically.

    • @PherPhur
      @PherPhur День тому

      What they need to do is just run like a million bots for a couple days on games and get some training data to run the supersampling on. Just charge these AAA devs(only ones making games that really need DLSS usually) a fee and integrate this training data into the driver updates.

    • @webpombo7765
      @webpombo7765 День тому +3

      ​@@PherPhur Not going to happen, that'd be time consuming and expensive, Nvidia does not care that much, if they did care about the user experience, DLSS 4 would've been a driver update, not a whole new card

    • @mrc2176
      @mrc2176 День тому

      @@webpombo7765 DLSS 4 is a driver update, multi frame gen is a whole new card

  • @dozerd42
    @dozerd42 День тому +17

    I think of Nvidia as neither "strategic" nor "masterful" by claiming 5070 TI is equivalent to a 4090. Its a marketing tactic and an obvious one. Nvidia has a problem. They have failed to completely convince everyone that ray tracing is essential for gaming

    • @arenzricodexd4409
      @arenzricodexd4409 День тому

      Typical for gamer. That's why game developer most often did not go with gamer mindset despite developing games for them.

    • @GeekyGami
      @GeekyGami День тому +1

      Just like they did with fluid physics with PhysX.

    • @australienski6687
      @australienski6687 9 годин тому

      They are not claiming the 5070 ti is equivalent to the 4090, they are claiming the 5070 NON ti is equivalent.

    • @colt.4598
      @colt.4598 9 годин тому

      @@dozerd42 Yep Ray tracing can't stand on its own. It requires DLSS to prop up FPS performance. Just like the fake frames can't stand on its own. It requires reflex 2 to lower latency. This is becoming a pattern.

  • @chaser9584
    @chaser9584 День тому +12

    The DLSS override function is just what you'd do with a dll swap now, a version update. You made it sound like it does what the Starfield modder did, replace FSR (or XeSS) with DLSS. Judging by what Nvidia says on their own site it needs DLSS to be implemented already.
    And to be fair, FSR4 also appears to look a LOT better than it's previous version. I'm actually more hopeful for this gen than the last.

    • @PherPhur
      @PherPhur День тому +4

      Yea DLSS is no moat whatsoever. FSR is open source and keeps on the heels of DLSS. Better value per $ and soon FSR should be developed by a third party in a way that it runs on titles even DLSS won't. Oh, and it's not limited by card generations.

  • @truthseeker6532
    @truthseeker6532 12 годин тому

    WOW!!!!
    What a vid!!!!
    What a breakdown!!!!
    10/10!!!!!
    First time here listening to you!!!!
    IMPRESSED!!!!!
    I'm subscribed as of NOW!!!!!!!!!!!

  • @DxXNA
    @DxXNA День тому +20

    it could be 5000 fps and it doesn't matter because the whole point in higher fps is to improve the latency between your game input and what happens on screen. Generating frames by lowering the base frame to something like 10-30fps and generating new frames in between means your input lag goes to 33ms - 100ms where if you had true 240fps it would be only 4.1ms.

    • @kesamek8537
      @kesamek8537 День тому

      You are not factoring in the wider audience (Hardware Unboxed people, LTT people etc.) who have been fooled into 4k for gaming.

  • @lupusabanglespenna
    @lupusabanglespenna День тому +5

    As a 3D modeler and animator I know this kind of stuff can help with areas of performance. But on the whole I find myself just wanting the raw "horse power" (and vram) to make things I do more efficient and so I can try new and frankly demanding effects. If it manages to play some fun games along the way then all the better. But I personally don't understand the obsession with crazy high frame rates, I've heard Bellular and so many other people say they won't even play games if they're not reaching at least 60fps. I find it so strange that people would just give up on interesting and/or enjoyable games because it doesn't meet that standard. Granted I'm not a hard core pvp person either, But I do understand how a higher frame rate can help with quick reaction times. But I don't even notice any difference past 80 FPS. Even if we were to assume people can subconsciously utilize 160fps anything higher then that would just be a flex needlessly consuming unnecessary resources.

    • @Vladislav888
      @Vladislav888 День тому +2

      > I personally don't understand the obsession with crazy high frame rates, I've heard Bellular and so many other people say they won't even play games if they're not reaching at least 60fps
      Because it's not about framerates. Sort of.
      If the frame pacing is implemented well, even 30 looks good, especially with properly implemented motion blur. 60 looks even better.
      On the other hand, if the frame pacing works like a broken clock, it always looks off, but 60 looks better than 30, and 120 looks better than 60, because it smoothes this effect somewhat.
      So it's 30[bFP] < 60[bFP] < 30[FP] < 60[FP]
      30 with bad Frame Pacing looks awful.

    • @ALEXANDERdk007
      @ALEXANDERdk007 День тому +1

      @@Vladislav888 It's all comes down to how smooth it looks, I personally had no issues playing Halo Infinite on my GTX 970 at around 20 to 23 FPS as it looked and played smooth enough and didn't feel like a broken clock

    • @Vladislav888
      @Vladislav888 День тому

      @@ALEXANDERdk007 >It's all comes down to how smooth it looks
      No, it doesn't.
      It has three distinct components: the perceived smoothness, movement-to-photon reaction time and an actual reactivity.
      DlSS 3 and presumably 4 address mostly the former while - I assume - making the movement-to-photon worse and having no effect on the latter.
      Thing is, any competently implemented game has a separate game logic loop under the hood. It often targets constant update rate because it makes player-to-player multiplayer sync easier. DLSS or similar has no effect on this one. This is an actual reactivity.
      It also features a separate draw loop, which is usually uncapped and renders as fast as possible.
      So my best guess about your Halo experience would be that the game actually ran something like constant 60 ticks per second under the hood while it was drawn at 20-30 FPS, so your nervous system eventually adjusted to the perceived visual - but not gameplay - delay.

    • @ALEXANDERdk007
      @ALEXANDERdk007 День тому

      @@Vladislav888 well no matter what, I had no issues getting to the top in competitive match's in Halo Infinite, so even multiple played fine for me despite the low FPS

    • @Vladislav888
      @Vladislav888 День тому

      @@ALEXANDERdk007 I've already explained, there is a huge difference between purely visual delay and and an actual gameplay delay.
      In a context of a competitive multiplayer FPS, it means that if someone with 200 FPS and 20 FPS would do an exactly the same input - timing-wise - in the exactly the same situation, the result would be the same.
      200 fps player has more time to react, though, so it's still a huge advantage.
      And if it's not the case, the input of a 20 fps player might not register at all.

  • @LICVIGO
    @LICVIGO 2 дні тому +72

    LOL AI fake frames mate...

    • @dreambiscuits
      @dreambiscuits День тому +1

      Do you even know what you're saying or are you just regurgitating what other people say?

    • @webpombo7765
      @webpombo7765 День тому +19

      ​@@dreambiscuitsDo you realize that's literally what it is? AI generated in-between frames that don't actually exist, and it's AI, which means it's fallible, it can literally hallucinate details that don't exist

    • @Nurse_Xochitl
      @Nurse_Xochitl День тому +13

      @@webpombo7765 details that don't exist is a bit too nice to say... should say something more like bullshit artifacting and glitches

    • @LICVIGO
      @LICVIGO День тому +12

      @@dreambiscuits I like my frames whit no input lag and no blurriness...

  • @Monk-Darlington
    @Monk-Darlington День тому +2

    I do t think dlss as looked bad since we got away from the first implimintation and the latest looks great.
    Atleast at 4k on quality, I can only think those who think it looks bad are using performance mode or 1080p etc or maybe messed with sharpening too much in either direction because at 4k maxed out, it looks amazing, near identical to native when you are playing and you sort of need to stop and look for the issues instead of actually playing the game.
    That or they have never actually used it only that awful lossless program and think it's the same thing.

  • @jonas_bento
    @jonas_bento 2 дні тому +95

    The statement that the RTX 5070 has RTX 4090 performance is misleading to put it lightly.
    Comparing an actual RTX 4090 running games with it's raw performance (that works universaly) to a RTX 5070 making use of every feature NVIDIA has at it's disposal (upscaling, frame gen, etc) but that only work on some specific games and also may introduce issues is very dishonest.

    • @BellularNews
      @BellularNews  2 дні тому +26

      Yep, that's the point of the video

    • @beanman853
      @beanman853 2 дні тому

      the full quote did caveat with dlss 4

    • @xynonners
      @xynonners День тому +1

      it's CEO math

  • @asliceofloaf1984
    @asliceofloaf1984 2 дні тому +8

    5070 is gonna have like 5-10% higher native performance than the 4070 Super, 15% more power consumption, and cost the same, or $20 cheaper.
    This gen is shaping up to be sadder than the previous one :/

  • @VME-Brad
    @VME-Brad День тому +23

    Meanwhile my 1080ti is still hanging on like the overengineered beast it is.

    • @dave7244
      @dave7244 День тому +1

      Best GPU of all time.

    • @2Syndras1Cup
      @2Syndras1Cup День тому +2

      My absolute GOAT Ti sadly died... A mistake that Nvidia will never make again.

    • @maelstromeous
      @maelstromeous День тому

      I'd probably still be using it if it didn't die :(

    • @patatacaliente8270
      @patatacaliente8270 День тому

      That was Nvidia's all-time best, I guess. Great value for money.

    • @Totenglocke42
      @Totenglocke42 День тому +1

      At 1080p, low settings, sure. Some of us want more than that.

  • @ZeroCrystal
    @ZeroCrystal 2 дні тому +29

    This glorified pissing contest between Nvidia and AMD, and the fact that I can buy a CAR for some of the RRPs I'm seeing led me to buying an Intel B580 and taking Door Number 3. I'm not desperate to have "bleeding edge," and I figure that as long as Intel keeps improving drivers, I'm fine.

    • @Thund3rstone
      @Thund3rstone День тому +9

      Point is that even as an "Enthusiast" with a big budget and going for mile high framerates on 4K you want real power for your money and not being beta tester for AI software.

    • @alephniguroth7105
      @alephniguroth7105 День тому

      @@Thund3rstone im tired boss, i dont care about the high end stuff but still keep getting flak from the AI race.

    • @albal156
      @albal156 День тому

      How did you get one of those? They aren't in stock anywhere as far as I can see and the ones that are in stock are for inflated prices similar to AMD Nvidias cards.

    • @ZeroCrystal
      @ZeroCrystal День тому

      @@albal156 Magic.

  • @brucethen
    @brucethen 9 годин тому +1

    The B580 appears to have a weakness, on certain games it can perform poorely compared to the competition, when it is coupled with a weaker CPU, Ryzen 5600 for example.

  • @DanielWW2
    @DanielWW2 2 дні тому +23

    There is no way the RTX5070 is even close to the RTX4090...
    Its so obviously based upon mostly frame generation. The RTX5070 doesn't even come close to having the hardware to approach the RTX4090.

    • @BURN447
      @BURN447 2 дні тому

      5070 will probably be somewhere around 30-40% of 4090 performance in rasterization

    • @beanman853
      @beanman853 2 дні тому +1

      @@BURN447 with 60% the cores, memory bandwidth etc.

    • @asliceofloaf1984
      @asliceofloaf1984 2 дні тому +5

      5070 performance looking to be in the area of 4070 Super for almost the same price. Disappointing

  • @xxburritobearxx1757
    @xxburritobearxx1757 День тому +1

    Already was considering getting the 5090 when i already have a 4080 but the fact you can just inject frame gen into anything without using lossless upscaling is crazy. You can play bloodborne on pc at 120-240 fps in 4k with a good nvidia pc

  • @Dunestorm333
    @Dunestorm333 2 дні тому +7

    I don’t think this means anything, most people cannot afford the 50 series. People online are far too concentrated on the high end when in reality, the mid-low end is where the market is.

    • @arenzricodexd4409
      @arenzricodexd4409 День тому +3

      First world problem? Most people just happy as long as they get to play their game.

  • @Wayne-i7h
    @Wayne-i7h 4 години тому +1

    And here I am still chugging along with my 2080 Super and 34" 2560x1080 monitor. Just enjoying my games. Sales pitch unnecessary.

  • @BearEnjoyer
    @BearEnjoyer 2 дні тому +10

    hey editor there is a typo in the title I imagine you wanted to say "NOT *want* to fight fair"

  • @frighteningspoon
    @frighteningspoon День тому +63

    AMD has a very real opportunity to go all in on Raw Performance rather than friggin AI everything

    • @OldManHash
      @OldManHash День тому +5

      AMD cards are great, I also hope they go for raw perf. and fair pricing going forward.

    • @joee7452
      @joee7452 День тому +6

      You do realize they already said they were out of the highend, right? Supposedly they are looking to compete at the Nvidia xx70ti level and below ONLY going forward. If they stick to that, then Nvidia owns the highend without even trying anymore because the only choices will be the xx80s and xx90s. I hope against hope that they will change their mind, but it might turn out to be a mute point for highend. AMD couldn't match the raw performance of the 4090 this round. even if the 5090 is only 30% better then the 4090, it's still going to be 60% better the the nearest AMD option at this point and will point the line as being 5090 > 4090 > 5080 > 7900XTX > 4080s > 9070XT/5070ti. If Nvidia adds a 5080s at some point, it will drop the highest AMD card 1 more down the list. We need them to compete with Nvidia so I really hope they change their minds.

    • @shadow7037932
      @shadow7037932 День тому +2

      AMD lost that war because they didn't invest and get anything like DLSS or CUDA until quite late. They were always lagging and they lost the dev support.

    • @TomHumphrey-nd8rk
      @TomHumphrey-nd8rk День тому +5

      AMD does the every couple generations or so. RDNA gen1 had no high end card either.
      They'll be back to compete at the high end eventually.

    • @OldManHash
      @OldManHash День тому

      @@TomHumphrey-nd8rk yeah, I think they're not trying to compete with Nvidia high end this cycle but will come in with RDNA5 as a competitor.

  • @Swordphobic
    @Swordphobic 2 дні тому +6

    You sound awfully happy for something most gamers are calling a bait and switch.

    • @39zack
      @39zack День тому +1

      he is most likely an paid shill with nvidia stock

  • @aq7h3r
    @aq7h3r День тому +5

    NVIDIA is going the Intel route

    • @randomuserame
      @randomuserame День тому

      Ironically Intel is going the AMD route with budget cards that are [honestly] a 50/50 hit or miss while trying to prove they can actually develop the tech. And AMD is taking over the mid to high-end [native enjoyer] segment.
      NVIDIA is chasing business-to-business sales, industrial applications, and "more-money-than-sense" prototype technology paypigs... I mean whales. I mean "super high end enthusiasts."

  • @ArcturusMinsk
    @ArcturusMinsk 2 дні тому +25

    the funny thing is that at this point Gaming is their smallest market

    • @webpombo7765
      @webpombo7765 День тому +4

      I love crypto mining racks!!!!!!!!!! I love AI image generation centers!!!!!!!!! I love the blockchain!!!!!!!!

    • @gravity00x
      @gravity00x День тому +5

      and it shows, they do not care an inch about innovation in technology, they wanna make the quick buck, to re-invest it into AI hardware for AI bros.

    • @PhilosophicallyAmerican
      @PhilosophicallyAmerican День тому

      @@gravity00x To be fair, they are innovating. It's just not in a direction we want.

    • @randomuserame
      @randomuserame День тому +1

      Which is why you should stop buying their products. You're an afterthought to them now... so you shouldn't keep _them_ in _your_ mind. When they want your business, they will come to you... on _your_ terms. Unironically ARC is looking like it's going to be a huge contender if they iron out the kinks. We're still in the "early adopter" phase of ARC GPUs. AMD has publicly stated they are chasing volume sales (more inventory, lower prices, more cards in boxes) rather than chasing the "upmarket" or B2B vendoring.

    • @ArcturusMinsk
      @ArcturusMinsk День тому

      @@PhilosophicallyAmerican is not like they need to, all their competitors have gutted their R&D devisions so it's not like they'll be catching up any time soon.

  • @mrlnl13
    @mrlnl13 День тому +3

    If 75% of the frames are slop hallucinated by AI and are not real, are you even playing the game? Lol

  • @penrar
    @penrar День тому +10

    Nvidia saw with the 1080 that they hit the limit and needed to pivot to marketing scams like RTX instead.

    • @galaxypanda1288
      @galaxypanda1288 День тому +2

      Seriously it’s crazy that the 1080ti an 8 year old graphics card still holds up to GPU performance today. I used a 1070 super from early 2017 until last year when I finally upgraded to a 4080 and it held up at 1080p on most modern games.

    • @DagobertX2
      @DagobertX2 День тому +2

      No, NVidia has always tried to make a gimmick that baits the customer to buy only their cards with an exclusive feature. PhysX, Hairworks for example.

  • @MrVivi0001
    @MrVivi0001 4 години тому +1

    What is the point of double frame rates if everything looks like mud

  • @meru_lpz
    @meru_lpz 2 дні тому +19

    I wish there was a version without all the AI TOPS and DLSS shenanigans. I dont want to pay money for things im not gonna use.

    • @arenzricodexd4409
      @arenzricodexd4409 День тому

      New to gpu world? Gpu always have those hardware that is useless to gamer. Good thing this AI thing are used in upscaling so there is still some benefit to games. In the past cards like 290X or 780Ti have those hardware FP64 that 100% useless in gaming.

    • @PherPhur
      @PherPhur День тому

      @@arenzricodexd4409 DLSS and FSR is actually pretty useful on the quality setting. You don't get too much boost out of it, but it is worth considering and the hit to the visuals is pretty much non existant.

  • @Boss_Fight_Index_muki
    @Boss_Fight_Index_muki День тому +4

    The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't).
    The 5070 could be sold at $400 and the 5080 at $650.

    • @ALEXANDERdk007
      @ALEXANDERdk007 День тому

      xx90 is mostly meant for business users that needs that much power for whatever they are doing

  • @gingernut3411
    @gingernut3411 2 дні тому +5

    Yeah.....no it just wont. The more they push ai performance the less games devs will put into making their games run half decent. Its already bad enough. Nvidias next gpu will probably finish off making the game 55 frames ahead of where you are.

  • @lxadreadeater
    @lxadreadeater День тому

    great video, just a tip though, i find often in your videos the balancing of audio is quite off, it's like i'll hear your voice very clearly, and then you'll talk slightly quieter and it'll be unintelligible. Might need some compression tweaks. again i love the content as always. :)

  • @evilsnowball
    @evilsnowball День тому +3

    I'm happy with my RX7800XT right now, it might not be as powerful as some cards, but it does what I want it to do.

    • @ALEXANDERdk007
      @ALEXANDERdk007 День тому +1

      I have had the same with my GTX 970 up to last gen, tho at this point it's defiantly time to get an upgrade for it

    • @evilsnowball
      @evilsnowball День тому +1

      @ALEXANDERdk007 my last Nvidia was a GTX 1660 Super, was a great card.

  • @greisboy425
    @greisboy425 День тому +2

    You don't need 100+ frames to enjoy a good quality scene of your fav games. 60 fps is more than enough (we used to play our fav games on 30 fps back on 2010). 60 FPS rendered natively without frame gen and upscale is more than enough. 60 FPS RENDERED NATIVELY IS LOT BETTER THAN 20 FPS RENDERED WITH FAKE 200 ADDED FRAMES.

    • @pedropierre9594
      @pedropierre9594 День тому +1

      Yeah I agree, hell you turn on Vrr and it look smooth as well

  • @chadyways8750
    @chadyways8750 2 дні тому +6

    you know what's kinda hilarious about this multi frame gen?
    lossless scaling has had it for a while now and even if nvidia's advantage is hardware accelerated denoisers and what else have you to reduce smearing and whatnot, i can tell from experience that having 240 fps is NOT worth the degradation in visual fidelity and that's frame generation on native scaling and nvidia's hardware will not be able to compensate for that in much more of a meaningful way.
    Sure, 90% of the time, you won't actually notice the artifacts and everything else unless you're specifically looking for them, but the moment you notice them, you can't unsee them and considering UE5 especially with it's abyssmal reliance on TAA, hair, waterfalls, etc looks horrendous and that's before you take into account that reflections alone look kinda ass with frame generation regardless of what.

    • @mojojojo6292
      @mojojojo6292 День тому

      Loseless scaling is dogshit in comparison though even compared to last gen DLSS frame gen.

  • @mikefisher7620
    @mikefisher7620 9 годин тому +2

    But it's not going to be $550 because of scalpers! LOL

  • @axelstuart2707
    @axelstuart2707 2 дні тому +18

    I wish people would stop talking about "performance".
    Let's call it what it is: artificial frames.

    • @AFistfulOf4K
      @AFistfulOf4K День тому

      And I wish people would stop complaining about things they've never tried but this is the internet.

  • @0ne0fmany
    @0ne0fmany 3 години тому +1

    I really hope AMD will try to win the market with actual frames and processing power.

  • @Xoodlz77
    @Xoodlz77 2 дні тому +7

    I personally like real frames not fake ones this generation seems like a step in the wrong direction.

  • @qwertymonsta
    @qwertymonsta 3 години тому +1

    I want optimized games that don't rely on AI fuckery to achieve playable framerates...

  • @kidman2505
    @kidman2505 2 дні тому +6

    DLSS is the magic wand bro. Instead of modest gains it looks WAYYY cooler to show a chart that says they've doubled FPS. Keep in mind, Nvidia is in fact a Server/AI company now, not a Consumer Graphics Card company. You'll get cut down enterprise parts at high prices and like it. Fortunately outlets like yourselves are catching onto it now finally, as well as the public because of it. I feel this was fairly well known to enthusiasts for some time, it requires the common player to realize what's going on. But the common user is on a Dell/HP/etc, and doesn't update their drivers. Frame gen is a crutch.

    • @arenzricodexd4409
      @arenzricodexd4409 День тому

      Even if they did not update their driver themselves windows will did it for them even if they did not get the latest one.

    • @celticgibson
      @celticgibson День тому

      @@kidman2505 that’s why they spent only 10 minutes of their 2 hour presentation on gaming. They long since chose data centre AI over gaming and it shows.

  • @MiaowVal
    @MiaowVal День тому +1

    I think you misunderstood something, DLSS Override does only work on games that already have DLSS implemented in them, it just means you can run DLSS 4.0 in games that only support DLSS 1.0 etc with all the features of DLSS 4.0 even if the result would not be so good as if it was a native implementation of DLSS 4.0. Though Nvidia never actually says which DLSS versions it works on, just that it works if you enable DLSS in game and then enable DLSS Override in the driver. I feel like this needs to be clarified or else it's going to cause a huge misunderstanding about what DLSS Override does. Also, AMD seems to have something similar for FSR 4.0 that lets you run FSR 3.1 games with FSR 4.0 on the Rx 9000 series if the press slide about FSR 4.0 is to be understood correctly.

  • @Liiineeepiiieeeceee
    @Liiineeepiiieeeceee День тому +3

    Frame generation is garbage - it's not real performance/FPS. It's basically that motion smoothing garbage that smart TVs have been doing for 15 years. Don't be fooled.

  • @RobSomeone
    @RobSomeone День тому +1

    I saw FSR 4 footage in Ratchet and Clank and I'm sold. NVidia didn't show DLSS with any major movement or details, which I find that pretty damning that DLSS MFG won't be as useful as you'd want in maintaining quality.

    • @pedropierre9594
      @pedropierre9594 День тому +1

      DLSS is already better then FSR, quality wise, what I didn’t like about the new FSR is gen specific, and if that’s the case I’d rather go for an Nvidia card this time around, AMD had my money because of their generational support without it well I can’t commit anymore

  • @shnoopx1383
    @shnoopx1383 2 дні тому +4

    From raw performance, the "5080" is a XX70 class card with the price of a 5080. It's 1/2 the performance of the 5090, and all previous cards with that kind of spec were "XX70". They tried the same BS with the 4080 12Gig Card, last time and unlaunched it. Now, nobody talkes about the shif in the pricing.
    The "real" 5080 wil be called 5080 Ti or Super, which will have ~70% of the 5090 performance anc cist us 1400-1700$. Just to release a 5080 Super Ti+ later, for 1800$.
    They moved the product stack 1 step down, when it comes to performance and the price 1 step up.
    The 5070 has 1/3 of the raw performance of the 4090. And the 5080 does have 2/3 the raw Performance of the 4090.
    In the past the new XX80 class cards had XX90 class performance. from the previous generation. That did change this time, but this time there is no uproar, because of the "4090 performance marketing BS". It's a distraction and it were a success.
    Beside that: "FPS" got a new defenition. It's not Frame per second anymore, it's "Fakes per second".
    Wbr, shnoopx

    • @Nurse_Xochitl
      @Nurse_Xochitl День тому

      Anyone who buys new NVIDIA cards post-GTX are an idiot IMHO.

  • @umarthdc
    @umarthdc 2 дні тому +12

    The truth is they have peaked 3 series ago and now they are inserting .jpgs to makes us believe we have a better GPU

  • @skybuilder23
    @skybuilder23 День тому +1

    DLSS doesn't shove DLSS in unsupported games, it overrides the DLL file to their newest models in already supported games. There's a tool for this called DLSS swapper that I've used forever.

  • @keit99
    @keit99 2 дні тому +6

    Almost feels like GPUs can't get more powerful and nvidia is scrambling to make them seem better.

    • @DagobertX2
      @DagobertX2 День тому +2

      Yes. GPU miniaturization is at it's limit now because of quantum tunneling effect. They can only make bigger GPUs from now on if they can't solve that.

    • @celticgibson
      @celticgibson День тому

      Look at the 4090 AMD 5090. They are using the latest process nodes yet are large enough to fry an egg on. GPUs have hit the physics wall unless you build a server rack cooling solution and coincidentally that’s where nVidia makes the most money. Thermal dynamics doesn’t lie.

  • @NOPerative
    @NOPerative День тому

    Why does Bellular look like Dexter (Dextar's Laboratory cartoon)?! AWESOME!
    btw good vid.

  • @kronos444
    @kronos444 День тому +7

    $550 for a 5070? Great! Now the 4060 will be substantially cheaper so I can finally buy a 3050!

  • @epicrotfl7017
    @epicrotfl7017 День тому +1

    NVIDIA lies about their products so often that I genuinely wouldn’t be surprised if all DLSS4 did was generate a single frame and then just make the FPS counter display a higher number.

  • @Bigdog1787
    @Bigdog1787 2 дні тому +7

    Most can only afford 150-250 for a graphics card

    • @dualie-dude-42
      @dualie-dude-42 2 дні тому +3

      Yeah bellular news has lost the plot calling $550 as cheap.

    • @dancingdroid
      @dancingdroid 2 дні тому

      Or you could save for longer and afford something more expensive? It's not as if your savings max out at 250 >_

    • @gman7497
      @gman7497 2 дні тому

      I have a relatively mediocre paying job and I was able to buy a 3k$ 4080 laptop, combo of savings and tax return money. I get that there may be other things that take priority, but if you are really commited to getting top end hardware it can be done with a modest income. It's not easy, but it's possible.

    • @merlinwarage
      @merlinwarage День тому

      It is not a compulsory tool, just a luxury hobby. People can go out and play minigolf and stuff for $50. I mean, there are many other hobbies (most of them) that are much more expensive.

    • @mojojojo6292
      @mojojojo6292 День тому

      Buy a console then, PC has never been a cheap platform. Or just grab a second hand 12gb 3060.

  • @nicovandermerwe2747
    @nicovandermerwe2747 День тому +2

    Unfortunately this AI thing just won't die

  • @Hawkfizzles
    @Hawkfizzles 2 дні тому +29

    AI frames.....

  • @SHUTDOORproduction
    @SHUTDOORproduction День тому +1

    8:09 what game is this?

  • @NoirMorter
    @NoirMorter День тому +3

    Intel was head of market for over a decade but they're loosing their dominance because the C suite got complacent and were business people not engineers. Nvidia will do the same thing.

    • @waiwai5233
      @waiwai5233 День тому +1

      Remember when Intel said 6 cores are more than enough?

    • @ALEXANDERdk007
      @ALEXANDERdk007 День тому

      And AMD will do the same exact thing ones they are the ones in the lead

  • @codemonkey6173
    @codemonkey6173 День тому +1

    8:00 Not only did we run them at native, we would run them ABOVE native for antialiasing. Remember super sampling?

  • @twistedtxb
    @twistedtxb День тому +3

    we've really hit a bottleneck haven't we. fake frames, extra latency, all this for USD 1.5K
    Moore's law is definitely dead

  • @weldabar
    @weldabar День тому +1

    You can't quantify good looking video, but you can quantify FPS. So while technically FPS is high, that doesn't mean quality is high.
    AMD and Intel could compete if they had a way to quantify video quality other than fake-FPS.

  • @SMuasGames
    @SMuasGames 2 дні тому +12

    have to love all those fake frames. 1 real to 3 fake.

    • @iwankazlow2268
      @iwankazlow2268 День тому +1

      20 to 200+ fps. More like 10 fake for a real one, with all the artifacts and a terrible responsiveness, programmed in on the engine level so the fakeness even works out to anything. The great bright future!

  • @kaijuultimax9407
    @kaijuultimax9407 День тому +1

    I hate framegen so much. The visual artifacting reminds me of the days of 1080i displays from the mid-to-late 2000s with how blurry and smeary it makes everything. Makes me sad that so much of the industry is openly adopting framegen just because it lets them borrow from the future at the cost of optimization (Silent Hill 2 Remake has literally no object culling at all, the game could run at a buttery smooth 4K 60fps WITHOUT framegen if Bloober used the fog to hide 3D asset culling like the original game did).

  • @loosetoothearl2869
    @loosetoothearl2869 День тому +3

    It's not raw performance. It's smoke and mirrors..

  • @jiji12boji2
    @jiji12boji2 День тому +1

    I have a 4090 and frame generation etc... isn't worth it, a lot of blur and artifacts that go away if you don't use them but if you don't use them you got 20 fps . The next card for me will be amd or Intel

  • @yeahhbuddy3932
    @yeahhbuddy3932 2 дні тому +8

    Yeh yeh sure mate.....

  • @PherPhur
    @PherPhur День тому +2

    What are you talking about? DLSS is no moat, FSR is open source and actively being developed in a way that will be applicable to any game, and it's performance is pretty darn close to DLSS. So what would someone rather have, supersampling on more games, that isn't limited by card generations, more VRAM per $ and more performance per $, or Nvidia?
    DLSS 4 will be ahead for a little while, but it never stays that way. I don't see this as much as anything. The only thing Nvidia really has is a cult following, and that will be enough to sustain them, but over time AMD should naturally just eat into that as more people find out it's a better value mostly/usually

    • @shotcallerjosh8457
      @shotcallerjosh8457 День тому

      @@PherPhur Wrong, Nvidia also have way more money than AMD for R&D, they will be always ahead if they keep pushing the boundaries. The only way AMD catches up is to innovate or make a breakthrough.

    • @BlackTakGolD
      @BlackTakGolD День тому

      I saw an article that said fsr 4 won't be open sourced, not sure.

  • @BAMSE64
    @BAMSE64 День тому +5

    $550 is what most people make in a week, not affordable lol

    • @celticgibson
      @celticgibson День тому +1

      nVidia don’t care. Gamers are not their business model now. Data centres are.

  • @roki977
    @roki977 День тому +1

    Well, in latest leaks mid range 9070xt beats rtx 4080 super and 7900xtx in 3d mark. No wonder Jansen came out with the shield ...

  • @patbluetree4636
    @patbluetree4636 День тому +4

    Im not paying a premium for fake frames.

  • @COYSMike
    @COYSMike День тому


    eally good vid explaing the whole frame gen thing. I'm hoping my lowly 2080Ti can get a small lift when DLSS4 arives

  • @IOAwaits
    @IOAwaits 2 дні тому +11

    Rather have a amd. Better value. Slightly better company. Its also wiser in the Big Picture to support Amd gpus and cpus.
    If the 5070 is 549.00. The Amd = is gonna be 300-450. With similar performance and prolly more memory. 🤷🏻‍♀️.

    • @celticgibson
      @celticgibson День тому +1

      AMD was the only tech disruptor in the last two decades. Their Ryzen chips are unmatched.

    • @IOAwaits
      @IOAwaits День тому +1

      @ well said