There are NO Relative Performance GPUs and this is why / R9 290X vs GTX 780Ti in 2022

Поділитися
Вставка
  • Опубліковано 13 чер 2024
  • Iceberg Tech: / @icebergtech
    HL Discord: / discord
    Timecodes:
    00:00 - Brief Introduction
    00:46 - GPUs overview
    09:20 - 2012
    13:14 - 2013
    17:31 - 2014
    21:37 - 2015
    24:10 - 2016
    27:51 - 2017
    32:03 - 2018
    35:44 - 2019
    40:35 - 2020
    43:51 - 2021
    46:40 - 2022
    49:22 - Conclusion
    Background Tracks:
    Unicorn Heads - 808 Doorbell Chime
    plenka - Catharsis
    plenka - Plasma Water
    plenka - Spirit
    plenka - Lost the time
    plenka - Muddy House
    Case & Point - Error Code
    plenka - Bbrokenn
    Portwave - Oh, M
    Arti-Fix - The Mercury Man
    Arti-Fix - Cybernatic Sect
    Arti-Fix - The Untold Story
    Arti-Fix - Dangerous
    Arti-Fix - Liquidator
    Arti-Fix - Your Time
  • Наука та технологія

КОМЕНТАРІ • 746

  • @Stormygewehr
    @Stormygewehr Рік тому +162

    Damn, the 780ti was my dream card back in the day, I should've dreamt of something else instead.

    • @ogaimon3380
      @ogaimon3380 Рік тому +5

      lol

    • @MechAdv
      @MechAdv Рік тому +6

      My first desktop build was with a used GTX970 just before Pascal dropped. Needless to say, it wasn’t the best money I ever spent. Lol

    • @ivankovachev8835
      @ivankovachev8835 Рік тому +12

      Yeah, and AMD had the superior architecture from 2007 to 2015, yet people bought Nvidia, AMD was also more worth it from 2016 to present, and I'd say that tehy won the RX 6000 vs RTX 3000 series generation, because Nvidia lied about prices, yet people refuse to buy AMD GPUs. Hell they buy Intel GPUs half as much since AMD has about 10% market share and Intel has about 5% right now...
      And before anyone calls me a fanboy, I own both AMD and Nvidia and have had both AMD and Nvidia for the majority of my life, rarely only one brand at a time. Both have driver problems and other minor problems, so it's not just drivers.

    • @juicecan6450
      @juicecan6450 Рік тому +5

      The Radeon HD 7970 GHz edition was my dream card. Sapphire even had a Toxic 6GB model of it. literally 2x of what the 780Ti had

    • @ivankovachev8835
      @ivankovachev8835 Рік тому +5

      @@juicecan6450 The 6GB variant was made for crossfire, the HD 7970 didn't have enough bandwidth to utilize more than 3GB of VRAM effectively.

  • @mck8292
    @mck8292 Рік тому +363

    The way nVidia gpu's aged so badly is just ridiculous.. you pay extra and you get .. nothing..
    Back in 2017, i really hesitated between the GTX 1050ti and the RX 570 because they were called "similar in price and performance" but i'm really glad i chose the RX 570 because, not only, it was a bit cheaper, but clearly more powerful and had much larger framebuffer (i bought the 8Gb model btw) and now the difference is even larger in newer games making the GTX 1050ti basically irrelevant in today's market considering that you get 570s for around 100€ used and still play decently at 1080p
    EDIT: As said in the video, such comparison as "This GPU was cleary more powerful than this GPU" is pointless.
    Comparing two different GPU, from two different brands, with two architechture is not what i meant.
    MY point is that nVidia need to adress the architectural obsolescence and lack of vram on their products.... even tho i know they won't because they such in a strong position compared to AMD.
    And NO btw, i'm not an AMD fanboy, we can criticize big companies without necessarily endorsing what others competitors do, keep that in mind.

    • @blebekblebek
      @blebekblebek Рік тому +29

      It's only Kepler, nvidia did so poorly they had to rush releasing maxwell which is by far the most impactful gpu upgrades in years, the only limitation that nvidia always do is skimping on vram which was corrected in pascal, which by far the longest period of gpu lifespan, and it still performing very well regardless, and yes this is including the infamous GT1030/1040.
      As much you want to praised AMD for doing they supposedly to do, RX Series at launch is very scarce, specially RX 480/470/580/570. It's only became abundances after etherium crashes in mid 2017 (or 2018 I forgot) since miners are scraping them away. High performances GPU was always targeted by miners regardless, even before eth switch to POW RX 570 is still sought after even after they being resold as mining rig because demand is so high.
      I bought RX 580 used for $90 in 2018 and sold it at $200 in 2021, it's so ridiculous what we've got so far, and now we can't even get "high performance" GPU within $400 price range, few months ago GTX 1660 was sold for $600-900.
      What I dislike the most about AMD is they always lump few generation GPU's togther and cut the drivers support within that range, for example, HD Series was released in 2011 (GCN1.0) and R9 series last released was in 2015 (GCN3) but they cut everything down at the same time, sure we get long time before it was end, but before that HD3000 and HD6000 was also cut at the same time, the worst part is HD6000 only released a year before they stop drivers support for that series (which I bought back then) I was stuck with "new" card and no more driver support.

    • @baoquoc3710
      @baoquoc3710 Рік тому +1

      the problem here is Kepler arch are the most complex architecture that developers had to work on which does affect developers work time and thus people at the time tend to work on GCN, which is indeed more dev friendly.

    • @Nintenboy01
      @Nintenboy01 Рік тому +14

      no the 1050 Ti is only about as powerful as a GTX 960, whereas the RX 570 was around GTX 970 level

    • @mck8292
      @mck8292 Рік тому +1

      I understand that Kepler was probably a bit of a pain to develop for and im honestly impressed that they kept the driver support for such a long period but Maxwell and Pascal have drawback as well, they tend to perform worse on DX12 those days and let alone Vulkan where Polaris is destroying nVidia
      I don't want to just praise AMD tho, they do some shitty things as well, people just overlook that aspect some times..
      Exemple: That stupid RX 6500XT with its PCIe 4.0 limitation and 4Gb framebuffer (even though AMD stated earlier that 4Gb was "not enough" to tease nVidia on "lack of vram")
      They're just companies, motivated by profits.
      But AMD give the impression to be more "aware" of what the consumer "need" and always improved some aspect their product over generation (TDP, Driver support, RT performance..)
      nVidia tho don't give a sh*t on what the consumer think about their products, even tho (some of) their gpu's suck in terms of price to performance, they think it will sell anyway because peoples are dumb.. which they are not apprently considering the RTX 4080 sells..
      But yeah, that situation whith micro-chips crisis is insane and even AMD is rising prices but keeps them lower than nVidia to maintain "good value"... soo yeah.

    • @HardwareLab
      @HardwareLab  Рік тому +16

      @@Nintenboy01 watch a video bro, there is no such a thong as relative performance

  • @hairychesticles1
    @hairychesticles1 Рік тому +83

    gonna be interesting to see how the 6800xt ages vs the 3080 10gig cards

    • @TheAnoniemo
      @TheAnoniemo Рік тому +47

      8GB on the 3070 is basically a crime at this point, while the RX6800 gets a cool 16GB.

    • @Winnetou17
      @Winnetou17 Рік тому +26

      @@TheAnoniemo Less of a crime than 10GB on 3080, I'd say

    • @waifuhunter9709
      @waifuhunter9709 Рік тому +23

      And i bought 3080 over 6900XT when 6900XT was 20% cheaper.
      And now i started sweating profoundly

    • @AshtonCoolman
      @AshtonCoolman Рік тому +4

      The slower ray tracing performance will plague AMD GPUs as time goes on. They'll probably be faster in raw rasterization though.

    • @MechAdv
      @MechAdv Рік тому +19

      I traded my full hash rate 3080 10Gb for a LHR 3080 12gb the week that they came out. Lol, the guy was like, “are you sure!?!?”
      I didn’t give a fuck about the mining perf, I just wanted a brand new card with a brand new warranty and 2gb more Vram. Shit was brand new in the box with the plastic wrap still on, my card had been used daily for like a year by that point. That was a good day for capitalism. Lol

  • @doompenguin7453
    @doompenguin7453 Рік тому +313

    It's extremely impressive how the 290x demolishes the 780 Ti while being $150 cheaper. I kinda regret not buying the 390x now because that would have probably lasted me a lot longer than my 970.

    • @alksonalex186
      @alksonalex186 Рік тому +21

      If you're not bothered by lack of driver support and downloading custom drivers for little juice, then maybe it could be a better buy

    • @doompenguin7453
      @doompenguin7453 Рік тому +25

      @@alksonalex186 Not bothered by that at all.

    • @vogonp4287
      @vogonp4287 Рік тому +46

      GCN cards aged so well. My old RX 580 8gb consistently outperforms the 1060 6gb in recent games.

    • @azz09444
      @azz09444 Рік тому +13

      @@vogonp4287 yeah im also using an rx 580 8gb runs all games on very high 60fps, perfect for 1080p gaming

    • @Sam-cq9bj
      @Sam-cq9bj Рік тому +15

      @@vogonp4287 no gtx 1060 and rx 580 are still similar in terms performance .

  • @AlexBoneChannel
    @AlexBoneChannel Рік тому +86

    Love to see a channel that has only 600 subscribers push out such long videos of quality content.

    • @HardwareLab
      @HardwareLab  Рік тому +22

      We had less than 400 when published 🗿

    • @KiraSlith
      @KiraSlith Рік тому +4

      Really shows there's still people out here doing great UA-cam content out of passion first.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому

      now like over a 1000.

  • @raresmacovei8382
    @raresmacovei8382 Рік тому +34

    Correction to the video:
    While Kepler and Maxwell V1 (750) support some more features than just DX11/DX12 Feature Level 11_0, they are NOT in fact Feature Level 11_1 capable.
    So a game demanding 11_1 such as God of War doesn't actually work.

    • @HardwareLab
      @HardwareLab  Рік тому +12

      Interesting. Maybe I’ll research more about that and try to do am addition video. Don’t promise btw. Thanks for the comment :)

    • @NUCLEARARMAMENT
      @NUCLEARARMAMENT Рік тому

      You can play God of War and even Halo Infinite and SpiderMan Miles Morales just fine on Kepler GPUs, you just need Intel SDE (in case you lack AVX on your CPU like older Nehalem/Westmere and Phenom processors) and VKD3D to translate DX12 to VK calls. As long as your GPU supports VK 1.2, and all Kepler cards pretty much do, you will be just fine. The only limit is VRAM.

  • @Lexicon-ff6or
    @Lexicon-ff6or Рік тому +140

    I bought the R9 390 back in 2015 over the 970, and looking back I am very glad I made that decision. I was using that up until this year until I had to upgrade because the lack of driver support. If AMD didn't abandon driver support for the 300 series cards so early I could've easily had gotten another 2 years of use out of it, but since it was over 6 years ago I didn't mind upgrading to a new GPU (from AMD of course). I learned from past experiences not to buy NVIDIA cards do to how poorly they age, I'm glad to see people are starting to see past the hype.

    • @unlimitedslash
      @unlimitedslash Рік тому +9

      If you run linux or decided to use the hacked drivers that 390 is still a damn good card tbh.

    • @towertooth24157
      @towertooth24157 Рік тому +4

      Absolutely loved my R9 390. Had to retire it when they stopped supporting it and games like Cyberpunk came out. I picked up a 3060ti as a stopgap before finally getting the beastly 4090. All on the same PSU lolz

    • @doueven
      @doueven Рік тому +4

      Had an R9 390X that died in 2020 amid the pandemic. Picked it over the 970 for the bigger vram.
      Built a whole new system at the beginning of 2021 (fortunately) with an RX6800XT
      It's been a beast so far but I must admit, the second hand prices are quite bad vs NVIDIA. Was thinking of getting a new card next year but would probably get bugger all for the 6800XT

    • @josephdias3968
      @josephdias3968 Рік тому +3

      Yeah nimez drivers kick ass on the r9 290

    • @ExalyThor
      @ExalyThor Рік тому +1

      How is the driver support? Oh right...

  • @sythos_8653
    @sythos_8653 Рік тому +43

    The 290x was a great card. Still can't believe it holds out that well despite being older than the ps4 and xbox one. I bought a used 290 in 2015 for $200 that lasted me years.

    • @classicallpvault8251
      @classicallpvault8251 Рік тому +4

      It's not older than the XBOX One and PS4, the reference model was released almost simultaneously, but the most common type by far, the Sapphire Tri-X version, came out in December 2013, a few months later. Also note that the 290X is a generation ahead of the PS4 and XBOX One in terms of microarchitecture, both consoles use GCN 1.0 (so the same architecture as the 7970) and the Hawaii XT is based on GCN 2.0, which itself was first released in the form of the 7790 in February of 2013.

    • @Protector1rk
      @Protector1rk Рік тому +3

      290x ≈ Rx 580. Of course it's great.

  • @rangersmith4652
    @rangersmith4652 Рік тому +16

    GPUs all perform very differently under different conditions and in different applications/APIs, so direct comparisons can only report results in specific cases. But we need reviewers to give us those specific comparisons so we can "generalize" to some degree as a basis for choosing one GPU over another. I still have the R9-290 I bought when it was released. It is no longer running in any of my active systems, but when I last installed it, it was still working. Great card then, and still adequate for a lot of games.

  • @tourmaline07
    @tourmaline07 Рік тому +34

    I agree - those TPU performance charts based on geomeans from reviews do have their place but they can't tell the whole story at all and one should really consider their use case before buying.
    This is why I went for a second hand RTX 2080ti earlier this year and overclocked it rather than a 3070 or 6700xt as I want to target 4k60 and FSR was non-existent (and infinity cache does not work well at 4k offsetting a 256-bit bus) , and 8GB of VRAM is not enough. (This card will follow the same fate as the 780ti IMO very quickly).
    The R9 290X considering its age has held up extremely well.

    • @LaurentiusTriarius
      @LaurentiusTriarius Рік тому +5

      It always depend at what prices you could get these cards, here last year the 2080ti's in good condition I was finding were about the same price as many current gen budget kings...

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Рік тому +2

      3070 will not last long, I have a card with low VRAM, so I know. 6700xt does not have enough memory bandwidth but can be a good option at less price.

    • @raresmacovei8382
      @raresmacovei8382 Рік тому +1

      Getting a 2080 Ti instead of a 3070 or even 6700 XT is always a smart choice.

    • @tourmaline07
      @tourmaline07 Рік тому +1

      @@raresmacovei8382 DLSS for me was the deciding factor , playing at 4k on a 27" 60hz panel I knew I needed a good upscaling solution.

  • @HankBaxter
    @HankBaxter Рік тому +4

    Awesome video, my man. It's really nice to see how these cards stack up over time.

  • @dandeson9723
    @dandeson9723 Рік тому +51

    Remember when everyone said "you will never need more than 2gb of vram, its not important", so many people were wrong. Also somehow nvidia gpu's become less powerful when time goes by just like my Vega56 beats 1070 most of the time. Also that is a quality video, hopefully you get more subs.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +2

      Vega 56 have always been faster than 1070. That's why nvidia release 1070Ti to counter Vega 56.

    • @dandeson9723
      @dandeson9723 Рік тому +1

      @@arenzricodexd4409 yes and no, at the release Vega 56 struggled to beat 1070.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому

      @@dandeson9723 if vega 56 only matching 1070 performance then nvidia will not release 1070Ti in the first place

    • @dandeson9723
      @dandeson9723 Рік тому +1

      @@arenzricodexd4409 you lost me there

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +1

      @@dandeson9723 Vega 56 and Vega 64 comes out roughly a year after nvidia release 1080 and 1070. Since the very beginning AMD have adjusting vega 56 performance to beat 1070 consistently knowing how it perform in games a year before they release vega. So vega 56 struggle to compete with 1070 are not the case at all when AMD had a year looking at 1070 and 1080 performance before releasing vega to the market.

  • @joekoch8485
    @joekoch8485 Рік тому +53

    Interesting to see these older cards put through their paces today and see hold they do or don’t hold up today. Keep up the good work!

    • @maxwellmelon69
      @maxwellmelon69 Рік тому

      Im still running a powercolor 290x 50-60fps mid settings in new world it still relevent enough for me

  • @Netuser
    @Netuser Рік тому +1

    Insane video, I can see so much work put into it. Everything is clear and easy to understand. Keep it going bro! Oh, and music is really good too! Thank you for that video!

  • @pohjademaniratana2567
    @pohjademaniratana2567 Рік тому +1

    Your videos in this channel are good offering another perspective. I like that. Keep it up guys.

  • @butifarras
    @butifarras Рік тому +4

    Hey, really good video. I really like the concept of taking a few games from each year, gives a really interesting view of how the driver and api support evolves.
    The editing and the transitions are really well done, they look really proffesional, that said i agree with other comments that the video is too long, maybe thats just inevitable if you want to show some actual gameplay and go over so many games, oh well.
    Although you don't like relative performance, a relative performance vs year chart would have been really interesting!

  • @Wasmachineman
    @Wasmachineman Рік тому +23

    Upgraded from a water cooled RX480 to a Vega 64 and the performance increase was insane in GTA V.
    Not as insane as going from a RX480 to a 6900 XT in my main though!

    • @hrayz
      @hrayz Рік тому

      I went from the RX 580, to trying Crossfire RX 580 (good in some games, dx12&Vulkan do have multi-gpu modes too), to Vega64, then the RX 6900XT.

    • @michaelzomsuv3631
      @michaelzomsuv3631 Рік тому +1

      The performance was not insane in GTA V. We're talking about a ~40% difference to Vega 64.
      6900XT would be insane though, that's like 300% difference.

    • @raresmacovei8382
      @raresmacovei8382 Рік тому

      Water cooled RX 480? But why

    • @raresmacovei8382
      @raresmacovei8382 Рік тому

      ​@@michaelzomsuv3631 Nah. RX 480 to V64 can be up to 70% performance difference, depending on game, API and GPU clocks.

    • @Wasmachineman
      @Wasmachineman Рік тому

      @@raresmacovei8382 Because why not. the RX480 Nitro+ stock cooler is absolute garbage too.

  • @brokencage9723
    @brokencage9723 Рік тому

    wow great video. thanks for all the work you put into it.

  • @agnelo258
    @agnelo258 Рік тому +2

    Thank you for your dedication to make this video

  • @shwainokaras95
    @shwainokaras95 6 місяців тому

    One of most inreresting tech video i watched in these year. Ty, GL!

  • @Mortharius
    @Mortharius 10 місяців тому

    Really interesting video, love these experiments with cards. Really good job man

  • @Ivan-pr7ku
    @Ivan-pr7ku Рік тому +14

    Very good retrospective work -- more of this kind of benchmarking is needed. I was surprised to see so many performance issues with Vuklan API titles on Kepler. Is it just the VRAM size limitation or memory asset mismanagement in the driver -- who knows. One big difference between Kepler and GCN architectures is the inability of the former to run compute and graphics kernels at the same time. This is probably the reason for the extra performance drop in Doom Eternal with its specific implementation of forward rendering, that certainly causes GPU pipeline under-utilization in Kepler, on top of the chronic Vulkan run-time problems.

    • @Degalfox
      @Degalfox Рік тому

      i think you pretty much answered your question. other then that is the memory buss of the 2 cards 384bit for the 780ti and 512 for the 290x. pretty much what ever gddr5 the cards have, the r9 will have more bandwidth.

  • @knuttella
    @knuttella Рік тому

    great video. congrats on all the work done. hope you hit 10k subs soon!

  • @catbertz
    @catbertz Рік тому +3

    This was interesting and useful! My child's friend is still using my old 780 (non TI). I think it's time to push them to upgrade at least a few years, even though I'm impressed with how long both the cards in this video held out.

  • @Sybertek
    @Sybertek Рік тому

    Amazing study! Subbed.

  • @anikimaster1463
    @anikimaster1463 Рік тому +2

    О, новый канал Краба, сразу узнал по голосу, и по манере подачи контента
    ))

  • @thegrimmtv3532
    @thegrimmtv3532 Рік тому

    This is the content I was looking for. In-depth and detailed benchmark comparisons.

  • @FatheredPuma81
    @FatheredPuma81 Рік тому +6

    Relative performance charts such as that one GPU website's global chart are amazing for finding GPU's to look into. Saw the 2080 Ti was close to a 3070, remembered that it really was, looked it up and it really was but had more VRAM so I grabbed that for like $300 cheaper than a 3070.

  • @Peter-kb1ye
    @Peter-kb1ye Рік тому +1

    Clicked on the video so I would see the difference between those 2 gpus but stayed for the music.
    Great video and even better music choice. I hope you continue to grow!

  • @s.biertumpel3761
    @s.biertumpel3761 Рік тому

    good video! keep up the good work!

  • @adriananonim8052
    @adriananonim8052 Рік тому

    Great video , congrats 4you;)

  • @sodozormemesdeveloper9438
    @sodozormemesdeveloper9438 Рік тому

    Great Video. Thank You

  • @lucasgabriel6369
    @lucasgabriel6369 Рік тому

    Nice vide and music selection

  • @ChinchillaBONK
    @ChinchillaBONK Рік тому +2

    This channel seems to be something different from the much larger channels out there.
    I like your technical knowledge and how you presented the data.
    Just want to know, if I want to get the cheaper AMD cards for Blender, is the reason AMD is behind Nvidia for Blender a lack of support or a hardware limitation, and is it possible to find open source plugins to help with AMD cards?
    Also, is VRAM more important for rendering or GPU speed?

  • @YourFriendlyKebab
    @YourFriendlyKebab Рік тому

    This is an excellent quality video, congrats!

  • @stephanhart9941
    @stephanhart9941 Рік тому

    I Love the BALLZ @ 6:30 !!! Instant SUBSCRIBE. Much success. Monetization is coming soon! Keep it up!!!

  • @ailiop1
    @ailiop1 Рік тому

    Thank you for all the test and for trying riftbreaker and deus ex

  • @MatthewKiehl
    @MatthewKiehl Рік тому

    Thanks so much for this. I'm curious about power consumption similarities and differences. When "doing well" does one use more power, or is one card pulling more Watts in every case?

  • @r4z4m4t4z
    @r4z4m4t4z Рік тому

    nice production, well done

  • @picb
    @picb Рік тому +18

    Im still using a watercooled 290 and love it cause of how affordable it was (and is)!

    • @bdhale34
      @bdhale34 Рік тому +1

      The first gen i7 875k I gave to my niece for minecraft/roblox/fortnite has a R9 290 4GB reference card in it and that thing still slaps hard at 1080p for less demanding and esports titles.

  • @chillhour6155
    @chillhour6155 Рік тому

    neat channel, subscribed

  • @jeffdeal1162
    @jeffdeal1162 Рік тому +5

    I've had two 290x's in crossfire when they were relevant, and ironically enough during the height of the pandemic I bought a 290 blower card for my living room pc for $80, where it still plays witcher 3 at 60 fps with a mix of high/medium at 1080p. This was a golden era for gpu's until the 10xx cards came out, and we all know nvidia will never make another 1080 Ti - a card I've also owned 3 versions of.

  • @kanpuu-san
    @kanpuu-san Рік тому +1

    Thx for this awesome video. Can you do the same comparison between GTX 1060 and RX 480? It would be interesting.

  • @KH_1
    @KH_1 Рік тому

    good info graphic visual!

  • @maybelive765
    @maybelive765 Рік тому

    Did you run modded drivers for any of the two cards? That is a huge benefit for my r9 fury(s) X

  • @hj-hv6rt
    @hj-hv6rt Рік тому +2

    Wow, awesome video. How well the 290x held up in some of these modern games is impressive

  • @arc00ta
    @arc00ta Рік тому +5

    I had a whole pile of 290s and 290Xs, did some quad CF benching which was fun. They were right garbage for gaming though when they were new, very common to have black screen crashes and a bunch of other issues. TBH the better card out of all of them was the OG Titan, that thing was a monster under water.

  • @shiroishii7312
    @shiroishii7312 Рік тому +4

    APIs were also optimized over time for PS4/XB1. Both consoles running on AMD-GCN based apus for more than seven years also means that devs had to ensure a good compatibility for AMD-GCN GPUs over a longer period of time. The PS3 and the 360 launched with Nvidia GPUs so games were running pretty good on Jensen side for a while too. It just depends on what most people play on.

    • @kishaloyb.7937
      @kishaloyb.7937 Рік тому +6

      PS3 had a Nvidia GPU. Xbox 360 had a ATI/AMD GPU. So in that generation, it was pretty tied. PS4/Xbox One onwards, every company moved to AMD cause of how horribly Nvidia treats it's own customers.

    • @arenzricodexd4409
      @arenzricodexd4409 Рік тому +1

      @@kishaloyb.7937 Anything bad have always been associated with nvidia so even when certain client no longer use nvidia people assume it must be because of how bad nvidia treat their customer. AFAIK sony does not really have issue with nvidia. They simply go with AMD with PS4 because AMD can give them both cpu and gpu in without needing to look for separate company to do it. With MS if you look back the issue with the original xbox you should know that it is not nvidia fault entirely. If nvidia was really that bad nintendo will not going to do business with nvidia with the Switch after using AMD GPU for generations inside their main console.

    • @garbagemousemat678smith2
      @garbagemousemat678smith2 Рік тому

      Its amazing how well CP2077 run on PS4 Pro now considering it has Jaguar cpu. Mechwarrior 5 totally get cpu bottlenecked with Jaguar.

    • @kishaloyb.7937
      @kishaloyb.7937 Рік тому

      @@arenzricodexd4409 Apple, Sony, MS all left Nvidia. Linus Torvalds himself has publicly "showed the bird" to Nvidia. I think that is evidence enough that Nvidia is a very bad business partner. That is even stated by Linus himself. Then comes all the hardware requirements.
      Considering how badly Nintendo rips off their own customers, they are finding it at home when dealing with Nvidia xD
      Though considering how good AMD APUs are now even in the mobile/handheld form factor, Nintendo might just switch to AMD for their Switch 2 hardware requirements.

  • @hermanbest8106
    @hermanbest8106 Рік тому

    Nice job bro

  • @JOHNTECH112
    @JOHNTECH112 5 місяців тому

    great comparison, love to see a video on rx 580 and gtx 1060

  • @swuspicious
    @swuspicious Рік тому

    amazing video on the topic!!! not a lot of creators cover older hardware and how valuable it can still be to people with a very low budget. one minor complaint i have with the video is how you present the benchmarks with no voiceovers for honestly an annoyingly long time. the video could be a few minutes shorter without the full benchmarking footage while getting the same idea and data across. having 1%/0.1% lows to better show off stutters could also improve the data presented more.

  • @VioletGiraffe
    @VioletGiraffe Рік тому +1

    Watching the video in 240p (because of slow and expensive Internet connection), I couldn't tell whether the racing car footage in the beginning is a game or actual real life video.

  • @Nachokinz
    @Nachokinz Рік тому +1

    Another reminder that graphics cards with more vram age more gracefully. I remember when others insulted my decision to purchase a 2gb GTX 285 in 2009 claiming how it was a waste of money, yet that frame buffer gave the headroom to keep texture settings reasonable until I upgraded in 2015 and again in 2021. Unfortunately some will never listen and then wonder why performance of cards with less vram decrease significantly after only a couple years.
    Been paying attention to prices of workstation cards being sold off by mining farms like the Nvidia A4000s; based on prior experiences it wouldn't surprise me if those aged better than the 3070 or even 3080 because of its 16gb of vram, that is despite performance similar to a 3060ti in 2022. There are some impressive deals out there for those who have patience.
    Thank you for taking the time to run all those benchmarks and make this video, condensing 10 years of experience into just under an hour.

  • @bestgamerz8093
    @bestgamerz8093 Рік тому +1

    This is a good content. Informative and Entertaining. You deserve a sub. May Allah grow your channel. Ameen!

  • @bryanbrewer4272
    @bryanbrewer4272 Рік тому

    Bravo! spot on!

  • @Maartwo
    @Maartwo Рік тому

    Great video, nice driving. What's your PB in Misano? Lol

    • @HardwareLab
      @HardwareLab  Рік тому

      Pb in misano?

    • @HardwareLab
      @HardwareLab  Рік тому

      Oh, I’m sorry, it’s about assetto corsa? It’s Denis typing, footages are done by Maxim

  • @TyrKohout
    @TyrKohout Рік тому +5

    This video is on a similar level to Digital Foundry - especially in terms of research quality and script writing. Very impressive piece, boys! This channel is going to be really big if you keep up the amazing releases like these.

  • @glenbehan1028
    @glenbehan1028 Рік тому

    I love the idea of this video, expansion might need to be had

  • @MCgranat999
    @MCgranat999 Рік тому

    I feel like I need your entire music playlist

  • @infinnite4938
    @infinnite4938 Рік тому

    Great thumbnail!

  • @thecrionic
    @thecrionic Рік тому +2

    While i enjoyed the video (good work btw), everyone remotely in the field knows and understands the points you make. Yes, these are generalisations and yes, generally they are not okay but for better or worse, we lack better means of measuring the relative performance of competing products, so such a generalisation gives a pretty good top down general view of what a certain GPU can do.
    Different architectures give different results, so do the different amounts of VRAM but that is normal and very well understood. Yet, reviewers aren't wrong - they present the new products as they are percieved at launch. Noone can singlehandedly redo their reviews of all past generations of products since 2012 every year to check if something has changed. And ofcourse it has.
    I'm still rockin a r9 290 but imagine if that 780ti had 4 gigs of RAM and full DX12 support, would it lose to the 290x today? I don't think so.
    Yes, nGreedia skimped on the specs (as they always do) but how long did it take to show? Answer is 5 years. 5 years in tech terms is alot and most people who would of bought these at launch in the first place, most probably have moved on 'cause they can afford it, obviously, there is always something better comming out.
    You also failed to mention the frame timing comparison, which in these 10 past years has been a bane, where 60 fps can look like 40 on AMD cards because of the way our eyes work.
    Drivers are always a hit or a miss. I've had problems with both companies, i've had minor video issues with AMD, didn't like the colors reproduced by an nVidia card even with corrections but what i found really frustrating was that i had AUDIO issues, FRIGGIN AUDIO ISSUES, caused by the nVidia drivers at the time.. It's a videocard, why did it bugger my audio drivers.. Anyway..
    Lastly, when the r9 290x launched, it offered very competitive pricing, TITAN level performance for half the money... Well, count me and the whole town in.
    But.... And this is a big *butt*, in 2022 AMD offers slightly worse performance for 20$ less (talking mid - high range) and basically no ray tracing. Why should i buy the inferior product? Today's GPUs will be irrelevant in 10 years time. If i want ray tracing today AMD sucks, whereas a 3060 with dlss/ fsr will deliver and is cheaper than 6700xt and has more vRAM...
    If AMD went ( like they did with the r9 290x vs GTX TITAN), "Hey, here's a 6700xt, it does not have the ray tracing capabilities of the 3070 but it has 10 gigs of vRAM and is only 349$" i'd be all over it. But they didn't, they got nGreedy. So thanks but no thanks.
    p.s. I was always a fan of AMD GPUs since 2008, they always delivered value for money until 2020 when they decided that value for money has no place in their corporate world. We'll see how that works in the crypto crash of today.
    Now i'm a fan of second hand GPUs.

  • @ArthurDNB
    @ArthurDNB Рік тому

    I wached everything start to finish. Sir you got a subscriber here. Do a 390 vs 980 next. I had both cards and I think the 390 still to this day a great card

  • @bruhzooka
    @bruhzooka Рік тому

    Hardware Lab killing it with the premium content. Do you have a Patreon?

  • @hrayz
    @hrayz Рік тому +1

    I really wanted to see a comparison chart (charts) at the end.
    Show averages by year. Chart % by each API. Etc.

    • @agsel
      @agsel Рік тому

      But he is against summarizing and indexing, that's the whole point of the video

  • @GRAN_EME
    @GRAN_EME Рік тому

    FINALY someone talkin about it

  • @philcooper9225
    @philcooper9225 Рік тому

    Music at 9:19 ?
    Also: Holy HELL that Doom 2016 comparison is so brutal!

  • @oscaraliisa
    @oscaraliisa Рік тому +4

    For the same reason I picked 390 instead of 970/980. Also recently I swapped 1080 Ti for 6700 XT. 3070 with its 8 gigs of VRAM feels more like downgrade.

  • @realnamesnotgiven6193
    @realnamesnotgiven6193 Рік тому

    Very good video

  • @od13166
    @od13166 Рік тому

    Really good video saw NVIDIA Driver video from linus he actually explained well about 780ti
    techpowerup should refresh that performance chart by playing new game until driver discontinues

  • @wakesake
    @wakesake Рік тому +2

    you sir have one of the rare hardware channels worth subing
    Very NICE vid , thanks for the huge effort

  • @dylon4906
    @dylon4906 Рік тому +1

    it's absolutely insane how well the 290x still holds up in modern titles. it's still perfectly usable as a low-end option for playing 1080p at medium to low settings. I have no idea why amd killed driver support for it. I feel very bad for the people that invested more in the 780ti for the maybe 10% advantage it had in the popular titles around when these cards were released. I also wonder if the 290x aging so well is also due to the same architecture being used in the 8th gen consoles.
    that being said, I wouldn't be surprised at all if a similar situation ends up happening with the 3080 vs the 6800xt. the 6800xt launched with just a little bit slower performance than the 3080 but had a much bigger framebuffer and was (intended to be) 50 bucks cheaper. it's also the same architecture used in the 9th gen consoles.

  • @McLeonVP
    @McLeonVP Рік тому

    13:09 the game doesn't have HBAO?
    or SSAO?

  • @yrc4184
    @yrc4184 Рік тому

    I recently gave my MSI R9 290x Gaming for a nephew , i built rig with i5 4670k , hes happy now with hes "new" pc 😊 this card still kicks ass in newer games like horizon zero dawn (50fps avg on medium settings) .

  • @r92king
    @r92king Рік тому +8

    Great video. R290X was a beast when it came out but it was hot and loud that is why NVIDIA could ask for the extra price. Relative same performance doesn't mean both GPU will behave the same under every game and condition think about it as GPU performance tier. So like R9 290X, R9 390, RX 470, RX 570, GTX 970, GTX 1060 3G likely to performe almost the same in most of the games. Back then they were neck to neck in most of the games as we can see it your test. Nvidia usually makes GPUs for the present while AMD likes to put future proof features in their GPUs. But peopele don't care about the future they just buy the GPU which is currently the best because they think like if the GPU becomes slow they will buy a new one even if the other one could last in their system one or more years before it becomes slow.

    • @HardwareLab
      @HardwareLab  Рік тому +8

      But it was never relative. The difference can reach 50% or whatever higher in some games, but be less than 5% in others. That is why it’s not correct to speak about any equivalents, it just depends on the game A LOT
      P. S. You don’t care != people don’t care. If you don’t care - buy whatever suits you, but who cares - will buy something different. This is the point.

    • @ozzyp97
      @ozzyp97 Рік тому +6

      It really wasn't all that hot and loud under a proper cooler, it just got that reputation because AMD's reference design was legendarily awful. The more obvious reason is that the 780Ti was clearly the faster card at the time, it took years of driver updates and more demanding games to really show what the 290X was capable of.

    • @varungupta8241
      @varungupta8241 Рік тому +4

      @@ozzyp97 the newer games didn't show how capable the 290x was it showed how bad the 780ti was

  • @cracklingice
    @cracklingice Рік тому

    Kinda neat to see how these retro gaming cards compare today.

  • @305backup
    @305backup Рік тому

    Ive had a R9 280x pretty much since it came out and its been able to run pretty much everything at decent settings up until directx 12 just straight up dropped support for it.

  • @abumy4
    @abumy4 Рік тому

    dang! throwing so much shade at techpowerup there that one might need a flashlight.
    Either way, great video! I would have liked tldr adbice given at the beginning of the video (do your own research part, and the video shows exactly why). Otherwise I liked how did keep some suspense with the results without shaming 780ti without reason!

  • @TrashedRedPanda
    @TrashedRedPanda Рік тому

    What's funnier is if you look at the memory speeds the 290x on average is almost 2x slower with it's memory speed but still either matching or just slightly lower in regards to fps shown

  • @likeclockwork64butbetter58
    @likeclockwork64butbetter58 Рік тому

    Nice review REALLY NICE REVIEW
    I had to comment when you showed how Metro Last Light actually favored the R9 290X in dynamic scenes

  • @McLeonVP
    @McLeonVP Рік тому

    Can you test the GTX 1080 vs Vega 64 ?

  • @iansysoev9462
    @iansysoev9462 Рік тому

    Man, you're SEVERELY underrated, hope one day you'll become as big as LTT)

  • @ALPHABYTE64
    @ALPHABYTE64 10 місяців тому

    This video is great

  • @demontongue9893
    @demontongue9893 Рік тому

    I remember building my first proper gaming PC after playing dayz with onboard graphics for a year, it was a 970 with an i74770k and I was blown away.
    Ever since I can't play a game if it's not over 75 to 85 fps
    😆

  • @RANDOM-ix6pn
    @RANDOM-ix6pn Рік тому

    Good comparison, a lot of work behind. I think newest drivers could do something different, Nvidia has better driver development isn't it?
    I suggest try again with Xtreme G drivers for AMD and Nvidia.
    Good job and new sub!

  • @parwiesshahsavari5604
    @parwiesshahsavari5604 Рік тому

    Just clicked play and didn't realize its 50 mins long until I got to 35th minute. Every song is a banger in there. You got a playlist or a name of the genre?

    • @HardwareLab
      @HardwareLab  Рік тому +2

      All the tracks names are in the description. Not sure if there is a genre name for that or something

  • @selohcin
    @selohcin Рік тому +4

    A few things:
    1) Everyone should understand that TechPowerUp's GPU chart is based on how each card runs games at 1080p Ultra settings for all cards up to the RTX 2080 Ti, where the chart switches to 4K. It doesn't mean "This card is ALWAYS 4% faster than that card"; it means that it is 4% faster on average when accounting for a wide range of games.
    2) Furthermore, it is not fair to test games in Vulkan using the 290X and then *not* test them in DX11 using the 780 Ti. The best API for each card should be chosen and then tested against the other.
    3) This video is good, but it is TOO LONG. You could have shortened the video to 30 minutes by talking about the games' performance *while* the footage played in the background instead of waiting until it is over. Your audience has both eyes and ears, and you should make use of both of them.

    • @HardwareLab
      @HardwareLab  Рік тому +2

      1) And my point is that this data is just useless because it represents nothing
      2) Didn’t you see that we are testing games for representing a specific API? I mentioned that in introduction. Alright then, tell me in which game I should switch from Vulkan to DX11? Wolfenstein? Strange Brigade? RDR2? Definitely these games have DX11, 100% true
      3) You are not my audience since your understanding of the video and the ideas that we were trying to explain is poor. We show around a minute sections to demonstrate how the game behaves based on different scenes in the benchmark section, this is the thing why a few words with test on the background with numbers doesn’t show the whole picture, moreover, the work with music and visuals and all the timings is done for an artistic purpose, so this structure is a concept of the video

    • @cyjanek7818
      @cyjanek7818 Рік тому

      @@HardwareLab I would agree with only one thing that person said - the video is too long.
      Maybe I am not a target but I liked everything about that video. Thing is I was skipping as much of video when no one is talking etc as I could (that is just my trait, really problematic but well) and Tought video was like 20 minutes long, when I read his comment that it could be 30 mint I was like "what? It was shorter than that".
      I dont mean it in any bad way but I think there is no need for longer presentation of a game if nothing special happened during that test. Obviously do as you wish but I thought it might be usefull to share that I agree with that one point when I dont agree with any other in his comment.

    • @HardwareLab
      @HardwareLab  Рік тому

      @@cyjanek7818 okay, i get your point, but again, this is an artistic choice and many other people in comment section got the vibe we were trying to create

  • @redwalkie3552
    @redwalkie3552 Рік тому

    Хороший тест, красава

  • @Mech438
    @Mech438 Рік тому

    Ok wonderfull video .But why in every game , texture setup is to high ? Vram in 290x is almost 4 GB witch tell me tis texture is to high and limitation for gpu - nvidia 780ti .

    • @HardwareLab
      @HardwareLab  Рік тому

      The textures are on medium or low on most of vram demanding games. In GTA V they are set to high, because standard looks as bas as ps3 version. They are too awful to test it with serious face. Same can be said about medium instead of low in RDR2

  • @spr4yz69
    @spr4yz69 Рік тому

    so basically, gtx 600, 700 etc series supports DirectX 12, but you can't use them?

  • @carlospulido6224
    @carlospulido6224 Рік тому

    Top quality content. But out off that topic, TechPowerUp usally does GPU reviews for most of the Top offerings of the year, so you could argue that they have some backup research done... By the correct use of the word "equivalent" (if what you look for is Games/application performance) then it never was missplaced.
    But in the other hand, mixing the concepts of equivalent or equal X gpu vs Y gpu. Then yes, it's a clear missunderstanding. Like mixing Thunderbolt vs FireWire or USB4...
    PS: new sub.

  • @aguascritalias12
    @aguascritalias12 Рік тому

    Ótimo vídeo ainda tenho um r9 290x 😅

  • @heinzletzte.6385
    @heinzletzte.6385 11 місяців тому +1

    I wonder how the older games would perform using dxvk

  • @glamdring0007
    @glamdring0007 Рік тому +2

    I bought the R9 390 8gb many years ago when it launched...my son was still using that GPU to play games at 1080p until just a couple of weeks ago and it was still providing +60fps in most games with decent quality settings. I replaced it only because after so many years AMD has discontinued driver updates.

    • @ismaelsoto9507
      @ismaelsoto9507 Рік тому +1

      If you don't mind using modded drivers NimeZ Drivers are quite decent, at least it allows these GPUs play modern titles at acceptable frame rates.

  • @Skyshadow1
    @Skyshadow1 Рік тому

    I have a pc which i built throughout 2012-2013, intel 5570k and r9280 (or 280x ) 4gb, started with 8gb ram now i run it with 16.
    Most games still run in medium to low settings, i can even go high in less demanding titles.
    I have run in some optimization problems in the past and even now there are games that go 100% on CPU while GPU is more like 50-70%.
    Unfortunately due to budget limitations back then i could never get 100% the experience and performance i expected from my build, that 8gb ram haunted me for years but i am not doing the same mistake this time.
    But i cna legit say that i can efficiently run titles before 2019 very efficiently.

  • @afuyan
    @afuyan Рік тому

    Interesting comparison, subscribed. GCN is literally just as it says, Graphics Core Next.

  • @pptemplar5840
    @pptemplar5840 3 місяці тому +1

    Relative performance can be measured based on usecase. saying (x) card is 105% of (x) card is silly mostly because that 105% will fluctuate based the load
    Basically the way most people measure things is just bad, you can still measure relative performance as long as you don't oversimplify it, but people will always tend to simplify it.

  • @jreynol100
    @jreynol100 Рік тому +3

    I had 2x 290x in crossfire until early last year. Served me well for a long time. Would have kept them longer if mutli-gpu didn't lose so much support over the years since 2013

    • @likeclockwork64butbetter58
      @likeclockwork64butbetter58 Рік тому

      Its a crying shame too. Multi GPU had some crazy success. Resident Evil 5 had scaling with upwards of three GPUs. Triple crossfire always made more sense since you could pair a dual and a single together and use the single when crossfire wasn't supported, the dually when regular crossfire was better supported, and all three when the game was supported really well

  • @alun1038
    @alun1038 Рік тому +3

    Techpowerup’s relative performance isn’t quite accurate these days. I think buyers should just watch UA-cam benchmarks and comparisons, or go to Tom’shardware GPU hierarchy page, which isn’t 100% accurate but still better than Techpowerup

    • @HardwareLab
      @HardwareLab  Рік тому +2

      The problem is not the exact database, the problem is the whole concept that different gpus can be placed against each other on different “levels of performance” and the equivalents could be found there. That’s wrong, it doesn’t work like this

  • @agsel
    @agsel Рік тому

    Performance indices do have their place, unless you already know what games you are going to play and look up the performance in each title, just like you're showing here. What did your performance index difference end up being averaged over your tests?

    • @HardwareLab
      @HardwareLab  Рік тому

      No, it doesn’t have any place because it doesn’t represent anything, it’s just a useless number that states nothing, because the difference between “relevant” gpus can reach over 50% in both sudes, so real results never have anything in common with these numbers

  • @McLeonVP
    @McLeonVP Рік тому

    17:54 the music in BMG sounds like 2010 pop : Electronic