THANK YOU NVIDIA!! - RTX 4060 Ti Review

Поділитися
Вставка
  • Опубліковано 23 гру 2024

КОМЕНТАРІ •

  • @LinusTechTips
    @LinusTechTips  Рік тому +2336

    Contrary to the table at 0:13, the RTX 4060Ti is PCIe x8, not x16. We also said the X370 chipset does not support ReBAR but that was added in a later update. We apologize for the confusion!

    • @Byfils
      @Byfils Рік тому +36

      Excellent job, this review was much better than previous ones.

    • @karolwesoek5350
      @karolwesoek5350 Рік тому +9

      Is it enough for PCIe 3.0?

    • @ArchusKanzaki
      @ArchusKanzaki Рік тому +31

      @@karolwesoek5350 I think most board that is still rocking PCIe 3.0, will be bottlenecked on the CPU before you buy this GPU....
      The argument for x8 bus is more relevant to 50-series GPU, which either just run on PCIe, or just sip powers from one 6-pin/8-pin so it does not overwhelm the poor used-office PC PSU. The memory bus already knee-cap this GPU anyway.

    • @nemtudom5074
      @nemtudom5074 Рік тому +13

      Geez, thats skeezy

    • @tummie420
      @tummie420 Рік тому +38

      Couldn't it be Nvidia chose to 'only' go with a 128bit bus because whenever they used a 256bit bus in combination with the new larger L2 cache the performance of the 4060ti would be to close to the 4070? Would be interesting to find that out.

  • @toaster98
    @toaster98 Рік тому +1432

    Congrats Nvidia !
    After the successful un-launch of the 4080 12Gb you more than made up for it by re-releasing the 3060Ti !

    • @JXSnWp
      @JXSnWp Рік тому +91

      Just remember, it's pronounced "tie"
      - Thanks Steve!

    • @CakePrincessCelestia
      @CakePrincessCelestia Рік тому +2

      @@JXSnWp Underrated comment.

    • @onicEQ
      @onicEQ Рік тому +2

      MSRP adjusted for inflation too

    • @tds397
      @tds397 Рік тому

      @@onicEQ Inflation is BS at this point, all they want is to keep the same profit levels as they had during the mining craze. 🤡

    • @AndyViant
      @AndyViant Рік тому +12

      Unlaunched their own 4060 Ti to replace it with the 3060 Super lol

  • @albertocv632
    @albertocv632 Рік тому +8001

    The fact that the RTX 3060ti is so close in all charts and can even beat the 4060 ti at higher res (sometimes) is just hilarious

    • @Velerios
      @Velerios Рік тому +288

      And the 3060ti might actually also be more powerful than the 3060ti for people with pcie 3.0. NOBODY NEEDS PCIE 4.0 IN THIS LOW END, EXCEPT WHEN GPU MAKERS ARE SO CHEAPO WHILE MAKING CARDS THAT USES ONLY HALF OF THE PCIE LANES. This is a pcie 4.0x8 card as far as i know.

    • @averagemike2171
      @averagemike2171 Рік тому +227

      Makes me even happier knowing I went with a 6700xt instead of waiting on new gpus.

    • @alexfakluy3410
      @alexfakluy3410 Рік тому +89

      Yes dude, because of the memory. 4060ti need 12gb with wider bus, like 192bits, to proper beat the 3060ti in 1440p.

    • @LynX2161
      @LynX2161 Рік тому +63

      brb, on my way to buy a pre owned 3060 ti and tell everyone I bought a new 4060 ti, nvidia has lost its mind and my money

    • @lewkehh3558
      @lewkehh3558 Рік тому +63

      the 3060ti was slept on so bad, I gave my 3060ti to my brother for his pc when I upgraded. He's able to run games so smoothly at 1440p 240hz with no issues. Forza runs like a dream, he plays GTA V mostly with redux and it runs like a dream. Hands down the best budget card ever imo, 3060ti was unreal

  • @Gr1mm4
    @Gr1mm4 Рік тому +648

    Here at Nvidia we made a mistake with a few cards, and realised they were a bit too good and had far too much longevity for our tastes. So we went to work and made sure that won't happen again: we are proud to announce, the 4060Ti, a card that will be almost useless for new games in 6 months!

    • @wanderer7779
      @wanderer7779 Рік тому +52

      Ah yes the good old 1080ti i bet that mistake won't happen again 😅

    • @morthimer
      @morthimer Рік тому +17

      That waste of sand card is useless right now you don't need even need to wait 6 months.

    • @Sydd787
      @Sydd787 Рік тому +36

      Boys, it's time we rise up.
      Lets..... go outside, I know its hard, but you have to be a soldier and do it.
      Lets... play physical sports like soccer, volleyball etc.
      Lets get dirty. We dont need ngreedia.

    • @silviu7883
      @silviu7883 Рік тому +8

      @@wanderer7779 wait for FSR 3.0 to get the 1080 ti final form.

    • @AceBurn90
      @AceBurn90 Рік тому +6

      That means that my 1080Ti 11GB will have to survive (at least) the next generation. I can wait untill The elder Scrolls 6 is out in 2026 to 2029

  • @nothingtoseehereipromise
    @nothingtoseehereipromise Рік тому +72

    I literally just got the 4060ti without heavily researching it first (oops). I paired it with a I7-7700T , 32gb and 1tb nvme (gen3). All these components were sitting around collecting dust, so I wanted to make a low-wattage gaming rig. I'm also gaming on a 1080p 180mhz monitor and so far, it performs extremely well, no complaints whatsoever. It obviously crushes 1080p gaming so I'm perfectly happy with it.

    • @insabon
      @insabon 10 місяців тому +4

      I got it for $240 so I wasn't malding about its mediocrity

    • @ishaanmurarka9082
      @ishaanmurarka9082 9 місяців тому +1

      I have a Ryzen 3200g, 16 gb ram, 500 GB SSD and a used 1660 super (which I got for 110$ a year back). I am still crushing 1080p 60fps gaming. That's why the whole 40 series looks like a comical scam to me. 😂😂 So for the first time, when I do upgrade I think I will look at AMD or Intel this time. Nvidia is as hopeless as nokia at this point.

    • @thedeathstar420
      @thedeathstar420 8 місяців тому

      Haha me too. Thinking to return it and spend the extra $100 to get the 4070

    • @Ragssssss
      @Ragssssss 8 місяців тому +2

      ​@@thedeathstar420 I'd save those 100$ for future upgrade. And, when the right time comes I'll sell 4060 ti + use these 100$ to get new GPU.

    • @gilbert8667
      @gilbert8667 8 місяців тому

      I'm having no issues with it, I don't need 4k, and in most cases, I'm at 1440 easily at 60fps. I'll just have to upgrade a little earlier than I want. No biggie.

  • @philippesoares1745
    @philippesoares1745 Рік тому +1415

    God i hope Intel continue their efforts in the GPU market.....
    A new player really is needed...

    • @gorkskoal9315
      @gorkskoal9315 Рік тому +19

      I wonder if they'll go google and call it quits after just one full card. What I don't get is how they can do so badly. They company has made integrated graphics since forever, and CPUs since time.

    • @pelonix
      @pelonix Рік тому +77

      @@gorkskoal9315 they are not quitting, as an arc a770 16gb LE owner they are making strides with driver improvements, battlemage will be legendary

    • @logandeathrage6945
      @logandeathrage6945 Рік тому +36

      ​@@gorkskoal9315 Intel has atleast 2 more generations from what they said a couple months ago. I just hope Battlemage is a good price to performance like ATI/AMD used to be because many people would jump to it.

    • @thetheoryguy5544
      @thetheoryguy5544 Рік тому +34

      @@gorkskoal9315 Do so badly? Linus just showed were the A770 is beating the 4070ti in some task and not far behind in most. Drivers are constantly improving each month and the raw power within the card can surpass the 3070 in time, all for less $$$. I'd say they are doing amazing for a first time being serious in the GPU market. They are defiantly committed for two more gens being Battlemage and Celestial, Battlemage aiming for 4070 to 4080 performance from what I hear and Celestial probably on par with whatever 50 series mid range will offer (5060ti/5070) And will more then likely not increase much in price, maybe $400 to $500 for the top end model. I think once more people feel safe with the drivers, which are already pretty stable, they will sell good AND iNTEL will continue to keep new Arc cards rolling out.

    • @philtkaswahl2124
      @philtkaswahl2124 Рік тому +15

      @@gorkskoal9315 What meme echo chambers have you been staying in? The fact that the company only did IGPUs for so long is not only taken into consideration by the tech commentary space, it was a concern that Intel would have trouble tackling a discrete GPU after only having that as experience since iGPU =/= discrete GPU, especially a discrete GPU with all the features Intel was promising, and they'd have to compete with a duopoly with literal decades of experience with discrete GPU drivers.
      And they did have trouble, so it very much was _not_ a surprise and people were shitting on them for the totally expected outcome. Intel actually did better than expected with the hardware being decent, since people were expecting they'd botch that too, and most of the problem being the expected driver inexperience and bullshit interface software. Shitting on them that became less loud with the driver improvements over the last several months.

  • @OhItsThat
    @OhItsThat Рік тому +2058

    Can’t wait for the 5060Ti with 2GB of GDDR5. That’s when Nvidias line of retro gaming cards will really hit its stride.

    • @pfizerpricehike9747
      @pfizerpricehike9747 Рік тому +206

      Don’t forget the 64Bit buswidth and pcie 4x4 connection and decreased shadercore count
      But it will feature dlss 4.0
      Get 4 inaccurate fake frames - unusable in any non slow paced offline point and click adventure game - for every upscale frame produced
      Oh did we forget to mention
      To keep manufacturing costs down upscaling will be on by default
      Now to the price
      It’s just 600$ for the 1 Gig and 750$ for the 2gig variant

    • @pranavmohite8544
      @pranavmohite8544 Рік тому +84

      "retro gaming cards" 🤣🤣

    • @chrisbinwang2119
      @chrisbinwang2119 Рік тому +53

      can't wait sell my 1060 6 GB for $800

    • @NoxNyctores427
      @NoxNyctores427 Рік тому +48

      $700 cards baby, can't wait to play Unreal Tournament 2004 on that beauty

    • @SD-tj5dh
      @SD-tj5dh Рік тому +5

      I can't wait to spend $500 on a brand new CGA card

  • @MeTheOneth
    @MeTheOneth Рік тому +1244

    Looking around the tech channels, this looks like the most positive review of the 4060 TI today, and the conclusion was "Buy Arc A770 instead". Amazing.

    • @vezz9913
      @vezz9913 Рік тому +117

      yeah this review is almost shilly in comparison to how bad the price perf is

    • @EuphoricDan
      @EuphoricDan Рік тому +118

      Jayz2Cents said it was a good card but took the review down. He had an apology video up now lul

    • @LaskaiTamas23
      @LaskaiTamas23 Рік тому +7

      TPU gave it a highly recommended badge 😂😂

    • @SpaceRanger187
      @SpaceRanger187 Рік тому +35

      @@LaskaiTamas23 people donate to hot tub streamers so people will still buy this.. gamers are .....idk what to even call them these days

    • @fellwind
      @fellwind Рік тому +112

      LTT needs to make the sarcasm more obvious.

  • @johngelnaw1243
    @johngelnaw1243 Рік тому +34

    You know, if they'd shipped it as the 4050 Ti for $250, they'd have a winner on their hands.

    • @AaronShenghao
      @AaronShenghao Рік тому +11

      Heck if they shipped it as 4060 at 300, it would still be a winner...

    • @chilldoc9638
      @chilldoc9638 Рік тому +2

      250? Keep dreaming

    • @hirakuotaku5386
      @hirakuotaku5386 Рік тому +1

      @@chilldoc9638 where

    • @chilldoc9638
      @chilldoc9638 Рік тому +1

      @@hirakuotaku5386 in your head, because you're not getting it anywhere else for that price

    • @Dell-ol6hb
      @Dell-ol6hb Рік тому

      @@chilldoc9638 no shit that's why they said "if" vlearly Nvidia is likely never selling GPUs at that price point ever again

  • @sourcelocation
    @sourcelocation Рік тому +5013

    At this point Nvidia really thinks we are stupid lol

  • @Neoxon619
    @Neoxon619 Рік тому +2193

    This is why competition is important, because without it we have companies like Nvidia gouging us for GPU prices.

    • @yosutzuhruoj
      @yosutzuhruoj Рік тому +106

      Wouldn't make a difference. People keep buying. They can slap a 5x increase of price at this point and it wouldn't hurt them at all, people have disposable income, or they save up for the latest and greatest GPU.
      The money isn't even in the consumer market anymore.

    • @RichardServello
      @RichardServello Рік тому +7

      They would be even worse!

    • @paulcox2447
      @paulcox2447 Рік тому +25

      We have competition. It's called AMD

    • @whattheduck69
      @whattheduck69 Рік тому +25

      Your wallet is king.
      Keep hold of your money because someone will want to sell you something at some point. The decision is ultimately ours.

    • @Shakzor1
      @Shakzor1 Рік тому +18

      Well, the competition is busy sniffing glue and pricing their less appealing products at similarly high, rather than trying to gain marketshare with sensible prices

  • @supreme_tech_reviews
    @supreme_tech_reviews Рік тому +56

    My RX 6700xt aged very well. At the time of purchasing, I was disappointed that the 3070 was out of stock. But in hindsight, it was a great purchase

    • @calmkat9032
      @calmkat9032 Рік тому +4

      As a 3070 user, it's also aged very well, if you can even call 2 years "aging". So not only is Nvidia being beaten by amd, its being beaten by itself from just recently lol.
      Edit: not saying this to be an nvidia fanboy, i only got it because i had a great opportunity by getting a prebuilt with it, and I actually use DLSS 2.0

    • @pauloazuela8488
      @pauloazuela8488 Рік тому +2

      ​@@calmkat9032You know they messed up if not only their competitor is held their own against you but your oldee product lineup beats your newer ones 😂

    • @paulbantick8266
      @paulbantick8266 Рік тому +1

      ​@@calmkat9032 "So not only is Nvidia being beaten by amd, its being beaten by itself from just recently lol."
      That's due to NVidia's new 'Self-Flagellating' architecture.

    • @GeneralS1mba
      @GeneralS1mba Рік тому

      ​@@paulbantick8266nvidia's architecture was soo good, but their scalped price margins were too high to ignore, so they shifted their dies up a tier to upsell us consumers :(

  • @radomiami
    @radomiami Рік тому +471

    Hold on, gotta upgrade from the 3060 Ti to the 4060 Ti real quick.
    (opens MSI Afterburner)

    • @averagemike2171
      @averagemike2171 Рік тому +22

      💀

    • @PannyLanny
      @PannyLanny Рік тому +3

      imagine undervolting 4060ti tho

    • @roastinpeace2320
      @roastinpeace2320 Рік тому +2

      @@PannyLanny My undervolted 3060 draws about 85W-115W tops (mostly around 70W), while delivering all of its performance. So 4060ti can probably be undervolted to about same level maybe even more.

    • @bloodyzbub6213
      @bloodyzbub6213 Рік тому

      @radomiami.....Best comment here. I actually lol'd.

  • @Skellist
    @Skellist Рік тому +1362

    This card is just straight up e-waste. Good job, Nvidia!

    • @dai4358
      @dai4358 Рік тому +11

      true

    • @mentalretardation69420
      @mentalretardation69420 Рік тому +21

      more like paperweights that think of circles and then commit suicide right after

    • @ambhaiji
      @ambhaiji Рік тому +45

      not e-waste 4060 Ti 16GB would be a good buy at $299.

    • @BloonMan137
      @BloonMan137 Рік тому +110

      ⁠​⁠@@ambhaiji but it’s not 299 so it’s still garbage.

    • @hollownexus9316
      @hollownexus9316 Рік тому +83

      @@ambhaiji 4090ti would be a good buy at $699. what is your point?

  • @kaystephan2610
    @kaystephan2610 Рік тому +342

    This gen of cards is the best advertisement for RX6000 cards AMD could've gotten.

    • @groenevinger3893
      @groenevinger3893 Рік тому +5

      why?

    • @Pazo139
      @Pazo139 Рік тому +14

      I would have gotten 4070 if it weren't for the 12gb. So i went with a 7900xt

    • @Justachamp772
      @Justachamp772 Рік тому +1

      fr

    • @Justachamp772
      @Justachamp772 Рік тому +18

      ​@@groenevinger3893 bro Nvidia fucked up so bad, that it made AMD's last gen 6000 series GPU look like even better option, and with the huge price cut you can get a 4070 type GPU for less than a 3060ti price.

    • @groenevinger3893
      @groenevinger3893 Рік тому

      @@Justachamp772 So actually you are saying you can better buy the 4070?? So why even start talking about the 6000 generation? Amd still lacks RT.

  • @jasehipolito8209
    @jasehipolito8209 Рік тому +64

    As someone who bought a 3060 OC during the pandemic, this makes me happy that i not missing out on much. It even has 12Gb vram 🥴

    • @nisnast
      @nisnast Рік тому +2

      Same tbh

    • @fynkozari9271
      @fynkozari9271 Рік тому

      Scamvidia. How long do I wait for msi 4070 slim $700 to drop to $500. 2 years?

  • @husselbruss5124
    @husselbruss5124 Рік тому +767

    Honestly looking back, the humble RTX 3060 with 12GB of VRAM was an absolute miracle that did not get appreciated nearly enough at the time.

    • @steve_bisset
      @steve_bisset Рік тому +51

      This ^ I bought a 3060 12gb to keep my old gaming PC going. Then built a new gaming pc with a 3070ti a year later. That 8gb 3070ti isn't looking too good now, but not like there's much alternative in the 40 series anyway 😂

    • @kodykj2112
      @kodykj2112 Рік тому +17

      I have a 3060TI and it's been really good so far but I'd sure love some more memory

    • @FeTiProductions
      @FeTiProductions Рік тому +31

      Yes it was. A lot of people wanting to justify the 3070 and 3080 say the 3060 should've been an 8GB VRAM card, not realizing its all the Ampere stack that needed way more VRAM and the 3060 was the only gem of the crop

    • @Evbuscus1
      @Evbuscus1 Рік тому +10

      I got a 3060 for my first "workstation" PC and it feels pretty good to be able to run UE5, Photoshop, and ten different browser tabs without it breaking a sweat. You really feel all that space in VRAM.

    • @ghjong001
      @ghjong001 Рік тому +15

      I think you mean the 6700XT. Really, pretty much the entire RDNA2 lineup at current prices looks absolutely amazing compared to the current generation from both companies right now.

  • @SamiulIslam-dr3ew
    @SamiulIslam-dr3ew Рік тому +354

    feels good to see Intel making its way up into the "worthy of being compared with" list
    really excited to see Battlemage

    • @SamiulIslam-dr3ew
      @SamiulIslam-dr3ew Рік тому +1

      @@Blackfatrat I thought of getting one for my new build as well, but it still isn't available in my country. Hope in 5 years we'll do our mini scrapyard war with an A770 and a 13700K

    • @chronometer9931
      @chronometer9931 Рік тому +2

      We might actually get to see a shake up in the graphics industry next generation thanks to Intel. A770 Made me really excited for battle mage as well, Intel is crushing it!

    • @MrMaddog2004subscribe
      @MrMaddog2004subscribe Рік тому

      I would buy if they had more powerful gpus, but right now I can't justify going from a 3060 to a A770

    • @chronometer9931
      @chronometer9931 Рік тому

      @@MrMaddog2004subscribe Then you must be excited for Intel battlemage?

    • @MrMaddog2004subscribe
      @MrMaddog2004subscribe Рік тому

      @@chronometer9931 what's that?

  • @HoloScope
    @HoloScope Рік тому +309

    Nvidia did a great job with making me consider going with ARC. Which I never thought I’d consider. Thanks Nvidia!

    • @Gajaczek93
      @Gajaczek93 Рік тому +14

      16gb arc enjoyer here, if you value peace of mind don't do it. If you like at least a tiny amount of tinkering you can do it. If you want sff build with arc like me- fuck no

    • @HoloScope
      @HoloScope Рік тому +45

      @@remeuptmain This isn’t twitter ya degen

    • @Skyflairl2p
      @Skyflairl2p Рік тому +18

      @@Gajaczek93 "enjoyer" > proceeds to display discontent. I 100% believe you are indeed an arc owner.

    • @zairman
      @zairman Рік тому +2

      Been using an ARC 770 for a while now and it's been great!

    • @madprophetus
      @madprophetus Рік тому

      Always Blue

  • @テスタロッサ
    @テスタロッサ 10 місяців тому +4

    So what is the problem with having 128 bus with 32MB L2 cache than 256 bus with 4MB L2 cache?

  • @bobbygetsbanned6049
    @bobbygetsbanned6049 Рік тому +274

    If I was NVIDIA I would be pretty damn embarrassed to see the 6700XT go toe to toe with the 4060TI. Lets hope AMD doesn't screw up the 7600 and 7700 pricing so the whole 4060 series can be totally irrelevant.

    • @Tomazack
      @Tomazack Рік тому +13

      It would be an easy win for AMD, launching their mid tier cards at a competitive price. 6700XT and 6800XT are already really good deals, make the 7000 series a little bit better without being greedy and Nvidia will have to rethink their strategy.

    • @oxfordsparky
      @oxfordsparky Рік тому

      it goes toe to toe in raster only, add in literally any other factor and the 6700xt loses big time.

    • @uhis1686
      @uhis1686 Рік тому +7

      @@oxfordsparky your point?

    • @IdeasAreBulletproof
      @IdeasAreBulletproof Рік тому +4

      @@oxfordsparky In ray tracing the 6700xt has the 12 gigs, so when even more and more Ray tracing workloads use more then 8gb, the 4060ti will be completely unusable, maybe the 16gb card is more useable but it really depends on the ray tracing load on if it is heavy on the bus or not, while the 6700xt gets some framerate in ray tracing compared to the 4060ti 8gb. In Content creation, a 4060ti (16gb variant) probably compares to or beats 3080 noting that a 4070 beats a 3090ti in content creation and probably beats a upcoming 7950xtx & 7900xtx in blender, So that point still stands.

    • @IdeasAreBulletproof
      @IdeasAreBulletproof Рік тому +1

      AMD needs to price the 7600 below the price of the OG 4060 with performance in between it and the 4060ti Ie: Peformance around a 6700xt. I think a good price for a 4060/4060ti killer would be between $200-$280.

  • @TheBrando212
    @TheBrando212 Рік тому +563

    Any chance of an ARC revisit video soon? Would really love to hear your team's thoughts on how much the drivers have improved.

    • @MsNyara
      @MsNyara Рік тому +8

      I know some people with the card, it is still a heavily buggied mess, but it has improved a lot. I absolutely do not recommend the card for non-tech savvy people or people without patience for crashes and bugs, since you will experience a lot of those, and it will take years for them to solve the majority of them, if Intel doesn't drops the venture first, but when you hit a game that is not bugged (usually DX12) they are a great bang for the buck.
      For the less adventurous the RX 6500 XT, RX 6600 and RTX 3060 12GB stands on the same price ranges with comparable performance and way better energy efficiency, though, but the Intel offerings might potentially improve past their performance level with the time, or not, time will tell. The A350 is also nice if you need cheap AV1 encoding for less than $150 bucks, but the card is kinda doo doo for gaming.

    • @arnox4554
      @arnox4554 Рік тому

      I think they may wanna wait until Intel releases either another huge driver update or another card altogether.

    • @gorkskoal9315
      @gorkskoal9315 Рік тому +2

      notels snarc gpdont?It's emulating direct x9 and is, frankly, a bug ridden pile of shit compared to the other options. at an effective 400 dollars, before tax and shipping you can get a 3060ti, 3060, 1060ti and 1060, and well you get the idea a few cards that do just about as well without half the suck.

    • @Rickyc12s
      @Rickyc12s Рік тому +4

      As someone daily driving the A750 it's perfectly fine. I've had no issues playing anything (Ark, Warcraft 3, Rocket League, Tomb Raider, CyberPunk and GTA5 mainly) - I never had any issues from day one though about a year ago when I bought it.

    • @Gramini
      @Gramini Рік тому

      @@gorkskoal9315 Don't they now use dxvk in their drivers to translate D3D9 to Vulkan?

  • @diegoleiva7242
    @diegoleiva7242 Рік тому +106

    Happy to see the Arc A770 punching here and there, even if it's a mixed bag of results all the time. I expected it to catch up at 1440p and over. Oh well. EDIT: In GN's review it matched the 4060Ti on Cyberpunk at 4K. Not bad!

    • @iconoclast485
      @iconoclast485 Рік тому +20

      with intel's engineering team I am kinda looking forward to what they do with their battlemage architecture.

    • @KarrasBastomi
      @KarrasBastomi Рік тому +3

      For card priced at 250 USD it's a steal

    • @Kholaslittlespot1
      @Kholaslittlespot1 Рік тому

      If I switch from NV, I'm hoping it will be Arc... But I like top end cards, so we will see.

    • @roberthumphrey1392
      @roberthumphrey1392 Рік тому

      ​@@remeuptmainyou responding to everyone's comment with the same shitty ratio comment get a grip kiddo 😂😂

    • @thisnowthen
      @thisnowthen Рік тому

      ​@@KarrasBastomiIf AMD didn't exist. Unfortunately, amd exists and their cards are way faster at $250

  • @Somatom_Man
    @Somatom_Man 6 місяців тому +5

    It is sad that all these influencers have decided they can get more response and comments by jumping on the 4060TI negatively. I just installed mine and love it to death. It is massive, impressive and fast. It has the newest 12 pin high power connector and with inflation added, I think it is a great value. These people have a sytem they use when making videos. UA-cam tells them what they will pay more money for, and they respond in kind. Hate videos do better than informational videos many times.

  • @TeamixlYT
    @TeamixlYT Рік тому +54

    "We realized that we were giving consumers cards that are too future proof... so we decided to make each new generation become obsolete in a years time!"

    • @bipbop3121
      @bipbop3121 Рік тому +5

      It costs more than I want to spend and it won't even play today's games the way I am interested in playing them. Nevermind future proof is failing now

  • @joemarais7683
    @joemarais7683 Рік тому +227

    Thanks for the great review of the RTX 4050ti. This gpu seems very reasonable at $250 and I’m sure the budget market will be happy to finally have a marginal upgrade over the 2060 Super.

    • @Zack_Mee_Nhatz
      @Zack_Mee_Nhatz Рік тому +3

      This needs to be said outloud.

    • @wylde_karrde
      @wylde_karrde Рік тому +12

      Jokes aside... even the 2060 super has 8gb of vram, lol. This card isn't an upgrade for anyone unless they're on a 1060 or older imo, and even then, still better options for the price.

    • @Padopoulosman
      @Padopoulosman Рік тому +2

      2023 GPU market (Good Ending)

  • @longview3k69
    @longview3k69 Рік тому +539

    The Arc A770 has been getting huge performance upgrades since release and it would be an interesting video to see how far its come since. Probably not a "30-Day Challenge" video, but a Value Proposition Card compared to others around it's class currently.

    • @awesomestuff2496
      @awesomestuff2496 Рік тому +1

      Is ARC a seperate company or part of AMD??

    • @joseavila-marquez6085
      @joseavila-marquez6085 Рік тому +47

      @@awesomestuff2496 It's intel's graphics card division

    • @scroopynooperz9051
      @scroopynooperz9051 Рік тому +2

      But isnt Intel scaling down their GPU division operations? How long can they continue this run of performance increases?
      Intel is a bit of a dark horse in this GPU space, but we all should absolutely be cheering them on.

    • @longview3k69
      @longview3k69 Рік тому +27

      @@scroopynooperz9051 As far as I know, those were rumors as word got out that Intel was closing or downscaling their GPU Division.
      BUT Intel has since dispelled the rumors and released a diagram saying that they will work on GPUs from Battlemage (B-Series) up to their Druid (D-Series) Architecture. It's at least guaranteed that Battlemage will come out by the end of 2024.

    • @gurdit11
      @gurdit11 Рік тому +2

      This.

  • @jea6141
    @jea6141 Рік тому +2

    Jedi Survivor crashed on my 4090 with 24 GB Ram, with the notification "not enough VRAM available.." with max settings and RT on. What a joke!

  • @Scitch87
    @Scitch87 Рік тому +167

    I feel like it should've been mentioned that the 4060 TI has a x8 PCIE 4 connection.
    Would've been great to see you testing the 4060 TI on a PCIE 3 system to see how much performance might be lost.

    • @pfizerpricehike9747
      @pfizerpricehike9747 Рік тому +12

      I guess that’s part of the agreement
      Nvidia would never allow that
      Like they wouldn’t allow similar priced higher tier amd cards shown in the same chart

    • @AzraelAlpha
      @AzraelAlpha Рік тому +2

      The card will not bottleneck the PCIe lanes or bandwidth on PCIe 3 so the difference is minimal, almost within margin of error.

    • @Spentalei
      @Spentalei Рік тому +9

      x8 is whatever, x4 is where the fun (read: not fun) begins.

    • @CrocoDylianVT
      @CrocoDylianVT Рік тому

      You don't lose any on 3.0, just a bit lower 1% lows

    • @alexhades69
      @alexhades69 Рік тому +2

      @@CrocoDylianVT You do, actually. These days, texture streaming is a thing, thus making the bus the bottleneck. 128 bit is a joke. No matter how you look at it.

  • @Not_A_Serious_Guy
    @Not_A_Serious_Guy Рік тому +824

    It's safe to say that AMD really did well in coming back with a bang in the market

    • @Soraviel
      @Soraviel Рік тому +2

      Yep

    • @th1jzzz531
      @th1jzzz531 Рік тому +47

      Love my 6750 XT and probably soon 6950 XT. Imo no one should be buying nvidia rn cause of the insane greed and afwul price to performance compared to team red. Dont know why so many people are still buying cards like the 3060 which is awful value

    • @najeebshah.
      @najeebshah. Рік тому +20

      how lol? they got destroyed in the top end , they have yet to launch a lower end card than the 7900xt and xtx

    • @sreyass1118
      @sreyass1118 Рік тому +64

      7900 xtx is better than 4080 and it's cheaper

    • @F1ll1nTh3Blanks
      @F1ll1nTh3Blanks Рік тому +8

      Well for the last gen cards at least. 6xxx are kinda getting the last laugh here.

  • @ZaikoEAG11
    @ZaikoEAG11 Рік тому +263

    This generation of cards (both companies) is a tough sell... the more (generations) we progress the more caveats we hear to justify the price...
    The 1080ti was a miracle. Best purchase ever.

    • @fajaradi1223
      @fajaradi1223 Рік тому +11

      They'll be the best card to buy at 2025

    • @mrcaboosevg6089
      @mrcaboosevg6089 Рік тому +13

      I think AMD are better but they've both lost the plot with their pricing, i've got a 2080 and see no reason to upgrade

    • @gitrekt-gudson
      @gitrekt-gudson Рік тому +1

      @@mrcaboosevg6089 You see no reason to upgrade from a 2080? Do you not play newer games? I am on a 2080ti and absolutely ready to be done with it. Ordered a 4090.

    • @xxstealerxx
      @xxstealerxx Рік тому +3

      @@mrcaboosevg6089 2080 is just bad for modern gaming. 20 series is a flop

    • @11Stormtrooper
      @11Stormtrooper Рік тому +5

      still rocking my 1080ti as well!

  • @JacketCK
    @JacketCK Рік тому +3

    who ever fucked up the table at 0:12 is so getting fired 💀

  • @Ja_Schadenfreude
    @Ja_Schadenfreude Рік тому +227

    This card is the 4050 and should be MSRP at $249.99. $279.99 for the 16 GB version. AND discounted a few months in by $50.

    • @ogaimon3380
      @ogaimon3380 Рік тому +2

      not in my wildest dream 😂

    • @KingLarbear
      @KingLarbear Рік тому +35

      I totally agree with you. They're slowly trying to move the line so that you'll think this is 4060 but it is more like a 4040

    • @FerrisMacWheel
      @FerrisMacWheel Рік тому +9

      Or should come as a bonus when buying a pack of Doritos

    • @knocks42
      @knocks42 Рік тому +9

      Impossible!!! If they do that, how could Jensen afford his new leather jacket???

    • @disguiseddv8ant486
      @disguiseddv8ant486 Рік тому +11

      Nvidia is pulling the 4080 12GB/16GB model scheme again with the 4060 Ti 8GB/16GB models.

  • @aceofclubskid
    @aceofclubskid Рік тому +108

    Just a friendly reminder that the rx 6950XT is going for about $600 still. It has gaming performance on-par with a 3090, only suffering in ray-tracing and productivity tasks.
    If you're primarily a gamer, it's really worth taking a look at this card, it's an unbeatable value.

    • @pauloa.7609
      @pauloa.7609 Рік тому +6

      Very high energy consumption though.

    • @aravindpallippara1577
      @aravindpallippara1577 Рік тому +32

      @@pauloa.7609 better than 3090 however

    • @pauloa.7609
      @pauloa.7609 Рік тому +4

      @@aravindpallippara1577 true but that doesn't lower it's outrageous power consumption, right?

    • @pauloa.7609
      @pauloa.7609 Рік тому +6

      @@remeuptmain true, I consider anything below a 4090 a waste of sand but I don't want to pay 2000+ euros for it even if I could buy literally 100 cards because that would be burning money away. These cards need to be half of the price to be worth it, all segments, from both nvidia and amd.

    • @xTurtleOW
      @xTurtleOW Рік тому +7

      @@pauloa.7609 So the only card you think is good you are too poor for. Nice man

  • @DerpySwag
    @DerpySwag Рік тому +255

    dude this graphics for the graphs are absolutely SPOT ON. LTT have clearly listened to their community about feedback on how clear the graphics could be, and have made massive improvements. thank you so much

    • @thanos879
      @thanos879 Рік тому +9

      @@muyoso that sounds like a personal problem you have 😂. Are you unhappy in life

    • @nRuaif
      @nRuaif Рік тому +10

      @@muyoso This is a video, not a presentation. You know you can pause right?

    • @bakedbeanfanclub
      @bakedbeanfanclub Рік тому

      ​@@muyoso It's different approaches, and that's OK. GN includes a large variety of cards, but LTT reruns benchmarks for every card in every review. One isn't better than the other, but it shouldn't be surprising that GN's breadth and depth of data is bigger.

    • @_aullik
      @_aullik Рік тому

      @@muyoso those are different styles. GN shows a lot but i have to pause every time he shows a graph otherwise i will miss something important. LTT shows a lot less, but what they show i can see at a glance.

    • @yvan2563
      @yvan2563 Рік тому

      I do wish they would skip the weird squares in the background, makes it harder to look at the content.
      edit: it's the LTT logo... makes it looks like crap in 480p.

  • @dezz_YT
    @dezz_YT Рік тому +3

    Not me and my 100 dollar laptop knowing damn well we can never afford that💀💀💀

  • @Alzorath
    @Alzorath Рік тому +87

    I actually got a killer deal when I got my 6700XT (was over $100 cheaper than the 3060 non-Ti at the time) - and can 100% agree, as long as you don't need ray tracing, it holds its weight insanely well without breaking the bank.

    • @SinisterSlay1
      @SinisterSlay1 Рік тому +11

      I've turned on Ray tracing on my 6700xt and it works fine. Totally usable. Nvidia does not have the monopoly on Ray tracing

    • @Alzorath
      @Alzorath Рік тому +3

      @@SinisterSlay1 oh, it's definitely usable with it, it just takes more of a performance hit from it than nVidia last I tried it.

    • @Gay-is-_-trash
      @Gay-is-_-trash Рік тому

      ​@@xenstence AMD shill

    • @Gay-is-_-trash
      @Gay-is-_-trash Рік тому +1

      Bugs, glitches, crashing, half your library barely works.. Buy a 3060, you will be amazed how optimized Nvidia is

    • @Gay-is-_-trash
      @Gay-is-_-trash Рік тому

      @@xenstence OK shill. Have you played any real games? Games not made specifically for AMD powered consoles

  • @ninjadev64
    @ninjadev64 Рік тому +397

    I'm so happy to see Arc finally getting some recognition. It's just been hated on all across UA-cam.

    • @CleonofAthens
      @CleonofAthens Рік тому +31

      Ok as an Intel Arc A770 LE owner it is very much deserved. This card doesn't do well enough in modern games vs an RX 6600 to justify your entire library of older games being a buggy mess that doesn't work at all 20% of the time due to game breaking bugs. I've had the card 1 month because I listened to everyone complain about VRAM, and who cares about VRAM when the experience is literally garbage. Intel has 3 different programs to control the cards software and the card can't even boot correctly from sleep mode 8 months after launch. The performance is only sometimes good in optimized Direct X12 brand new sponsored games, and it is a joke for everything else. I replaced an RX 480 8gb and I feel like it did a better job than the ARC card. While I may only be getting 40 ish FPS, its better than playing a glitchy mess with high FPS because of emulation. The AV1 encoder also doesn't work half the time in OBS for no apparent reason either. Theres entire forum posts filled with bugs that are known for months and they will never get to them because the ROI is not just worth it for them. Just buy the RX 6600 for 200$ on a budget and upgrade whenever you have an issue. Because think of this. Spend 200$ now keep the 80-150$ you'd spend to get to the RX 6700/6750XT series, throw it in a stock market portfolio and use it to buy a brand new RX 8600XT realistically or something like what that might be. It'll be the latest product with the latest feature set as well.

    • @poochyenarulez
      @poochyenarulez Рік тому +2

      @@CleonofAthens wow its still bad? I thought the drivers were good at this point.

    • @TheFriendlyInvader
      @TheFriendlyInvader Рік тому +11

      ​@@poochyenarulez meh I'd give it another year to smooth out. Point being that the Arc cards are going to age much better than this piece of shit card. There is not improvement to come for the 4060ti, this is it. It's done, it's garbage forever, but Arc still has room to stretch, and provided Intel stays true to supporting their GPU division it will get better.

    • @TheFriendlyInvader
      @TheFriendlyInvader Рік тому +2

      ​@@RyTrapp0 it's not going to take nearly as long as you'd expect. This may be hard to believe, but Nvidia basically had to nuke all driver optimizations after Turing. This is also why post Turing a significant number of older titles encounter strange artifacting or similarly terrible DX8/9 performance when compared against older generations of cards. The only exception to this being the rare few titles which are still popular and on DX9, which amounts to essentially none now that CS has moved to source 2.

    • @EbonySaints
      @EbonySaints Рік тому +6

      ​@@poochyenarulez ​As someone else who uses Arc... they're better... 😐 They're not "dumpster fire" tier like before January, but seeing as how they now just got to the point of having Arc Control launch at start without requiring UAC and now enabling fan control, they have quite a ways to go. Granted, Intel has put in a lot of leg work to fix them, but starting from so far behind means that DX11/10 performance is a bit janky. DX9/VK isn't great either, though that might just be because I play janky map games and I have the thing clocked as high as it will go while not BSOD'ing every five seconds. It's considerably better than at launch. Also, DX12 performance is insane.
      I personally don't stream, so I don't know how nice it plays with OBS, but in Handbrake the thing is a beast.

  • @farhanmahalludin
    @farhanmahalludin Рік тому +185

    The 4060 Ti made me realise how insane the value and the price to performance ratio of the 3060 Ti back in 2020. I was lucky to get one at MSRP when it was launched.

    • @sirmongoose
      @sirmongoose Рік тому +11

      same straight from EVGA before they stopped making nvidia GPUs for good (LMAO)

    • @pfizerpricehike9747
      @pfizerpricehike9747 Рік тому

      But the card was crippled like 3070 and ti with mere 8 gigs knowing you’ll be forced to upgrade much quicker

    • @sirmongoose
      @sirmongoose Рік тому +1

      @@pfizerpricehike9747 3060 xc has +4GB vram with +36% bandwith compared to the 4060 Ti. but lower L2

    • @farhanmahalludin
      @farhanmahalludin Рік тому +1

      @@pfizerpricehike9747 Yeah 8 GB of VRAM was a concern even back then in 2020 but it wasn't as big of a deal unlike today.

    • @StreetPreacherr
      @StreetPreacherr Рік тому +1

      HEll, I'm still enjoying my 1080ti purchased on its release day, which can still render most new games at 60fps at max settings as long as you're happy wit 1080p resolution, which is the best my TV/Monitor is capable of anyway.

  • @Joel-ew1zm
    @Joel-ew1zm Рік тому +2

    Power consumption, heat and noise are big for me. I don't know if I speak for many PC gamers, but I tend to run discarded professional workstations as a platform for my gaming PC's. I can usually get them for free after they are a few generations old. These PC's while powerful and able to support discrete GPU's, they do not have tons of airflow and also their power supplies don't have the same wattage or PCIe cables as aftermarket gaming PCs do. For me, staying well below 200W for the GPU is a hard requirement. I am finally able to replace my 1070ti with something that is nearly twice as powerful while staying in (actually below) my existing card's 180W thermal envelope. The same cannot be said for the AMD cards or even the 4070 series.

  • @SilensMort
    @SilensMort Рік тому +353

    Steve at GN said this card is "bordering on a waste of sand," and honestly, I think that's more accurate than you're willing to get on this review.

    • @Ja_Schadenfreude
      @Ja_Schadenfreude Рік тому +83

      Yeah, definitely disappointed in LTT on this one. This card simply is not good value.

    • @kimbroslic3
      @kimbroslic3 Рік тому +40

      Gotta keep Nvidia happy so they can keep getting review samples lmao

    • @sweetsurrender815
      @sweetsurrender815 Рік тому +17

      @@Ja_Schadenfreude Not good value but it's a good card in itself. That's what they said. Get something else, but if price was better this would be good.

    • @ferretsmiles
      @ferretsmiles Рік тому +45

      ​@ww7285 do you get off on hating. This wasn't a positive review the conclusion is literally that the pricing is designed to get you to spend more money than you should but that luckily the competition at this price is pretty good even including intel. How can this be anything but negative when one of the conclusions is maybe try ark instead.

    • @fvdeddrift
      @fvdeddrift Рік тому +5

      It's on pat with the 11th gen Core CPUs! Actual true waste of sand, since 10th gen was generally better in every way... But this GD Furry60 shit is coming real close. Lol! I know the 8x vs 16x thing is probably not a huge deal for most... But fuck me! Who thought that would have been a good decision? Simply from an online trolling standpoint! I intend to make the fuckin fun outta that... And the decrease bus... And... And. . ..... WTF?

  • @jensenhuangnvidiaCEO
    @jensenhuangnvidiaCEO Рік тому +50

    it might be only 10% faster, but the most important part. And this is the part that really matters. Its profit margins are 100% more than the previous card.
    Thanks for the leather.

    • @Kakuruma
      @Kakuruma Рік тому +7

      Can you give me a 4090? I'll give you a pretty leather jacket and I'll lie about the 4060Ti being good.

    • @masiosareanivdelarev562
      @masiosareanivdelarev562 Рік тому +1

      Nice!

  • @unsignedlong
    @unsignedlong Рік тому +192

    The moral of the story is that if you're able to find RTX on a deal, then you can be 100% sure that there is a RX card somewhere that performs same or even better with lower price. With is only NVIDIA's fault and nobody's else. Great work!

    • @aleksandarlazarov9182
      @aleksandarlazarov9182 Рік тому +1

      "Yesterday's performance for yesterday's price!", I feel like the prices of used 3060ti's and lower won't go down much with the release of this one, just no insentive. :(

    • @AdamKuzniar
      @AdamKuzniar Рік тому

      unless you do any pro work and need those CUDA and AI cores

    • @yukisnoww
      @yukisnoww Рік тому

      Nvidia's cards always commanded a premium because features and fanbois

    • @unsignedlong
      @unsignedlong Рік тому +1

      ​@@aleksandarlazarov9182 I think it goes "Yesterday's performance for tomorrow's price" but yeah... The whole value of 4000 series is so bad that prices of last-gen barely moved (at least where I live). Also, there are not that many people upgrading so it's also harder to find something used from 3000 series.

    • @unsignedlong
      @unsignedlong Рік тому +1

      @@AdamKuzniar Yeah, the CUDA stuff from NVIDIA has great support from any software in existence and there is no denying it, and only then there is some kind of AMD acceleration (but also probably inferior anyway)
      But that's kinda the issue as NVIDIA has so much profit from the enterprise that needs CUDA acceleration as well as AI stuff that I don't see just why they're treating the general customer in such a bad way for giving us such bad value products just to make it feel more premium :(
      I feel like only self-employed people would want to seriously use CUDA or AI acceleration on something like a 70 series card or lower just because they already had that card anyway. Any serious business would go straight for 4080 or 4090 or some Quadro thing as time equals money. Those same businesses can also justify spending such a ridiculous amount of money for a 4090's xd

  • @Sergmanny46
    @Sergmanny46 Рік тому

    How many sponsors do you want in your video?
    Uploader: Yes.
    Jesus, can't get rid of ads even when video browsing.

  • @garrettkajmowicz
    @garrettkajmowicz Рік тому +62

    NVidia's engineering effort seems to be all-in on product segmentation rather than product quality. The cache+reduced memory bus allows decent game performance while hamstringing the professional/AI workloads and so allows for better upselling to the higher-tier product.

    • @thegambler9994
      @thegambler9994 Рік тому

      Hey, it's a loss for the consumer, but they're not cannibalizing their products either - something AMD has already stumbled on a couple times with their Zen chips.

    • @squidwardo7074
      @squidwardo7074 Рік тому

      Almost nobody is buying this to use with AI or "professional" workloads.

    • @garrettkajmowicz
      @garrettkajmowicz Рік тому

      @@squidwardo7074 Notwithstanding the gimped memory capacity, I/O, and PCIe connectivity, the core has about the same performance as the RTX A4000 professional card, which is currently retailing for about $930. So, yeah, it could easily bite into professional sales if it wasn't so neutered.

    • @jamesalexander5559
      @jamesalexander5559 Рік тому

      ​@@squidwardo7074 My university loved to buy Nvidia's gaming cards for machine learning research back when I was there 6 years ago. Quadro and Fire Pro cards are way too expensive for some organizations.

    • @squidwardo7074
      @squidwardo7074 Рік тому

      @@jamesalexander5559 I doubt that's more than 5% of sales

  • @AventineArchives
    @AventineArchives Рік тому +50

    6700XT is the one to go for at this price range.

  • @MichielQuintelier
    @MichielQuintelier Рік тому +87

    Just wanted to say BIG thank you for highlighting the graphics card in question in the graphs of this video! I don't often comment on UA-cam videos, but this improvement is so worth it! Big praise to the people who create the graphs ❤

  • @Saxony
    @Saxony Рік тому +12

    All the videos I've seen testing the 4060ti show it performing really well in games like cyberpunk (with DLSS) and Red Dead Redemption 2, I currently have a 1070 and Im intending on upgrading to a new build with the 4060ti 8gb which I could just swap out for the 16gb version or a 4070 sometime in the future if I really needed to if there really is a VRAM issue which from what I've seen there actually isn't. I haven't even run into any VRAM issues with my 8gb 1070 at 1440p, it's just that its just not powerful enough anymore and I want access to DLSS and RTX.

    • @smeet3010
      @smeet3010 10 місяців тому

      Did you get the 4060ti? How is the performance? I am on a 1060 6gb and 4060ti would be a 3 gen leap for me.

    • @Saxony
      @Saxony 10 місяців тому +1

      @@smeet3010 no. I got a 4070 and it was well worth it

  • @cncgeneral
    @cncgeneral Рік тому +132

    I appreciate the older cards in the graph. A 1060 is probably where a lot of people buying these will be upgrading from

    • @Grumpy_old_Boot
      @Grumpy_old_Boot Рік тому +11

      I'm still running a GTX 1660 Super ... and I intend to continue doing so for a while.

    • @cncgeneral
      @cncgeneral Рік тому +11

      @@Grumpy_old_Boot My GTX780 was still running things fine until 2 weeks ago it died and my friend gave a his old 1660 as he felt bad 😅

    • @Grumpy_old_Boot
      @Grumpy_old_Boot Рік тому +3

      @@cncgeneral
      Yup, as long as you don't have a high fidelity display with g-sync or something like that, the middle class cards is plenty.

    • @Melamamoduro
      @Melamamoduro Рік тому +4

      As things are going it doesn't seem that people will upgrade from the 1060 for a few years more.

    • @Adri9570
      @Adri9570 Рік тому

      @@cncgeneral how it died and how old it was? I'm curious about the life span of the PC components.

  • @alistairblaire6001
    @alistairblaire6001 Рік тому +31

    The VRAM concerns are valid because the current gen consoles have 16gb shared between textures and everything else. When they port these games to PC it translates to needing more than 8gb VRAM, especially if you turn texture details up. Native PC games might be a bit more optimized but simultaneously tech marches on and those games are going to need more VRAM anyway.

    • @BriBCG
      @BriBCG Рік тому +3

      Yes, and it would be pretty pathetic if a graphics card that cost as much as a current gen console on it's own was forced to use settings that make it look worse than said console because of VRAM limitations.

    • @alexanderkhan9097
      @alexanderkhan9097 Рік тому +3

      seriously, it almost feels like nvidia lately is secretely spearheading sony's ps5 marketing campaign. Why play on PC at all when the GPU alone costs as much as an entire PS5 digital?

    • @mrcaboosevg6089
      @mrcaboosevg6089 Рік тому +4

      In this 4k world we're living in now the textures alone will eat up memory, without those 4k textures there's little point in even playing in 4k

    • @squidwardo7074
      @squidwardo7074 Рік тому

      @@mrcaboosevg6089 noibody is buying a 60 series gpu and playing at 4k...

    • @mrcaboosevg6089
      @mrcaboosevg6089 Рік тому

      @@squidwardo7074 They can handle 4k in a lot of games, 4k isn't that special any more

  • @HeisenbergFam
    @HeisenbergFam Рік тому +154

    The only way to force Nvidia's prices down is if competition gets really fierce

    • @williamcricket7931
      @williamcricket7931 Рік тому +50

      Just don't buy NVIDIA

    • @lonzoformvp5078
      @lonzoformvp5078 Рік тому +4

      AMD cant even do that

    • @Magnum_Express
      @Magnum_Express Рік тому +14

      its been pretty fierce if you ignore ray tracing performance. amd card hold their own now and much cheaper prices than nvidia equivalents.

    • @MallocFree90
      @MallocFree90 Рік тому +1

      ​@lonzo for mvp people like you are the peobl

    • @F1ll1nTh3Blanks
      @F1ll1nTh3Blanks Рік тому

      The competition is more fierce than ever before so much so, if all you care about is gaming, this card is already 200 overpriced, if this thing sells then, it's because it offers good enough value or because the consumer isn't very smart.

  • @XpertEngineer12
    @XpertEngineer12 5 місяців тому +1

    The Dnf at 3:09 is actually crazy

  • @zkatt3238
    @zkatt3238 Рік тому +58

    If Nvidia had put a decent width memory bus in this thing, it would probably be a bit more competitive. As it stands, they are not going to sell many of these things.

    • @gorkskoal9315
      @gorkskoal9315 Рік тому +5

      I don't think ngreedia gives a shit. Amd might be spanking them in the 1020-1440 raster user area, but that's what? 10 maybe 12% of nvidias target? Testla only needs the Cuda cores for their "ai" to work. I have no idea if they need bandwidth. render farms might need bus and clock speeds but those guys have more than enough cash just from 2hours of interest alone to buy up a fuckton of creem of the crop cards.

    • @timmyl0
      @timmyl0 Рік тому +2

      This is effectively e-waste at its price point.
      The only thing I can see this being alright in is prebuilt PCs from the usual guys (HP, Acer and so on) or people who don't want to break the bank but need/want to upgrade.
      The issue I see is people deciding to jack up the prices on 3060ti and so forth like the 2xxx series were.

    • @noxious89123
      @noxious89123 Рік тому +1

      @@gorkskoal9315 Steam survey shows that something like 77% of users are playing on 1080p still.

  • @tbthunderer
    @tbthunderer Рік тому +628

    I bought a 6750XT in January for about $430 including tax. I was worried when the budget RTX cards released I was going to be upset I didn't wait. I want to take this time to thank Nvidia for not making me feel bad about my purchase, in fact, I'm happier with my purchase now than I was a week ago! 😂

    • @YourLordMobius
      @YourLordMobius Рік тому +13

      I paid $1,400 for my 6900xt during the shortages. I didn't feel too bad at the time as it was only $400 over MSRP, but I felt worse later on when everything came back in stock. Now with these absolute lackluster releases I don't feel so bad. Also since 6900xt's have gotten so cheap I'm not afraid to OC the piss out of my card now.

    • @tbthunderer
      @tbthunderer Рік тому +11

      @@YourLordMobius I overclocked the snot out of my 6750XT basically as soon as I got it since I got it for pretty much a steal, especially since it was the Red Devil variant.

    • @YourLordMobius
      @YourLordMobius Рік тому

      I just recently oc'ed my 6900xt with the morepowertool. Peak it's pulling 395w and won't pull anymore even though it's set to limit at 450w. It's on water so no worries there. Stable in most applications at 2800 mhz. So yeah, I got a little extra performance out of mine LOL.

    • @tbthunderer
      @tbthunderer Рік тому +1

      @@YourLordMobius I'd say! My buddy has a 6950XT and he's a little bit computer illiterate. I tried to talk him into letting me OC it, but he's super worried I'm going to screw it up lol

    • @YourLordMobius
      @YourLordMobius Рік тому +1

      @@tbthunderer ah yes, the old "you could blow it up if you OC" fear. Only way you're going to blow one up is by overvolting. Even with the massive amp draw on mine the voltage is still relatively tame, and it just won't pull any more power even though it's completely unrestrained.
      Edit: Radeon cards have always been notorious for allowing crazy high amp draws. Even without an insane OC and on-air my Vega 56 used to pull 550w with a mild OC.

  • @XionEternum
    @XionEternum Рік тому +73

    What makes this even better is that due to walking back the now 4070ti from it's original 4080 moniker, it's possible this was intended to be the 4070.

    • @JayXdbX
      @JayXdbX Рік тому +21

      I haven't really been keeping up with GPU performance relative to each other but wouldn't the 3070 cream this card if it was called a 4070?

    • @ilnumero1234
      @ilnumero1234 Рік тому +1

      No,there wouldn’t be any 4070ti and the 4070 would have higher cuda corr

    • @XionEternum
      @XionEternum Рік тому +13

      @@ilnumero1234 HAHAHAHA! Fanboy fantasy speculation! The chips were specced over a year ago in R&D. There's no chance the moniker change also changed CUDA core counts, and Nvidia has had x70ti tiers for about a decade now. So there was going to be one no matter what. At best - at absolute best - they removed an x70 Super variant for the change. Can't prove it though so it's speculation vs speculation. Mine's funnier though, so :P

    • @ItsMeStrider
      @ItsMeStrider Рік тому +3

      @@ilnumero1234 copium

    • @mybuttsmellslikebutterbut207
      @mybuttsmellslikebutterbut207 Рік тому

      I wonder what the 4050 will be like

  • @robinotten6635
    @robinotten6635 Рік тому +14

    To be honest I think that the new cards are great considering how much less power they use. I hope they will continue that trent with future releases. Only downside is the price at the moment, but it will come down around black friday and xmass maybe. I hope so at least. Still... I don't get people that would consider a 6650xt or 3060ti over this one with the reason being that it is much cheaper. Well, yes, maybe if you buy a card for one or two years. If you game on the 4060ti you'll get a bit more performance while costing you less then the cheaper models in the long run due to the energy savings.

    • @TessiaVR-cy2tf
      @TessiaVR-cy2tf Рік тому +3

      That is so true. The price is garbage but the higher 40xx atleast are fine. I run a 4090 in cb2077 psycho Pathtracing with framegen on 90fps while drawing 340-370 Watts.
      Those cards are sooo efficient it's amazing

    • @rasmasyean
      @rasmasyean Рік тому +2

      If you live with your parents, you go for the best AMD card you can buy because you don't care about electric bills and air conditioning costs! And when bitcoin goes back up...the 4060 Ti and 5060 Ti and 6060 Ti will cost $2,000+. So think of your 4060 Ti as a speculative investment! rofl!

  • @haroldasraz
    @haroldasraz Рік тому +78

    Looking forward to seeing RTX 5060-Super-Ti with 12 Gb and 32-bit bus aimed at 720p gamers at 60 fps ++++.

  • @Jay-mx5ky
    @Jay-mx5ky Рік тому +60

    An additional issue here that wasn't covered was textures completely failing to load on this card in newer games. I think overall these videos are good for a numbers to numbers comparison, but often times there's more to the story that other reviews are hammering on the gpu for. Textures shouldn't be failing to load on new titles for a $400 product.

    • @yourick1953
      @yourick1953 Рік тому +1

      Exactly, for a 400$ product which could be under 300$ (one tiny die!), I would be expecting atleast 12GB, a bit over 3070 Ti performance for a nice 1440p experience.

    • @ablet85
      @ablet85 Рік тому +1

      Yup, I watch LTT videos but I wouldn’t use them as a guide to buy a GPU ever. I’m glad for creators like HWUB and GN.

    • @ag4640
      @ag4640 Рік тому

      How is that related to the GPU

    • @ag4640
      @ag4640 Рік тому

      How is that the GPU's fault?

    • @ag4640
      @ag4640 Рік тому

      That's optimization dude

  • @Machtyn
    @Machtyn Рік тому +149

    I was hoping you would include the 6800XT. It's a 16GiB card that will be in the same price range as the 4060 Ti, but with more memory.

    • @gorkskoal9315
      @gorkskoal9315 Рік тому +2

      Doesn't it also have faster rawspeed to? That might help a little for performance.

    • @squidwardo7074
      @squidwardo7074 Рік тому +15

      Wouldn't call $100 more in the same price range

    • @kusucks991
      @kusucks991 Рік тому +11

      @@squidwardo7074 I think most would though. Computer hardware pricing generally already varies that much for the same SKU just between retailers

    • @squidwardo7074
      @squidwardo7074 Рік тому +1

      @@kusucks991 Most of that is a waste of money for the lower end cards

    • @Machtyn
      @Machtyn Рік тому +4

      @@squidwardo7074 I guess. The 4060ti 16GiB version is going to be at least $550. If it even gets released.
      For me, the $100 for better performance and more RAM is worth it. I'm going to pull the buy it trigger to replace my GeForce 1080. What's amazing to me is that this 2 year old card bests nVidia's card for $100 difference. I understand they're not supposed to be comparable cards.
      I looked up some used 6700XT cards, and they're $100 or more cheaper than the 4060 and 4060Ti. And these two cards are supposed to be comparable.

  • @UberTastical22
    @UberTastical22 Рік тому +5

    Would love if you specified whether the 1060 is 3 or 6 gb. Great review though, fair and informative

  • @Sleepyless
    @Sleepyless Рік тому +351

    Thank you NVDIA for making the 60s cards 10% better every year so the 7060 ti can maybe get the same results as the 4090

    • @VENOM_S1X
      @VENOM_S1X Рік тому +17

      4070Ti

    • @itsalwaysdarkestbeforethes1198
      @itsalwaysdarkestbeforethes1198 Рік тому +61

      The 7090 ti can give you 120 fps at 8k with pathtracing on, with a headroom to spare! It’s a marvel of engineering
      The 7060 can do 1080p 60 in some games, but not even 30 fps in others. Good deal at just $799!

    • @GamerErman2001
      @GamerErman2001 Рік тому +7

      @@itsalwaysdarkestbeforethes1198 How much power will the 7090 ti use? Maybe 1200 watts?

    • @nukesrus2663
      @nukesrus2663 Рік тому +9

      @@GamerErman2001 1.21 gigawatts

    • @repentofyoursinsandbelieve629
      @repentofyoursinsandbelieve629 11 місяців тому

      4060 ti i like it

  • @pyroslev
    @pyroslev Рік тому +57

    My 6700 XT looks better and better each time. I just wish I had waited a bit longer as I paid about 100 bucks more than I should have for it. Even still, Team Red is looking to be a winner here.

    • @samantha_t99
      @samantha_t99 Рік тому +1

      It's only money, no point dwelling on it as you'll have already made that $100 back.

    • @rohitramamoorthy1019
      @rohitramamoorthy1019 Рік тому

      I can relate man I bought my 6700xt for $400, but think about it this way: I bought the GPU in fall '22, and GPU prices are ALWAYS going to drop as time goes on, so no point about dwelling about it. You had the opportunity to use the GPU much earlier than a decent amount of people

    • @evacody1249
      @evacody1249 Рік тому

      Dude I got my 6700xt in Oct of 2021 for like 800 something. I look at it now and it has paid for it's self. I have no need to upgrade to a new GPU.

  • @sagnikbagchi5076
    @sagnikbagchi5076 Рік тому +52

    Considering the 4070 costs at least around $745 and above here in India, instead of $600. I am so glad I got the 6700XT at the equivalent of 360 dollars last month. The 4060ti launch price will easily cross $500 here.

    • @chillpumkin
      @chillpumkin Рік тому +2

      This! The silicon tax is way too real. For anyone thinking of buying a RTX 4000 card at msrp, no, they're not.

    • @Tainted79
      @Tainted79 Рік тому +2

      RX 6700XT is a great card, I've been using one for 6 months now with no problems. 3dsmax/CAD and plays new AAA unoptimized games pretty well.

    • @KaranYadav-gr5xj
      @KaranYadav-gr5xj Рік тому +1

      Flipkart ❤
      For the first time ever we got PC components for the same price as USA

    • @chillpumkin
      @chillpumkin Рік тому

      @@KaranYadav-gr5xj I mean, on offer yes. We still pay a 16% tax

  • @tantegreta
    @tantegreta Рік тому +4

    I'm sorry, did I miss the part where you critisize nVidia for almost 0% generational upgrade and no pcie 3.0 for a couple hundred dollars more??

  • @mythtech
    @mythtech Рік тому +42

    The graphs are a far more significant upgrade since the last benchmarks video compared to this card VS its previous generation.
    Props to the Labs and Editing team for listening to community feedback!

  • @TheoLubbe
    @TheoLubbe Рік тому +612

    Would've been neat to see the RTX3070 in the list also. It's also a 8GB card, has a pricetag around the same as the 4060Ti and has bigger numbers overall.

    • @alexandersedore7720
      @alexandersedore7720 Рік тому +45

      They still need it to look somewhat decent 👀they also didn't really harp on the fact it loses to the 3060TI in some 4k games and is only like 5% faster at 1440p, but they need free gpus not paid ones. Still better review then Jay2Cents

    • @scroopynooperz9051
      @scroopynooperz9051 Рік тому +61

      the tragedy of the RTX 3070 is that it still could have been very relevant today. it has plenty of horsepower but totally kneecapped by that 8GB.
      even just a 12GB RTX 3070 would probably badly embarrassing Nvidia's current offerings

    • @sudbtd
      @sudbtd Рік тому +14

      ​​@@silentwonder-chilled It is already bad for 1440p. 8gb isn't enough.

    • @TheZanzou
      @TheZanzou Рік тому +21

      @@sudbtd That's a ridiculous statement. The fact it could be better if it had even 4gb more vram doesn't automatically make it bad. I've a 1080 that I still play 1440p on and it still performs fairly well on most modern games even though it's weaker than a 3070. And by fairly well i mean High 60+ FPS on most games, very rarely do I have to turn it down to medium.
      I definitely don't recommend running out and grabbing one just because it still performs well as another 2-3 years that's probably not going to be the case anymore, but the point that just because it doesn't have 12 or 16gb of vram and loses a little performance because of that doesn't make it a bad card automatically still stands. The 3070 has similar performance to the 4060ti so if 1440p ultra 100+ fps on most games is bad.. yeah..

    • @sudbtd
      @sudbtd Рік тому +4

      @@TheZanzou even though, on games like Jedi, the 3070 just cannot run? Ah yes. 8gb vram is "enough"

  • @cheeseisgreat24
    @cheeseisgreat24 Рік тому +107

    See, this is precisely why I’m glad intel got in the game, because being able to punch up considering their pricepoint is exactly the reality check you need to know Nvidia’s pricing is completely out to lunch.

    • @TheEclecticDyslexic
      @TheEclecticDyslexic Рік тому +5

      Nvidia literally doesn't care at this point. They are only in consumer cards for name recognition. They honestly might just stop making budget cards all together given how the data center market is going for them. Budget cards from now on will just be last gen stock.

    • @cheeseisgreat24
      @cheeseisgreat24 Рік тому +2

      @@TheEclecticDyslexic Which is fine by me, I just won’t ever buy them. They don’t have a value-add for me.

  • @Zach-rw6jf
    @Zach-rw6jf Рік тому +1

    I just built my sister her first budget gaming rig. I went with a 4060Ti. Personally I use AMD (and I love to trash talk Nvidia), but admittedly I do fight with drivers occasionally and that isn't something I want her to struggle with. She's really only gaming at 1080, and at $379 on sale it was hard to beat for the price-to-performance, DLSS support, and knowing it's just going to work. It's probably not a card for anyone who watches LTT, but I do think it has a place in budget builds for casual gamers, especially if you can find it on sale.

  • @graphicarc
    @graphicarc Рік тому +31

    Thank you for including intel arc in the benchmarks!

  • @dylanbarrett4698
    @dylanbarrett4698 Рік тому +14

    I really like the little red bubble things to help distinguish where the information that’s being talked about is. Props LMG editors

  • @BjornsTIR
    @BjornsTIR Рік тому +88

    I miss the times, where the 60 series was about the performance of last gen's 80 series

    • @RayanMADAO
      @RayanMADAO Рік тому +1

      When?

    • @RubiGMM
      @RubiGMM Рік тому +28

      @@RayanMADAO gtx 1060 vs gtx 980 ti and rtx 3060/ti vs rtx 2080 (and even older examples if you keep digging down)

    • @BjornsTIR
      @BjornsTIR Рік тому +28

      @@RayanMADAO 980 to 1060, 1080 to 2060 (though it had a vram downgrade), 2080 super to 3060 Ti; Theoretically you could also count 680 to 960, since 600 and 700 were the same architecture. Oh, also 580 to 660 ti

    • @Daeyae
      @Daeyae Рік тому +1

      ​@BjornsTIR 2060 was a 1070ti not a 1080

    • @Gay-is-_-trash
      @Gay-is-_-trash Рік тому +3

      Don't ignore Bidenflation ... All 40 series cards are at the same price, or cheaper than their predecessor if u recognize the Bidenflation

  • @starbuckqbb2287
    @starbuckqbb2287 Рік тому +5

    Feeling really good about my 3060Ti I got for $430 a year ago right now.

  • @matebors25
    @matebors25 Рік тому +43

    Would’ve like to see the 3070 included in the comparisons

    • @lorsheckmolseh3345
      @lorsheckmolseh3345 Рік тому +10

      It's faster.

    • @komorka88
      @komorka88 Рік тому +10

      But nvidia would not like that. So they did not include it.

    • @xsvforce3335
      @xsvforce3335 Рік тому +5

      No point in including another 8GB card you shouldn’t buy. Besides, helping them sell 3070s would have been doing Nvidia a favour and they don’t deserve any.

    • @UnrealPander
      @UnrealPander Рік тому +3

      Its actual peak comedy that there's one guy still riding on the big old "LTT the Nvidia child company" bullshit and another explaining that LTT wants to do anything but help nvidia, right next to eachother
      Made my day

    • @Lodgiefitness
      @Lodgiefitness Рік тому +1

      It’s the same card.

  • @MetalMan1245
    @MetalMan1245 Рік тому +232

    As now a dedicated Arc user (my a380 is treating me well, thank you very much) I am so happy to see Arc putting up a fight in price tiers nobody seems to consider as valid.

    • @cosminmilitaru9920
      @cosminmilitaru9920 Рік тому +23

      Thank you for your service! We should support Intel GPU launch to get in the main ring with AMD and Nvidia.

    • @Innosos
      @Innosos Рік тому +16

      You poor soul. Thank you for your sacrifice!

    • @sorrai
      @sorrai Рік тому +13

      @@Innosos "poor soul" is a bit much. 😂😂

    • @Sergmanny46
      @Sergmanny46 Рік тому +9

      @@Innosos You're talking as if Intel can't make decent cards. They can, people just need to be patient and wait for better drivers because the hardware is solid, only software is their weak point.
      But in the long run, Intel is going to force both Amd and Nvidia (Nvidia mostly, Amd is still somewhat consumer friendly) to get off their high horse prices because aint nobody buying overpriced gpus anymore when you can get an A770 16gb for 340$ that offers similar, if not better performance than the Rtx 3060.

    • @heelyneelyos
      @heelyneelyos Рік тому +6

      Airbus gpu

  • @TheCunningStunt
    @TheCunningStunt Рік тому +18

    Thanks for including the RTX 2060 in the results. It's good to have cards that are only a couple of years old that many of us still use shown to give us a good idea if an upgrade is worth it. Slightly older cards are always left out so we end up with apples to apples comparisons, when many of us need apples to oranges.
    Never forget that about 98% of us that game aren't rocking the latest and greatest or have the cash for top tier hardware.

  • @JinglesOnJuniper
    @JinglesOnJuniper Рік тому +1

    I'm convinced that Nvidia has gone into a tick tock cycle.
    10 series - good performance
    20 series - ray tracing / dlss
    30 series - good performance
    40 series - frame gen
    50 series - good performance?

  • @aid0nex
    @aid0nex Рік тому +59

    The first GPU that I ever bought in my life was in 2012 - a GTX 660. I was 13 years old back then and built my first Gaming PC ever. I cannot believe that this old card that I bought for 200€ and that is still somewhere in my garage today, has a bigger interface then Nvidias latest mid range cards in 2023 that costs 500€ and more. :D

    • @Pasi123
      @Pasi123 Рік тому +5

      The GTX 660 was a nice card, I bought one (GTX660-DC2O-2GD5) second hand in January 2014 for just 140€. I miss the times when graphics cards weren't so expensive

    • @CakePrincessCelestia
      @CakePrincessCelestia Рік тому +1

      @@choppings54 Laughs in 3.5 Gigs :D

    • @Donnerwamp
      @Donnerwamp Рік тому

      150€ for a GTX 550Ti back in, idk, 2010 or so? After that a GTX 670 in 2012 for 350€, which was a huge jumpnfor a poor student back then. Now I'm looking at something that's supposed to be just one tier below the xx70 card (while looking like it belongs in the xx50Ti tier) and it costs more than my first PC as a whole...

    • @aid0nex
      @aid0nex Рік тому

      @@choppings54 There is a GTX 1080 in my PC currently, however I barely play on my PC anymore. I game on my Steam Deck or on my Xbox Series X most of the time.

  • @Hypnodog_
    @Hypnodog_ Рік тому +13

    The 60 series card is going for what I bought my 980 STRIX for back in the day, absolute insanity

  • @harwil32
    @harwil32 Рік тому +36

    Not to mention the used market for Radeon cards if you are willing to take the chance, you can score a 6800 XT for cheaper sometimes!

    • @harwil32
      @harwil32 Рік тому

      @@rustler08 thanks, yikes, guess I checked a while back, they used to be cheaper

    • @21preend42
      @21preend42 Рік тому

      @@harwil32 nvidia cards are cheaper on the second market, maybe because there are more of them.

    • @squidwardo7074
      @squidwardo7074 Рік тому

      except you can do the same thing with nvidia gpus...

  • @TIOLIOfficial
    @TIOLIOfficial Рік тому +1

    0:18 - My god, NVIDIA is just pulling an Apple with this gen.

  • @superhero6785
    @superhero6785 Рік тому +43

    I said it when ARC was released - I'm glad we're finally at a point again where there's a viable 3rd option. Just, so long as you're not concerned about being made fun of when your buddies ask what card you've got :)

    • @scarletspidernz
      @scarletspidernz Рік тому +3

      Nvidia keeps going the way it is Intel will be nothing to laugh about much faster, they've only released thier 1st gen and in 2yrs it went from 🤣 to "well maybe?" 🤔 if they release battlemage and keep at it with driver improvements we'll have a 3rd contender by next gen

    • @Junebug89
      @Junebug89 Рік тому

      @@scarletspidernz Battlemage does seem like it's coming out, the real question is whether their 3rd will come. Part of the problem is that to my understanding, some of the hardware issues with Alchemist were found after Battlemage already entered production. So the real test will be when Cleric or Crusader or whatever they call it comes out.

    • @scarletspidernz
      @scarletspidernz Рік тому

      @@Junebug89 hopefully they keep at it this is a golden opportunity for them atm

    • @gabead
      @gabead Рік тому

      lol😂

  • @Twiney_
    @Twiney_ Рік тому +14

    As someone who bought the rx 6700 xt I'm relieved that it doesn't make that much of a difference even if it's less efficient

  • @andrewwoody9320
    @andrewwoody9320 Рік тому +45

    Exactly why I upgraded to 6800xt from 1660ti. Team red actually gives what consumers need. I have no worries with my 16gb of vram at 1440p. Seems like nvidia is becoming another asus🤨

    • @mass_stay_tapped_in528
      @mass_stay_tapped_in528 Рік тому

      Quick question as that’s what I’m considering to upgrade to as well; currently running a 2060 OC, and it gives me 65-85 fps at 1440p but the 6800XT looks amazing for the price & performance (PLUS ALL THAT VRAM)… would you say your AAA gaming experience is pretty amazing in terms of fps & settings? Really want a bump with games like RDR2 & GOW 2018 … and the 6800XT looks more than capable for 1440p. Genuine question, greatly appreciate any responses back!

    • @umbasa01
      @umbasa01 Рік тому

      My 5700xt has.been holding off well also $300 back in Feb 2020

    • @cabbageboio
      @cabbageboio Рік тому +2

      I'm waiting for the 2nd gen intel arc cards to come out before I finally upgrade my 1070

    • @friendlyhobo6483
      @friendlyhobo6483 Рік тому +2

      ​@@mass_stay_tapped_in528 consider buying a 6950xt instead of the 6800xt. The prices have dropped quite close enough that it makes it worth it. Like $60 price difference.

    • @vinylSummer
      @vinylSummer Рік тому

      ​@@mass_stay_tapped_in528 6800xt is an awesome card for 1440p! I currently own 6700xt and the thing meets all my expectations at 1440p. 60 fps in the latest AAA single player titles with high settings, 90+ fps in beautiful online games and older AAA titles. 6800xt is like 50% more performance than 6700xt so it'll translate to 90 fps in the latest AAA single player games and 140+ in online ones.. I don't think that I NEED that upgrade, but I know damn well that I want it 😭

  • @lucedas3890
    @lucedas3890 Рік тому +2

    I also believe we should be thankful for the 4060ti…the entertainment value on watching it burn in reviews is priceless

    • @wyterabitt2149
      @wyterabitt2149 Рік тому

      Only price is the issue, the card is slightly flawed but still fantastic as a full package. If it gets cheap enough it would be a very good option even with the lower RAM.
      If it doesn't, then it should be avoided.

  • @dlheiland
    @dlheiland Рік тому +80

    I was told by friends I was nuts for doing a 3090 for 4K gaming when it came out. I knew VRAM was going to be an issue. I mod games a lot, and that eats VRAM. It was only a matter of time. Now, I can still sit comfortably not worrying about it for a few more years.

    • @mango_raider4116
      @mango_raider4116 Рік тому

      Unfortunately fanboys are always serial copers and Nvidia is no different. I know Some guys who swear up and down that VRAM is irrelevant and nobody needs it and you're just an AMD shill for saying VRAM is vital to modern gaming.

    • @User-ot1cg
      @User-ot1cg Рік тому +7

      “few more years” you’re set for a very long time.

    • @John-Is-My-Name
      @John-Is-My-Name Рік тому +12

      Duh you bought the extremely over qualified and overpriced top of the line graphics card of last gen, it better be decent for a lot more years.

    • @PrashantMishra-kh1xt
      @PrashantMishra-kh1xt Рік тому +5

      24 gigs of VRAM is going to suffice for at least 8 years at 4K

    • @VeniVidiAjax
      @VeniVidiAjax Рік тому

      Love my 2nd hand 3090. 900€ and playing on 1440P everything by maxed out.
      ACC
      F1
      Flight Simulator.
      Of which i play ACC and Flight sim in VR sometimes.
      I don’t even get why people would go for a 4080 for 1500+ while paying 900 gets you a nice 3090 with 24gb vram.

  • @cristianorossi6971
    @cristianorossi6971 Рік тому +13

    I don't know who had the idea to put a green and a red bar in the charts, but that is extremely useful! Thank you!, finally I can enjoy this type of content without pausing it every single time a new chart appears. Yes, sometimes is useful to see where all the other products are located in comparison, but sometimes I just want to hear the speaker talking.

  • @KrishinPillay
    @KrishinPillay Рік тому +24

    I just bought a 3060TI for $320 brand new last week , a day later they announce the 4060 family and I regretted that I didn't wait.
    Little did I know, there was nothing to regret

    • @someoneelse5005
      @someoneelse5005 Рік тому +3

      *laughs in 256bit bus*

    • @21preend42
      @21preend42 Рік тому +1

      the 4070 is not bad, it's on par with the 3080 and it's cheaper, people lost their minds when the 3080 was out.

    • @jamesalexander5559
      @jamesalexander5559 Рік тому

      Same deal for me with the Arc A770. I wanted to buy a 3060 but I needed the extra VRAM for my job and AMD cards unfortunately don't perform well for my work.

    • @pengu6335
      @pengu6335 Рік тому +1

      ​@@21preend42Lost their minds in what way? The 3060ti & the 3080 were the best 3000 series cards imo.

    • @jesusbarrera6916
      @jesusbarrera6916 Рік тому

      @@21preend42 the 1070 was better than the 980, the 2070 was better than the 1080, the 3070 was on the toes of the 2080ti..... the 4070 is also horrible value and you guys need to grow some basic standards

  • @bigbenisdaman
    @bigbenisdaman Рік тому +2

    Guessing Nvidia said reviews couldn't compare it to the 3070? 4060ti has basically same performance as the 3070 for up to half the watts. Would have been a good take. Now that the prices are below msrp, not a bad buy.

  • @silverfilmsofficial
    @silverfilmsofficial Рік тому +32

    So glad I picked up the 12GB 3060, really coming in handy lately

    • @tsiupbmt2822
      @tsiupbmt2822 Рік тому +4

      Same here that 3060 aorous elite 12gb card comes in so handy now

    • @silverfilmsofficial
      @silverfilmsofficial Рік тому +1

      @@rustler08 You may be right but it's pretty fast compared to my old 980

    • @pell9538
      @pell9538 Рік тому +1

      @@rustler08 Something that people dont mention is Power Efficiency. I considered a 6700/6700xt but both draw more power than the 3060, i dont need my electric bill going up even more when i have other things i have to worry about.

    • @serphvarna4154
      @serphvarna4154 Рік тому +1

      Should have gone for the 6700xt if you wanted the 12gb vram

    • @squidwardo7074
      @squidwardo7074 Рік тому

      @@rustler08 You're right and this is why the VRAM complaints are kinda dumb. In all honesty almost 0 games will use all 8gb, and if it does you can just turn down the graphics from high to medium and you probably won't even be able to tell the difference. If you're trying to play 4k on a 60 series card you're doing it wrong, and that's about the only time you will use all 8gb. The AMD cards absolutely destroy it in some games, but in others the 4060ti is 10-20% better, so I guess it depends which games you play. For me personally I stick with nvidia for now mainly because of shadowplay, and most of the games I play are better with nvidia.

  • @vffa
    @vffa Рік тому +52

    Concerning the 8 GB of vram, it really makes me miss the HBCC high-bandwidth Cache Controller from the Radeon Vega graphics cards. That was an insanely good feature especially when coupled with the HBM2 video memory. Good Times.

    • @DGCastell
      @DGCastell Рік тому +5

      HBM will be fondly remembered 🤧

    • @nathangarvey797
      @nathangarvey797 Рік тому +1

      As an owner of a Radeon VII, I completely agree! I just wish Vega had been better for gaming rather than compute.
      I honestly think that a 6900xt with HBM would have given the 3090ti a run for it's money, as it always seemed like it was the memory bandwidth keeping RDNA2 back. Not a gpu engineer, though, so could definitely be wrong

    • @TheHavocInferno
      @TheHavocInferno Рік тому +1

      Didn't HBCC do basically nothing tangible for performance?

    • @vasilije94
      @vasilije94 Рік тому

      ​@@TheHavocInferno you are right. HBM didnt do Jack shit for those GPUs except make them expensive garbage. Those GPUs had nowhere near the power needed for HBM to make any difference. It was a marketing scheme, same way nvidia is adding bigger l2 cache while cutting bandwith, instead of actually adding more vram capacity. Its all just marketing schemes. The fact that these fanboys above think HBM was really good, really shows you how dumb people are to logic.

  • @andrewjmarx
    @andrewjmarx Рік тому +20

    I was so excited about the 16GB variant when I first heard about it. I've been wanting to replace my 1060 for years with something not overly power hungry, but I also have plans for a CUDA application next year that ties me to Nvidia. The stupid 128-bit bus is a double-whammy for me (gaming and computing) that's going to force me to buy something higher tier and run it in a reduced power mode

    • @ever611
      @ever611 Рік тому

      Why you need CUDA?

    • @ProjectPhysX
      @ProjectPhysX Рік тому +2

      Same here. VRAM bandwidth is piss poor on RTX 40 series, and a lot of compute application scale in performance directly with raw VRAM bandwidth, not "effective" bandwidth, as the cache is useless here. So you get half the performance of the RTX 30 predecessor for the same or higher price. DOA.

    • @MindBlowerWTF
      @MindBlowerWTF Рік тому +2

      buy last gen used then? 3080, the one with more VRAM?

    • @andrewjmarx
      @andrewjmarx Рік тому

      @@MindBlowerWTF I've been waiting for AV1 since before most people have heard of it. Strongly considered getting a low-end Arc card specifically for it, but I don't want to deal with managing 2 gpus with different sets of drivers across both Linux and Windows

    • @andrewjmarx
      @andrewjmarx Рік тому +2

      @@ever611 I write computing software that works with matrices that have effectively no upper limit on their size. As the matrices become larger, CPUs just can't keep up in terms of speed. It's the type of computing that makes sense for GPUs, and I want to target people with access to HPC facilities/supercomputers, which tend to predominantly support Nvidia (including the one I have access to). Since it's so heavily used, there's also just a ton of resources on CUDA programming. OpenCL or Vulkan compute could be viable, and I would love to eventually support non-proprietary alternatives to CUDA, but CUDA is going to be the lowest resistance route to reaching the most people I'm trying to target

  • @BevanWard
    @BevanWard 4 місяці тому +1

    NVIDIAcould have released the 16gb 4060Ti for $430, making a mid-range card that is future-proof, and not a total rip-off. But if they cared about their consumers, then they wouldn't be NVIDIA

  • @WickedKillaYT
    @WickedKillaYT Рік тому +39

    Can confirm the 6700xt for 1440p is a great card, best bang for your buck in the price range for someone who doesn't care much about ray tracing.

    • @brutallica2944
      @brutallica2944 Рік тому +2

      6750XT owner, can defenetly recommend that one too.

    • @101sjls
      @101sjls Рік тому

      @@brutallica2944 Same, managed to pick one up on Newegg for the same price as a 6700XT and couldn't be happier with how it's performing at 1440p

    • @GeneralS1mba
      @GeneralS1mba Рік тому +1

      I really hope 7700xt is sub 400, that would be such a smack for the 4060 ti

    • @hampuswirsen7793
      @hampuswirsen7793 Рік тому

      How would it work with a 5120X1440 240 hz like the samsung oddysey g9?

    • @WickedKillaYT
      @WickedKillaYT Рік тому

      @@hampuswirsen7793 it would work just fine, but don't expect to play most games anywhere near the 240fps mark. Esports games with lower settings and/or fsr you may get close but most newer games at 1440p high I get between 100-170

  • @r3claim3r
    @r3claim3r Рік тому +10

    This RTX 4060 Ti seems like good news for AMD's older gen cards.

  • @Alovon
    @Alovon Рік тому +14

    Game designer (in progress) here, on the note of the VRAM thing.
    Pretty much it's just industry growing pains, developers are pushing the traditional geometry shading method to it's limit in regards to complexity and it's overwhelming VRAM busses (and CPUs) all around.
    Tack on UE4 not being built for that level of complexity like in Jedi Survivor or Hogwarts Legacy, you get a lot of problems.
    We know this is a growing pain thing because Fortnite with UE5 with Nanite enabled, the mode that HAS EVERY SINGLE BLADE OF GRASS BEING A MODEL runs actually pretty fine on VRAM despite the current size and complexity of the game map despite that vegetation change when Nanite is on resulting in a geometric complexity at least at the level of Jedi Survivor.
    Virtualized Geometry/Mesh Shading is the solution to the VRAM problem as you only need to make your full-res model and a PLOD (Preferably), versus the Full-Res Model, 4-6 LOD tiers, then the Persistent LOD of standard shading/geometry.
    When models are AS Complex as they are nowadays, those extra LOD tiers are not "Simple" at all unless a developer bruteforce optimizes it which you only see on console Exclusives because PC players like extra scaling so you have to forego that to a degree or other consoles may not take kindly to the LOD Setup you have for that specific console.

    • @Alovon
      @Alovon Рік тому +2

      How insecure do you have to be to say something so unimportant to anything?

    • @Alovon
      @Alovon Рік тому

      @thomasboob559 I'm just saying that optimization takes longer/is harder becuase of how they are pushing against complexity limits for the current Paradigm.

    • @Alovon
      @Alovon Рік тому

      @thomasboob559 It barely runs on the last generation consoles because of CPU problems. Heck even had problems on current gen last I recall

  • @carterbeals9771
    @carterbeals9771 Рік тому +6

    Can I just say how glad I am that Intel finally has a viable GPU product? The more competition, the better imo, AMD can't take on Nvidia alone.

  • @QactisX
    @QactisX Рік тому +13

    Still don't regret getting a 6800 in Feb. 16GB is nice

  • @JasonB808
    @JasonB808 Рік тому +142

    My decision on buying a 6800XT at $540 is getting better and better. Thanks NVIDIA.

    • @vorkzen4172
      @vorkzen4172 Рік тому +1

      You have 6900xt for that price actualy.

    • @termitreter6545
      @termitreter6545 Рік тому +2

      Yeah same. Buying a Nvidia card is just a waste of money if you care about price.

    • @scubasausage
      @scubasausage Рік тому +2

      Yeah but you dont get DLSS and RT. And I turn those on in literally every game on my 3070ti. It doesnt hurt performance enough to notice. If you have a Radeon you're basically playing on low settings for every game. The difference between RT on and off is way bigger than low to ultra.

    • @dennisjungbauer4467
      @dennisjungbauer4467 Рік тому

      @@scubasausage RT is sweet, or can look great, but it's not at all like Low to Ultra, guess you've never played at lower settings - it's a cherry on top for more realistic lighting and it's not even always worth it. Some games certainly look great and can benefit quite a bit from it though.
      Also, the 6800XT can handle some RT, even though surely not at RTX 3080 levels.

    • @scubasausage
      @scubasausage Рік тому

      @@dennisjungbauer4467 You are incorrect, RTX on is a way bigger difference than from low to ultra. You must never have used ray tracing before.
      Also a 6800 can do ray tracing but not at playable frame rates. It gets regularly beaten by a 2070 when you turn RT on.

  • @MiskT
    @MiskT Рік тому +38

    While I can cross-reference the 4k data, I would love to see a VR performance rundown. VR is still a smaller market, but maybe a once a year rundown of current cards to current heasets and games would be great

    • @bakedbeanfanclub
      @bakedbeanfanclub Рік тому +15

      @@remeuptmain No need to be rude. VR is a very real niche, and the performance needs are different than 1080p, 1440p, and 4k. This is great feedback for the team to have.

    • @bakedbeanfanclub
      @bakedbeanfanclub Рік тому +8

      @@remeuptmain Are you really going into comments sections and getting mad about people having opinions?

    • @kalimkhawaja8605
      @kalimkhawaja8605 Рік тому +8

      @@remeuptmain u post minecraft videos and get cooked in your own comments....

    • @SkyTreeStudio
      @SkyTreeStudio Рік тому +3

      ​@@remeuptmainis that cause yours walked out?

    • @Please_Consume_Irresponsibly
      @Please_Consume_Irresponsibly Рік тому +1

      @@remeuptmain you must be too poor to afford VR, I can see why you’re so angry all the time lmao

  • @lord-fishv7355
    @lord-fishv7355 Рік тому +8

    I just realised the 4060ti has fewer cores than the 3060ti, and the two things it does better are L2 cache and watts, but decreased power for the same performance is expected when decreasing the size of the transistors on the die but they also took away a lot of the other things that would take more power anyway.