I'm usually well in excess of 144 fps ultra, everything at max except for max SSAA. 16x Anisotropy, TAA, Ultra settings and 4 x SSAA can be too much workload. At that point you're better off just running DLSS to reduce jagged edges.
@@AndyViant I use DLSS at 4K quality or balanced whenever I have the chance, unless DLSS is bugged in the game, also always replacing the DLL file with the latest version of DLSS is essential
3080 Ti ca do 4K DLSS Quality in almost any game at 60+ FPS (minimum high settings) and will still be able to do Performance 4K in the next 3 years with no issues. 12GB of Vram is enough considering you are rendering 1440p and 1080p.
You can undervolt it and lower the power draw by 20% without losing any performance. I got my 4070ti down from 280w to 190w and only lost 5% performance, which I gained most of it back by ramping up my memory clock speed.
I finally replaced my 1080ti with a used evga 3080ti a couple of months ago. Paid half what I paid for my 1080ti back in the day, and hope to get just as many years out of it
Was about to get a 3080ti to replace my broken 2080, but what held me back was the power consumption. Especially as a power user, I usually game for more than 5 hours a day. Went with a 4070 super and still got a similar, if not better performance with really good power efficiency
A lot of people just skip over power consumption, sh*t is expensive these days, running more than 5h a day, yeah I'd be looking at those numbers as well. Yikes! Congrats on the 4070, still rocking my 3060 12g, but I upgraded that 2 years ago from an RX580 4gigs and upgraded that from a GTX 460 2 years prior which I've been using for nearly a decade. lol. Gaming wasn't a top priority as I focused on photography and that gtx 460 served me quite well since 2010.
Same, I got a 7900GRE instead though, it runs 265-270W peak board power, which was about 30W more than 4070S AIB models, but I wanted the 16GB of VRAM so I could keep it for a while.
@@KyleRuggles really that expensive? I'm paying 5 cents for 1 kilowatt/h and I cannot say, that it is lowest price right now, usually it is even cheaper. How much do you pay for electricity?
@@user-propositionjoe I got the powercolor Hellhound GRE, great cooler, but it comes running 285-290W peak board power, I got it to 265-270W with a 80mV undervolt on the core, which gave me 8% performance increase in synthetic benchmarks for 20W less!! AMD cards are crazy. 1440p performance is fantastic.
The RTX 3080 and the RX 6800 XT / 6900 XT are the best cards you could get right now. Amazing performance for relatively small money on the used market.
@@silverwerewolf975 It's alright depending on what you want to do. For 1080p 144+ Hz Monitors it's still pretty good. But yeah if they cost around the same I would always go for the 6800 XT
I also got a FTW3 3080ti for $500, including an ekwb block with active backplate right before the 7800xt released. Still my daily driver, with no issues at 4k 120hz. Granted, I hardly use it and when I do, it's to play Hades 2, Crab Champions, or Talos Principle 2.
@scruffles87 I find there is a rough patch on long runs before the projectile speed is high enough for it to stop rendering most of the particle effects. I often see dips, sometimes down to 50ish fps between islands 60 to 90 on a good run. Then there is the freeze frames when you autoloot 100s of chests with 100s of items per chest XD
Brought this card for my Lenovo P520 Workstation and works brilliant. Only have a 60Hz TV so playing in 4K is rarely an issue on very high settings. Great review
I’m not really a gamer or a pc enthusiast anymore but the same principle can be applied to many aspects of life and tech buying in general. I’m a graphic designer and I’m using my base M1 Mac still as it’s plenty fast for all my needs and a new m3 Machine is very expensive and won’t increase my productivity by meaningful amount. I’m also using an iPhone 13 mini as my main phone because I don’t need or want to spend $1k on a newer phone and the mini does everything I need it to do. If what you have works, be content and only buy when you have a need.
I picked up an RTX 3090 Ti FE back in August of 2023 when, in France, the RTX 4070 Ti was selling for about the same 800€. At the time I felt like the 24GB of VRAM buffer would last a lot longer than the 10-12GB of the 3080/Ti and 4070/Ti, and it seems as though I was right. My card is currently watercooled and undervolted: runs at ~320W under load (managed to shave off ~100-120W of power) and I have even overclocked it with the undervolt (2070Mhz @940mV when it was running at 2025Mhz @1.070V at stock). As the RTX 3080 Ti is not very far from the 3090/Ti in terms of performance, I think it fair to say that, after watching the video, these cards still have a lot more service years left than NVIDIA would like to admit. I feel like the RTX 3000 series will be the second best architecture and will likely age the same as Pascal. We'll see!
@@hollandiaTV Can you name a single modern title, that can be running with 360fps at 1440p with decent graphics settings? This question includes even 4090 with highest end processor.
@@hollandiaTV most enigines cap at 200, so it depends solely on game. Some games have also hard caps or bad optimization. So i did say you pay primary for higher settings at same Fps.
I upgraded from GTX 1080 to RTX 3080. It's been amazing card and plays all my games at 1080p as it's bit of overkill GPU for that resolution, but for VR games it's faster than the GTX 1080 was, obviously.
@@DeathMonkeysgot a 4070ti and it runs VR excellent most games can reach the 120hz of the headsets displays with no issues. Been playing half life alyx for the first time.
I bought my EVGA FTW3 3080Ti for $600 last year. This card is really overkill for the games I play (and I don’t play often.) The next triple A game I’m even interested in is Wolverine, which probably won’t hit the streets before 2026. By then, I will need something with at least 20gb of vram. So I figure that I will hold on to the 3080 Ti for a couple more years.
I also have an EVGA FTW3 3080Ti, its a power hungry monster, had to switch my 850w psu for a 1000w to avoid shutdowns :-D. Wondering if you also have issues with high temperatures, my card reaches near to 93 deg on the hotspot and 88 deg on the average chip without overclock. Had to undervolt to avoid this monster to melt itself.
Bought a EVGA 3090 Ti FTW3 for $700 late December 2023 as well. I’m good for a very long time with this total overkill GPU. Great value long as you get a good one on the user market. Never buy one that’s been opened up before if it still has a warranty.
@@fmertin87something very wrong with your setup to get those temps. Either not enough case airflow, GPU fans not set correctly or the GPU was modified with a XOC VBios. 65c should always be the target goal for temps and the is no need to undervolt unless there are problems elsewhere.
I love my RX 6700 XT. That was the best purchase I ever made. 1440p high settings in almost every game I’ve ever tried to play. It truly is one of the best budget cards ever made. That being said, if NVIDIA had good pricing for once, I would buy a 4080 super.
Iceberg, I left a comment on a video a while ago about how you needed a de-esser on your audio recording. And today watching this video it sounds so much better. Way less sharpness on the ears. Thank you!!!
I was legitimately looking to get one of these for £500, in the end I found a 4070 super going for £520 on ebay. Got the 4000 series card as I hear power consumption has been quite improved. Shame about having to use the absolute dogwater 12pin connector tho.
I'm currently running a 2080 Ti that I've been using for 5 years. There's a lot of games I have to run at lower settings now but it still runs everything at 1440p and when I want to play in 4k on my tv I can use DLSS.
i have a suprim 3070ti and on full load it pulls around 300watts. i undervolted it and now it pulls about 220 ish with higher clocks. So if u have a 3080ti or you want to buy it you dont have to worry because you can undervolt the 3080ti to run atleast 100watts lower at the same speeds. say you drop the clocks by around 100mhz aswell i think wll be looking at 260-270ish. which is not bad.
Yeah my 3080ti without an undervolt pulls 350-400w vs my undervolt which pulls 250-280w ish at 1800 core clock and the performance difference is like, 140FPS w/o or 134FPS with the undervolt.
I was lucky enough to get one of these on launch day by camping out at BestBuy. Killer card, still handles anything I throw at it even at 4K with DLSS. I can't say I would ever use frame gen because of the impact on latency so I don't really feel compelled to upgrade at all, although AV1 encoding would be nice. I will be getting a 5080 or 5090, but if I weren't in the position to do so I would be totally content rocking my 3080 Ti for a few more years.
Still rocking base 3080 paired with an i9 11900kf and 64gb ram on a 49' screen. I run alot of games on full screen at high resolution. It's more than enough for almost everything on high settings with dlss on. So I thing this gpu is still good to go for few more years
It is good to see you are reviewing older higher end cards like the 3080 ti but also old monsters like 1080 ti. It is good to see a wide verity in the content also this channel is one of the main reasons why i have the 1080 ti and im happy with it
I bought my EVGA RTX 3080 Ti at launch, direct from EVGA. Still rocking it. Waiting for the 5000 series before upgrading. I do game at 4k. Most games I can run at 4k60 with RT using DLSS balanced or performance. Performance does not look bad at 4k. Some games though just kill my card, like Black Myth. I have a 5800x for my CPU. I hope nVidia releases a 5080 Ti. The 3080 Ti is basically a 3090 with less VRAM. The 3080 Ti is gimped by it's lack of VRAM.
Helldivers 2 your stats are showing the 6900xt. Not sure if its a glitch, or if you accidentally grabbed the wrong footage, but just wanted to point that out.
Iceberg Tech : I've had the "pleasure" of deep dive testing a Manli RTX 3080 Ti 12GB and a Palit RTX 3080 Ti 12GB recently....the first versions continuous PWR consumption in modern games was 400 Watts with peaks up to 441 Watts....which proved just simply too much for my trusty Corsair RM850X via only 2 separate PCI-EX 8-pin cables that it needs. My RM850X can deliver at maximum continuous of 354 Watts (2 x 8-pin + 70 Watts from PCI-EX x16). It can deliver 12 Amps per 8-pin cable (=142 Watts)....whereas the 3080 Ti needs 17 Amps per 8-pin connector (=198 Watts). Whereas the Palit version was somewhat more conservative with the PWR consumption and I was able to run it for a few days to do all of the deep dive tests....and compared to my AMD RX 6900 XT 16GB the 3080 Ti is noticeably slower and eats around 100 Watts more of continuous PWR. All of my tests were done on my Ryzen 7 5800X3D which is the 2.nd fastest gaming CPU on the market currently. I was disappointed in the RTX 3080 Ti 12GB because in reality it's a direct competitor to the RX 6800 XT 16GB at best....while eating around 140 Watts more PWR. The 3080 Ti performs too weak for the PWR it needs to stay awake.
3090 ftw3 here, love this card. took it apart and painted it white to match my build as well! with a nice undervolt it doesn't burn the cables out of the wall either.
In regards to frame gen, in the mean time there is also Lossless scaling that is a pretty handy program on steam for any user frankly. Even more compatible than fsr.
Minor detail, on the Helldiver 2 section the GPU name states RX 6900XT, maybe you just forgot to change the label or is from another footage, great video though.
i was looking for this comment. it is an AMD card by watching the clock speed.. the 3080Ti doesn't reach those 2.3/2.4gHz so it most definetly was an amd gpu there
Awesome vidoe my man! Thats a lot of power usage for that. I'm in the 300 watts + of power usage with my 7900 XT but I'm running at 4k ultra for everything.
Heloo there :). Just wabted to tell you than on the Helldivers 2 bit was an error. In the top left corner on msi afterburner i states rx 6900xt not 3080ti. Great video tho. :)
Got an EVGA 3080ti, and aside from the amazing native performance as highlighted here and DLSS, I also recently use Lossless Scaling's frame gen (x2 to x4) which is insane! No upgrade needed for a long long time.
I acquired a refurb Dell/alienware OEM 3080ti for $380 on amazon 6 months ago. It replaced my 2060 Super and Ive been quite impressed by it on 1440p. Cyberpunk is finally a game i can run at 100+. About to tear it apart and apply some new thermal pads
I have the same model 3080ti, shunt modded and on liquid metal. I've pushed 500 watts through it regularly and sometimes allowed 600 for benchmarks, and EVGA's card has taken it like a champ. Obviously not recommended for normal users (or almost anyone else, for that matter), but it's a testament to the robustness of EVGA's design. Keep it cool, and the card is a beast.
The 40 series was only Nvidia recalibrating their strategy and so far they’ve had zero problems selling products which means the 50 series is only going to double down on what we’ve already seen. During an earnings call, whether it is an artificial intelligence enthusiast, a scalper or a typical consumer. It doesn’t matter who purchases the product when making decisions for the forward trajectory of the company during a PowerPoint presentation.
Your production quality always never fails to impress me... the first time i found your channel I thought you had at least 2 million subs.... hope someday you reach that!!
Isnt this right though after a couple of years a 4k card turns to 1440p..now the question.is how long can it stay at 1440p. How often should gamers upgrade in your opinion?
@@IcebergTech no i meant in the comments, like when i was trying to explain the 3080 Ti 20GB it kept removing the comment when there was nothing explicit. Does it just not like it if its too long?
@@Hamborger-wd5jg Ah I see. No, but sometimes the auto moderation gets a bit overzealous. I try to go through the "held for review" section every couple of hours after a video goes live just to make sure there's no false positives.
Still very happy with my 3080 Ti FE. I have two undervolt presets that i switch between, #1 is performance parity with stock but at 85% (300w) power usage, and #2 is 5-8% performance hit but 75% (~250w) power usage. #2 is my go-to. Based on GPU prices in 2023-2024 and rumored future prices, i don't anticipate buying a new GPU any time soon. Thanks for the 3080 ti review!
I have the EVGA RTX 3080 12Gb XC3 Hybrid, and it doesn't let me down even after moving to a 32:9 (5120 x 1440p) monitor, it still flows thru and even more now that I moved from a R9 3900X to a R7 7800X3D.
This beast with its power consumption (I had the XC3 12GB non-Ti model) made my room fans circulate and blow hot air on me when I accidentally left it on one night. Absolute monster but I couldn’t keep it so I had to side-grade to a 4070. Wish EVGA still made them.. 😢
@@reeveho1384 At the time, 100%. But now, 6900xt can be considered a same performing, better bin. Uses as much power but with 10% better performance, not enough for it to be an upgrade, but it is enough to make it on par with 6800 efficiency, (when you push them). Price delta can be minimal these days. It's definitely worth a 10% higher price if you can get that, or up to €50. My favorite of that gen is the 6800. It offends nobody. It's even on par with RDNA 3 on efficiency.
Few weeks ago debated on 3080 12gb or 4070 super. Picked up a 1 owner very clean 3080 12gb for 3070fe+$140 cash. After few games I immediately sold it because the power draw and heat generated was INSANE. My side glass panel you could probably toast some bread on it. The space heater joke was very apparent. Very happy with the efficiency of the 4070 super and performance for just $100 more
New to this kinda thing, wonder if I can get some advice. I have an i7 10th gen with 16gb ram. Will I be able to run the 3080 ti on my PC? Or are there further requirements? Any help appreciate
There’s a few practical things to bear in mind, such as whether you have a power supply with enough wattage and PCIE connectors to support it, and if your case has enough room. Aside from that, nope, you’re good to go 👍
Nice & Timely Video, I bought a EVGA 3070Ti FTW3 Ultra in January 2022 from EVGA for $830 US (Ouch but good price THEN.) Still is fine for my 5800X3D PC since I play mostly RTS Games at 1440P. I have a Radeon RX7900XT 20GB-OC ($700 US) & 5900X PC for production & 4K Fun. Both PC's use 64GB-3600CL16 RAM & Dual 2TB Gen.4 NVME SSDs.
1:31 Accusations? They WERE undoubtedly taking advantage of the situation and their customers. More than just egregious. How any defended this I don't understand. It's the same people who defend nvidia's current 8/12gb vram offerings.
I have the 3080FE 10GB and still playing everything I download at 1440P high settings. Sure I may have to DLSS qualify and use "optimized settings" in some games, but I get 60+ fps 99% of the time. Amazing value as I sold my 5700XT to a miner in 2021 for $1100USD and was able to upgrade to the 3080 essentially for free with the proceeds.
I can attest to effectiveness of FSR FG mods on RTX 3080ti. Previously I just couldn't run Alan Wake 2 or Hellblade 2 with path tracing but with FRS 3 FG mods I was able to play with 1440p output resolution up scaled by TV to 4k (with DLSS balanced obviously) and 120 fps. It looked sharp (not 4k like but definitely like good 1440p on a 4k panel) and felt smooth and consistent with path tracing, It needed some settings optimization and all the latest patches but it wored very well. Honestly I didn't expect to ever play path tracing like that on this card but lo and behold it' runs and it runs well !
my current pc that i built in 2022 has a 3080ti and a 5900x and it's never let me down. solid upgrade from my old pc with an i5-3350p and gtx 1050ti (previously upgraded fromm a 650ti), i see this one lasting me another 10 years.
I have a reference 3080ti paired with a ryzen 9 5900x and 4x8gb 3600mhz ram. In Starfield I was getting up to 90fps in new Atlantis area that you tested. I am more than happy to upload a small test clip to my yt channel that you can use for a data collecting or a reference point if you do decide to look into why Starfield seemed to be capped.
Honestly, I might need to get that processor. Then I have an r5 5 5500 OC at 4.54 with a 3080 ti FE, and it's a good pair, but I feel like missing the 1 and .1 lows on fps. I am gonna be on a b450 msi tomahawk max 16GB 2x8GB OC 4000mhz
i had a rx 5600xt 6gb and one day i noticed that i had bunch of vram chips so i soldered them making it 16gb with 256bit bus instead of 192bit. (you can break you card if you don't know how to solder so i recommend practice first on dead gpus)
i also did 24gb 192bit i didn't have enough chips to populate the other 2 so you can have 32gb 256bit, i guess you can do also for 5700 and 5700xt up to 32gb .
I'd love to see the difference between base 6GB model and 16GB one. probably breathes a bit more comfortably, but not much raw horsepower gained. if judging by what 7600 and 7600 XT are like.
@@inkredebilchina9699 well actually it improves like 30% because the bandwidth is increased and a 7600 is not all that much faster either but you can turn things down and have texture and shadow quality at ultra ,these 2 alone are the most important anyway , others you can lower setting ,and have stutter free gameplay.
@@fdgdfgdfgdfg3811 30%? that's actually a really awesome performance uplift and waaaaaay more than the xt vs non-xt models of 7600 which are like 10 percentish for double the vram.
WHY ARE WE TALKING ABOUR RETIRING 30 SERIES CARDS WHEN PEOPLE HAVENT EVEN STOPPED USING THEIR 10 SERIES!
When my wifes needs my tower for work (with 3080Ti) I am still playing on my old G752VS laptop with an GTX1070 :D The 10 series is legendary
I got my 1080 ti in my spare PC and my 2080 ti was gifted to my nephew while I got a 4090 in my main PC.
WHY ARE WE TALKING ABOUR RETIRING 10 SERIES CARDS WHEN PEOPLE HAVENT EVEN STOPPED USING THEIR 7 SERIES!
@@pf100andahalfI know people who play games on their 900 series still as it is still supported
1060 gang
3080ti lowkey kinda underrated, it should have had at least 20gb of vram though
Back then 16gigs and the 3080 12gigs the rest shouldve been 10/12gigs just with core cuts etc.
16gb wouldve been more than enough
then it would be a 3090
And that's why you buy a 3090 for $50 more on the local market. Just like when it was brand new, even now it serves only to make 3090s look good.
Actually....
It did, there was a few 3080 Ti 20GB's made, TechPowerUp has it on the website, but it wasnt released widescale commercially i dont think
Proud owner of EVGA 3080Ti FTW Ultra here. It kicks ass. At least for my 1440p gaming on all ultras, with occasional DLSS for my 144Hz monitor.
Dang, nice, I got the Strix OC, but would have loved a EVGA card...
I'm usually well in excess of 144 fps ultra, everything at max except for max SSAA. 16x Anisotropy, TAA, Ultra settings and 4 x SSAA can be too much workload. At that point you're better off just running DLSS to reduce jagged edges.
Same here
@@AndyViant I use DLSS at 4K quality or balanced whenever I have the chance, unless DLSS is bugged in the game, also always replacing the DLL file with the latest version of DLSS is essential
Own one as well! Got myself one for 410 euros which is a steal, considering some 3080ti’s are still going for 700 euros(secondhand).
I'm still playing on my 3080 ti with 4k quality and high FPS on a 4k oled moniter and the card is an absolute beast.
3080 Ti ca do 4K DLSS Quality in almost any game at 60+ FPS (minimum high settings) and will still be able to do Performance 4K in the next 3 years with no issues. 12GB of Vram is enough considering you are rendering 1440p and 1080p.
Me as well
I planned to retire my gtx 1080 with the 3080ti but the inflation and stupid prices got me to 4070 with only 200W power draw.
You can undervolt it and lower the power draw by 20% without losing any performance. I got my 4070ti down from 280w to 190w and only lost 5% performance, which I gained most of it back by ramping up my memory clock speed.
rtx4070 is basically 3080ti for the most part, but with less power draw, i would say that a win win in your situation
@@unclexbox85 Emmm no, is weaker than a 3080 and 6800xt
I finally replaced my 1080ti with a used evga 3080ti a couple of months ago. Paid half what I paid for my 1080ti back in the day, and hope to get just as many years out of it
@@hovdeepI just got one to replace my 2070 super 1440p wasn't getting the FPS I wanted over the years
Was about to get a 3080ti to replace my broken 2080, but what held me back was the power consumption. Especially as a power user, I usually game for more than 5 hours a day. Went with a 4070 super and still got a similar, if not better performance with really good power efficiency
A lot of people just skip over power consumption, sh*t is expensive these days, running more than 5h a day, yeah I'd be looking at those numbers as well. Yikes! Congrats on the 4070, still rocking my 3060 12g, but I upgraded that 2 years ago from an RX580 4gigs and upgraded that from a GTX 460 2 years prior which I've been using for nearly a decade. lol. Gaming wasn't a top priority as I focused on photography and that gtx 460 served me quite well since 2010.
Same, I got a 7900GRE instead though, it runs 265-270W peak board power, which was about 30W more than 4070S AIB models, but I wanted the 16GB of VRAM so I could keep it for a while.
Just undervolt
@@KyleRuggles really that expensive? I'm paying 5 cents for 1 kilowatt/h and I cannot say, that it is lowest price right now, usually it is even cheaper. How much do you pay for electricity?
@@user-propositionjoe I got the powercolor Hellhound GRE, great cooler, but it comes running 285-290W peak board power, I got it to 265-270W with a 80mV undervolt on the core, which gave me 8% performance increase in synthetic benchmarks for 20W less!! AMD cards are crazy.
1440p performance is fantastic.
The RTX 3080 and the RX 6800 XT / 6900 XT are the best cards you could get right now. Amazing performance for relatively small money on the used market.
Used 6800 XT is such a goated purchase
@@stefannita3439 Absolutely. I picked one up for 360 Euros in April, crazy purchase.
@@stefannita3439 Absolutely. Picked one up last month for just 360 Euros. It doesn't get cheaper than that for so much performance.
10gb vram is crap, 6800xt is the one to get.
@@silverwerewolf975 It's alright depending on what you want to do. For 1080p 144+ Hz Monitors it's still pretty good. But yeah if they cost around the same I would always go for the 6800 XT
I also got a FTW3 3080ti for $500, including an ekwb block with active backplate right before the 7800xt released. Still my daily driver, with no issues at 4k 120hz. Granted, I hardly use it and when I do, it's to play Hades 2, Crab Champions, or Talos Principle 2.
Same here- but I plan to spray my backplate white 😊
For all the crazy stuff that can happen in Crab Champions, it's a remarkably optimized game, it never dips below triple digits on island 200+
@scruffles87 I find there is a rough patch on long runs before the projectile speed is high enough for it to stop rendering most of the particle effects. I often see dips, sometimes down to 50ish fps between islands 60 to 90 on a good run. Then there is the freeze frames when you autoloot 100s of chests with 100s of items per chest XD
I've also got a ftw3 ultra 3080 ti, its a really nice card
Which has better performance the 3080 ti or 7800xt for you?
we evolving iceberg we are evolving
Brought this card for my Lenovo P520 Workstation and works brilliant. Only have a 60Hz TV so playing in 4K is rarely an issue on very high settings. Great review
I’m not really a gamer or a pc enthusiast anymore but the same principle can be applied to many aspects of life and tech buying in general. I’m a graphic designer and I’m using my base M1 Mac still as it’s plenty fast for all my needs and a new m3 Machine is very expensive and won’t increase my productivity by meaningful amount. I’m also using an iPhone 13 mini as my main phone because I don’t need or want to spend $1k on a newer phone and the mini does everything I need it to do. If what you have works, be content and only buy when you have a need.
I picked up an RTX 3090 Ti FE back in August of 2023 when, in France, the RTX 4070 Ti was selling for about the same 800€. At the time I felt like the 24GB of VRAM buffer would last a lot longer than the 10-12GB of the 3080/Ti and 4070/Ti, and it seems as though I was right. My card is currently watercooled and undervolted: runs at ~320W under load (managed to shave off ~100-120W of power) and I have even overclocked it with the undervolt (2070Mhz @940mV when it was running at 2025Mhz @1.070V at stock).
As the RTX 3080 Ti is not very far from the 3090/Ti in terms of performance, I think it fair to say that, after watching the video, these cards still have a lot more service years left than NVIDIA would like to admit. I feel like the RTX 3000 series will be the second best architecture and will likely age the same as Pascal. We'll see!
superb content as always! never stop making videos dude
I'm still sitting on a GiggleByte 3090 still with a 5800x3d. Not really planning on upgrading anytime soon
If you want the new monitors 360 hz 1440p you need at least a 4080 super
@@hollandiaTV Can you name a single modern title, that can be running with 360fps at 1440p with decent graphics settings? This question includes even 4090 with highest end processor.
A tier gpu. Dont upgrade.
Nobody asked that@@hollandiaTV
@@hollandiaTV most enigines cap at 200, so it depends solely on game. Some games have also hard caps or bad optimization. So i did say you pay primary for higher settings at same Fps.
I upgraded from GTX 1080 to RTX 3080. It's been amazing card and plays all my games at 1080p as it's bit of overkill GPU for that resolution, but for VR games it's faster than the GTX 1080 was, obviously.
I jumped to a 4070 from a 1070 due to getting a 4k monitor. Haven’t tried VR yet but I’m sure it will chew through Vr because my 1070 could.
This makes me genuinely happy to see since I literally just upgraded from the 1080 to the 3080 back in March, just found it at a stupid good price 😊
@@DeathMonkeysgot a 4070ti and it runs VR excellent most games can reach the 120hz of the headsets displays with no issues. Been playing half life alyx for the first time.
People kept telling me that it was overkill for 1080p but with how shit optimization is these days I have felt no need to get a better monitor
@@OleNesie once you go 1440p you can never go back as it's just so much sharper and more detailed so stick with your 1080p for as long as possible 😂
You should try undervolting the card it makes a drastic change in power draw
I bought my EVGA FTW3 3080Ti for $600 last year. This card is really overkill for the games I play (and I don’t play often.) The next triple A game I’m even interested in is Wolverine, which probably won’t hit the streets before 2026. By then, I will need something with at least 20gb of vram. So I figure that I will hold on to the 3080 Ti for a couple more years.
I'm sure it would be console exclusive for at least 2 years
I also have an EVGA FTW3 3080Ti, its a power hungry monster, had to switch my 850w psu for a 1000w to avoid shutdowns :-D.
Wondering if you also have issues with high temperatures, my card reaches near to 93 deg on the hotspot and 88 deg on the average chip without overclock.
Had to undervolt to avoid this monster to melt itself.
Bought a EVGA 3090 Ti FTW3 for $700 late December 2023 as well.
I’m good for a very long time with this total overkill GPU.
Great value long as you get a good one on the user market. Never buy one that’s been opened up before if it still has a warranty.
@@fmertin87something very wrong with your setup to get those temps.
Either not enough case airflow, GPU fans not set correctly or the GPU was modified with a XOC VBios.
65c should always be the target goal for temps and the is no need to undervolt unless there are problems elsewhere.
@@fmertin87 you should undervolt your gpu, or maybe its time to change termal past
Just a note, Helldivers 2 does NOT support DLSS or FSR, it actually just uses bog standard TAAU upscaling.
I love my RX 6700 XT. That was the best purchase I ever made. 1440p high settings in almost every game I’ve ever tried to play. It truly is one of the best budget cards ever made. That being said, if NVIDIA had good pricing for once, I would buy a 4080 super.
Iceberg, I left a comment on a video a while ago about how you needed a de-esser on your audio recording. And today watching this video it sounds so much better. Way less sharpness on the ears. Thank you!!!
I've got the EVGA RTX 3080 and it plays a treat at 4K 60fps in most titles I dont need to upgrade to a 40 series card I'm happy with what i have now.
Would a 3080 ti be a good mice at $300?
From where? For that price@@cliffordduhh45
16:47 Senua wasn't sure what to click on next....
6900 XT footage for Helldivers 2, oops!
I hope you have a really well ventilated room for that 3080 Ti, up to 400W power draw is crazy!
Oops, my bad 😂 that was supposed to be for the comparison video…
Remember when Vega was slammed for using 40 more watts then the 1080. Now... Silence. 😂
I was legitimately looking to get one of these for £500, in the end I found a 4070 super going for £520 on ebay. Got the 4000 series card as I hear power consumption has been quite improved. Shame about having to use the absolute dogwater 12pin connector tho.
Thank you for testing my GPU❤. I also have a 3080 Ti FTW3!
Starship Troopers is 100% VHS vibe, it just adds to the experience!
I'm currently running a 2080 Ti that I've been using for 5 years. There's a lot of games I have to run at lower settings now but it still runs everything at 1440p and when I want to play in 4k on my tv I can use DLSS.
Acquired a 3080ti FE earlier this year through some shenanigans. Been so awesome
already 3 years ago?? wow
i have a suprim 3070ti and on full load it pulls around 300watts. i undervolted it and now it pulls about 220 ish with higher clocks. So if u have a 3080ti or you want to buy it you dont have to worry because you can undervolt the 3080ti to run atleast 100watts lower at the same speeds. say you drop the clocks by around 100mhz aswell i think wll be looking at 260-270ish. which is not bad.
Lol, same exact story with my Zotac Trinity 3070 Ti.
Yup I undervolted my 3080 to less than 200w with a core clock boost
Yeah my 3080ti without an undervolt pulls 350-400w vs my undervolt which pulls 250-280w ish at 1800 core clock and the performance difference is like, 140FPS w/o or 134FPS with the undervolt.
I was lucky enough to get one of these on launch day by camping out at BestBuy. Killer card, still handles anything I throw at it even at 4K with DLSS. I can't say I would ever use frame gen because of the impact on latency so I don't really feel compelled to upgrade at all, although AV1 encoding would be nice. I will be getting a 5080 or 5090, but if I weren't in the position to do so I would be totally content rocking my 3080 Ti for a few more years.
Is it already 4 years? 😮
Still rocking base 3080 paired with an i9 11900kf and 64gb ram on a 49' screen. I run alot of games on full screen at high resolution. It's more than enough for almost everything on high settings with dlss on. So I thing this gpu is still good to go for few more years
It is good to see you are reviewing older higher end cards like the 3080 ti but also old monsters like 1080 ti. It is good to see a wide verity in the content also this channel is one of the main reasons why i have the 1080 ti and im happy with it
As a 407O owner that card was pretty underrated I did wish it had 20gb vram but still not bad
I bought my EVGA RTX 3080 Ti at launch, direct from EVGA. Still rocking it. Waiting for the 5000 series before upgrading. I do game at 4k. Most games I can run at 4k60 with RT using DLSS balanced or performance. Performance does not look bad at 4k. Some games though just kill my card, like Black Myth. I have a 5800x for my CPU. I hope nVidia releases a 5080 Ti. The 3080 Ti is basically a 3090 with less VRAM. The 3080 Ti is gimped by it's lack of VRAM.
Thank you for posting this video 🎥👍🥤🍿. You have a new subscriber 🎉. Keep up the great 🎉.
Helldivers 2 your stats are showing the 6900xt. Not sure if its a glitch, or if you accidentally grabbed the wrong footage, but just wanted to point that out.
Yeah, wrong footage. Spoiler for the comparison video, I guess 🤦♂️
I bought a asus rog strix 3080 10gb for 350 usd a couple months ago. It should hold up for 1440p for awhile won't it? I'm loving it right now
I have one of these. The VRAM makes it work really well on everything. Mine has 16GB though.
Huh? How?
@@juanpabloabriola7365 OP probably has the goated rx 6800 xt (16gb) instead
Iceberg Tech : I've had the "pleasure" of deep dive testing a Manli RTX 3080 Ti 12GB and a Palit RTX 3080 Ti 12GB recently....the first versions continuous PWR consumption in modern games was 400 Watts with peaks up to 441 Watts....which proved just simply too much for my trusty Corsair RM850X via only 2 separate PCI-EX 8-pin cables that it needs. My RM850X can deliver at maximum continuous of 354 Watts (2 x 8-pin + 70 Watts from PCI-EX x16). It can deliver 12 Amps per 8-pin cable (=142 Watts)....whereas the 3080 Ti needs 17 Amps per 8-pin connector (=198 Watts).
Whereas the Palit version was somewhat more conservative with the PWR consumption and I was able to run it for a few days to do all of the deep dive tests....and compared to my AMD RX 6900 XT 16GB the 3080 Ti is noticeably slower and eats around 100 Watts more of continuous PWR.
All of my tests were done on my Ryzen 7 5800X3D which is the 2.nd fastest gaming CPU on the market currently.
I was disappointed in the RTX 3080 Ti 12GB because in reality it's a direct competitor to the RX 6800 XT 16GB at best....while eating around 140 Watts more PWR. The 3080 Ti performs too weak for the PWR it needs to stay awake.
3090 ftw3 here, love this card. took it apart and painted it white to match my build as well! with a nice undervolt it doesn't burn the cables out of the wall either.
In regards to frame gen, in the mean time there is also Lossless scaling that is a pretty handy program on steam for any user frankly. Even more compatible than fsr.
Why would a generation old flagship retire?
3080 Ti is NOT flagship lol wtf are you on about
Minor detail, on the Helldiver 2 section the GPU name states RX 6900XT, maybe you just forgot to change the label or is from another footage, great video though.
i was looking for this comment. it is an AMD card by watching the clock speed.. the 3080Ti doesn't reach those 2.3/2.4gHz so it most definetly was an amd gpu there
Awesome vidoe my man! Thats a lot of power usage for that. I'm in the 300 watts + of power usage with my 7900 XT but I'm running at 4k ultra for everything.
So glad we have Hellblade 2 for testing now 😊🙌
RTX 3080 Ti was RTX 3090 in disguise with half the VRAM. It's a shame that only because of the pricing set by NVIDIA people hated it.
have a 3090. this will be my 1080ti card. I am going to use this for 5 years or more easily.
Heloo there :).
Just wabted to tell you than on the Helldivers 2 bit was an error. In the top left corner on msi afterburner i states rx 6900xt not 3080ti. Great video tho. :)
Do we know if the 50 Xt cards will make a return
Yeah I had some intrusive thoughts about upgrading my 3080ti but right now I think there's not much sense due to the gains vs price difference yet.😊
Got an EVGA 3080ti, and aside from the amazing native performance as highlighted here and DLSS, I also recently use Lossless Scaling's frame gen (x2 to x4) which is insane! No upgrade needed for a long long time.
I acquired a refurb Dell/alienware OEM 3080ti for $380 on amazon 6 months ago. It replaced my 2060 Super and Ive been quite impressed by it on 1440p. Cyberpunk is finally a game i can run at 100+. About to tear it apart and apply some new thermal pads
I have the same model 3080ti, shunt modded and on liquid metal. I've pushed 500 watts through it regularly and sometimes allowed 600 for benchmarks, and EVGA's card has taken it like a champ. Obviously not recommended for normal users (or almost anyone else, for that matter), but it's a testament to the robustness of EVGA's design. Keep it cool, and the card is a beast.
7:18 I see what you did there😄
I keep calling that game different things, nobody ever seems to notice!
3060 12Gb in 2024?
rtx 4080 - 5800x3d here for i while i hope so ... good vid mate
The 40 series was only Nvidia recalibrating their strategy and so far they’ve had zero problems selling products which means the 50 series is only going to double down on what we’ve already seen. During an earnings call, whether it is an artificial intelligence enthusiast, a scalper or a typical consumer. It doesn’t matter who purchases the product when making decisions for the forward trajectory of the company during a PowerPoint presentation.
I own a 3080Ti but a 4K test would be nice. Since that's what I'm currently pushing with mine.
Same for me. 4k 120hz vrr, dlss q and adjust settings for around 90+fps and I'm happy with all games so far
@@tog2842 Only games that push me under 60 are Cyberpunk and Hellblade 2
How about a 6900 xt vs 3080 ti in 2024?
i just upgraded from a 1660 super to a regular 3080 and at 1080p it is just awesome
Your production quality always never fails to impress me... the first time i found your channel I thought you had at least 2 million subs.... hope someday you reach that!!
honestly the real MVP here is the 7500F - such a goated CPU for the $
just gotta say that this channel gives me a certain kind of joy that the others cant
I’m still on a 2080 I’m lookin to upgrade but idk I might get that card if the price is right
I think.. you are becoming the best in the biz… when I look for insight… it’s not the behemoths anymore, it’s you…well done!
Isnt this right though after a couple of years a 4k card turns to 1440p..now the question.is how long can it stay at 1440p.
How often should gamers upgrade in your opinion?
Any idea why the filters are so sensitive?
If you mean the censor bleep, that's not a filter. I voluntarily bleep rude words... most of the time, anyway.
@@IcebergTech no i meant in the comments, like when i was trying to explain the 3080 Ti 20GB it kept removing the comment when there was nothing explicit. Does it just not like it if its too long?
@@Hamborger-wd5jg Ah I see. No, but sometimes the auto moderation gets a bit overzealous. I try to go through the "held for review" section every couple of hours after a video goes live just to make sure there's no false positives.
@@IcebergTech ah okay
Still very happy with my 3080 Ti FE. I have two undervolt presets that i switch between, #1 is performance parity with stock but at 85% (300w) power usage, and #2 is 5-8% performance hit but 75% (~250w) power usage. #2 is my go-to.
Based on GPU prices in 2023-2024 and rumored future prices, i don't anticipate buying a new GPU any time soon.
Thanks for the 3080 ti review!
DLSS 3 vs FSR 3 vs Lossless Scaling frame gen?
I have the EVGA RTX 3080 12Gb XC3 Hybrid, and it doesn't let me down even after moving to a 32:9 (5120 x 1440p) monitor, it still flows thru and even more now that I moved from a R9 3900X to a R7 7800X3D.
Great video as always! Looking forward to the next one comparing to the 6900XT.
Still on my RTX 3080ti on a 49" Monitor 32:9 , and have no trouble with her. Great Performance at all!
Hell yeah my GPU!
Never mind, I have the laptop not the desktop card. :)
This beast with its power consumption (I had the XC3 12GB non-Ti model) made my room fans circulate and blow hot air on me when I accidentally left it on one night. Absolute monster but I couldn’t keep it so I had to side-grade to a 4070. Wish EVGA still made them.. 😢
Only Legends would know that this video was called "Reject Modernity"
Which would you rather use? This or the rx 6900xt?
i have a 3080 ti. I'd trade it for a 6900xt any day
@@randomguydoes2901 tbh I think the 6800xt is way better value
@@reeveho1384 At the time, 100%. But now, 6900xt can be considered a same performing, better bin. Uses as much power but with 10% better performance, not enough for it to be an upgrade, but it is enough to make it on par with 6800 efficiency, (when you push them). Price delta can be minimal these days. It's definitely worth a 10% higher price if you can get that, or up to €50.
My favorite of that gen is the 6800. It offends nobody. It's even on par with RDNA 3 on efficiency.
the raw perforance is similar but the 6900 xt is more efficient and packs 16gb...
Few weeks ago debated on 3080 12gb or 4070 super. Picked up a 1 owner very clean 3080 12gb for 3070fe+$140 cash. After few games I immediately sold it because the power draw and heat generated was INSANE. My side glass panel you could probably toast some bread on it. The space heater joke was very apparent. Very happy with the efficiency of the 4070 super and performance for just $100 more
Another collab with randomgaminginhd would be lovely
New to this kinda thing, wonder if I can get some advice. I have an i7 10th gen with 16gb ram. Will I be able to run the 3080 ti on my PC? Or are there further requirements?
Any help appreciate
There’s a few practical things to bear in mind, such as whether you have a power supply with enough wattage and PCIE connectors to support it, and if your case has enough room. Aside from that, nope, you’re good to go 👍
Nice & Timely Video, I bought a EVGA 3070Ti FTW3 Ultra in January 2022 from EVGA for $830 US (Ouch but good price THEN.) Still is fine for my 5800X3D PC since I play mostly RTS Games at 1440P. I have a Radeon RX7900XT 20GB-OC ($700 US) & 5900X PC for production & 4K Fun. Both PC's use 64GB-3600CL16 RAM & Dual 2TB Gen.4 NVME SSDs.
i think you were using 6900xt footage for helldivers 2
Still rocking this card after 2 years. I'll upgrade soon once 5000 series come out
your footage for HD2 says in top corner RX 6900 XT
1:31 Accusations? They WERE undoubtedly taking advantage of the situation and their customers. More than just egregious. How any defended this I don't understand.
It's the same people who defend nvidia's current 8/12gb vram offerings.
3080ti user here. I love this card, it's a beast
I have the 3080FE 10GB and still playing everything I download at 1440P high settings. Sure I may have to DLSS qualify and use "optimized settings" in some games, but I get 60+ fps 99% of the time. Amazing value as I sold my 5700XT to a miner in 2021 for $1100USD and was able to upgrade to the 3080 essentially for free with the proceeds.
3080 10GB still a very capable card for sure.
I have this card since 2022, I'll keep it at least until RTX 50, and maybe even RTX 60.
This card or 7700 xt?
3080ti is miles better
We all need to stay warm in winter.
Also speaking of I once had a EVGA GTX 780TI and that only had 3gb vram, feelsbadman.
can you do vr benchmarks plz
I can attest to effectiveness of FSR FG mods on RTX 3080ti. Previously I just couldn't run Alan Wake 2 or Hellblade 2 with path tracing but with FRS 3 FG mods I was able to play with 1440p output resolution up scaled by TV to 4k (with DLSS balanced obviously) and 120 fps. It looked sharp (not 4k like but definitely like good 1440p on a 4k panel) and felt smooth and consistent with path tracing, It needed some settings optimization and all the latest patches but it wored very well. Honestly I didn't expect to ever play path tracing like that on this card but lo and behold it' runs and it runs well !
Texture streaming is what made your FT gameplay stutter
If EVGA wanted to give NVIDIA an even bigger “F*ck you”, they should have become one of AMD’s board partners lmaooo 😂😂😂
my current pc that i built in 2022 has a 3080ti and a 5900x and it's never let me down. solid upgrade from my old pc with an i5-3350p and gtx 1050ti (previously upgraded fromm a 650ti), i see this one lasting me another 10 years.
would you please test the RTX 3080 Ti with high end CPU like 7800x3D or equivalent? please please pretty please
I have a reference 3080ti paired with a ryzen 9 5900x and 4x8gb 3600mhz ram. In Starfield I was getting up to 90fps in new Atlantis area that you tested. I am more than happy to upload a small test clip to my yt channel that you can use for a data collecting or a reference point if you do decide to look into why Starfield seemed to be capped.
Honestly, I might need to get that processor. Then I have an r5 5 5500 OC at 4.54 with a 3080 ti FE, and it's a good pair, but I feel like missing the 1 and .1 lows on fps. I am gonna be on a b450 msi tomahawk max 16GB 2x8GB OC 4000mhz
i had a rx 5600xt 6gb and one day i noticed that i had bunch of vram chips so i soldered them making it 16gb with 256bit bus instead of 192bit. (you can break you card if you don't know how to solder so i recommend practice first on dead gpus)
i also did 24gb 192bit i didn't have enough chips to populate the other 2 so you can have 32gb 256bit, i guess you can do also for 5700 and 5700xt up to 32gb .
I'd love to see the difference between base 6GB model and 16GB one. probably breathes a bit more comfortably, but not much raw horsepower gained. if judging by what 7600 and 7600 XT are like.
@@inkredebilchina9699 well actually it improves like 30% because the bandwidth is increased and a 7600 is not all that much faster either but you can turn things down and have texture and shadow quality at ultra ,these 2 alone are the most important anyway , others you can lower setting ,and have stutter free gameplay.
@@fdgdfgdfgdfg3811 30%? that's actually a really awesome performance uplift and waaaaaay more than the xt vs non-xt models of 7600 which are like 10 percentish for double the vram.
@@inkredebilchina9699 because the xt version is just a 16gb version and the non xt and the memory bandwidth is the same
For starfield, check if the game is limited by a single CPU core. The game seems to heavily depend on best thread performance.