I'm usually well in excess of 144 fps ultra, everything at max except for max SSAA. 16x Anisotropy, TAA, Ultra settings and 4 x SSAA can be too much workload. At that point you're better off just running DLSS to reduce jagged edges.
@@AndyViant I use DLSS at 4K quality or balanced whenever I have the chance, unless DLSS is bugged in the game, also always replacing the DLL file with the latest version of DLSS is essential
You can undervolt it and lower the power draw by 20% without losing any performance. I got my 4070ti down from 280w to 190w and only lost 5% performance, which I gained most of it back by ramping up my memory clock speed.
I finally replaced my 1080ti with a used evga 3080ti a couple of months ago. Paid half what I paid for my 1080ti back in the day, and hope to get just as many years out of it
Was about to get a 3080ti to replace my broken 2080, but what held me back was the power consumption. Especially as a power user, I usually game for more than 5 hours a day. Went with a 4070 super and still got a similar, if not better performance with really good power efficiency
A lot of people just skip over power consumption, sh*t is expensive these days, running more than 5h a day, yeah I'd be looking at those numbers as well. Yikes! Congrats on the 4070, still rocking my 3060 12g, but I upgraded that 2 years ago from an RX580 4gigs and upgraded that from a GTX 460 2 years prior which I've been using for nearly a decade. lol. Gaming wasn't a top priority as I focused on photography and that gtx 460 served me quite well since 2010.
Same, I got a 7900GRE instead though, it runs 265-270W peak board power, which was about 30W more than 4070S AIB models, but I wanted the 16GB of VRAM so I could keep it for a while.
@@KyleRuggles really that expensive? I'm paying 5 cents for 1 kilowatt/h and I cannot say, that it is lowest price right now, usually it is even cheaper. How much do you pay for electricity?
@@user-propositionjoe I got the powercolor Hellhound GRE, great cooler, but it comes running 285-290W peak board power, I got it to 265-270W with a 80mV undervolt on the core, which gave me 8% performance increase in synthetic benchmarks for 20W less!! AMD cards are crazy. 1440p performance is fantastic.
3080 Ti ca do 4K DLSS Quality in almost any game at 60+ FPS (minimum high settings) and will still be able to do Performance 4K in the next 3 years with no issues. 12GB of Vram is enough considering you are rendering 1440p and 1080p.
I also got a FTW3 3080ti for $500, including an ekwb block with active backplate right before the 7800xt released. Still my daily driver, with no issues at 4k 120hz. Granted, I hardly use it and when I do, it's to play Hades 2, Crab Champions, or Talos Principle 2.
@scruffles87 I find there is a rough patch on long runs before the projectile speed is high enough for it to stop rendering most of the particle effects. I often see dips, sometimes down to 50ish fps between islands 60 to 90 on a good run. Then there is the freeze frames when you autoloot 100s of chests with 100s of items per chest XD
The RTX 3080 and the RX 6800 XT / 6900 XT are the best cards you could get right now. Amazing performance for relatively small money on the used market.
@@silverwerewolf975 It's alright depending on what you want to do. For 1080p 144+ Hz Monitors it's still pretty good. But yeah if they cost around the same I would always go for the 6800 XT
I’m not really a gamer or a pc enthusiast anymore but the same principle can be applied to many aspects of life and tech buying in general. I’m a graphic designer and I’m using my base M1 Mac still as it’s plenty fast for all my needs and a new m3 Machine is very expensive and won’t increase my productivity by meaningful amount. I’m also using an iPhone 13 mini as my main phone because I don’t need or want to spend $1k on a newer phone and the mini does everything I need it to do. If what you have works, be content and only buy when you have a need.
Brought this card for my Lenovo P520 Workstation and works brilliant. Only have a 60Hz TV so playing in 4K is rarely an issue on very high settings. Great review
@@Topknowledge_187 Can you name a single modern title, that can be running with 360fps at 1440p with decent graphics settings? This question includes even 4090 with highest end processor.
@@Topknowledge_187 most enigines cap at 200, so it depends solely on game. Some games have also hard caps or bad optimization. So i did say you pay primary for higher settings at same Fps.
I upgraded from GTX 1080 to RTX 3080. It's been amazing card and plays all my games at 1080p as it's bit of overkill GPU for that resolution, but for VR games it's faster than the GTX 1080 was, obviously.
@@DeathMonkeysgot a 4070ti and it runs VR excellent most games can reach the 120hz of the headsets displays with no issues. Been playing half life alyx for the first time.
I picked up an RTX 3090 Ti FE back in August of 2023 when, in France, the RTX 4070 Ti was selling for about the same 800€. At the time I felt like the 24GB of VRAM buffer would last a lot longer than the 10-12GB of the 3080/Ti and 4070/Ti, and it seems as though I was right. My card is currently watercooled and undervolted: runs at ~320W under load (managed to shave off ~100-120W of power) and I have even overclocked it with the undervolt (2070Mhz @940mV when it was running at 2025Mhz @1.070V at stock). As the RTX 3080 Ti is not very far from the 3090/Ti in terms of performance, I think it fair to say that, after watching the video, these cards still have a lot more service years left than NVIDIA would like to admit. I feel like the RTX 3000 series will be the second best architecture and will likely age the same as Pascal. We'll see!
I bought my EVGA FTW3 3080Ti for $600 last year. This card is really overkill for the games I play (and I don’t play often.) The next triple A game I’m even interested in is Wolverine, which probably won’t hit the streets before 2026. By then, I will need something with at least 20gb of vram. So I figure that I will hold on to the 3080 Ti for a couple more years.
I also have an EVGA FTW3 3080Ti, its a power hungry monster, had to switch my 850w psu for a 1000w to avoid shutdowns :-D. Wondering if you also have issues with high temperatures, my card reaches near to 93 deg on the hotspot and 88 deg on the average chip without overclock. Had to undervolt to avoid this monster to melt itself.
Bought a EVGA 3090 Ti FTW3 for $700 late December 2023 as well. I’m good for a very long time with this total overkill GPU. Great value long as you get a good one on the user market. Never buy one that’s been opened up before if it still has a warranty.
@@fmertin87something very wrong with your setup to get those temps. Either not enough case airflow, GPU fans not set correctly or the GPU was modified with a XOC VBios. 65c should always be the target goal for temps and the is no need to undervolt unless there are problems elsewhere.
I was legitimately looking to get one of these for £500, in the end I found a 4070 super going for £520 on ebay. Got the 4000 series card as I hear power consumption has been quite improved. Shame about having to use the absolute dogwater 12pin connector tho.
I love my RX 6700 XT. That was the best purchase I ever made. 1440p high settings in almost every game I’ve ever tried to play. It truly is one of the best budget cards ever made. That being said, if NVIDIA had good pricing for once, I would buy a 4080 super.
i have a suprim 3070ti and on full load it pulls around 300watts. i undervolted it and now it pulls about 220 ish with higher clocks. So if u have a 3080ti or you want to buy it you dont have to worry because you can undervolt the 3080ti to run atleast 100watts lower at the same speeds. say you drop the clocks by around 100mhz aswell i think wll be looking at 260-270ish. which is not bad.
Yeah my 3080ti without an undervolt pulls 350-400w vs my undervolt which pulls 250-280w ish at 1800 core clock and the performance difference is like, 140FPS w/o or 134FPS with the undervolt.
Iceberg, I left a comment on a video a while ago about how you needed a de-esser on your audio recording. And today watching this video it sounds so much better. Way less sharpness on the ears. Thank you!!!
I was lucky enough to get one of these on launch day by camping out at BestBuy. Killer card, still handles anything I throw at it even at 4K with DLSS. I can't say I would ever use frame gen because of the impact on latency so I don't really feel compelled to upgrade at all, although AV1 encoding would be nice. I will be getting a 5080 or 5090, but if I weren't in the position to do so I would be totally content rocking my 3080 Ti for a few more years.
I'm currently running a 2080 Ti that I've been using for 5 years. There's a lot of games I have to run at lower settings now but it still runs everything at 1440p and when I want to play in 4k on my tv I can use DLSS.
Iceberg Tech : I've had the "pleasure" of deep dive testing a Manli RTX 3080 Ti 12GB and a Palit RTX 3080 Ti 12GB recently....the first versions continuous PWR consumption in modern games was 400 Watts with peaks up to 441 Watts....which proved just simply too much for my trusty Corsair RM850X via only 2 separate PCI-EX 8-pin cables that it needs. My RM850X can deliver at maximum continuous of 354 Watts (2 x 8-pin + 70 Watts from PCI-EX x16). It can deliver 12 Amps per 8-pin cable (=142 Watts)....whereas the 3080 Ti needs 17 Amps per 8-pin connector (=198 Watts). Whereas the Palit version was somewhat more conservative with the PWR consumption and I was able to run it for a few days to do all of the deep dive tests....and compared to my AMD RX 6900 XT 16GB the 3080 Ti is noticeably slower and eats around 100 Watts more of continuous PWR. All of my tests were done on my Ryzen 7 5800X3D which is the 2.nd fastest gaming CPU on the market currently. I was disappointed in the RTX 3080 Ti 12GB because in reality it's a direct competitor to the RX 6800 XT 16GB at best....while eating around 140 Watts more PWR. The 3080 Ti performs too weak for the PWR it needs to stay awake.
In regards to frame gen, in the mean time there is also Lossless scaling that is a pretty handy program on steam for any user frankly. Even more compatible than fsr.
I can attest to effectiveness of FSR FG mods on RTX 3080ti. Previously I just couldn't run Alan Wake 2 or Hellblade 2 with path tracing but with FRS 3 FG mods I was able to play with 1440p output resolution up scaled by TV to 4k (with DLSS balanced obviously) and 120 fps. It looked sharp (not 4k like but definitely like good 1440p on a 4k panel) and felt smooth and consistent with path tracing, It needed some settings optimization and all the latest patches but it wored very well. Honestly I didn't expect to ever play path tracing like that on this card but lo and behold it' runs and it runs well !
Still very happy with my 3080 Ti FE. I have two undervolt presets that i switch between, #1 is performance parity with stock but at 85% (300w) power usage, and #2 is 5-8% performance hit but 75% (~250w) power usage. #2 is my go-to. Based on GPU prices in 2023-2024 and rumored future prices, i don't anticipate buying a new GPU any time soon. Thanks for the 3080 ti review!
Minor detail, on the Helldiver 2 section the GPU name states RX 6900XT, maybe you just forgot to change the label or is from another footage, great video though.
i was looking for this comment. it is an AMD card by watching the clock speed.. the 3080Ti doesn't reach those 2.3/2.4gHz so it most definetly was an amd gpu there
Heloo there :). Just wabted to tell you than on the Helldivers 2 bit was an error. In the top left corner on msi afterburner i states rx 6900xt not 3080ti. Great video tho. :)
3090 ftw3 here, love this card. took it apart and painted it white to match my build as well! with a nice undervolt it doesn't burn the cables out of the wall either.
Helldivers 2 your stats are showing the 6900xt. Not sure if its a glitch, or if you accidentally grabbed the wrong footage, but just wanted to point that out.
Still rocking base 3080 paired with an i9 11900kf and 64gb ram on a 49' screen. I run alot of games on full screen at high resolution. It's more than enough for almost everything on high settings with dlss on. So I thing this gpu is still good to go for few more years
I have the EVGA RTX 3080 12Gb XC3 Hybrid, and it doesn't let me down even after moving to a 32:9 (5120 x 1440p) monitor, it still flows thru and even more now that I moved from a R9 3900X to a R7 7800X3D.
Awesome vidoe my man! Thats a lot of power usage for that. I'm in the 300 watts + of power usage with my 7900 XT but I'm running at 4k ultra for everything.
Nice & Timely Video, I bought a EVGA 3070Ti FTW3 Ultra in January 2022 from EVGA for $830 US (Ouch but good price THEN.) Still is fine for my 5800X3D PC since I play mostly RTS Games at 1440P. I have a Radeon RX7900XT 20GB-OC ($700 US) & 5900X PC for production & 4K Fun. Both PC's use 64GB-3600CL16 RAM & Dual 2TB Gen.4 NVME SSDs.
1:31 Accusations? They WERE undoubtedly taking advantage of the situation and their customers. More than just egregious. How any defended this I don't understand. It's the same people who defend nvidia's current 8/12gb vram offerings.
I use frame gen with Lossless scaling on my 3090ti and it works very well in every game, no need dlss3 support. So i dont need a 4000 nor 5000 series gpu. Maybe 6000 will be the time to consider upgrading
I have the same model 3080ti, shunt modded and on liquid metal. I've pushed 500 watts through it regularly and sometimes allowed 600 for benchmarks, and EVGA's card has taken it like a champ. Obviously not recommended for normal users (or almost anyone else, for that matter), but it's a testament to the robustness of EVGA's design. Keep it cool, and the card is a beast.
People keep complaining about GPU's not being fast or the VRAM not being enough but how about these game developers relying on DLSS and other upscaling for their games to be ran? They are releasing unoptimized trash just to get as much money as quick as possible so its not exactly GPU's that are the issue all of the time..
It is good to see you are reviewing older higher end cards like the 3080 ti but also old monsters like 1080 ti. It is good to see a wide verity in the content also this channel is one of the main reasons why i have the 1080 ti and im happy with it
This beast with its power consumption (I had the XC3 12GB non-Ti model) made my room fans circulate and blow hot air on me when I accidentally left it on one night. Absolute monster but I couldn’t keep it so I had to side-grade to a 4070. Wish EVGA still made them.. 😢
Got my evga ftw3 second hand from a microcenter didn’t even come with a box just a bag and the gpu. It’s performs fantastically well and I don’t think I’ll ever get rid of it now that evga closed its gpu department doors.
I have the 3080FE 10GB and still playing everything I download at 1440P high settings. Sure I may have to DLSS qualify and use "optimized settings" in some games, but I get 60+ fps 99% of the time. Amazing value as I sold my 5700XT to a miner in 2021 for $1100USD and was able to upgrade to the 3080 essentially for free with the proceeds.
I managed to wait out the 30 series price gouge and got the 4070ti early on. although still too expensive, I'll at least save in the energy costs compared to a 3090 lol
Isnt this right though after a couple of years a 4k card turns to 1440p..now the question.is how long can it stay at 1440p. How often should gamers upgrade in your opinion?
Your NVME shouldn't be the reason for your 1% lows on Ratchet and Clank, it's about the top speed of the PS5's SSD. Digital Foundry have used much lower speeds and seen no performance issues. Might be direct storage being broken again, a Nixxes' special.
The 3080 10GB is still fine for 4K ultra and below, especially with DLSS, released over 3.5 years ago. Next console generation is at least another 2-3 years away, until then these cards are fine as the yardstick is not changing.
Your production quality always never fails to impress me... the first time i found your channel I thought you had at least 2 million subs.... hope someday you reach that!!
Maybe it's the UA-cam compression, but I can't see any difference between Alan Wake II with and without RT. Other than the murdered frame rate, of course.
I have a reference 3080ti paired with a ryzen 9 5900x and 4x8gb 3600mhz ram. In Starfield I was getting up to 90fps in new Atlantis area that you tested. I am more than happy to upload a small test clip to my yt channel that you can use for a data collecting or a reference point if you do decide to look into why Starfield seemed to be capped.
Honestly, I might need to get that processor. Then I have an r5 5 5500 OC at 4.54 with a 3080 ti FE, and it's a good pair, but I feel like missing the 1 and .1 lows on fps. I am gonna be on a b450 msi tomahawk max 16GB 2x8GB OC 4000mhz
I bought a Zotac 3080ti during the shortage and spent too much, but it's been working great for everything that I play. But if I waited a few months I probably could have gotten a 4000 series or spent less than I did
im still on my evga 208-0ti ftw3 with a ek waterblock, got it a month before the 30 series launch from a panicked seller at 500, still holding out for 4070ti super perf at 400 used
I have a Strix 3080Ti and for what I do (vid editing) it's fine. I also have a 4070Ti Super for a future build. The reason I got it is it uses less electricity for better rendering performance. And being that it's nothing for that machine to run 12-14 hours a day sometimes the efficiency alone is a welcome thing.
Ahh you do realize you should be undervolting the 3080Ti, not leaving it running at stock like this video, around the same performance 80-100 watts less, cooler and quieter.
@@Battleneter It's very quiet in it's stock form, not loud like the MSI cards. The 3080 TGP is 350 watts vs. 284 for the Ti Super. Even if I do bring it down to that voltage level being the Ti Super has 16gb of memory and a clock speed of 2640 with a 256 bit memory bus it will probably edge out the 3080Ti. But anyway like stated the 4070 is for a new build (that actually gonna start next week) and I'm leaving the Ryzen 9 / 3080Ti / 64gb DDR4 build machine intact for a backup machine just in case. 😎👍🏽
@@nicktayloriv310 Sure an undervolted 3080TI is not going to match the efficiency on a 4070Ti Super but it certainly tames it. Yep the 4070 TI super is a bout 10% faster in raster, would not normally be worth upgrading to, but a new build is a diff conversation.
Only down side just 3080Ti don’t have 16GB VRAM, if that exists would be even much more kick ass! I still have my 3080Ti XC3 Ultra, it’s not FTW but I mod it EKWB water block, it just gives me running Forza Horizon 5 with 144fps smooth and resident evil 4 RE with 90~120fps, I would upgrade at least until 50series comes out or even 60series!
i had a rx 5600xt 6gb and one day i noticed that i had bunch of vram chips so i soldered them making it 16gb with 256bit bus instead of 192bit. (you can break you card if you don't know how to solder so i recommend practice first on dead gpus)
i also did 24gb 192bit i didn't have enough chips to populate the other 2 so you can have 32gb 256bit, i guess you can do also for 5700 and 5700xt up to 32gb .
I'd love to see the difference between base 6GB model and 16GB one. probably breathes a bit more comfortably, but not much raw horsepower gained. if judging by what 7600 and 7600 XT are like.
@@inkredebilchina9699 well actually it improves like 30% because the bandwidth is increased and a 7600 is not all that much faster either but you can turn things down and have texture and shadow quality at ultra ,these 2 alone are the most important anyway , others you can lower setting ,and have stutter free gameplay.
@@fdgdfgdfgdfg3811 30%? that's actually a really awesome performance uplift and waaaaaay more than the xt vs non-xt models of 7600 which are like 10 percentish for double the vram.
Just got a 3090 on Monday. Bought it "used" from eBay, and it turned out to be brand new- peel ply wasn't even removed. I suspect it never got installed because it's a Hydro Copper, and sometimes people's ambitions exceed their motivation. I chose it over anything relatively affordable from the 40 series for the 24gb vram, and because 40 series is notoriously unreliable.
Few weeks ago debated on 3080 12gb or 4070 super. Picked up a 1 owner very clean 3080 12gb for 3070fe+$140 cash. After few games I immediately sold it because the power draw and heat generated was INSANE. My side glass panel you could probably toast some bread on it. The space heater joke was very apparent. Very happy with the efficiency of the 4070 super and performance for just $100 more
The 10 series was great, the 20 was a skip-it. The 30 series was great, the 40 is a skip-it. The 50 series will be great. The 4 year upgrade cycle is still the goat 🔥
the 20 series got better with age imo, especially the 2070 and 2070 super on the used market. a 2070 will perform identical to a 3060 outside vram limitations and can be found for like $160 used
This card was on my radar for a long time, but I bought used 3090 from MSI for 600€, it has slightly better performance, 24gb Vram and believe it or not, lesser power consumption than ftw3 3080ti. I got it paired with 12900k, and I'm fine with overall performance. Nice video buddy, see you again.
gotta love evga that's why I have a shelf dedicated to them in my collection haha, very happy with my 3090ti ftw3 and unsure which company ill even go with when upgrade time comes
I've been happy with my AMD 7800XT 16GB GPU in my latest 2024 build. My last build from 2019 was still using its original EVGA GTX 1660Ti GPU when I retired it recently. 😆
I went from 2080 to 4070ti. About the same power draw, big uplift in performance. And the cooler in the 4070ti is so overbuilt, i didn't think over seen the thing hit 70c lol.
Rocking a 3090 FTW3+ on a Ryzen 5 7600. It is incredible for the price I paid, around 450 bucks in my country (Argentina). I'm mentioning this because it is practically exactly the same as the 3080 Ti! By the way, I fucking love your videos!
I feel bad for people who bought for example a RTX 3090 Ti, without getting DLSS framegen from NVIDIA. Yes, the card has no optical flow accelerators, but in theory NVIDIA could use the tensor cores as a fallback, at the cost for some performance?
You try and act and trying to sound smart mate but im sorry you knew absolutely nothing it seems, the 30 series and 20 series have optical flow accelerator its just that they revamped it in Ada lovelace, regarding DLSS Frame gen it has absolutely 0 correlation to Tensor cores unlike DLSS upscaling which does use those tensor cores for fp16 inferencing
3080ti lowkey kinda underrated, it should have had at least 20gb of vram though
Back then 16gigs and the 3080 12gigs the rest shouldve been 10/12gigs just with core cuts etc.
16gb wouldve been more than enough
then it would be a 3090
And that's why you buy a 3090 for $50 more on the local market. Just like when it was brand new, even now it serves only to make 3090s look good.
Actually....
It did, there was a few 3080 Ti 20GB's made, TechPowerUp has it on the website, but it wasnt released widescale commercially i dont think
Proud owner of EVGA 3080Ti FTW Ultra here. It kicks ass. At least for my 1440p gaming on all ultras, with occasional DLSS for my 144Hz monitor.
Dang, nice, I got the Strix OC, but would have loved a EVGA card...
I'm usually well in excess of 144 fps ultra, everything at max except for max SSAA. 16x Anisotropy, TAA, Ultra settings and 4 x SSAA can be too much workload. At that point you're better off just running DLSS to reduce jagged edges.
Same here
@@AndyViant I use DLSS at 4K quality or balanced whenever I have the chance, unless DLSS is bugged in the game, also always replacing the DLL file with the latest version of DLSS is essential
Own one as well! Got myself one for 410 euros which is a steal, considering some 3080ti’s are still going for 700 euros(secondhand).
I planned to retire my gtx 1080 with the 3080ti but the inflation and stupid prices got me to 4070 with only 200W power draw.
You can undervolt it and lower the power draw by 20% without losing any performance. I got my 4070ti down from 280w to 190w and only lost 5% performance, which I gained most of it back by ramping up my memory clock speed.
rtx4070 is basically 3080ti for the most part, but with less power draw, i would say that a win win in your situation
@@unclexbox85 Emmm no, is weaker than a 3080 and 6800xt
I finally replaced my 1080ti with a used evga 3080ti a couple of months ago. Paid half what I paid for my 1080ti back in the day, and hope to get just as many years out of it
@@hovdeepI just got one to replace my 2070 super 1440p wasn't getting the FPS I wanted over the years
WHY ARE WE TALKING ABOUR RETIRING 30 SERIES CARDS WHEN PEOPLE HAVENT EVEN STOPPED USING THEIR 10 SERIES!
When my wifes needs my tower for work (with 3080Ti) I am still playing on my old G752VS laptop with an GTX1070 :D The 10 series is legendary
I got my 1080 ti in my spare PC and my 2080 ti was gifted to my nephew while I got a 4090 in my main PC.
Was about to get a 3080ti to replace my broken 2080, but what held me back was the power consumption. Especially as a power user, I usually game for more than 5 hours a day. Went with a 4070 super and still got a similar, if not better performance with really good power efficiency
A lot of people just skip over power consumption, sh*t is expensive these days, running more than 5h a day, yeah I'd be looking at those numbers as well. Yikes! Congrats on the 4070, still rocking my 3060 12g, but I upgraded that 2 years ago from an RX580 4gigs and upgraded that from a GTX 460 2 years prior which I've been using for nearly a decade. lol. Gaming wasn't a top priority as I focused on photography and that gtx 460 served me quite well since 2010.
Same, I got a 7900GRE instead though, it runs 265-270W peak board power, which was about 30W more than 4070S AIB models, but I wanted the 16GB of VRAM so I could keep it for a while.
Just undervolt
@@KyleRuggles really that expensive? I'm paying 5 cents for 1 kilowatt/h and I cannot say, that it is lowest price right now, usually it is even cheaper. How much do you pay for electricity?
@@user-propositionjoe I got the powercolor Hellhound GRE, great cooler, but it comes running 285-290W peak board power, I got it to 265-270W with a 80mV undervolt on the core, which gave me 8% performance increase in synthetic benchmarks for 20W less!! AMD cards are crazy.
1440p performance is fantastic.
I'm still playing on my 3080 ti with 4k quality and high FPS on a 4k oled moniter and the card is an absolute beast.
3080 Ti ca do 4K DLSS Quality in almost any game at 60+ FPS (minimum high settings) and will still be able to do Performance 4K in the next 3 years with no issues. 12GB of Vram is enough considering you are rendering 1440p and 1080p.
I also got a FTW3 3080ti for $500, including an ekwb block with active backplate right before the 7800xt released. Still my daily driver, with no issues at 4k 120hz. Granted, I hardly use it and when I do, it's to play Hades 2, Crab Champions, or Talos Principle 2.
Same here- but I plan to spray my backplate white 😊
For all the crazy stuff that can happen in Crab Champions, it's a remarkably optimized game, it never dips below triple digits on island 200+
@scruffles87 I find there is a rough patch on long runs before the projectile speed is high enough for it to stop rendering most of the particle effects. I often see dips, sometimes down to 50ish fps between islands 60 to 90 on a good run. Then there is the freeze frames when you autoloot 100s of chests with 100s of items per chest XD
I've also got a ftw3 ultra 3080 ti, its a really nice card
Which has better performance the 3080 ti or 7800xt for you?
The RTX 3080 and the RX 6800 XT / 6900 XT are the best cards you could get right now. Amazing performance for relatively small money on the used market.
Used 6800 XT is such a goated purchase
@@stefannita3439 Absolutely. I picked one up for 360 Euros in April, crazy purchase.
@@stefannita3439 Absolutely. Picked one up last month for just 360 Euros. It doesn't get cheaper than that for so much performance.
10gb vram is crap, 6800xt is the one to get.
@@silverwerewolf975 It's alright depending on what you want to do. For 1080p 144+ Hz Monitors it's still pretty good. But yeah if they cost around the same I would always go for the 6800 XT
I’m not really a gamer or a pc enthusiast anymore but the same principle can be applied to many aspects of life and tech buying in general. I’m a graphic designer and I’m using my base M1 Mac still as it’s plenty fast for all my needs and a new m3 Machine is very expensive and won’t increase my productivity by meaningful amount. I’m also using an iPhone 13 mini as my main phone because I don’t need or want to spend $1k on a newer phone and the mini does everything I need it to do. If what you have works, be content and only buy when you have a need.
we evolving iceberg we are evolving
Brought this card for my Lenovo P520 Workstation and works brilliant. Only have a 60Hz TV so playing in 4K is rarely an issue on very high settings. Great review
I'm still sitting on a GiggleByte 3090 still with a 5800x3d. Not really planning on upgrading anytime soon
If you want the new monitors 360 hz 1440p you need at least a 4080 super
@@Topknowledge_187 Can you name a single modern title, that can be running with 360fps at 1440p with decent graphics settings? This question includes even 4090 with highest end processor.
A tier gpu. Dont upgrade.
Nobody asked that@@Topknowledge_187
@@Topknowledge_187 most enigines cap at 200, so it depends solely on game. Some games have also hard caps or bad optimization. So i did say you pay primary for higher settings at same Fps.
I upgraded from GTX 1080 to RTX 3080. It's been amazing card and plays all my games at 1080p as it's bit of overkill GPU for that resolution, but for VR games it's faster than the GTX 1080 was, obviously.
I jumped to a 4070 from a 1070 due to getting a 4k monitor. Haven’t tried VR yet but I’m sure it will chew through Vr because my 1070 could.
This makes me genuinely happy to see since I literally just upgraded from the 1080 to the 3080 back in March, just found it at a stupid good price 😊
@@DeathMonkeysgot a 4070ti and it runs VR excellent most games can reach the 120hz of the headsets displays with no issues. Been playing half life alyx for the first time.
People kept telling me that it was overkill for 1080p but with how shit optimization is these days I have felt no need to get a better monitor
@@OleNesie once you go 1440p you can never go back as it's just so much sharper and more detailed so stick with your 1080p for as long as possible 😂
You should try undervolting the card it makes a drastic change in power draw
superb content as always! never stop making videos dude
I picked up an RTX 3090 Ti FE back in August of 2023 when, in France, the RTX 4070 Ti was selling for about the same 800€. At the time I felt like the 24GB of VRAM buffer would last a lot longer than the 10-12GB of the 3080/Ti and 4070/Ti, and it seems as though I was right. My card is currently watercooled and undervolted: runs at ~320W under load (managed to shave off ~100-120W of power) and I have even overclocked it with the undervolt (2070Mhz @940mV when it was running at 2025Mhz @1.070V at stock).
As the RTX 3080 Ti is not very far from the 3090/Ti in terms of performance, I think it fair to say that, after watching the video, these cards still have a lot more service years left than NVIDIA would like to admit. I feel like the RTX 3000 series will be the second best architecture and will likely age the same as Pascal. We'll see!
6900 XT footage for Helldivers 2, oops!
I hope you have a really well ventilated room for that 3080 Ti, up to 400W power draw is crazy!
Oops, my bad 😂 that was supposed to be for the comparison video…
Remember when Vega was slammed for using 40 more watts then the 1080. Now... Silence. 😂
I've got the EVGA RTX 3080 and it plays a treat at 4K 60fps in most titles I dont need to upgrade to a 40 series card I'm happy with what i have now.
I bought my EVGA FTW3 3080Ti for $600 last year. This card is really overkill for the games I play (and I don’t play often.) The next triple A game I’m even interested in is Wolverine, which probably won’t hit the streets before 2026. By then, I will need something with at least 20gb of vram. So I figure that I will hold on to the 3080 Ti for a couple more years.
I'm sure it would be console exclusive for at least 2 years
I also have an EVGA FTW3 3080Ti, its a power hungry monster, had to switch my 850w psu for a 1000w to avoid shutdowns :-D.
Wondering if you also have issues with high temperatures, my card reaches near to 93 deg on the hotspot and 88 deg on the average chip without overclock.
Had to undervolt to avoid this monster to melt itself.
Bought a EVGA 3090 Ti FTW3 for $700 late December 2023 as well.
I’m good for a very long time with this total overkill GPU.
Great value long as you get a good one on the user market. Never buy one that’s been opened up before if it still has a warranty.
@@fmertin87something very wrong with your setup to get those temps.
Either not enough case airflow, GPU fans not set correctly or the GPU was modified with a XOC VBios.
65c should always be the target goal for temps and the is no need to undervolt unless there are problems elsewhere.
@@fmertin87 you should undervolt your gpu, or maybe its time to change termal past
I was legitimately looking to get one of these for £500, in the end I found a 4070 super going for £520 on ebay. Got the 4000 series card as I hear power consumption has been quite improved. Shame about having to use the absolute dogwater 12pin connector tho.
Just a note, Helldivers 2 does NOT support DLSS or FSR, it actually just uses bog standard TAAU upscaling.
I love my RX 6700 XT. That was the best purchase I ever made. 1440p high settings in almost every game I’ve ever tried to play. It truly is one of the best budget cards ever made. That being said, if NVIDIA had good pricing for once, I would buy a 4080 super.
i have a suprim 3070ti and on full load it pulls around 300watts. i undervolted it and now it pulls about 220 ish with higher clocks. So if u have a 3080ti or you want to buy it you dont have to worry because you can undervolt the 3080ti to run atleast 100watts lower at the same speeds. say you drop the clocks by around 100mhz aswell i think wll be looking at 260-270ish. which is not bad.
Lol, same exact story with my Zotac Trinity 3070 Ti.
Yup I undervolted my 3080 to less than 200w with a core clock boost
Yeah my 3080ti without an undervolt pulls 350-400w vs my undervolt which pulls 250-280w ish at 1800 core clock and the performance difference is like, 140FPS w/o or 134FPS with the undervolt.
16:47 Senua wasn't sure what to click on next....
I bought a asus rog strix 3080 10gb for 350 usd a couple months ago. It should hold up for 1440p for awhile won't it? I'm loving it right now
Iceberg, I left a comment on a video a while ago about how you needed a de-esser on your audio recording. And today watching this video it sounds so much better. Way less sharpness on the ears. Thank you!!!
Starship Troopers is 100% VHS vibe, it just adds to the experience!
I was lucky enough to get one of these on launch day by camping out at BestBuy. Killer card, still handles anything I throw at it even at 4K with DLSS. I can't say I would ever use frame gen because of the impact on latency so I don't really feel compelled to upgrade at all, although AV1 encoding would be nice. I will be getting a 5080 or 5090, but if I weren't in the position to do so I would be totally content rocking my 3080 Ti for a few more years.
I have one of these. The VRAM makes it work really well on everything. Mine has 16GB though.
Huh? How?
@@juanpabloabriola7365 OP probably has the goated rx 6800 xt (16gb) instead
I'm currently running a 2080 Ti that I've been using for 5 years. There's a lot of games I have to run at lower settings now but it still runs everything at 1440p and when I want to play in 4k on my tv I can use DLSS.
Acquired a 3080ti FE earlier this year through some shenanigans. Been so awesome
Yeah I had some intrusive thoughts about upgrading my 3080ti but right now I think there's not much sense due to the gains vs price difference yet.😊
Iceberg Tech : I've had the "pleasure" of deep dive testing a Manli RTX 3080 Ti 12GB and a Palit RTX 3080 Ti 12GB recently....the first versions continuous PWR consumption in modern games was 400 Watts with peaks up to 441 Watts....which proved just simply too much for my trusty Corsair RM850X via only 2 separate PCI-EX 8-pin cables that it needs. My RM850X can deliver at maximum continuous of 354 Watts (2 x 8-pin + 70 Watts from PCI-EX x16). It can deliver 12 Amps per 8-pin cable (=142 Watts)....whereas the 3080 Ti needs 17 Amps per 8-pin connector (=198 Watts).
Whereas the Palit version was somewhat more conservative with the PWR consumption and I was able to run it for a few days to do all of the deep dive tests....and compared to my AMD RX 6900 XT 16GB the 3080 Ti is noticeably slower and eats around 100 Watts more of continuous PWR.
All of my tests were done on my Ryzen 7 5800X3D which is the 2.nd fastest gaming CPU on the market currently.
I was disappointed in the RTX 3080 Ti 12GB because in reality it's a direct competitor to the RX 6800 XT 16GB at best....while eating around 140 Watts more PWR. The 3080 Ti performs too weak for the PWR it needs to stay awake.
I game at 4k on a 3080 ti. VRAM is seldom an issue, and if you know how to undervolt, you can get the card to pull 330 watts maximum
have a 3090. this will be my 1080ti card. I am going to use this for 5 years or more easily.
In regards to frame gen, in the mean time there is also Lossless scaling that is a pretty handy program on steam for any user frankly. Even more compatible than fsr.
I own a 3080Ti but a 4K test would be nice. Since that's what I'm currently pushing with mine.
Same for me. 4k 120hz vrr, dlss q and adjust settings for around 90+fps and I'm happy with all games so far
@@tog2842 Only games that push me under 60 are Cyberpunk and Hellblade 2
Thank you for posting this video 🎥👍🥤🍿. You have a new subscriber 🎉. Keep up the great 🎉.
I can attest to effectiveness of FSR FG mods on RTX 3080ti. Previously I just couldn't run Alan Wake 2 or Hellblade 2 with path tracing but with FRS 3 FG mods I was able to play with 1440p output resolution up scaled by TV to 4k (with DLSS balanced obviously) and 120 fps. It looked sharp (not 4k like but definitely like good 1440p on a 4k panel) and felt smooth and consistent with path tracing, It needed some settings optimization and all the latest patches but it wored very well. Honestly I didn't expect to ever play path tracing like that on this card but lo and behold it' runs and it runs well !
7:18 I see what you did there😄
I keep calling that game different things, nobody ever seems to notice!
Still very happy with my 3080 Ti FE. I have two undervolt presets that i switch between, #1 is performance parity with stock but at 85% (300w) power usage, and #2 is 5-8% performance hit but 75% (~250w) power usage. #2 is my go-to.
Based on GPU prices in 2023-2024 and rumored future prices, i don't anticipate buying a new GPU any time soon.
Thanks for the 3080 ti review!
Minor detail, on the Helldiver 2 section the GPU name states RX 6900XT, maybe you just forgot to change the label or is from another footage, great video though.
i was looking for this comment. it is an AMD card by watching the clock speed.. the 3080Ti doesn't reach those 2.3/2.4gHz so it most definetly was an amd gpu there
Heloo there :).
Just wabted to tell you than on the Helldivers 2 bit was an error. In the top left corner on msi afterburner i states rx 6900xt not 3080ti. Great video tho. :)
already 3 years ago?? wow
3090 ftw3 here, love this card. took it apart and painted it white to match my build as well! with a nice undervolt it doesn't burn the cables out of the wall either.
Helldivers 2 your stats are showing the 6900xt. Not sure if its a glitch, or if you accidentally grabbed the wrong footage, but just wanted to point that out.
Yeah, wrong footage. Spoiler for the comparison video, I guess 🤦♂️
So glad we have Hellblade 2 for testing now 😊🙌
would love to see a similar vid but with the 2080ti
You’re in luck!
ua-cam.com/video/J3bclnCRpU0/v-deo.html
@@IcebergTech yooo what i missed the vid????
rtx 4080 - 5800x3d here for i while i hope so ... good vid mate
Still rocking base 3080 paired with an i9 11900kf and 64gb ram on a 49' screen. I run alot of games on full screen at high resolution. It's more than enough for almost everything on high settings with dlss on. So I thing this gpu is still good to go for few more years
Yea, the founder edition only has 2x8pin pcie power, maxing out around 350w. AIB models with 3x8pin can go much higher.
I have the EVGA RTX 3080 12Gb XC3 Hybrid, and it doesn't let me down even after moving to a 32:9 (5120 x 1440p) monitor, it still flows thru and even more now that I moved from a R9 3900X to a R7 7800X3D.
Awesome vidoe my man! Thats a lot of power usage for that. I'm in the 300 watts + of power usage with my 7900 XT but I'm running at 4k ultra for everything.
Nice & Timely Video, I bought a EVGA 3070Ti FTW3 Ultra in January 2022 from EVGA for $830 US (Ouch but good price THEN.) Still is fine for my 5800X3D PC since I play mostly RTS Games at 1440P. I have a Radeon RX7900XT 20GB-OC ($700 US) & 5900X PC for production & 4K Fun. Both PC's use 64GB-3600CL16 RAM & Dual 2TB Gen.4 NVME SSDs.
I have the FTW3 in my system with water cooling. It definitely sucks down the watts, but it's been a solid card taking everything I throw at it.
1:31 Accusations? They WERE undoubtedly taking advantage of the situation and their customers. More than just egregious. How any defended this I don't understand.
It's the same people who defend nvidia's current 8/12gb vram offerings.
I have this card since 2022, I'll keep it at least until RTX 50, and maybe even RTX 60.
Why would a generation old flagship retire?
3080 Ti is NOT flagship lol wtf are you on about
Another collab with randomgaminginhd would be lovely
I use frame gen with Lossless scaling on my 3090ti and it works very well in every game, no need dlss3 support. So i dont need a 4000 nor 5000 series gpu. Maybe 6000 will be the time to consider upgrading
I have the same model 3080ti, shunt modded and on liquid metal. I've pushed 500 watts through it regularly and sometimes allowed 600 for benchmarks, and EVGA's card has taken it like a champ. Obviously not recommended for normal users (or almost anyone else, for that matter), but it's a testament to the robustness of EVGA's design. Keep it cool, and the card is a beast.
People keep complaining about GPU's not being fast or the VRAM not being enough but how about these game developers relying on DLSS and other upscaling for their games to be ran? They are releasing unoptimized trash just to get as much money as quick as possible so its not exactly GPU's that are the issue all of the time..
It is good to see you are reviewing older higher end cards like the 3080 ti but also old monsters like 1080 ti. It is good to see a wide verity in the content also this channel is one of the main reasons why i have the 1080 ti and im happy with it
honestly the real MVP here is the 7500F - such a goated CPU for the $
i just upgraded from a 1660 super to a regular 3080 and at 1080p it is just awesome
Thank you for testing my GPU❤. I also have a 3080 Ti FTW3!
Only Legends would know that this video was called "Reject Modernity"
This beast with its power consumption (I had the XC3 12GB non-Ti model) made my room fans circulate and blow hot air on me when I accidentally left it on one night. Absolute monster but I couldn’t keep it so I had to side-grade to a 4070. Wish EVGA still made them.. 😢
Got my evga ftw3 second hand from a microcenter didn’t even come with a box just a bag and the gpu. It’s performs fantastically well and I don’t think I’ll ever get rid of it now that evga closed its gpu department doors.
If EVGA wanted to give NVIDIA an even bigger “F*ck you”, they should have become one of AMD’s board partners lmaooo 😂😂😂
I have the 3080FE 10GB and still playing everything I download at 1440P high settings. Sure I may have to DLSS qualify and use "optimized settings" in some games, but I get 60+ fps 99% of the time. Amazing value as I sold my 5700XT to a miner in 2021 for $1100USD and was able to upgrade to the 3080 essentially for free with the proceeds.
3080 10GB still a very capable card for sure.
I managed to wait out the 30 series price gouge and got the 4070ti early on. although still too expensive, I'll at least save in the energy costs compared to a 3090 lol
Isnt this right though after a couple of years a 4k card turns to 1440p..now the question.is how long can it stay at 1440p.
How often should gamers upgrade in your opinion?
Hell yeah my GPU!
Never mind, I have the laptop not the desktop card. :)
As a 407O owner that card was pretty underrated I did wish it had 20gb vram but still not bad
Your NVME shouldn't be the reason for your 1% lows on Ratchet and Clank, it's about the top speed of the PS5's SSD. Digital Foundry have used much lower speeds and seen no performance issues. Might be direct storage being broken again, a Nixxes' special.
Just to ask, are these games running on a default, fresh, unoptimized windows installation?
The VRAM boogie man will definitely get this card very soon, just not enough
The 3080 10GB is still fine for 4K ultra and below, especially with DLSS, released over 3.5 years ago. Next console generation is at least another 2-3 years away, until then these cards are fine as the yardstick is not changing.
Your production quality always never fails to impress me... the first time i found your channel I thought you had at least 2 million subs.... hope someday you reach that!!
Maybe it's the UA-cam compression, but I can't see any difference between Alan Wake II with and without RT. Other than the murdered frame rate, of course.
I have a reference 3080ti paired with a ryzen 9 5900x and 4x8gb 3600mhz ram. In Starfield I was getting up to 90fps in new Atlantis area that you tested. I am more than happy to upload a small test clip to my yt channel that you can use for a data collecting or a reference point if you do decide to look into why Starfield seemed to be capped.
Honestly, I might need to get that processor. Then I have an r5 5 5500 OC at 4.54 with a 3080 ti FE, and it's a good pair, but I feel like missing the 1 and .1 lows on fps. I am gonna be on a b450 msi tomahawk max 16GB 2x8GB OC 4000mhz
I bought a Zotac 3080ti during the shortage and spent too much, but it's been working great for everything that I play. But if I waited a few months I probably could have gotten a 4000 series or spent less than I did
im still on my evga 208-0ti ftw3 with a ek waterblock, got it a month before the 30 series launch from a panicked seller at 500, still holding out for 4070ti super perf at 400 used
I have a Strix 3080Ti and for what I do (vid editing) it's fine. I also have a 4070Ti Super for a future build. The reason I got it is it uses less electricity for better rendering performance. And being that it's nothing for that machine to run 12-14 hours a day sometimes the efficiency alone is a welcome thing.
Ahh you do realize you should be undervolting the 3080Ti, not leaving it running at stock like this video, around the same performance 80-100 watts less, cooler and quieter.
@@Battleneter It's very quiet in it's stock form, not loud like the MSI cards. The 3080 TGP is 350 watts vs. 284 for the Ti Super. Even if I do bring it down to that voltage level being the Ti Super has 16gb of memory and a clock speed of 2640 with a 256 bit memory bus it will probably edge out the 3080Ti.
But anyway like stated the 4070 is for a new build (that actually gonna start next week) and I'm leaving the Ryzen 9 / 3080Ti / 64gb DDR4 build machine intact for a backup machine just in case. 😎👍🏽
@@nicktayloriv310 Sure an undervolted 3080TI is not going to match the efficiency on a 4070Ti Super but it certainly tames it. Yep the 4070 TI super is a bout 10% faster in raster, would not normally be worth upgrading to, but a new build is a diff conversation.
Only down side just 3080Ti don’t have 16GB VRAM, if that exists would be even much more kick ass!
I still have my 3080Ti XC3 Ultra, it’s not FTW but I mod it EKWB water block, it just gives me running Forza Horizon 5 with 144fps smooth and resident evil 4 RE with 90~120fps, I would upgrade at least until 50series comes out or even 60series!
i had a rx 5600xt 6gb and one day i noticed that i had bunch of vram chips so i soldered them making it 16gb with 256bit bus instead of 192bit. (you can break you card if you don't know how to solder so i recommend practice first on dead gpus)
i also did 24gb 192bit i didn't have enough chips to populate the other 2 so you can have 32gb 256bit, i guess you can do also for 5700 and 5700xt up to 32gb .
I'd love to see the difference between base 6GB model and 16GB one. probably breathes a bit more comfortably, but not much raw horsepower gained. if judging by what 7600 and 7600 XT are like.
@@inkredebilchina9699 well actually it improves like 30% because the bandwidth is increased and a 7600 is not all that much faster either but you can turn things down and have texture and shadow quality at ultra ,these 2 alone are the most important anyway , others you can lower setting ,and have stutter free gameplay.
@@fdgdfgdfgdfg3811 30%? that's actually a really awesome performance uplift and waaaaaay more than the xt vs non-xt models of 7600 which are like 10 percentish for double the vram.
@@inkredebilchina9699 because the xt version is just a 16gb version and the non xt and the memory bandwidth is the same
We all need to stay warm in winter.
Also speaking of I once had a EVGA GTX 780TI and that only had 3gb vram, feelsbadman.
Just got a 3090 on Monday. Bought it "used" from eBay, and it turned out to be brand new- peel ply wasn't even removed. I suspect it never got installed because it's a Hydro Copper, and sometimes people's ambitions exceed their motivation. I chose it over anything relatively affordable from the 40 series for the 24gb vram, and because 40 series is notoriously unreliable.
Few weeks ago debated on 3080 12gb or 4070 super. Picked up a 1 owner very clean 3080 12gb for 3070fe+$140 cash. After few games I immediately sold it because the power draw and heat generated was INSANE. My side glass panel you could probably toast some bread on it. The space heater joke was very apparent. Very happy with the efficiency of the 4070 super and performance for just $100 more
For starfield, check if the game is limited by a single CPU core. The game seems to heavily depend on best thread performance.
Picked up a 3080 Ti for $400 on eBay the other day. Why pay 50% more for the 4070 Super just for access to FG?
The 10 series was great, the 20 was a skip-it. The 30 series was great, the 40 is a skip-it. The 50 series will be great. The 4 year upgrade cycle is still the goat 🔥
Going from an i7-6700K + 1080, to an i7-10700K + 3080, and now soon going to a Core Ultra 7 265K + 5080. Thing's are looking great
the 20 series got better with age imo, especially the 2070 and 2070 super on the used market. a 2070 will perform identical to a 3060 outside vram limitations and can be found for like $160 used
This card was on my radar for a long time, but I bought used 3090 from MSI for 600€, it has slightly better performance, 24gb Vram and believe it or not, lesser power consumption than ftw3 3080ti. I got it paired with 12900k, and I'm fine with overall performance. Nice video buddy, see you again.
gotta love evga that's why I have a shelf dedicated to them in my collection haha, very happy with my 3090ti ftw3 and unsure which company ill even go with when upgrade time comes
I've been happy with my AMD 7800XT 16GB GPU in my latest 2024 build. My last build from 2019 was still using its original EVGA GTX 1660Ti GPU when I retired it recently. 😆
Great video as always! Looking forward to the next one comparing to the 6900XT.
I went from 2080 to 4070ti. About the same power draw, big uplift in performance. And the cooler in the 4070ti is so overbuilt, i didn't think over seen the thing hit 70c lol.
Rocking a 3090 FTW3+ on a Ryzen 5 7600. It is incredible for the price I paid, around 450 bucks in my country (Argentina). I'm mentioning this because it is practically exactly the same as the 3080 Ti!
By the way, I fucking love your videos!
I feel bad for people who bought for example a RTX 3090 Ti, without getting DLSS framegen from NVIDIA. Yes, the card has no optical flow accelerators, but in theory NVIDIA could use the tensor cores as a fallback, at the cost for some performance?
You try and act and trying to sound smart mate but im sorry you knew absolutely nothing it seems, the 30 series and 20 series have optical flow accelerator its just that they revamped it in Ada lovelace, regarding DLSS Frame gen it has absolutely 0 correlation to Tensor cores unlike DLSS upscaling which does use those tensor cores for fp16 inferencing
I still think newer triple A games are 1440P high(not the highest settings) RTX 3080 Ti card