Great video! I didn't think the GTX 980 Ti is such a strong card! Through a HP workstation card I got hold of a Quadro M6000 and turns out it is the workstation version of the Titan X, which is same gen, but a little faster than the 980 Ti. I think it's a keeper, especially for older games / backlog games.
Vega 56 does support VRR over HDMI. I also have a Dual Vega 64 rig running VRR over HDMI right next to this computer I'm typing on here. I Googled to make sure it wasn't a quirk in my setup and it isn't. Vega 56 does do Freesync and VRR over HDMI.
Frame chasers did a video with the 3070ti vs 2080ti and it was like a 4% win for the 3070ti that’s putting vram aside. So it should beat a 3070 by 3-8% ish
Been rocking my Vega 64 for years now and am still surprised how capable it's been with a few graphics tweaks. That being said I think it's finally time to find it's replacement 😢 waiting on that rx 7800 to drop
I bought a Vega 56 early 2019 and am grateful for Amd to continue support for this card bringing new features that they first planned for their 5000/6000 series cards & even now providing driver support for vega 56....but over these 4 years I was little disappointed with Amd especially how they handled their 6000 series with all the driver issues.I was also hoping 7900XT to start with 800$ for my upgrade which eventually Amd lowered it to but did it little too late for me to jump ship to Nvidia 4070t early this year.I am not planning for a 4k upgrade and also wanted to experience Ray-tracing first hand.I hope FSR 3 becomes successful and works well on previous generations.Also would like Amd to release mid-tear GPU before Nvidia as it is this space that Amd actually excels pretty well.
Is the Vega 56 overclocked and undervolted? I am just asking, because you really need to tweak it to get the most out of it. Mine was running on GTX 1080 stock performance levels with a Vega 64 Bios.
I've just replaced a Vega 56 with a better quality Vega 64 model for a great price (and put the other Vega in another PC). I also count on it lasting me at least another year (for hopefully more competition in the GPU market then).
Yup. Still running my Vega64. Rebar enabled via reg hack. Undervolt for better 1% lows. Plays most games at 1440p with a high/medium settings mix. Getting me through the GPUpocalypse.
I still have a 1070 and Vega 64 actively running in my backup PCs. Still getting satisfactory performance in 1080p and 1440p respectively in my opinion.
I know the radeon 780M might perform like r9 290 but i do wish AMD reconsider their strategy for ryzen 8000 strixpoint We need this level of performance for Mega APU in mobile/deskmini space Imagine games on 25-45w apu perform like Vega56 even without RSR With upscaling it's ez for 1080p gaming
A 45w rtx 4070 mobile might perform like a vega 56, or faster. I mean its really a rtx 4060, but nvidia is at that efficiency probably. AMD is still bumbling around quite a bit.
@@HuntaKiller91 Giant APU's are fine. The question is the price. What will rx780m APU's cost? The rx780m will likely be around a 50w gtx 1650. Or a 35w gtx 1650ti max q gpu thats OC'd. And slower than a 35w 3050ti. So realistically it should be $500 and no a dime more.
@@siyzerix there's a minipc with 6800H/6900HX from beelink or minisforum selling for $499 barebones if im not mistaken Pretty sure this APU will sell at the same price,laptops with those apu alone are nowhere to be found but previous gen 5500u/5700u is around that $500 price aswell Hopefully this apu with radeon780m gets into that pricerange Mega APU with ryzen 8000 will be interesting tho nxtyear and seriously i hope AMD should rethink their strategy
Got a Vega 56 since 2017 - Ive played the Diablo 4 Beta on 1440p with high presets. had around 60-70 fps in average. Pretty decent imo. But i will upgrade to the 7900 XT soon caus its not a great exp. to play FPS Games or Cyberpunk 2077 on 1080p. I have massive problems in Battlefield 2042 and Cyberpunk 2077 on lowest settings. The lows are rly not great. Well for 6 years it was rly worth it. I hope the 7900 XT will last a couple of years for 1440p gaming. The 4070 Ti was an option for me but since i wanna keep my GPUs for a long time ill go with the 7900 xt caus of the memory.
Playing Elder Scrolls, WoW and GW2, my EVGA 1080Ti FTW3 handles 4k (Hisense 55U8G) with no issues. Yes, eye candy settings are reduced, but as far as game play goes, no issues. I thought my 1080 was dying so bought a 6900XT (scalping still rampant) from AMD. Same black screen issues and it turns out it was my PSU which I was able to RMA. Recently I bought a PowerColor 7900XTX for $999 and while both AMD GPUs allow for max eye candy, the improvement is not $1000 worth. My suggestion is to spend $400 on a 6750 and just enjoy most of the eye candy settings. $1600+ for a 4090 is absurd and overkill for just gaming.
From what I have heard AMD used the same HBM2 memory on vega 56 and 64 but they underclocked it on vega 56 to have a slower and cheaper card to sell. So basically putting the same mV and MHz on the ram of a vega 56 should give it a boost, and it is not really overclocking, more unlocking the potential within the HBM2 specs.
Don't know about Vega, but AMD is still updating RX 500 class GPUs, which are as old as the Vega cards. Besides, until very recently all AMD APUs had Vega GPUs as well.
@@gamingtemplar9893 Weird I've played Genshin on RX580 like a year ago and it worked fine, heck it works fine on 4th gen Intel HD, which is like what 9 years old? The issue might be on your side. R9 GPUs have problems with the likes of Halo Infinite and any newer game, but the old RX 580 I have to my brother still runs recent games fine on 1080p30.
my buddy has the 980 Ti, I own the other 2 cards tested. most excellent. I thought I had fried my V 56. drug it out of the drawer the other dat and installed it behind a 3600. still works and boy howdy it isn't that much slower than my 5700XT! was running METRO 2033 benchmarks. all I can say is I was impressed.
I will be honest. I still run inno3D liquid cooled 980ti. I was going to upgrade this year and due to unreasonable GPU prices, decided to buy a steam deck instead. Now I use gtx980ti to stream at 1280x800 to steam deck at ultra settings and the experience is just amazing. I can use the deck anywhere around the house getting 60 FPS all the time and I love it. Rarely sit at the pc now and I love the liberty that the deck offers. I am still going to buy a new GPU soon, just not this year as this year is for the deck! Next year I will migrate from team green to team red, due to price Vs quality and compatibility.
My current gpu is the evga 980ti 😅 i heard gpus going down in price now that covid chaos and bitcoin is over. My monitor is for 1080p so i have no problem running any games , newer games cant go on ultra setting but still a good gpu tbh
"Memory bus size matters." Wouldn't be able to tell that from how AMD and Nvidia are axing bus width across most of the board in their current generation vs previous one for the same branding tier while jacking up prices.
I still have a 980 Ti and with upscaling like FRS this card got a second life 100+ FPS on RD2 on console settings And all games that have the FSR option are very playable
Looks like the 980ti vs 1070 situation has reversed compared to when the 1070 released. At launch the 1070 review from techpowerup shows the 980ti at 88% the performance of the 1070, in your benchmarks the 980ti is almost 10% faster. Also it would be interesting to see how the 1660 Super/Ti would compare as they were within a couple of percent of the 1070 when they released. Since Maxwell/Pascal haven't aged that well in some more recent games it would be interesting to see how they compare, also the 1660 Super/Ti are pretty cheap on the used market right now.
I feeling the need to upgrade from my 1070. I do not play many new AAA games, but its starting to get hard to run FFXIV in 1440p. Im looking to get something in the $300ish price point, but there is nothing there that is worth it...
A Plague Tale Requiem is not a UE4 engine game, but an Asobo Studio in house Zouna engine. Many think that it is on UE4 because the game looks great and quite similar to games on the UE4 engine.
bought my vega56 like about 4yrs ago. i was in a struggle between nvidia or this vega56.. i completely kinda regret my choice. had a gtx 970 before but it was broken from the beginning.. but kinda ran better or equal but the temps... gosh the temps were SO much better. this asus strix version of mine has 3 damn fans and cannot handle to get cooling itself? changed the case, fans etc etc but still. its not overheating at all but i hate to put the fans around 85 % just to keep the vega under 80c degrees.. dont feel comfy with these temps even anyones on google mentioning thats completely safe. i wish i had just gave it back back then rly.
It`s not the Vega, it`s Asus fault, that is just random cooler slaped to that GPU, look online worst temps are on those Strix versions, even blover versions are better especialy for vrm section.
I’ve had about 8 different versions of the 980 Ti - it will always remain my favorite card - if you could get it clocked to 1500mhz and 8ghz on the vram it basically matched a stock 1070 Ti which was only 10% behind a 1080
Hmmm interesting. I’v owned a 970, 980 founder, 980 msi, and now a monster Gygabite 980 ti extreme overclock. My card boost up to 1430mhz. Reaching 1500 is very doable imo. I don’t OC now, but I’l give it a go. The card still serve me well a 1440p with sone minor tweak. Playing System Shock Remake at the moment. I dont feel the urge to upgrade. Reaching 60fps is enough for me.
ive got a i7 7700 gtx 1070, ive been leaning to upgrade to 13600k or 13500 cpu for work video rendering and gaming, my questuion is it enough to get significant boost in frames as well? or should i upgrade gpu as well?
Bruh its gpu testing, not a whole system testing. So it needs to avoid any cpu bottleneck scenario. Running them with i7 7700K (2017 flagship) wouldn't reveal their true potential, or even answer the title of the video ("upgrade required?") because the answer will be "upgrade your cpu first"
Well, I had a GTX 1080 for 4k/60 hz and recently upgraded to a 1440p / 165hz monitor and honestly it wasnt enough for certain games so i updated to a RTX 4070 Ti and i have no complaints but ive always played @ high settings so the 1080 sorta lasted a bit longer @ 4k/60 and kinda was meh @ 1440p / 120 hz anything past that was pushing it. I really dont play every game at super ultra settings with RT even if the gpu can do it i stick around custom high settings and good frames.
My 980ti oc 1516mhz and 7750mhz gddr5 performance is on par with my friends 1070ti and they are MSI gaming cards only my card is oc and the other one is stock! I remember people saying 1070 faster than the titan x Maxwell 😂 no way oc Titan x can almost keep up with a 1080 and a reference card for sure. But it needs a different cooler that blower can't cool 3072 cuda cores
well i have a vega 56 from asus and gtx 980 ti from ichill and the gtx 980 ti is about 8 - 10% better you need to show the speed of the cards most gtx 980 ti that i saw get this low were gtx 980 ti with a core clock thats 1250 - 1290 Mhz My card did 1450 Mhz with no problems and it does make a big diffrence I had a gtx 1070 benchmarks that i had, to compare with the other cards its a lot slower. the gtx 980 ti was as stronge as a gtx 1070 ti that my friend had I hope i helped here
A 6600K with a 980ti in Hogwarts gets about 50fps. If you have one of these old GPU's, it's time to upgrade to a new GPU, not a new CPU. That 13900K he's using, doesn't make the games play better. This video, should have been used with CPU's of that era combined with an RX 6600 to see if an upgrade really is needed since you upgrade GPU's not CPU's. An RX 6600 is what? Twice as fast as a 1060 for only $220 bucks and will probably get every last FPS out of those old CPU's as well.
And for more perspective, a 7700K with a 2060 at 1440p plays Cyberpunk at 43fps. Which is what Chris is showing using an expensive, hot and inefficient 13900K that should never be paired with the GPU's being tested. So, no, your question about the 7700K or any other CPU would not matter. Again, you do need a GPU upgrade, not a CPU upgrade. Disregard video's like this.
Would you be worried about the reliability of the HBM not that that card is 6 years old? I had watched a vid from buildzoid talking about cards that might have been mined on and he recommended avoiding cards with HBM
I also watched that video, but not all Vegas were used for mining and mining taxes the HBM very much (and likely even the interposer). As games don't tax the memory that much, I wouldn't worry at all for non-mining cards.
I remember when the 1070 came out it traded blows more with the 980ti, interesting to see how it ended up. I guess if you were desperate, there's always fsr 2 for all these cards.
What about Vega56? It got shit on by the entire Ngreedia loving tech channels on YT. I grabbed a Vega64 for £239 on Black Friday, instead of buying an RX5600XT that I was planning to, and it's been a legendary card. Destroys my Palit Jetstream 980Ti which I "saved" from a broken PC at work.
@@TheVanillatech nothin about vega 56. I only mentioned the 1070 vs the 980ti because I had a 1070. But anyway if you want my opinion on vega 56 it's better than either. It also overclocks to almost vega 64. Problem was it came out even later than the 1070 and people could have already owned the 980ti 2 years prior.
@@anthonyrizzo9043 980Ti was £600 at launch though! Certainly a great card, but before Vega even launched, Nvidia had abandoned the card - only critical update support in future drivers for Maxwell as of the end of 2016. Bit of a let down. Still, a solid card, a good 3 year card. Nothing in Pascal to surplant it except the 1080Ti. I was always impressed with Vega though, even at launch, its Vulkan speed was legendary (and DX12), had great undervolt / overclock ability, and has stood the test of time. Vega56 now beating the 980Ti and 1070 easily, even untweaked. Vega64 is a good 20% faster on top (raw).
@@TheVanillatech my main problem with amd is they always fall over there own feet at the finish line. Every time something looks good and they're gonna stick it to Nvidia there's always some dumb problem either it's late to the party or a software issue or something's not rdy yet like fsr and fsr2 and now fsr3. Then they don't price accordingly, they follow Nvidia in pricing instead of knocking it out of the park.
@@anthonyrizzo9043 To be fair, VEGA was late (true), part of it's toolset was geared to industry not jsut gaming (true), and it was expensive, for the first time in forever AMD offered similar or even worse fps/$ than the existing Nvidia lineup. BUT it didn't help that the entire tech press, having their Nvidia cheques in the post for years, all slaughtered the series before it even got established. Polaris beat the 1050/1060 series easily, got slated and didn't sell well. RDNA2 has slaughtered RTX in the mid range in a massacre - doesnt matter, sales show 3050's selling like hotcakes, while the 6600XT is 60% faster and the same price. Let that sink in .... 60% FASTER SAME PRICE. Still the 3050 sells 8 units to every 1 6600XT. If thats not knocking it out the park, then what is? But still don't sell units. Polaris beat the 1050/1060 on launch and was cheaper, and today it beats the Pascal series by a huge marging (100% faster in Dead Space 2 remake - 570 vs 1050). Doesn't matter. Sales didn't reflect it. I always buy whatever card is best for what I need on upgrade day. Done that since the late 90's. Over the years I've owned hundreds of GPU's. Usually a 50/50 split between nvidia and ati/amd. But these last 10 years, it's been more like a 70/30 split in AMD's favour. Always offered a far better product, in my price range (usually mid tier), with a handful of exceptions (GTX570 AMP and GTX970).
He probably cranked up the antialiasing at 8x or some crazy shit like that. You don't need more than 2x at 1440p. Heck, you might not need it at all on a 27" 1440p screen, maybe on 32".
the only reason I have a 1070 over Vega even though I am an AMD fan is because during that time Crypto made even a 580 be a lot more expensive than a 1070
Is it true that 60FPS on OLED feels smoother than 60FPS on LCD? I saw you claimed that in your oled video but i can’t find anyone who mentions this online, nobody else mentions that Oled feels smoother than LCD displays on the same amount of FPS.
@@Austrium1483 Yeah, I just can't game a 60hz anymore unless there is no other option. It's too cringe for me now as well. Very easy to get good 120hz on most things, especially if you target 1080p (which still looks incredibly good when displayed properly).
Running 2k @ 60+ is getting long in the tooth with Pascal mid/high models. I'm actually taking a gamble and going with team Blue this time with the ARK A770 16g. I do have one benchmark test done @ 2k giving me Ultra settings above 70 fps. Thanks for comparison!
@@earnistse4899 I've been following ARK A750/A770 since the release. (very buggy, Unstable and GTX 780 ti performance) As of these latest driver improvements, They seem to have fixed and pushed this card closer to a RTX 3060/ RX 6600/6700xt. I just purchased this card two days ago and updated to driver 4148. I guess I'll have to just gamble and hope for the best? 3 way Competition is a good thing?
@@Obie327 no chance it’s close to a 6700xt. Arc has some games where it’s close to a 3060 and many others where my 1070 beat it out. Even at 250 I’m not buying an a750. Not worth it yet.
@@earnistse4899 I totally understand and I'm not selling you on it. I'm impressed by the tech and DX 12 ultimate feature set. Having Ray Tracing and AV1 doesn't hurt either. I want to see where this first attempt goes as drivers tweaks and bug fixes slowly get rolled out. So far haven't felt regret buying into this new architecture. I've waited 5 months before purchasing, And saw the DX9 feature sets have been much improved. I guess we'll see in time what happens? Peace
I generally buy Ngreedia 80/80Ti class gpus and was previously a GTX980Ti owner and was quite shocked how well it’s still going after aallll this time 🤯🤯💪💪💪 still a monster GPU. I agree the bus width combined with 6GB buffer has kept it going soooo good.
It's good to see some solid testing and analysis for older graphics cards which demonstrate that the oldies are still goodies... Rock on Elvis
Good is strong.. They're pretty low fps but ya
Great video! I didn't think the GTX 980 Ti is such a strong card! Through a HP workstation card I got hold of a Quadro M6000 and turns out it is the workstation version of the Titan X, which is same gen, but a little faster than the 980 Ti. I think it's a keeper, especially for older games / backlog games.
Not sure about the quadros but Maxwell does have winXP drivers for some cool retro builds ;)
@@TheGoodOldGamer Yea the M6000 works in XP 🙂But makes more sense IMO for a Vista/7 machine.
Wow, PNY Quadro M6000 12GB cards are over 4 grand on CCL. Must be still in demand and useful charging that price.
@@Loppy2u That's crazy!
@@philscomputerlab you should consider selling that bad boy in that case. I'll take 4 grand over any GPU every day, all day.
Vega 56 does support VRR over HDMI. I also have a Dual Vega 64 rig running VRR over HDMI right next to this computer I'm typing on here. I Googled to make sure it wasn't a quirk in my setup and it isn't. Vega 56 does do Freesync and VRR over HDMI.
So many 10 series gamers that refused to get a 20 series, couldn't find a 30 series and are now priced out of a 40 series.....oof.
Seeing the 980Ti against the 1070 makes me want to see the 3070 vs 2080ti in 2023
the 2080 ti aged better nothing can replace vram except more.
I can tell you now 2080Ti was always a better card
Biggest thing is the vram. That extra vram is gonna help with higher settings and resolutions a lot
Frame chasers did a video with the 3070ti vs 2080ti and it was like a 4% win for the 3070ti that’s putting vram aside. So it should beat a 3070 by 3-8% ish
980Ti does seem to age much better than Kepler , GTX780Ti .
I have a Vega 64 and feel absolutely no urge to upgrade anytime soon.
I went from a 1070 to a 7900xt. 1700x to 5800x3d. started in 2017.
Nice, how you liking the 7900XT?
@@adi6293 I run 3dmark a lot and it doesn't run the same alll the time. very wierd.
AM4 turned out to be the gift that kept on giving. Honestly, I feel sorry for the people who went with Intel's 7th through 10th Gen processors.
Been rocking my Vega 64 for years now and am still surprised how capable it's been with a few graphics tweaks. That being said I think it's finally time to find it's replacement 😢 waiting on that rx 7800 to drop
You and me both brother! Was looking to upgrade to 6950xt but waiting also.
I bought a Vega 56 early 2019 and am grateful for Amd to continue support for this card bringing new features that they first planned for their 5000/6000 series cards & even now providing driver support for vega 56....but over these 4 years I was little disappointed with Amd especially how they handled their 6000 series with all the driver issues.I was also hoping 7900XT to start with 800$ for my upgrade which eventually Amd lowered it to but did it little too late for me to jump ship to Nvidia 4070t early this year.I am not planning for a 4k upgrade and also wanted to experience Ray-tracing first hand.I hope FSR 3 becomes successful and works well on previous generations.Also would like Amd to release mid-tear GPU before Nvidia as it is this space that Amd actually excels pretty well.
I think ima get a 6700xt cheaper and better only doing so since my Vega 64 died in the last few days but was a amazing for a long time
thanks for benchmarking that many games. i found this video interesting. keep up the good work
I have a 1070ti id love to upgrade but i refuse at these prices
It's so funny watching Chris on tech podcast and how he is always going on about tuning and then leaves a ton of performance on Vega 56 on the table
Is the Vega 56 overclocked and undervolted? I am just asking, because you really need to tweak it to get the most out of it. Mine was running on GTX 1080 stock performance levels with a Vega 64 Bios.
I have a Vega 64. Guess I can stretch it another year.
I've just replaced a Vega 56 with a better quality Vega 64 model for a great price (and put the other Vega in another PC). I also count on it lasting me at least another year (for hopefully more competition in the GPU market then).
I'm surprised to see the 980Ti doing so well as TPU has it over 10% slower than the 1070 and Vega 56 at 1080p.
Notice that the new RTX 4070 Ti has a 192-bit mem bus...
Yup. Still running my Vega64. Rebar enabled via reg hack. Undervolt for better 1% lows.
Plays most games at 1440p with a high/medium settings mix.
Getting me through the GPUpocalypse.
What you doing to get Rebar enabled? Tried in past on my 64 but gave up. Thanks
Sorry, didn't see your post till now. It's a simple registry entry. I'll look it up and post it asap.
I still have a 1070 and Vega 64 actively running in my backup PCs. Still getting satisfactory performance in 1080p and 1440p respectively in my opinion.
I know the radeon 780M might perform like r9 290 but i do wish AMD reconsider their strategy for ryzen 8000 strixpoint
We need this level of performance for Mega APU in mobile/deskmini space
Imagine games on 25-45w apu perform like Vega56 even without RSR
With upscaling it's ez for 1080p gaming
A 45w rtx 4070 mobile might perform like a vega 56, or faster. I mean its really a rtx 4060, but nvidia is at that efficiency probably. AMD is still bumbling around quite a bit.
@@siyzerix any dgpu will always hv thất upper hand ỉn efficiency tho
@@HuntaKiller91 Giant APU's are fine. The question is the price. What will rx780m APU's cost? The rx780m will likely be around a 50w gtx 1650. Or a 35w gtx 1650ti max q gpu thats OC'd. And slower than a 35w 3050ti. So realistically it should be $500 and no a dime more.
@@siyzerix there's a minipc with 6800H/6900HX from beelink or minisforum selling for $499 barebones if im not mistaken
Pretty sure this APU will sell at the same price,laptops with those apu alone are nowhere to be found but previous gen 5500u/5700u is around that $500 price aswell
Hopefully this apu with radeon780m gets into that pricerange
Mega APU with ryzen 8000 will be interesting tho nxtyear and seriously i hope AMD should rethink their strategy
Got a Vega 56 since 2017 - Ive played the Diablo 4 Beta on 1440p with high presets. had around 60-70 fps in average. Pretty decent imo. But i will upgrade to the 7900 XT soon caus its not a great exp. to play FPS Games or Cyberpunk 2077 on 1080p. I have massive problems in Battlefield 2042 and Cyberpunk 2077 on lowest settings. The lows are rly not great. Well for 6 years it was rly worth it. I hope the 7900 XT will last a couple of years for 1440p gaming. The 4070 Ti was an option for me but since i wanna keep my GPUs for a long time ill go with the 7900 xt caus of the memory.
Playing Elder Scrolls, WoW and GW2, my EVGA 1080Ti FTW3 handles 4k (Hisense 55U8G) with no issues. Yes, eye candy settings are reduced, but as far as game play goes, no issues. I thought my 1080 was dying so bought a 6900XT (scalping still rampant) from AMD. Same black screen issues and it turns out it was my PSU which I was able to RMA. Recently I bought a PowerColor 7900XTX for $999 and while both AMD GPUs allow for max eye candy, the improvement is not $1000 worth. My suggestion is to spend $400 on a 6750 and just enjoy most of the eye candy settings. $1600+ for a 4090 is absurd and overkill for just gaming.
if you are talking about ESO I have a 5800X3D and it flat fixes ALL the janky ESO bs
From what I have heard AMD used the same HBM2 memory on vega 56 and 64 but they underclocked it on vega 56 to have a slower and cheaper card to sell. So basically putting the same mV and MHz on the ram of a vega 56 should give it a boost, and it is not really overclocking, more unlocking the potential within the HBM2 specs.
Don't know about Vega, but AMD is still updating RX 500 class GPUs, which are as old as the Vega cards. Besides, until very recently all AMD APUs had Vega GPUs as well.
@@gamingtemplar9893 Weird I've played Genshin on RX580 like a year ago and it worked fine, heck it works fine on 4th gen Intel HD, which is like what 9 years old? The issue might be on your side.
R9 GPUs have problems with the likes of Halo Infinite and any newer game, but the old RX 580 I have to my brother still runs recent games fine on 1080p30.
You can enable sam with a registry change on the vega 56 and it'll help it out on some titles.
thank you for this video
my buddy has the 980 Ti, I own the other 2 cards tested. most excellent. I thought I had fried my V 56. drug it out of the drawer the other dat and installed it behind a 3600. still works and boy howdy it isn't that much slower than my 5700XT! was running METRO 2033 benchmarks. all I can say is I was impressed.
Freesync works over HDMI. I played on an RX 480 on a 4K monitor
Just upgraded from Vega 56 to 7900 XTX, which was necessary because I'm going to 4K. Other than that, I might've stayed with Vega another year or two.
I will be honest. I still run inno3D liquid cooled 980ti. I was going to upgrade this year and due to unreasonable GPU prices, decided to buy a steam deck instead.
Now I use gtx980ti to stream at 1280x800 to steam deck at ultra settings and the experience is just amazing. I can use the deck anywhere around the house getting 60 FPS all the time and I love it.
Rarely sit at the pc now and I love the liberty that the deck offers. I am still going to buy a new GPU soon, just not this year as this year is for the deck! Next year I will migrate from team green to team red, due to price Vs quality and compatibility.
My current gpu is the evga 980ti 😅 i heard gpus going down in price now that covid chaos and bitcoin is over. My monitor is for 1080p so i have no problem running any games , newer games cant go on ultra setting but still a good gpu tbh
@@pokiblue5870 Same here yeah. I play on full hd TV too 😅 Can easily see all the details without squinting my eyes
Memory Bus Master! What about the rx390x with 512bit bus ? And 8GB gddr5.
"Memory bus size matters." Wouldn't be able to tell that from how AMD and Nvidia are axing bus width across most of the board in their current generation vs previous one for the same branding tier while jacking up prices.
I still have a 980 Ti and with upscaling like FRS this card got a second life
100+ FPS on RD2 on console settings
And all games that have the FSR option are very playable
Looks like the 980ti vs 1070 situation has reversed compared to when the 1070 released. At launch the 1070 review from techpowerup shows the 980ti at 88% the performance of the 1070, in your benchmarks the 980ti is almost 10% faster.
Also it would be interesting to see how the 1660 Super/Ti would compare as they were within a couple of percent of the 1070 when they released. Since Maxwell/Pascal haven't aged that well in some more recent games it would be interesting to see how they compare, also the 1660 Super/Ti are pretty cheap on the used market right now.
I feeling the need to upgrade from my 1070. I do not play many new AAA games, but its starting to get hard to run FFXIV in 1440p. Im looking to get something in the $300ish price point, but there is nothing there that is worth it...
Ikr? The 1070 is still kicking but the upgrade is needed, I did that and moved to a 6800 xt just yesterday
Very nice 😊
my 1070 can run lots of games at 4K at high settings 60fps, I'm surprised how far I can push it
A Plague Tale Requiem is not a UE4 engine game, but an Asobo Studio in house Zouna engine. Many think that it is on UE4 because the game looks great and quite similar to games on the UE4 engine.
Ngreedia hasn't been optimizing pascal on newer games for a while.
well pascal kinda sucks in dx 12 and vulkan.
bought my vega56 like about 4yrs ago. i was in a struggle between nvidia or this vega56.. i completely kinda regret my choice. had a gtx 970 before but it was broken from the beginning.. but kinda ran better or equal but the temps... gosh the temps were SO much better. this asus strix version of mine has 3 damn fans and cannot handle to get cooling itself? changed the case, fans etc etc but still. its not overheating at all but i hate to put the fans around 85 % just to keep the vega under 80c degrees.. dont feel comfy with these temps even anyones on google mentioning thats completely safe. i wish i had just gave it back back then rly.
It`s not the Vega, it`s Asus fault, that is just random cooler slaped to that GPU, look online worst temps are on those Strix versions, even blover versions are better especialy for vrm section.
I’ve had about 8 different versions of the 980 Ti - it will always remain my favorite card - if you could get it clocked to 1500mhz and 8ghz on the vram it basically matched a stock 1070 Ti which was only 10% behind a 1080
GTX 1070 Ti is only about 5% behind a GTX 1080. (Techpower up reviews)
Hmmm interesting. I’v owned a 970, 980 founder, 980 msi, and now a monster Gygabite 980 ti extreme overclock.
My card boost up to 1430mhz. Reaching 1500 is very doable imo.
I don’t OC now, but I’l give it a go.
The card still serve me well a 1440p with sone minor tweak. Playing System Shock Remake at the moment. I dont feel the urge to upgrade. Reaching 60fps is enough for me.
WHy are you using current gen CPU to test out GPUs from 8 years ago ?
Im betting if you slam those power & temp sliders the 1070 will pick up ALOT
Mw2 hv fsr2.1 now
Great for esports testing the clarity is better now
ive got a i7 7700 gtx 1070, ive been leaning to upgrade to 13600k or 13500 cpu for work video rendering and gaming, my questuion is it enough to get significant boost in frames as well? or should i upgrade gpu as well?
1070 was $379 at launch (without mining crisis) , what can you buy with 379 now?
would have liked to see a more realistic combo
don’t think anyone combines a 13900K with any of these cards
Bruh its gpu testing, not a whole system testing. So it needs to avoid any cpu bottleneck scenario. Running them with i7 7700K (2017 flagship) wouldn't reveal their true potential, or even answer the title of the video ("upgrade required?") because the answer will be "upgrade your cpu first"
Well, I had a GTX 1080 for 4k/60 hz and recently upgraded to a 1440p / 165hz monitor and honestly it wasnt enough for certain games so i updated to a RTX 4070 Ti and i have no complaints but ive always played @ high settings so the 1080 sorta lasted a bit longer @ 4k/60 and kinda was meh @ 1440p / 120 hz anything past that was pushing it.
I really dont play every game at super ultra settings with RT even if the gpu can do it i stick around custom high settings and good frames.
My 980ti oc 1516mhz and 7750mhz gddr5 performance is on par with my friends 1070ti and they are MSI gaming cards only my card is oc and the other one is stock! I remember people saying 1070 faster than the titan x Maxwell 😂 no way oc Titan x can almost keep up with a 1080 and a reference card for sure. But it needs a different cooler that blower can't cool 3072 cuda cores
well i have a vega 56 from asus and gtx 980 ti from ichill and the gtx 980 ti is about 8 - 10% better
you need to show the speed of the cards most gtx 980 ti that i saw get this low were gtx 980 ti with a core clock thats 1250 - 1290 Mhz
My card did 1450 Mhz with no problems and it does make a big diffrence
I had a gtx 1070 benchmarks that i had, to compare with the other cards its a lot slower.
the gtx 980 ti was as stronge as a gtx 1070 ti that my friend had
I hope i helped here
You did use a great CPU. Wonder if it'd get similar framerates with a 7700k or something similar.
A 6600K with a 980ti in Hogwarts gets about 50fps. If you have one of these old GPU's, it's time to upgrade to a new GPU, not a new CPU. That 13900K he's using, doesn't make the games play better. This video, should have been used with CPU's of that era combined with an RX 6600 to see if an upgrade really is needed since you upgrade GPU's not CPU's. An RX 6600 is what? Twice as fast as a 1060 for only $220 bucks and will probably get every last FPS out of those old CPU's as well.
And for more perspective, a 7700K with a 2060 at 1440p plays Cyberpunk at 43fps. Which is what Chris is showing using an expensive, hot and inefficient 13900K that should never be paired with the GPU's being tested. So, no, your question about the 7700K or any other CPU would not matter. Again, you do need a GPU upgrade, not a CPU upgrade. Disregard video's like this.
someone can tell me what should i choose 980ti or vega 56 for wow?
Would you be worried about the reliability of the HBM not that that card is 6 years old? I had watched a vid from buildzoid talking about cards that might have been mined on and he recommended avoiding cards with HBM
I also watched that video, but not all Vegas were used for mining and mining taxes the HBM very much (and likely even the interposer). As games don't tax the memory that much, I wouldn't worry at all for non-mining cards.
nost of those guys undervolt for efficiency tho
@@yurimodin7333 oh yeah totally. Just curious about other people take on HBM and mining
Can't imagine playing on dinosaurs like those cards 😅😂🤣🤣
it could be worse. ppl were riding out the mining bubble with 1030's after all.....as long as I have gsync/freesync I can make it work
If people can afford the i9 13900K, they would get a 4090 or 4080
I remember when the 1070 came out it traded blows more with the 980ti, interesting to see how it ended up. I guess if you were desperate, there's always fsr 2 for all these cards.
What about Vega56? It got shit on by the entire Ngreedia loving tech channels on YT. I grabbed a Vega64 for £239 on Black Friday, instead of buying an RX5600XT that I was planning to, and it's been a legendary card. Destroys my Palit Jetstream 980Ti which I "saved" from a broken PC at work.
@@TheVanillatech nothin about vega 56. I only mentioned the 1070 vs the 980ti because I had a 1070. But anyway if you want my opinion on vega 56 it's better than either. It also overclocks to almost vega 64. Problem was it came out even later than the 1070 and people could have already owned the 980ti 2 years prior.
@@anthonyrizzo9043 980Ti was £600 at launch though! Certainly a great card, but before Vega even launched, Nvidia had abandoned the card - only critical update support in future drivers for Maxwell as of the end of 2016. Bit of a let down. Still, a solid card, a good 3 year card. Nothing in Pascal to surplant it except the 1080Ti.
I was always impressed with Vega though, even at launch, its Vulkan speed was legendary (and DX12), had great undervolt / overclock ability, and has stood the test of time. Vega56 now beating the 980Ti and 1070 easily, even untweaked. Vega64 is a good 20% faster on top (raw).
@@TheVanillatech my main problem with amd is they always fall over there own feet at the finish line. Every time something looks good and they're gonna stick it to Nvidia there's always some dumb problem either it's late to the party or a software issue or something's not rdy yet like fsr and fsr2 and now fsr3. Then they don't price accordingly, they follow Nvidia in pricing instead of knocking it out of the park.
@@anthonyrizzo9043 To be fair, VEGA was late (true), part of it's toolset was geared to industry not jsut gaming (true), and it was expensive, for the first time in forever AMD offered similar or even worse fps/$ than the existing Nvidia lineup. BUT it didn't help that the entire tech press, having their Nvidia cheques in the post for years, all slaughtered the series before it even got established.
Polaris beat the 1050/1060 series easily, got slated and didn't sell well. RDNA2 has slaughtered RTX in the mid range in a massacre - doesnt matter, sales show 3050's selling like hotcakes, while the 6600XT is 60% faster and the same price.
Let that sink in .... 60% FASTER SAME PRICE. Still the 3050 sells 8 units to every 1 6600XT.
If thats not knocking it out the park, then what is? But still don't sell units.
Polaris beat the 1050/1060 on launch and was cheaper, and today it beats the Pascal series by a huge marging (100% faster in Dead Space 2 remake - 570 vs 1050). Doesn't matter. Sales didn't reflect it.
I always buy whatever card is best for what I need on upgrade day. Done that since the late 90's. Over the years I've owned hundreds of GPU's. Usually a 50/50 split between nvidia and ati/amd. But these last 10 years, it's been more like a 70/30 split in AMD's favour. Always offered a far better product, in my price range (usually mid tier), with a handful of exceptions (GTX570 AMP and GTX970).
Textures, rarely if ever make any difference on FPS assuming you have enough Vram.
I'm glad I picked a 12 gb card. 10 gb 3080 and 8 gb 3070 ti will age like milk (that vram is already starting to stink a little)
My Vega worked with freesinc over hdmi
i get 90fps on 1070 in cyberpunk max settings, not sure how your frames are so low, i guess maybe its the fidelityfx
He probably cranked up the antialiasing at 8x or some crazy shit like that. You don't need more than 2x at 1440p. Heck, you might not need it at all on a 27" 1440p screen, maybe on 32".
the only reason I have a 1070 over Vega even though I am an AMD fan is because during that time Crypto made even a 580 be a lot more expensive than a 1070
May I recommend RTX3050 vs R9-390X?
thats about a 200W difference.....
The issue is that 60FPS isn’t enough, if you played on a high refresh rate monitor before 60 fps feels laggy and not smooth 😊
60 fps is the new 30 fps.
good salvage coverage. mb
Best gpu is rx 5700xt 150-200$ its 10-15FPS faster than Vega 64 and it supports SAM and RSR + it is RDNA so drivers are with console devision.
Is it true that 60FPS on OLED feels smoother than 60FPS on LCD? I saw you claimed that in your oled video but i can’t find anyone who mentions this online, nobody else mentions that Oled feels smoother than LCD displays on the same amount of FPS.
It really doesn't, tbut it may be subjective. What is true though is that OLED significantly reduces ghosting and the response times are much better.
@@Vans_WW that’s a massive shame, because once you are used to 120fps or more 60 fps just feels incredibly choppy
@@Austrium1483 Yeah, I just can't game a 60hz anymore unless there is no other option. It's too cringe for me now as well. Very easy to get good 120hz on most things, especially if you target 1080p (which still looks incredibly good when displayed properly).
FSR is your friend on old cards.
I got a 3080 and need to upgrade.
my 6700xt red devil is like a 2080ti so im good and have zero urge to upgrade other than oohh look at the pretty graphics there a bit better
Running 2k @ 60+ is getting long in the tooth with Pascal mid/high models. I'm actually taking a gamble and going with team Blue this time with the ARK A770 16g. I do have one benchmark test done @ 2k giving me Ultra settings above 70 fps. Thanks for comparison!
A770 is slower than a 1070 in many games … id know , I bought the a770 and returned it after finding this out.
@@earnistse4899 I've been following ARK A750/A770 since the release. (very buggy, Unstable and GTX 780 ti performance) As of these latest driver improvements, They seem to have fixed and pushed this card closer to a RTX 3060/ RX 6600/6700xt. I just purchased this card two days ago and updated to driver 4148. I guess I'll have to just gamble and hope for the best? 3 way Competition is a good thing?
@@Obie327 no chance it’s close to a 6700xt. Arc has some games where it’s close to a 3060 and many others where my 1070 beat it out. Even at 250 I’m not buying an a750. Not worth it yet.
@@earnistse4899 I totally understand and I'm not selling you on it. I'm impressed by the tech and DX 12 ultimate feature set. Having Ray Tracing and AV1 doesn't hurt either. I want to see where this first attempt goes as drivers tweaks and bug fixes slowly get rolled out. So far haven't felt regret buying into this new architecture. I've waited 5 months before purchasing, And saw the DX9 feature sets have been much improved. I guess we'll see in time what happens? Peace
I generally buy Ngreedia 80/80Ti class gpus and was previously a GTX980Ti owner and was quite shocked how well it’s still going after aallll this time 🤯🤯💪💪💪 still a monster GPU. I agree the bus width combined with 6GB buffer has kept it going soooo good.
my 980ti uses to muc power over 200w gonna upgrade to 2060 super
im feeling the need to upgrade from 1660 super after D4 beta
I feel like I need ( and I will ) an upgrade from a 3080 man such a memory hungry game
We will be much interested in performance of RTX 2080 ti VS RTX 3070 and 6700XT .
compare all three to a 3050
Meanwhile still rocking my GTX 960 2G😅
vega 56 is the best card by far
980 TI is great, but the lack of VRR is a killer
it still has gsync
other than vram... my 980ti is doing fine... i'm no high fps 4k ultra settings wacko... so it will probably last me the whole decade...
Rtx 4080 and rx 7900 xtx is already out, why would anyone still use 980ti and vega gpu's
Cos of silly prices
there are people who play nothing but fortnite or CSGO. the market contain multitudes, and lets be real nobody should be buying a 4080
Same
if gpu coming from that age, the cpu will be like 4670K, or 4790K, NOT A 13900k
The idea was to be GPU bound, not CPU bound...
Yes. Upgrade required.
Just bought a 980ti super cheap £60
nvidia would gimp their own cards everytime the newer gen woudn't sell well
Nope you dont need any upgrades whatsoever