I loved that card, it was my first high end card. Got it in 2019 and finally sold it so I could get a Radeon with RT cores. I sold it in 22 for almost the full price, it’s amazing how their value shot back up
7:20 Here I see low level optimisation for the PS5 version. I would say "bad performing engine" rather than "bad porting". As a matter of facts the game is ALSO a bad porting, but for all loading and memory issues that we know, not because of the raw performance. The performance boost that we can see on console is in line with what one can expected by a low level optimisation, which of course happens only for the exclusives...
You could legit get a laptop with an rtx 4050 in it and have equivalent performance to a Radeon VII, but with added DLSS, Ray Tracing support, and Frame Gen? And it would perform better than a PS5? That is absolutely wild. I didn't realize the PS5 was lacking in performance compared to modern hardware like that
Frametime noob here, how is the PS5 on the frametime graph look so ugly yet gameplay is smooth, but if you have the smallest blip on a pc frametime graph the whole game skips?
Probably since they are measuring the image updates at an output with v-sync forced on, since they can not run measuring tools on the PS5 itself. Whereas on a PC, they can run those tools on the PC itself and get actual frametimes, with v-sync disabled for testing.
Still running in my son's PC. I love my Radeon VII. It still looks amazing too. I bought it the week it came out for $600 (not $699, but $600) at Microcenter and it came with three games. I still play WWZ (which came with it) and my son is currently playing Palworld at 1080p and getting over 80 fps with it. I am going to upgrade him to a 7900 GRE though.... I just need to figure out what CPU/MB to upgrade to for him.
@@DirransRL I forgot I had said something about the CPU. There is a 7800X3D bundle that I was thinking about or just doing what I have, a 5800X3D, but b550 board. There is about a $200 difference in price and I can use that towards a PSU or something, I think I will do the AM5 setup.
this is the ONLY gpu with 16GB of VRAM that supports interlaced resolutions in modern drivers , a good piece of data to keep in mind for CRT Monitor enjoyers!
I wouldn't necessarily say it was underrated. It was a solid card, but it was very overpriced mostly because of how expensive HBM2 memory was at the time. The Vega 56 and 64 both had the same issue. The VII and Vega 64 Limited Edition were both aesthetically very nice looking though.
most people and inhumans will not see all these details in the dynamics - in the game itself. but still, the ps5, as always, gives an excellent result with their cool api... not like dx on pc and xbox. It's scary to think what would have happened to vulkan if Microsoft had bought it...
If Ps5 Pro really is based on RDNA 3 for it's Gpu and has 3.840 Cores that it will be 33 Teraflops, Just at Same Clock as the Base Ps5 Gpu which is 2.2Ghz. PS5 with 33 Teraflops.
Great, Funny you guys bought this Here, The Radeon 7, Cause Ps5 Pro also will have 3.840 Cores Like Radeon 7, according to Tom Anderson, But also Claims that will be Rdna 3, which I don't believe cause As Mark Cerny said in Ps4 Era Changing Even the Gpu Clock To Much Would affect the Compatibility with The Base Model, So For Ps4 Pro they keep the Same Gcn 2.0 just like the Ps4 base architecture! So they just exactly double the Cu's for the Pro Ps4, and they increase the Gpu Clock just about 12%, from 800 to 911, so it's very Expected based on the same Logic that Ps5 Pro will do the Same, Maybe an 4.608 Cores Gpu Rdna 2, with an Clock Around 2.5Ghz and we will have 23 Teraflops of Rdna 2 architecture Gpu.
@@a36538 Really? Because I thought he's just curious to find out more about how the card is performing with contemporary software. Probably because I'm not an idiot whose pathetic brain immediately switches into whatever lame ass console war or brand loyalty mode you can't seem to escape even for two seconds lol
vram frame buffer size stress is more important to hit before... effective bw of increased L2 or L3 cache gpus goes upwards de 1.5 TB/s mark (only for 80% hits average but depends of game and application)... servers are using >4TB/s buses without need to reach unified high bw ram (so to avoid latency, you want fast 1 word read to n-th word to hit max bus bw speed). GTA V is amazing vram scaler from 1 GB (2GB 4k is possible) to 24GB or more at full interanl rendering crazyness... 4K 60 fps in 1050Ti OC 1.95Ghzclock/1.95GHz x4 ram...at 2 gb vram use...
TLOU WAS a bad port. They fixed and improved since launch. That's like me saying Hitman 3 is a bad PS5 port, which it isn't. It's a early cross-gen game ( Pc benefits from this ) that also favors higher cpu performance, simple as that. TLOU is a First-Party game and is using more of the unique tech in the PS5 which doesn't exist in the 6700. What you're seeing is the result of that AFTER the fact it was patched umteenth times. The same results apply to the majority other first-party titles ported to the pc platform, aswell. DF know this is the case but still keep subtly misleading/misdirecting its viewership, why even bother sigh.
It seems than 1080 Ti crush Radeon 7 in Alan wake 2 with latest patch ... You test 1080 Ti in the other video you made, and you were getting something like ~40fps on PS5 resolution/setting... while Radeon 7 here dropping below 30fps, and and missing visuals
@@tj3495 not its not because mesh shader, its driver problem you see amd isn't updating driver for rdna1 and below are abandoned only option is the open source driver on linux.
AMD produces drivers for Vega and there are new drivers as of July 2024. Secondly, neither the 6700, nor the PS5 are able to do productivity work or compute as well as (or at all in the case of the PS5) like the Radeon VII. Thirdly, AMD uses the Vega design (CDNA) for its monstrously powerful compute/AI cards. If you're going to bypass foibles of a PS5 dropping frames in Cyberpunk 2077, it only stands to reason that you would do the same when a game is obviously a terrible port like Alan Wake 2 or The Last of Us. The RVII is a singular GPU that holds up within 10% to a PS5, 5 years after its release and it can do a myriad of things a PS5 can't. Office, compute, productivity, AI and, yes, gaming are all possible. If anything, I think it's the PS5 that looks pretty bad in this video and not the RVII.
You know it's all fine when the GPU that finally won GTX 1080Ti (some years later) has specs nowhere to be found in any new launch since then... 16GB 4096 bit bus hbm2 1000Gb/s... look at the iGPU AMD Radeon RX Vega M GH specs, 1536 Shaders, nobody was thinking on PS5 specs, as Vega 10/Polaris 22 start at 3584 Shaders on dGPU. Who was fired at AMD at this point? because since then only cuts to specs, ans trying to do the same or more with less moving parts and silicon u ntil they made something almost as good (and better for 99% of users) as RTX 30 and then BOOM underwhelming overpromising delayed launches of non working day 0 hw... How fast it will be that GPU with infinity cache of 256 mb, and current dual-issue RDNA3 architecture... you can find CDNA3 on server form only since 2023.06, but gaming? What is waiting for AMD since 2017...
From the point of view of a modder extra memory wins out. It depends. The 2080 is better for casual gamers long term. The Radeon VII is better for the Core gamers. But both products had major issues and deserved poor reviews at the time
How is it a bad buy if just by buying a decent GPU you already pay over 500 bucks. And thats a single PC component, forget about the rest. Your comment is just delusional.
another ? oh my god is the xbox series X still manufactured? It's impressive how the Microsoft console has no relevance at all, soon, a bike vs PS5 lol
The RX 6600 XT only runs at 224 bits of bandwidth with 8gb vram. It would absolutely get destroyed by the PS5 in newer more modern games running resolutions beyond 1080p.
@@Celestial-City PS5 doesn't have the GPU power nor the CPU to run modern games at resolutions beyond 1080p @ frame rates worth a damn, so the 6600XT is actually a closer match in terms of GPU horsepower. Having a lot of VRAM without the power to justify that VRAM is pretty pointless. Hence, the reason you have these consoles running 30FPS at fake "4K", etc. But 4K TV's are so popular and hence you have these very suboptimal configured consoles.
@@erickelly4107 my guy. There are hundreds of videos on UA-cam by GPU reviewers demonstrating that 8Gb Vram cards are obsolete for upcoming games that use high texture details. The PS5 can actually compromise by targeting 40fps (which is good enough for single player) with internal resolutions beyond 1440p and using FSR to hit 4K. Most PC graphics even the high end ones will use upscaling techniques to reach 4K
Is there an intel + NVIDIA Prototype PS5 spec'd PC equivalent? I know the answer... it's been hidding at plain sight... Can it feel wronger to game like a PS5 on an intel+NVIDIA combo? And turn ON RTX? 😂
Do the same tests with RX5600XT and let's call it PS5 AMD prototype... 2304 shaders... jus like PS5 and RX 6700... oh wait? AMD considered 5600XT to be "upgraded" with almost finished RDNA2 architecture to call it a tier higher...that beated RTX 30 in some aspects, but there is more... price up cost down name wrong... product not launch with the rest of line-up...wow AMD doing AMD things...
"You can see that the 6700 which is closets to the PS5 wipes the floor with it" Because it is not close to the PS5. Alex always does the deep dives that shows where the console uses lower than PC low textures in areas. He always shows that the graphics are inferior on console. But Richard cannot grasp that or he is lying intentionally for clicks.
@@DavidSmith-bv8mv "Alex is like a golden comfort blanket to folks like you, isn't he?" He is the one trying to actual help console players to not be fooled. You get what you pay for.
@@DavidSmith-bv8mv Remember what Richard was sayin when Xbox One launched? "Its like a high end PC"... "Its about like an RTX 2080". Then when Series X dropped he made fun of how dismal the Xbox One actually was.
That's the problem with people in the wider community, they like to "thrash" the PS5 a lot and yet it is still selling by thousands and millions still.
Well....my AMD Radeon 7 and my Asus Strix RX Vega 64 play Alan Wake 2 just fine without any texture losses and Vega 64 performs much faster than its rival GTX 1080 from back in the day....this seems to be a biased video with intentional hampering to diminish AMD Radeon 7 GPU because they envy it of how much this 16GB of HBM is useful.
Fine wine is just an anecdote for AMD always releasing hardware with poorly optimized drivers, and then the fixed drivers perform like it should have, and Vega still has untapped potential because it was never fully optimized and AMD played artificial segmentation games by not enabling Rebar and the new directx driver path. Which you can get with modded drivers or Linux. Vega ultimately got the same level of fine wine that gcn 1 and 2 got, only rdna2 had good optimization on launch, and the 7900 is currently undergoing fine wine right now. It's a stupid meme, but it is a real thing that exists because AMD hardware is more future proof and their launch drivers suck.
I loved that card, it was my first high end card. Got it in 2019 and finally sold it so I could get a Radeon with RT cores. I sold it in 22 for almost the full price, it’s amazing how their value shot back up
If I ever update my GPU I will have to donate my Sapphire Radeon R9 Fury triple x so John can be finally complete.
i think he would actually very much appreciate that
I have one of those! I upgraded to the Sapphire Toxic 6900XT Air cooled edition.
Man Fury was such a bad buy back in the day, the R9 390X was the best AMD buy back then, it also aged better.
@@rodrigofilho1996 I bought mine for $150
You can install AMD modded drivers to support latest games. Anernime drivers, still kept up to date and I used them with success for radeon HD 5000s.
hi i want to try it on old Radeon HD 6870 , please you tell me which one to download from that site?
How well do they work with helldivers 2?
And then your pc will 💣💥💨👋🏻
My second PC still uses a Vega 64 and it’s quite capable of 1440p60 on most games.
7:20 Here I see low level optimisation for the PS5 version. I would say "bad performing engine" rather than "bad porting". As a matter of facts the game is ALSO a bad porting, but for all loading and memory issues that we know, not because of the raw performance. The performance boost that we can see on console is in line with what one can expected by a low level optimisation, which of course happens only for the exclusives...
5:25 you could actually play alan wake 2 with vulkan on linux with vega cards no mesh shader issue
basically fk dx12 and windows
One of the rarest gpus you can buy
I throw the Vega64 FE in the ring or the Titan V
I still use my Radeon VII but not really for games. I use it as a music Visualizer because of its HBCC and hdmi out.
You could legit get a laptop with an rtx 4050 in it and have equivalent performance to a Radeon VII, but with added DLSS, Ray Tracing support, and Frame Gen? And it would perform better than a PS5? That is absolutely wild. I didn't realize the PS5 was lacking in performance compared to modern hardware like that
Part of it is the limited CPU power, I'm sure.
@vogonp4287 true, it being the old Zen 2 architecture is pretty limiting.
@@apersonontheinternet8354 A cut down iteration at that. Lacking cache, and some instructions.
Frametime noob here, how is the PS5 on the frametime graph look so ugly yet gameplay is smooth, but if you have the smallest blip on a pc frametime graph the whole game skips?
Probably since they are measuring the image updates at an output with v-sync forced on, since they can not run measuring tools on the PS5 itself.
Whereas on a PC, they can run those tools on the PC itself and get actual frametimes, with v-sync disabled for testing.
is that john's son playing mario 64?
I've been noticing that in the background in a few videos lately. I honestly think the answer is yes.
That's a buddy of him playing Mario 64 for some review.
@@Mr_Battlefieldthat’s hilarious. This is just an excerpt from the last df weekly. It’s Try from my life in gaming.
Yes xD
That would be a giant kid!
What's the "point" of a decimal separator when the traling number is always zero?
Fine Wine is fully mature, it tastes like vinegar now
😂🤣😭
Ahahahahahahahahahahahhahahahaha this peace of garbage trashdeon is obsolete ahahaha 😂😂😂😂
Still running in my son's PC. I love my Radeon VII. It still looks amazing too. I bought it the week it came out for $600 (not $699, but $600) at Microcenter and it came with three games. I still play WWZ (which came with it) and my son is currently playing Palworld at 1080p and getting over 80 fps with it. I am going to upgrade him to a 7900 GRE though.... I just need to figure out what CPU/MB to upgrade to for him.
R7 7800?
@@DirransRL Radeon 7900 GRE. It is a new 7900 below the XT.
@@IgoByaGo yes, I know of it. R7 as in the CPU. Would be a good pairing I imagine.
@@DirransRL I forgot I had said something about the CPU. There is a 7800X3D bundle that I was thinking about or just doing what I have, a 5800X3D, but b550 board. There is about a $200 difference in price and I can use that towards a PSU or something, I think I will do the AM5 setup.
Get an RTX 4070
I’m so glad you guys mentioned the MLB glitch. That was hilarious!
- of course I had to look it up. 😏
this is the ONLY gpu with 16GB of VRAM that supports interlaced resolutions in modern drivers , a good piece of data to keep in mind for CRT Monitor enjoyers!
Good to see the Radeon VII getting some more attention. It was my first GPU in my first build, it was quite an underrated GPU for its time imo
It wasn’t underrated. It performed worse than a 1080 Ti and RTX 2080 in most games and was overpriced. It was never a good value no a great performer.
I wouldn't necessarily say it was underrated. It was a solid card, but it was very overpriced mostly because of how expensive HBM2 memory was at the time. The Vega 56 and 64 both had the same issue. The VII and Vega 64 Limited Edition were both aesthetically very nice looking though.
@@chalpua8802 it performed above the 1080ti more often then not when the res was increased.
It's obsolete bro no one pay attention on this card...
I still use my VII in my office computer. I also play games on it.
most people and inhumans will not see all these details in the dynamics - in the game itself. but still, the ps5, as always, gives an excellent result with their cool api... not like dx on pc and xbox. It's scary to think what would have happened to vulkan if Microsoft had bought it...
If Ps5 Pro really is based on RDNA 3 for it's Gpu and has 3.840 Cores that it will be 33 Teraflops, Just at Same Clock as the Base Ps5 Gpu which is 2.2Ghz.
PS5 with 33 Teraflops.
Turn off the frame counter and just play the game and I highly doubt you will suffer any loss in enjoyment.
1tb memory bandwith so delicious for you know.... And 3.4 tf double precision at 700 $ is questionable how this gpu cost so cheap😂
a collector item but compared to rtx 20 and rdna its just not good
Great, Funny you guys bought this Here, The Radeon 7, Cause Ps5 Pro also will have 3.840 Cores Like Radeon 7, according to Tom Anderson, But also Claims that will be Rdna 3, which I don't believe cause As Mark Cerny said in Ps4 Era Changing Even the Gpu Clock To Much Would affect the Compatibility with The Base Model, So For Ps4 Pro they keep the Same Gcn 2.0 just like the Ps4 base architecture! So they just exactly double the Cu's for the Pro Ps4, and they increase the Gpu Clock just about 12%, from 800 to 911, so it's very Expected based on the same Logic that Ps5 Pro will do the Same, Maybe an 4.608 Cores Gpu Rdna 2, with an Clock Around 2.5Ghz and we will have 23 Teraflops of Rdna 2 architecture Gpu.
Test games that stress the VRAM, with its almost 1TB/s bandwith
Radeon VII was HBM wasn't it?
Like what? TLOU and Rachet and Clank? In which it'll produce 5 fps?
@@yarost12exactly, this guy is coping
@@a36538 Really? Because I thought he's just curious to find out more about how the card is performing with contemporary software. Probably because I'm not an idiot whose pathetic brain immediately switches into whatever lame ass console war or brand loyalty mode you can't seem to escape even for two seconds lol
vram frame buffer size stress is more important to hit before... effective bw of increased L2 or L3 cache gpus goes upwards de 1.5 TB/s mark (only for 80% hits average but depends of game and application)... servers are using >4TB/s buses without need to reach unified high bw ram (so to avoid latency, you want fast 1 word read to n-th word to hit max bus bw speed).
GTA V is amazing vram scaler from 1 GB (2GB 4k is possible) to 24GB or more at full interanl rendering crazyness... 4K 60 fps in 1050Ti OC 1.95Ghzclock/1.95GHz x4 ram...at 2 gb vram use...
TLOU the grass of 6700 looks better, check settings!!!
Fine wine is a thing of the past. It's all proton now.
Remember when AMD fanboys thought HBM would single handedly “win?” It’s been 5 years, 2 console generations later, and we’re still waiting.
`Why re-test Radeon 7 and not the RX 5700 XT? When the latter is closer to spec to PS5 hardware than the enthusiast VEGA GPU.
PS5 API? Vulkan?
PlayStation has their own proprietary graphics API. It’s not Vulkan, but it’s supposedly similarly low level.
Try Omega Drivers
TLOU WAS a bad port. They fixed and improved since launch. That's like me saying Hitman 3 is a bad PS5 port, which it isn't. It's a early cross-gen game ( Pc benefits from this ) that also favors higher cpu performance, simple as that. TLOU is a First-Party game and is using more of the unique tech in the PS5 which doesn't exist in the 6700. What you're seeing is the result of that AFTER the fact it was patched umteenth times. The same results apply to the majority other first-party titles ported to the pc platform, aswell. DF know this is the case but still keep subtly misleading/misdirecting its viewership, why even bother sigh.
It seems than 1080 Ti crush Radeon 7 in Alan wake 2 with latest patch ... You test 1080 Ti in the other video you made, and you were getting something like ~40fps on PS5 resolution/setting... while Radeon 7 here dropping below 30fps, and and missing visuals
my rx 590 gets 35fps in alan wake 2 on linux ,pretty sure the Radeon 7 will get higher than 40fps on linux
Radeon vii /5700xt has great performance on Linux in Alan wake 2
This RVII footage is before the mesh shader patch, the performance should be somewhat corrected
@@tj3495 not its not because mesh shader, its driver problem you see amd isn't updating driver for rdna1 and below are abandoned only option is the open source driver on linux.
@@fdgdfgdfgdfg3811 could you upload a video?
Don't you have any Xbox's you can do this stuff on, sure owners of that system might be interested 👍
AMD produces drivers for Vega and there are new drivers as of July 2024. Secondly, neither the 6700, nor the PS5 are able to do productivity work or compute as well as (or at all in the case of the PS5) like the Radeon VII. Thirdly, AMD uses the Vega design (CDNA) for its monstrously powerful compute/AI cards. If you're going to bypass foibles of a PS5 dropping frames in Cyberpunk 2077, it only stands to reason that you would do the same when a game is obviously a terrible port like Alan Wake 2 or The Last of Us. The RVII is a singular GPU that holds up within 10% to a PS5, 5 years after its release and it can do a myriad of things a PS5 can't. Office, compute, productivity, AI and, yes, gaming are all possible. If anything, I think it's the PS5 that looks pretty bad in this video and not the RVII.
3:25 XDDDDD
IT'S VII NOT 7
Wut?
Calm down unc
Go touch grass
same shit
VII roman numbers 😂😂😂 = 7
CHINA!🤣
You know it's all fine when the GPU that finally won GTX 1080Ti (some years later) has specs nowhere to be found in any new launch since then... 16GB 4096 bit bus hbm2 1000Gb/s... look at the iGPU AMD Radeon RX Vega M GH specs, 1536 Shaders, nobody was thinking on PS5 specs, as Vega 10/Polaris 22 start at 3584 Shaders on dGPU. Who was fired at AMD at this point? because since then only cuts to specs, ans trying to do the same or more with less moving parts and silicon u ntil they made something almost as good (and better for 99% of users) as RTX 30 and then BOOM underwhelming overpromising delayed launches of non working day 0 hw... How fast it will be that GPU with infinity cache of 256 mb, and current dual-issue RDNA3 architecture... you can find CDNA3 on server form only since 2023.06, but gaming? What is waiting for AMD since 2017...
From the point of view of a modder extra memory wins out. It depends. The 2080 is better for casual gamers long term. The Radeon VII is better for the Core gamers. But both products had major issues and deserved poor reviews at the time
Cope. Did you not watch the video?
The ONLY reason to pick the Radeon 7 over a 2080 today is if you're building a Hackintosh.
@@a36538Do you children know any other words than 'cope' or 'xbots'? Pay more attention in school.
@@a36538seems you're the one who hasn't watched the video.
Rtx 2080 destroy radeon 7 in modern games like alan wake 2 and last of us part 1
Radeon RX 7900 XTX laughs at PS5 has low fps.
PS5 laughs at 7900XTX price, you practically pay 2 PS5 to get a single GPU. What a joke!
It was a good console in its day, but PS5 is a bad buy in 2024. $500 for low FPS with a janky frame time? No thanks.
How is it a bad buy if just by buying a decent GPU you already pay over 500 bucks. And thats a single PC component, forget about the rest. Your comment is just delusional.
another ? oh my god is the xbox series X still manufactured? It's impressive how the Microsoft console has no relevance at all, soon, a bike vs PS5 lol
Pretty certain you’ll find the RX 6600XT not the 6700 is closer to the performance of PS5.
The RX 6600 XT only runs at 224 bits of bandwidth with 8gb vram. It would absolutely get destroyed by the PS5 in newer more modern games running resolutions beyond 1080p.
@@Celestial-City PS5 doesn't have the GPU power nor the CPU to run modern games at resolutions beyond 1080p @ frame rates worth a damn, so the 6600XT is actually a closer match in terms of GPU horsepower. Having a lot of VRAM without the power to justify that VRAM is pretty pointless. Hence, the reason you have these consoles running 30FPS at fake "4K", etc. But 4K TV's are so popular and hence you have these very suboptimal configured consoles.
no, they tested already and 6700 10gb is on par with ps5 in raster and rt. 6600xt is too slow for rt.
@@dagnisnierlins188 yup. The OP is delusional lol 😆
@@erickelly4107 my guy. There are hundreds of videos on UA-cam by GPU reviewers demonstrating that 8Gb Vram cards are obsolete for upcoming games that use high texture details. The PS5 can actually compromise by targeting 40fps (which is good enough for single player) with internal resolutions beyond 1440p and using FSR to hit 4K. Most PC graphics even the high end ones will use upscaling techniques to reach 4K
PS5 low hardware
Is there an intel + NVIDIA Prototype PS5 spec'd PC equivalent? I know the answer... it's been hidding at plain sight... Can it feel wronger to game like a PS5 on an intel+NVIDIA combo? And turn ON RTX? 😂
Wow a RIIV that is still working lol
ps5 with worst image quality 👍
Radeon Sieben
yay, another video about the PS5 vs some random GPU!
And you clicked and commented, pushing up the engagement numbers. Thanks! ❤
@@KarlTheExpert You're welcome?
Do the same tests with RX5600XT and let's call it PS5 AMD prototype... 2304 shaders... jus like PS5 and RX 6700... oh wait? AMD considered 5600XT to be "upgraded" with almost finished RDNA2 architecture to call it a tier higher...that beated RTX 30 in some aspects, but there is more... price up cost down name wrong... product not launch with the rest of line-up...wow AMD doing AMD things...
meow
"You can see that the 6700 which is closets to the PS5 wipes the floor with it"
Because it is not close to the PS5.
Alex always does the deep dives that shows where the console uses lower than PC low textures in areas. He always shows that the graphics are inferior on console. But Richard cannot grasp that or he is lying intentionally for clicks.
Sure thing, little buddy😂. Alex is like a golden comfort blanket to folks like you, isn't he?
@@DavidSmith-bv8mv "Alex is like a golden comfort blanket to folks like you, isn't he?"
He is the one trying to actual help console players to not be fooled. You get what you pay for.
@@DavidSmith-bv8mv Remember what Richard was sayin when Xbox One launched? "Its like a high end PC"... "Its about like an RTX 2080".
Then when Series X dropped he made fun of how dismal the Xbox One actually was.
thrash that ps5 all you want, its still gonna sell
That's the problem with people in the wider community, they like to "thrash" the PS5 a lot and yet it is still selling by thousands and millions still.
Playstation is a pure win for me as a pc gamer. I'm always looking forward to new exclusives coming to my preferred platform.
@i3l4ckskillzz79 those 'exclusives' are going elsewhere to cover the 10 billion they lost
130 million active monthy pc gamers laugh at the number of ps5s
@@mikeuk666what are you talking about? Who lost 10 billion? What was the 10 billion lost from?
Well....my AMD Radeon 7 and my Asus Strix RX Vega 64 play Alan Wake 2 just fine without any texture losses and Vega 64 performs much faster than its rival GTX 1080 from back in the day....this seems to be a biased video with intentional hampering to diminish AMD Radeon 7 GPU because they envy it of how much this 16GB of HBM is useful.
How? I have a Vega 56 and AW2 runs and looks exactly like this clip
Do you use the custom drivers? That could be the issue here.
@@aboveaveragebayleaf9216 nope, just regular AMD up to date drivers.
@@JVCFever0 you probably don't have a Vega then :-).
Ps5 perfomance close to videocard that got same price as full console, impressive.
That videocard can get picked up for a lot less than $500
The Graphics card alone cost as much as the PS5 console itself so what the point in comparing them. Further proof why DF are Ultra setting SNOBS
@@AZBCDE ur saying the guys that consistently make optimised settings guides for a bunch of games are ultra settings snobs? really?
@@mikeuk666 Price for card droped, bought one for 450, 1+ year ago
@@alrightylolstill more expensive than a ps5
Another massive win for PS5
low hardware, PS5 = PC 450 dollars lol
Another? What was the other one? Did it 'win' against some GPU from 2015?
It wins against mid range 2021 GPU's like RTX 3060 which is pretty good for a 2020 console @@KarlTheExpert
@@julienping not really. Unless you buy used. With new parts, you are talking more like $700 give or take.
AMD FineWine aka fine Whine is dead 😵 It never existed, it’s just a coping mechanism for people who bought this rubbish and are still holding on
Still better than getting instant nvidia rubbish here and now 😂
Simmer down Jensen.
Fine wine is just an anecdote for AMD always releasing hardware with poorly optimized drivers, and then the fixed drivers perform like it should have, and Vega still has untapped potential because it was never fully optimized and AMD played artificial segmentation games by not enabling Rebar and the new directx driver path. Which you can get with modded drivers or Linux. Vega ultimately got the same level of fine wine that gcn 1 and 2 got, only rdna2 had good optimization on launch, and the 7900 is currently undergoing fine wine right now. It's a stupid meme, but it is a real thing that exists because AMD hardware is more future proof and their launch drivers suck.
i wonder how the radeon 7 would hold up on linux