I love his content and slog thru it normally in spite of his Trisha Takanawa voice. These CES videos have been perfect due to the less distracting weird voice thing. I hope he agrees and tones it back because the content is A+
I watch his videos often and noticed the same thing!! My wife likes to sit in my office and read while I watch youtube, and she was certain he was an AI voice. Dawid does tech stuff is another weird voice. It doesn't matter to me, I'm here for the content and Gamer Meld delivers!
For me I'm done with everyone selling hype for each team they love. I'll wait and see FSR4, DLSS4, raw power and price of each card when they are really available and then decide for myself as usual what is best FOR ME. My preference is always price/performance ratio, but that's just me.
it feels smoother? well if they make u play with a controller then they are trying to hide the latency. A pad hides additional latency pretty well compared to when u use a mouse.
@@PlusTiger most? no. only the new wave of pc gamers. no pc gamer will play a fps with a pad, that is something console players have brought over to pc. But 3rd person hack and slash action games are pretty okey with a pad, as they are slower and easier to lock on as the game assists.
Super true, as a solutions architect for an mnc bigger than AMD, i can confirm. using controller is a cheat code to hide latency. real thing is when you use keyboard, mouse and that's when the real experience you get. And imo, nvidia thrives over AMD in almost every aspect, except cpu, which they are doing now at digits and no guarantees but i think they will do a cpu that easily slaps amd's and intel's
@@Pillokunyep, I only use a gamepad on PC when playing mostly racing games and some adventure ones or whatever, most others is mouse and keyboard unless I connect it to my TV...
@@PlusTiger complete lie lmao. PC gamers rarely use gamepads, only for racing or games like elden ring. Otherwise we primarily use mouse and keyboard. Controller is a huge disadvantage and just isn't a good input unless an FPS title (most these days) has crazy aim assist
It is stated that it's developed for RDNA4, but nowhere it's said it won't work on older RDNA GPUs... at least this is the info from many people that were at CES and talked to AMD
Literally only says the fsr4 upgrade feature works on rdna4, which allows you to use fsr4 in fsr3.2 supported games. Fsr4 is supposed to work like not as well on 7000 series but still works from what I've heard. But we will see
When FSR came out initially they were talking about I think at least 4 frames at once being generated between frames being totally possible. I am assuming that will stay true regardless of the AI direction towards upscaling.
It was more so people recognize the point being made. With that said, I did change it. But yeah. It’s just so you can quickly glance and see what I’m talking about.
It's still a bloody crap Vaseline on both my 1080P and 1440P monitors... Only looks fine for my 4K TV to see further away and of course upscales from higher resolution.
remember when the 6000 series came out and it didnt have any fsr but they said they were working on it. finally they are getting something close to dlss, it only took 5 years and doesnt even work on 6000 series cards,
The upscaling and frame generation software solutions are a stopgap until significant advances are made in graphics processing. Basically fake it till you make it.
I hope you are right, because the alternative is that the smaller nm architechtures are approaching a physical limit meaning that either GPUs will need to get bigger/more power hungry or become increasingly more expensive due to the lower yields.
I agree with everyone posting about ‘fake frames’ and kicking against other ‘AI tricks’ and marketing BS by all manufacturers, but the truth is that ‘traditional’ GPU technologies are hitting the constraints of the laws of physics. The fact that the 5090 has a TDP of up to 575 watts is really telling. The demands of gamers for better more ‘realistic’ graphics with each successive generation means that these tricks have to be employed to meet expectations regarding higher framerates, higher fidelity, higher resolution, and acceptable power draw. Personally, I think there’s so much fun to be had in gaming that doesn’t require such high graphical expectations… look at the success of the Nintendo Switch, Minecraft, and indie games like Among Us, Hollow Knight, and Balatro. In my 40 years of playing games I’ve noticed that demand for photorealism and framerates mostly comes from those who solely play AAA FPS blockbusters and competitive shooters, with the desire for graphical improvements making up for the lack of creativity, variety and gameplay interest that those genres entail.
Why would i buy 9070xt if my 7900xt could have fsr 4 ? Just cause it has better ray trace (never use it ) ? Think man just think. Now people have good reason to buy a reasonably priced gpu with good upscaling like 4070 and 4070 super did with dlss.
people with 7900 are not gonna be interest in AMD gpus going forward anyway since they are not in the high end anymore. those people will go for 5080 90 or wait for 6080 90.
@@arbernuka3679 people with 7900 are not gonna be interest in AMD gpus going forward anyway since they are not in the high end anymore. those people will go for 5080 90 or wait for 6080 90.
@@arbernuka3679 doubt that, they just said that is only for 9000 series, and even if its going to be for 7000 series either is going to be super limited or going to be delayed
@@lucaskp16 i have 7900 xt since day one. If the upscale technology have a generations, RX 7000 series should have at least 2 gen upscale technology support from Amd. Like FSR 3 and FSR 4. On Nvidia side Upscale technology have better support right know than Amd. I'm talking for AI based upscale not the old manual pixsel algorithm. Seriously not like this...
firstly, the gameplay is too smooth for it to be a native 4k output, both systems are running the same hardware so the right most result looking that much better and also running at the same or better performance can't be the result of native output. also people have put the same game, ratchet and clank with fsr 3.1 on their machines and confirmed the artifacting on the leftmost monitor are congruent, while the rightmost has artifacting still but to a much lesser degree which means it is still being put through a solution, they have not called it FSR4 yes that is true, but its a very safe bet to assume it is FSR4 or a succesor to 3.1.
@@Conumdrum your comment still does not address or disprove any of my valid arguments, besides, there is still a way to notice frames per second higher than the video supports, it will technically be the same framerate, but higher refresh rates will look sharper every frame with less blurring, anyways, that wasn't even a main argument point I made, you just nitpicked it from my comment
Making fsr 4 exclusive to rdna 4 and not even offering a fall back layer for the 7000 series is bad p.r. they could at least offer the image quality portion update. My next upgrade will be nvidia when the time comes. Even Intel xess has a fallback dp4a that would've made the most sense. What happened to the 7900xtx a.i. cores that is supposed to be there
Fake frames feel terrible in most casenarios and sometimes feel meh even in story mode where the latency isn't a crazy factor it still feels off. I just want raw performance as someone who doesn't even really care about raytracing
@@FDFFDS-go1oo Yes they also have a tool called lossless scaling which at 2x frame gen is pretty comparable to regular Nvidia implementation at game engine level, I know u guys love to spew nonsense without any idea what you are talking about its "ok" for story player games where there is no competition and you are just playing games for the look, but otherwise its basically not a real feature people care about
@@Tugela60 Lmao do you even think before you type? You are comparing a video you watch (pre-rendered) vs a live game that takes inputs and renders as you go in real time where your inputs and latency is affected based. Extrapolation has been a thing for a long time and there is a reason it wasn't ever used in games its terrible.
At some point, AMD has to create something that simply doesn't work on previous generation parts. There's no way to keep advancing if backward compatibility is a hard and fast requirement.
But that's a selling point of amd so they should figure it out, I mean I wouldn't expect it on rdna 2 but as a company it looks bad if it's not on rdna 3 that has a.i. cores.
The 5080 using DLSS4 hits 250+ fps in Cyberpunk at 4K on Ultra settings with Full Ray Tracing and the 5090 is twice as fast, Nobody is on Nvidias heels
@Conumdrum and it's 28 fps with fake frames off. 4 generations later and Nvidia still can't run cyberpunk at 30 fps according to their own website. Ur basically giving them 1-2k/yr to beta test their ai.
@@brussy1 I laugh, all these years and AMD still can't Ray trace DLSS 4 is going to be better than FRS 4 without frame gen and multiple frame gen will have people gaming at 4K at 150+ fps in most titles on the cheap, at the end of the day, its all bad news for AMD
A very massive improvement in Ratchet and Clank. I know that Ratchet and Clank is one of the worse ones for AMD FSR 2 & 3. I'm assuming it is (FSR4) not backwards due to the AI capabilities of the 7000 and older RDNA series GPU's. Well, we'll know more when AMD is more willing to talk about the 9000 series - and there is a definite fury of rumours floating about 9000 series.
@@edhorseman2237 7000 series has AI accelerators like the 9000 series, yeah they aren't as efficient. I think the issue is that if RDNA 3 had FSR 4 it would be faster than RDNA 4 so they won't allow it
@@edhorseman2237 nah AMD themselves have said that the 9070 non xt would be between 7800xt and 7900gre and the 9070xt is between 7900 gre and 7900xt. The difference would be negligible
@@AdeptN00B yeah my 4080 has been great aside from the top hdmi being dead I just use the second one on my s95b and it's been fine. I also think I just saw ALL RTX cards will get dlss 4.0?
See where you need tensor cores now, so will the rtx 4000 series be able to run it FSR4, where they have the tensor cores?. I can't wait for dlss4 see what that's about
@@David95853 Seems to be the case with FSR 4 as it won't work on 7000 series, but perhaps some features will still work on older GPU's. Unusual move for AMD if you ask me.
Since this FSR4 is only for the 8000 series then back to Nvidia I go, and getting rid of my 7900xtx for. 5080/5090. I supported AMD through the 7000 series, and that will be last the last time.
Same bro, Nvidia lowkey cooking, back to Nvidia. I bought a 7900 GRE like a month ago and this is such a disappointment which is crazy because this GPU is a beast. The higher end 7900's can def run FSR 4, but it would be faster than RDNA 4 so they won't allow it as then it wouldn't sell
9070/70XT is a GRE with ray tracing. Could have at least done an 70XTX to be more along the lines of a 7900XT with ray tracing. Can you believe i actually considered a 5070ti in my head. i have a GRE what is wrong with me..
I think that's aiming too high. I think between 7800XT and 7900XT is where these cards land. AMD uses VRAM buffer as an upsell feature so the 7900XT should still have a reason to exist.
@@arch1107 Nah I assume it's because at least NVIDIA is supported and not a shock when a generation is just dropped off the map like AMD did with 7000 series (from a 7900 GRE owner)
This is just another reason why Nvidia dominates the market! Even with the newest GPU’s FSR/DLSS will be required on the newest games with RT on full. When it comes to budget gaming at 1080, that is the only area that AMD and now Intel can compete; though the 5060 is likely to be about the same cost and still perform better with DLSS. If running 1080 though, I do not think DLSS/FSR will be needed. Still, I do not like the idea of spending money on a product that isn’t as good if the price is close to the superior product.
Do you have any idea what were you trying to tell ? You sound like most of ngreedia users🐒. You know new amd cards will perform 2x-3x better on raytrace and fsr looks incredible now ?!
@@arbernuka3679 what? 2x-3x time better ray tracing? from the leaks it was supose to be close to 4070 ti in terms of ray tracing, what you saying there could match the new rtx 5080 lol.
@ compared to AMD’s last gen sure, but compared nvidia’s new 5000 series, AMD will still be outclassed! It also seems that FSR4 still has some slight fractal issues, but definitely a big improvement over FSR3 from what I’ve seen on similar videos. The big issue is that Nvidia has 90% of the market, so game developers are optimizing games for Nvidia on PC; Indiana Jones and the great circle doesn’t even support FSR yet! (It will eventually). I always saw AMD as better option for budget gaming due to lower prices and if gaming 1080 no RT, FSR shouldn’t be needed; so I was going to buy one for my son when I build him a PC. But if Nvidia is going to lower the prices of the 5060, that will make it more competitive.
If frame generation makes three frames from the one rendered frame wouldn't that mean that there would be four identical frames? If so wouldn't that just output jerky all be it higher frame rates?
@@Conumdrum yes, it's amazing but you're not buying a GPU, but a software technology. Multi Frame Generation with AI already existed before the rtx 5000. And there is a lot of open source software that do that.
@@mouadessalim7116 When everyone starts gaming at 120fps+ at 4k on the cheap , AMD is in trouble Great looking fake frames won't offend the ones using them
Yup, it seems like AMD is trying to compete with old GPU's lol. That would be like if they advertised the 9060 to be equivalent to a 1080 ti. AMD is a joke. I regret my RX 7900 GRE purchase, Nvidia next time for sure
If AMD doesn’t come out with some info and pricing and release dates I will just go with nvidea. Was hoping for team red to pull off a win but this is sketch AF!
its more about upscale, and when you play a competitive game you dont use FG, you just use low settings for best performance, you dont play CS with max settings and expect to run 400 fps lol, FG should be in single player games, but i dont know a single person who is using that, dlss yes its good but FG not, and its never going to be
What do you mean, fsr quality has always looked way better than DLSS, the real difference between the 2 has always been the performance since DLSS is capable of achieving more fps in worse systems when compared to FSR
Bro I'm not an AMD fan boy, in fact I have a 3060 and will buy a 5080 as soon as it comes out, but always since I have tried DLSS vs FSR in all games I have tried it I have the same result DLSS gives me more performance but looks horrible while FSR looks much better but with not much performance upgrade... (As a warning I have only tried these in Quality mode since anything more just looks way too bad IMO)
@@ilbro7874 it is a good point, if Nvidia lover like to pick on AMD and saying this is BAD, then Nvidia is bad aswell as both is trying to upscale / fake frame
It sucks that Ryzen 7k cards can't use it, but since it is AI based on new architecture, it's understandable. I have a 7900xtx and AMD has said they're not going after the high end anymore. If theyre showing off a 9070/XT....it would back up there not being a 9080/9090 card. So I will just run with the card I have for another couple years since it is plenty powerful. I am really interested in their new products and depending on the price, I might get one to replace my wife's 6700xt.
Correction: They aren't in the high end this generation. The chiplet design(RDNA3) was a little rushed and the 7900 xtx was an un-polished product. They are actually happy with what they did with the chiplet design afterwards but they had to release something to keep the board partners happy. So they had their better RT tech and FSR 4 to debut so they whipped up a mono die and here is the 9070. Next gen will be different 😊
Dlss 4 also support on 20 series card And it's from greedy nvidia But AMD can't even support fsr4 on latest older card what a joke How the table turned
They bring all the features two even the 20 series except the frame Gen to nvidia cards, which is more than fair that includes ray reconstruction, so you're telling me amd couldn't do the same .
you simply can't have all the features of DLSS 4 in a 4000 series or older, it has always been that way. This happens with every new feature that depends on hardware dedicated stuff. You can still use FSR 3.1, that works pretty well, I play Hunt: Showdown with FSR and it looks good
Who would have bought 9070/xt if they had fsr 4 on their gpus (7000) ? If you have 7900 gre/xt/xtx would have you conisdered and upgrade to 9070 gpu ? The answere is NO so sit down and buy an nvidia card and play with fake 4x framegen fps🤣.
@@arbernuka3679 lol, like amd is not going to use the same fake gen?? i dont like this new ai stuff, but nvidia in raw performance still beat amd, if you want best gpu and dont want to use fsr or dlss go for 5090, its simple tactics but its going to work in the end.
fsr4 sucks! dlss shit sucks! thank u not for hyping this shit. Games gotta be optimized. Screw all those promoting shit frame generation technology. All lies and fake shit 😤😠😡
AMD needs to sell this cards at such a low price that they can't possibly make any $$$. they are just too far behind at this point. As they keep bleeding what little market share they have by continuing to not support older cards over and over.
if you dont play new gen games, or single player games sure, but more and more games are going to slap you in the face with ray tracing and some you cant even disable it, its going to be a shit show
no one in reality, but since it will be shoved down our throats, we will need so rt performance to stay afloat, but if the future is apus, well, not sure what to think about
Sigh. I am so glad I don't feel the urge to play this useless game of king of the hill anymore. Heck I could grab a 5700xt and it would be a huge upgrade for me. But there is no game on the market yet I have needed to upgrade to run, that I have really wanted to play. That day will inevitably come, but I will be damned if I am going to spend $500+ for a card that costs $42 for them to manufacture.
Cost of an item is not just the manufacturing cost. There is overhead such as design costs, financing costs, taxes, salaries, facility rentals, etc. The actual manufacturing cost is just a small part of overall cost, it can be as little as 10%, and that is before the manufacturer makes any profit at all.
@@Tugela60 How do you think their profits have soared over the last few years? Because I can tell you from watching their financials that all of those things average to a 10% increase over the same time period as well partner. Factories and equipment are sunk costs that usually recoup their investment in within 3 years.
@@rkwjunior2298 dont look at it as that, it could be that, but think of it as everybody disliked nvidia lies with ai so people want something else not from nvidia in the end, we need amd to be around but amd is not finding much reasons to keep selling gpus, so expect apus for everyone everywhere
So basically anybody who purchased a 7000 GPU is with being left behind and this is why people don't want to buy products because companies are fickle and don't stand behind their product I'll keep my RX7800 but I won't buy another AMD card I'll buy Intel next time
Your voice sounds better than the weird stuff you do with it on your normal vids
different mic and setup
Agreed, I can't watch his studio videos, he sounds so unnatural with all the filtering.
agreed, always feels super unnatural in his regular vids as if he's over-prepared and scripted. i like this much more
great content regardless though!
I love his content and slog thru it normally in spite of his Trisha Takanawa voice. These CES videos have been perfect due to the less distracting weird voice thing. I hope he agrees and tones it back because the content is A+
I watch his videos often and noticed the same thing!! My wife likes to sit in my office and read while I watch youtube, and she was certain he was an AI voice. Dawid does tech stuff is another weird voice. It doesn't matter to me, I'm here for the content and Gamer Meld delivers!
For me I'm done with everyone selling hype for each team they love. I'll wait and see FSR4, DLSS4, raw power and price of each card when they are really available and then decide for myself as usual what is best FOR ME. My preference is always price/performance ratio, but that's just me.
you know what else is massive?
ur mooooooom 😂
Stop dragging that meme
AMD's wide open opportunity they're about to blow
@@-in-the-meantime... Makes zero sense just to lock fsr 4 to the 9000 series but wait amd wants to play an nvidia it just doesn't work ffs
My back
it feels smoother? well if they make u play with a controller then they are trying to hide the latency. A pad hides additional latency pretty well compared to when u use a mouse.
Or maybe because most people are used to playing with controllers
@@PlusTiger most? no. only the new wave of pc gamers. no pc gamer will play a fps with a pad, that is something console players have brought over to pc. But 3rd person hack and slash action games are pretty okey with a pad, as they are slower and easier to lock on as the game assists.
Super true, as a solutions architect for an mnc bigger than AMD, i can confirm. using controller is a cheat code to hide latency. real thing is when you use keyboard, mouse and that's when the real experience you get. And imo, nvidia thrives over AMD in almost every aspect, except cpu, which they are doing now at digits and no guarantees but i think they will do a cpu that easily slaps amd's and intel's
@@Pillokunyep, I only use a gamepad on PC when playing mostly racing games and some adventure ones or whatever, most others is mouse and keyboard unless I connect it to my TV...
@@PlusTiger complete lie lmao. PC gamers rarely use gamepads, only for racing or games like elden ring. Otherwise we primarily use mouse and keyboard. Controller is a huge disadvantage and just isn't a good input unless an FPS title (most these days) has crazy aim assist
SO basically the 7000 series AI Accelerators are a waste of space that can't be use for AI upscaling, I smell BS on that but ok.
Because if they allowed FSR 4 on RDNA 3 GPU's they would beat 9000 series so they won't let it happen, shame on AMD
Maybe it will be supported over time through drivers,once they sold enough 90xx series. We'll see.
It is stated that it's developed for RDNA4, but nowhere it's said it won't work on older RDNA GPUs... at least this is the info from many people that were at CES and talked to AMD
Truth is we don't know however that world be an excuse to go get a 9070
@@JROCC0625 if AMD screws this up, even though nVidia isn't an answer, I think I'll go and get at least 4080 TBH...
@@WarlockSRBNothing coming from AMD will be close to a 4080, strange comparison.
@@Dexion845 that's not what they are saying. 9070xt is supposed to be close to 4080s
Literally only says the fsr4 upgrade feature works on rdna4, which allows you to use fsr4 in fsr3.2 supported games. Fsr4 is supposed to work like not as well on 7000 series but still works from what I've heard. But we will see
When FSR came out initially they were talking about I think at least 4 frames at once being generated between frames being totally possible. I am assuming that will stay true regardless of the AI direction towards upscaling.
Please dont tell me this.... And I hope FSR will be released to cards like the 7900xtx!!!
i wish utubers use proper real thumpnails. fsr3 is never that blurry. plus, there is sharpness bar.
Was about the comment the same thing. Such a BS, clickbait thumbnail.
if you couldnt tell that it was not serious then this video isnt for you go back to playing on your switch bud
The comparison between the two monitors at CES has both computers running FSR 3 & 4 on the performance mode
It was more so people recognize the point being made. With that said, I did change it. But yeah. It’s just so you can quickly glance and see what I’m talking about.
It's still a bloody crap Vaseline on both my 1080P and 1440P monitors... Only looks fine for my 4K TV to see further away and of course upscales from higher resolution.
remember when the 6000 series came out and it didnt have any fsr but they said they were working on it. finally they are getting something close to dlss, it only took 5 years and doesnt even work on 6000 series cards,
at least you have XeSS
RX9070----Please be under 600 dollars. Please be under 600 dollars. Please be under 600 dollars. Pleeeeeeaaaaassssseeeeeee!!!
waaaay too much money lol...
Any info on amd streaming encoder? better?
The upscaling and frame generation software solutions are a stopgap until significant advances are made in graphics processing. Basically fake it till you make it.
I hope you are right, because the alternative is that the smaller nm architechtures are approaching a physical limit meaning that either GPUs will need to get bigger/more power hungry or become increasingly more expensive due to the lower yields.
They confirmed that rnda 3 doesn't have the required hardware to run FSR 4 . At this point I'm going to Nvidia
The 7000 have a.i . cores that work fine for a.i. programs, but they can't work for fsr 4. Come on, amd
I agree with everyone posting about ‘fake frames’ and kicking against other ‘AI tricks’ and marketing BS by all manufacturers, but the truth is that ‘traditional’ GPU technologies are hitting the constraints of the laws of physics.
The fact that the 5090 has a TDP of up to 575 watts is really telling.
The demands of gamers for better more ‘realistic’ graphics with each successive generation means that these tricks have to be employed to meet expectations regarding higher framerates, higher fidelity, higher resolution, and acceptable power draw.
Personally, I think there’s so much fun to be had in gaming that doesn’t require such high graphical expectations… look at the success of the Nintendo Switch, Minecraft, and indie games like Among Us, Hollow Knight, and Balatro.
In my 40 years of playing games I’ve noticed that demand for photorealism and framerates mostly comes from those who solely play AAA FPS blockbusters and competitive shooters, with the desire for graphical improvements making up for the lack of creativity, variety and gameplay interest that those genres entail.
0:28 you know what else is massive? :)
Price
Your shits?
I wonder when we will see RDNA 4 budget cards like RX 9060.
RX 7000 series have AI cores. Or AI threads. Why Amd push people who buy day one RX7000 series. Not cool...
Why would i buy 9070xt if my 7900xt could have fsr 4 ? Just cause it has better ray trace (never use it ) ? Think man just think. Now people have good reason to buy a reasonably priced gpu with good upscaling like 4070 and 4070 super did with dlss.
people with 7900 are not gonna be interest in AMD gpus going forward anyway since they are not in the high end anymore.
those people will go for 5080 90 or wait for 6080 90.
@@arbernuka3679 people with 7900 are not gonna be interest in AMD gpus going forward anyway since they are not in the high end anymore.
those people will go for 5080 90 or wait for 6080 90.
@@arbernuka3679 doubt that, they just said that is only for 9000 series, and even if its going to be for 7000 series either is going to be super limited or going to be delayed
@@lucaskp16 i have 7900 xt since day one. If the upscale technology have a generations, RX 7000 series should have at least 2 gen upscale technology support from Amd. Like FSR 3 and FSR 4. On Nvidia side Upscale technology have better support right know than Amd. I'm talking for AI based upscale not the old manual pixsel algorithm. Seriously not like this...
AMD has not called it FSR 4 or even identified whats on the monitor, it could be 4K gaming
firstly, the gameplay is too smooth for it to be a native 4k output, both systems are running the same hardware so the right most result looking that much better and also running at the same or better performance can't be the result of native output. also people have put the same game, ratchet and clank with fsr 3.1 on their machines and confirmed the artifacting on the leftmost monitor are congruent, while the rightmost has artifacting still but to a much lesser degree which means it is still being put through a solution, they have not called it FSR4 yes that is true, but its a very safe bet to assume it is FSR4 or a succesor to 3.1.
@@Pvydrow UA-cam is 60 fps , that's all your seeing 60 FPS is not very smooth any way you slice it
@@Conumdrum your comment still does not address or disprove any of my valid arguments, besides, there is still a way to notice frames per second higher than the video supports, it will technically be the same framerate, but higher refresh rates will look sharper every frame with less blurring, anyways, that wasn't even a main argument point I made, you just nitpicked it from my comment
TBH it's absolutely trash that they aren't supporting 7000 series. Who the hell wants to buy a new GPU worse than their last one. Shits ludicrous
Making fsr 4 exclusive to rdna 4 and not even offering a fall back layer for the 7000 series is bad p.r. they could at least offer the image quality portion update. My next upgrade will be nvidia when the time comes. Even Intel xess has a fallback dp4a that would've made the most sense. What happened to the 7900xtx a.i. cores that is supposed to be there
Fake frames feel terrible in most casenarios and sometimes feel meh even in story mode where the latency isn't a crazy factor it still feels off. I just want raw performance as someone who doesn't even really care about raytracing
You do know that all video is comprised mostly of extrapolated frames right? Including the video you are watching.
Have you tried fake frames?
@@FDFFDS-go1oo Yes they also have a tool called lossless scaling which at 2x frame gen is pretty comparable to regular Nvidia implementation at game engine level, I know u guys love to spew nonsense without any idea what you are talking about its "ok" for story player games where there is no competition and you are just playing games for the look, but otherwise its basically not a real feature people care about
@@Tugela60 Lmao do you even think before you type? You are comparing a video you watch (pre-rendered) vs a live game that takes inputs and renders as you go in real time where your inputs and latency is affected based. Extrapolation has been a thing for a long time and there is a reason it wasn't ever used in games its terrible.
Only people are angry with a 40 card😂
At some point, AMD has to create something that simply doesn't work on previous generation parts. There's no way to keep advancing if backward compatibility is a hard and fast requirement.
But that's a selling point of amd so they should figure it out, I mean I wouldn't expect it on rdna 2 but as a company it looks bad if it's not on rdna 3 that has a.i. cores.
@@Antiquecurtain It's an expectation we have to let go of if we expect innovation.
Devs won't implement FSR 4 on their game just for 9070 cards good luck for finding game with FSR 4
I just want to know the price already. I want one.
yeah competetition is good how does nvidia feel now intel and amd is on their heel?
Amd is a nothing burger
The 5080 using DLSS4 hits 250+ fps in Cyberpunk at 4K on Ultra settings with Full Ray Tracing and the 5090 is twice as fast, Nobody is on Nvidias heels
@Conumdrum and it's 28 fps with fake frames off. 4 generations later and Nvidia still can't run cyberpunk at 30 fps according to their own website. Ur basically giving them 1-2k/yr to beta test their ai.
@@brussy1 exactly bro they lied they even said that the rtx 5070 is equal to a rtx 4090 ha ha ha😂😂😂😂
@@brussy1 I laugh, all these years and AMD still can't Ray trace DLSS 4 is going to be better than FRS 4 without frame gen and multiple frame gen will have people gaming at 4K at 150+ fps in most titles on the cheap, at the end of the day, its all bad news for AMD
A very massive improvement in Ratchet and Clank. I know that Ratchet and Clank is one of the worse ones for AMD FSR 2 & 3. I'm assuming it is (FSR4) not backwards due to the AI capabilities of the 7000 and older RDNA series GPU's. Well, we'll know more when AMD is more willing to talk about the 9000 series - and there is a definite fury of rumours floating about 9000 series.
The 7000 series has a.i. cores they should have a version on the 7000 series and as a fan of amd people shouldn't let them off the hook on that one.
Except AMD hasn't said its FSR4
Improvements excusive to new cards Nvidia "THEY ARE GREEDY" AMD "OMG AMAZING"
The two situations are not equivalent. Nvidia limits the access to the software, this situation with AMD is hardware based.
4090 cant do dlss4....yeah ok
@@edhorseman2237 7000 series has AI accelerators like the 9000 series, yeah they aren't as efficient. I think the issue is that if RDNA 3 had FSR 4 it would be faster than RDNA 4 so they won't allow it
@@SkylordRevision1 the only 7000 card that would be faster than the 9000 is the xtx...
@@edhorseman2237 nah AMD themselves have said that the 9070 non xt would be between 7800xt and 7900gre and the 9070xt is between 7900 gre and 7900xt. The difference would be negligible
Hey would the RX 9070 xt go better with my 5800x 3d then the rtx 4080 I have paired right now?
Not necessarily, but we don’t know 9070 xt benchmarks yet
Nah keep 4080.
That 4080 is gonna be good for awhile bro. You having issues playing stuff? Keep it till you do.
@@AdeptN00B yeah my 4080 has been great aside from the top hdmi being dead I just use the second one on my s95b and it's been fine. I also think I just saw ALL RTX cards will get dlss 4.0?
4800 all day. AMD won’t even give us info. Very sketchy!
See where you need tensor cores now, so will the rtx 4000 series be able to run it FSR4, where they have the tensor cores?. I can't wait for dlss4 see what that's about
Then the price better be fair to gain marketshare from nvidia
Does FSR use the CPU or GPU more? DLSS is hardware accelerated while FSR is software, so I always expect DLSS to be better.
FSR 4 uses machine learning, like DLLs. It uses hardware acceleration; therefore, it only works on next-generation GPUs.
@@David95853 Seems to be the case with FSR 4 as it won't work on 7000 series, but perhaps some features will still work on older GPU's. Unusual move for AMD if you ask me.
@@dcikarugait's not people who kept complaining about fsr is why we are here now not that they got a better solution y'all still complaining
@@chriswright8074 Do you feel I was complaining? FSR was pretty much free when you look at it, and gave life to older GPUs.
AMD the Walmart of video cards lol
I don't give a crap about fake frames. I play with that garbage rurned off.
I was really hoping they came out with the 24 Gigs version. I was ready to spend my money on AMD.
the more fake frames there are the less game devs have to do to optimise. eventually ur gonna get 95%+ fake frames.
7800xt I got last week was a waste of 500pound no fsr 4 mine broke so had to get a gpu
No. AI Upscaling is not the future if people push back against this garbage. Native resolution FTW.
it is now lol
Since this FSR4 is only for the 8000 series then back to Nvidia I go, and getting rid of my 7900xtx for. 5080/5090. I supported AMD through the 7000 series, and that will be last the last time.
Same bro, Nvidia lowkey cooking, back to Nvidia. I bought a 7900 GRE like a month ago and this is such a disappointment which is crazy because this GPU is a beast. The higher end 7900's can def run FSR 4, but it would be faster than RDNA 4 so they won't allow it as then it wouldn't sell
"Nvidias dlss was much better but...."
Have to keep that Nvidia hate strong.
9070/70XT is a GRE with ray tracing. Could have at least done an 70XTX to be more along the lines of a 7900XT with ray tracing. Can you believe i actually considered a 5070ti in my head. i have a GRE what is wrong with me..
I have a gre but having crashing with warzone. On the latest drivers as well
I think that's aiming too high. I think between 7800XT and 7900XT is where these cards land. AMD uses VRAM buffer as an upsell feature so the 7900XT should still have a reason to exist.
As good as dlss 4? Without something as good as nvidia reflex 2?
AMD has anti lag 2
@ 2nd class
@@Ultrajamz still works better than the nvidia reflex :D
i'm about to sell my 7900xtx and get an nvidia card out of spite for this lmao
You weird
because you want 75% of your frames to be fake?
@@chriswright8074cope
@@arch1107 Nah I assume it's because at least NVIDIA is supported and not a shock when a generation is just dropped off the map like AMD did with 7000 series (from a 7900 GRE owner)
Lol excited like a little kid in a candy store.
This is just another reason why Nvidia dominates the market! Even with the newest GPU’s FSR/DLSS will be required on the newest games with RT on full.
When it comes to budget gaming at 1080, that is the only area that AMD and now Intel can compete; though the 5060 is likely to be about the same cost and still perform better with DLSS. If running 1080 though, I do not think DLSS/FSR will be needed. Still, I do not like the idea of spending money on a product that isn’t as good if the price is close to the superior product.
Do you have any idea what were you trying to tell ? You sound like most of ngreedia users🐒. You know new amd cards will perform 2x-3x better on raytrace and fsr looks incredible now ?!
@@arbernuka3679 what? 2x-3x time better ray tracing? from the leaks it was supose to be close to 4070 ti in terms of ray tracing, what you saying there could match the new rtx 5080 lol.
@ compared to AMD’s last gen sure, but compared nvidia’s new 5000 series, AMD will still be outclassed!
It also seems that FSR4 still has some slight fractal issues, but definitely a big improvement over FSR3 from what I’ve seen on similar videos.
The big issue is that Nvidia has 90% of the market, so game developers are optimizing games for Nvidia on PC; Indiana Jones and the great circle doesn’t even support FSR yet! (It will eventually).
I always saw AMD as better option for budget gaming due to lower prices and if gaming 1080 no RT, FSR shouldn’t be needed; so I was going to buy one for my son when I build him a PC. But if Nvidia is going to lower the prices of the 5060, that will make it more competitive.
Idk about everyone else but im tired of all this upscaling and ai stuff. what happened to optimization and real generational performance gains?
If frame generation makes three frames from the one rendered frame wouldn't that mean that there would be four identical frames? If so wouldn't that just output jerky all be it higher frame rates?
We don't need fake frames.
Don't kid yourself, DLSS4 looks fantastic and is low latency
@@ConumdrumYoure a sheep.
Thats just a "solution" to intentionally created problem.
I aint supporting that garbage with my wallet.
@@Conumdrum yes, it's amazing but you're not buying a GPU, but a software technology. Multi Frame Generation with AI already existed before the rtx 5000. And there is a lot of open source software that do that.
@@mouadessalim7116 When everyone starts gaming at 120fps+ at 4k on the cheap , AMD is in trouble Great looking fake frames won't offend the ones using them
All video is fake frames dude. You don't watch video?
Bye2 AMD
Gaming is as we know it is dead it’s all down to AI now no real optimization from devs.
Everything looks good in ratchet & clank
Forget 9070xt. Rtx 5070 even better than 4090 only for 549$
Yup, it seems like AMD is trying to compete with old GPU's lol. That would be like if they advertised the 9060 to be equivalent to a 1080 ti. AMD is a joke. I regret my RX 7900 GRE purchase, Nvidia next time for sure
If AMD doesn’t come out with some info and pricing and release dates I will just go with nvidea. Was hoping for team red to pull off a win but this is sketch AF!
I don't think they'll wait that long to announce their line up and features.
480$ for 9070 xt
nvidia gave a release date for the new 5000 cards. AMD is going to release just before the nvidia release just to undercut their sales. you watch!
@@qwertyqwerty-zi6drrumor only
@@tonyd7164you must be physic to know this
so instead of improving pure performance they're just gonna use ai to make fake frames and create more delay in competitive games.
Its what Nvidia does. Nobody seems to care to shit on them for it.
its more about upscale, and when you play a competitive game you dont use FG, you just use low settings for best performance, you dont play CS with max settings and expect to run 400 fps lol, FG should be in single player games, but i dont know a single person who is using that, dlss yes its good but FG not, and its never going to be
nobody should be buying a video card based on jts ability to mask performance constraints with fake frame generation.
AMD is making revolution with this AI AA algorithm!
first 🧋
amd 6000-7000series will be good for garbage soon
Which is what people have, most people aren't gonna stick with a company that trashed their 2 year old GPU.
If the future is AI based as we suspect RIP high end VR.
ml upscaling is the future*
* the future that nobody wanted
major flop for amd, nvidia just keeps dominating the gpu market
oh yeah
Team AMD FSR 4! New GPUs lets go!
What do you mean, fsr quality has always looked way better than DLSS, the real difference between the 2 has always been the performance since DLSS is capable of achieving more fps in worse systems when compared to FSR
You live in a fantasy world. Dlss is miles better than fsr in quality.
@@jibijayNot when wearing AMD fanboy glasses!
Bro I'm not an AMD fan boy, in fact I have a 3060 and will buy a 5080 as soon as it comes out, but always since I have tried DLSS vs FSR in all games I have tried it I have the same result DLSS gives me more performance but looks horrible while FSR looks much better but with not much performance upgrade...
(As a warning I have only tried these in Quality mode since anything more just looks way too bad IMO)
Buttt
if this arent good, point at Nvidia also.. they both aka "fake", so don't complain now.
Thats not a good point.
If both have fake frames available, but one can deliver more real frames per dollar...thats more important to me
@@ilbro7874 it is a good point, if Nvidia lover like to pick on AMD and saying this is BAD, then Nvidia is bad aswell as both is trying to upscale / fake frame
It sucks that Ryzen 7k cards can't use it, but since it is AI based on new architecture, it's understandable. I have a 7900xtx and AMD has said they're not going after the high end anymore. If theyre showing off a 9070/XT....it would back up there not being a 9080/9090 card. So I will just run with the card I have for another couple years since it is plenty powerful. I am really interested in their new products and depending on the price, I might get one to replace my wife's 6700xt.
Correction: They aren't in the high end this generation. The chiplet design(RDNA3) was a little rushed and the 7900 xtx was an un-polished product. They are actually happy with what they did with the chiplet design afterwards but they had to release something to keep the board partners happy. So they had their better RT tech and FSR 4 to debut so they whipped up a mono die and here is the 9070. Next gen will be different 😊
It's not understandable the 7000 has a.i. cores I think this is a big mistake on amd part
If the 9070XT is faster than the 4080 Super all the way around & is $600 or less. AMD will win the gaming space.
I ve seen rumours that 9070 xt will be 480$ msrp, the target is 4080 super performance for under 500$
Dlss 4 also support on 20 series card
And it's from greedy nvidia
But AMD can't even support fsr4 on latest older card what a joke
How the table turned
No, its going to be only on the rtx 5000 and rtx 4000 series, like the rtx 3000 series could not use any dlss 3, so why would they do this?
They bring all the features two even the 20 series except the frame Gen to nvidia cards, which is more than fair that includes ray reconstruction, so you're telling me amd couldn't do the same .
Amd is brain dead, Fsr 4 should have been also available with the 7000 series of gpus they have ai cores use brain dead tactics amd go broke ffs
The 7000 cards lack the hardware, nothing can be done about that. Some proxy of FSR4 will probably make it but it won't have the AI acceleration.
you simply can't have all the features of DLSS 4 in a 4000 series or older, it has always been that way. This happens with every new feature that depends on hardware dedicated stuff. You can still use FSR 3.1, that works pretty well, I play Hunt: Showdown with FSR and it looks good
Who would have bought 9070/xt if they had fsr 4 on their gpus (7000) ? If you have 7900 gre/xt/xtx would have you conisdered and upgrade to 9070 gpu ? The answere is NO so sit down and buy an nvidia card and play with fake 4x framegen fps🤣.
@@arbernuka3679 lol, like amd is not going to use the same fake gen?? i dont like this new ai stuff, but nvidia in raw performance still beat amd, if you want best gpu and dont want to use fsr or dlss go for 5090, its simple tactics but its going to work in the end.
@@cythose4059are u sure? Wait to see 5090s rasterisation performance in modern games😂😂
fsr4 sucks! dlss shit sucks! thank u not for hyping this shit. Games gotta be optimized. Screw all those promoting shit frame generation technology. All lies and fake shit 😤😠😡
AMD needs to sell this cards at such a low price that they can't possibly make any $$$. they are just too far behind at this point. As they keep bleeding what little market share they have by continuing to not support older cards over and over.
Yeah the only way they could be saved is to sell the 9070 XT for $400 or max $450
Don't....care...about....raytracing...
if you dont play new gen games, or single player games sure, but more and more games are going to slap you in the face with ray tracing and some you cant even disable it, its going to be a shit show
no one in reality, but since it will be shoved down our throats, we will need so rt performance to stay afloat, but if the future is apus, well, not sure what to think about
Sigh. I am so glad I don't feel the urge to play this useless game of king of the hill anymore. Heck I could grab a 5700xt and it would be a huge upgrade for me. But there is no game on the market yet I have needed to upgrade to run, that I have really wanted to play. That day will inevitably come, but I will be damned if I am going to spend $500+ for a card that costs $42 for them to manufacture.
Cost of an item is not just the manufacturing cost. There is overhead such as design costs, financing costs, taxes, salaries, facility rentals, etc.
The actual manufacturing cost is just a small part of overall cost, it can be as little as 10%, and that is before the manufacturer makes any profit at all.
@@Tugela60 How do you think their profits have soared over the last few years? Because I can tell you from watching their financials that all of those things average to a 10% increase over the same time period as well partner. Factories and equipment are sunk costs that usually recoup their investment in within 3 years.
Fsr4 is frame gen is Fake frames
like dlss 4, specifically 75% fake frames
@arch1107 yet everyone is praising FSR4 like it's the best thing since sliced bread.
Its fine. I see the bias.
@@rkwjunior2298 dont look at it as that, it could be that, but think of it as everybody disliked nvidia lies with ai so people want something else not from nvidia
in the end, we need amd to be around but amd is not finding much reasons to keep selling gpus, so expect apus for everyone everywhere
So basically anybody who purchased a 7000 GPU is with being left behind and this is why people don't want to buy products because companies are fickle and don't stand behind their product I'll keep my RX7800 but I won't buy another AMD card I'll buy Intel next time
Have you seen what Nvidia is doing ?
because you imagine that intel will stay around and will care about you?
intel wants to sell ai gpus like nvidia, if they suceed, forget desktop gpus
Same, got 7900gre, not gonna buy amd again
@@hungaryhusar6567 Exact same situation with me and my 7900 GRE. Nvidia next time for sure.
Agreed, but I'll buy Nvidia next time.