*why FRAME GEN usually sucks...* If you have, say 80FPS without FG and manage "100FPS" after FG is on then that's 25% more FPS. Right? Well, since exactly HALF the frames are now interpolated that "100FPS" is actually based on a render of 50FPS with every other frame extrapolated from the frame data. So? Well, your RESPONSIVENESS (lag, latency, slugishness) is based on the real, rendered frames so it will be slightly worse than what you'd get at a normal 50FPS (since there's a delay to render the next frame as you have to WAIT longer for the generated frame to extrapolated data from). The HIGHER the FPS number, the less of an issue this is. And of course there's a visual cost which varies quite a bit. Most of you guys probably know all this, but many don't. Do I think Frame Gen is the future? YES. Provided these three main things are true: 1) FPS gain is relatively high (i.e. 100->"150FPS"), and 2) FPS rendered is relatively high (i.e. "100FPS" (50FPS x2) for slower pace games, and higher for faster paced games), and 3) Visual artifacts, and frame time stutters are minimal *much of this is subjective, this is a bit of an oversimplification but so far with my RTX4070 +R7-5700X3D I haven't found a game where FG improved the experience. It also varies by the monitor type as things like Persistance Blur can be pretty bad sometimes at lower FPS values, so adding latency to improve VISUAL fluidity with reduce PB might be a worthwhile tradeoff. Again, subjective. Both the software and hardware will improve over time with dedicated HARDWARE reducing the drop in rendered frames. So, going from, say, 60FPS-> 116FPS or 100FPS->180FPS might be doable with minimal downside.
I can barely play modern games now a days on my 4070 super with reasonable fps on 1440p lol. It's like I never upgraded with how unoptimized and demanding these games keep getting.
Good to see the verification video in Japanese. 4070s seems to be still excellent. By the way, I'm using 7800xt. Dlss and FSR technology is gospel for gamers. And your video too!
Anything up to the 4070 SUPER level of performance should be okay I think. Keep in mind that I haven't actually tried the 12600K so it's just an educated guess.
As someone who plays more then 5 Hours .I do take my sweet time to enjoy 12G is not enough and sure 16 wont be in long run base on my own experince so best is to have 20G vram and over to be safe with or with out mode for long period of time
After being with AMD for half a decade I am upgrading from 1080p to 1440p to this gpu. I'm getting it next week. The 7900 GRE is 710 USD vs 730 USD for the 4070 Super. I think if I am spending this much money on a gpu I don't want to deal with the lackluster visuals of FSR. FSR 3.1 looks okay and even good when implemented well but 99% of devs just dont care to implement it correctly with care. I'm planning to keep this GPU till the PS6 launches and only upgrading maybe 1-2 years after the launch of PS6 because I know at first games will still be optimized for PS5s since not everyone moves to the latest consoles as soon as they arrive. Can't wait to see 1440p for the first time and see if DLSS is really worth the extra money. I've got sooo many RT games I have to play. First I am going to play Witcher 3 for the first time. Heard its good
The 4070 SUPER isn’t particularly strong for ray tracing. Additionally, RT demands more video memory to run effectively, which the 4070 SUPER lacks. If you’re serious about enjoying RT, consider the 4070 Ti SUPER instead. That said, it might be worth waiting to see what new GPUs NVIDIA announces at CES on January 6th.
@@theivadim I have already been without a gpu for 2 weeks after selling all my spare pc parts early December. I am not waiting till February to buy the RTX 5070 also the RTX 4070 Ti Super is 280 USD more expensive. I don't have the money
@@west5385 I went for the rtx 4070 super for this reason. Almost 300$ for a few more GB or VRAM and a 15% power increase wasn't worth it in the end but it really depends on the resolution and framerate you like.
With dlss quality or balanced at 3840x2160 every game is playable with this card. Not native i know, but with dlss, it works well. And some games actually look better with dlss.
huge 71° and fan speed of 2000 rpm ? loud and hot to be honest if you are woundered how this card can do 4k and lower Graphics ... i am not woundered and RT not maxed out ? damn it Nvidia is really overhyped.
God there are no good options in the gpu market. If you want ray tracing and dlss, you need to go with Nvidia. I originally thought that as long as the gpu had the horse power, that it didn't matter that Nvidia decided to subtract the bus width by 64 bits, but now I realize, you need to settle for an uncomfortable amount of vram that is sure to be inadequate within 5 years I'd say. I'm thinking about starting a business mainly dedicated towards upgrading vram capacity. Otherwise, you're stuck with AMD. I'm not ashamed to admit that I am greedy and am not willing to give up RTX HDR, DLSS, and rt performance. The only way forward I can see is a straight vram upgrade.
So when people tested on said resolution (1080p, 1440p, 4k), it actually meant screen resolution in game settings not physical monitor size. Coz I see ur monitor is not like 4k size. Correct me if Im wrong about this bcoz this in-game screen resolution setting vs monitor size confuses me.
@@theivadim EDIT: nvm Im an idiot, quick google search tells me it does not, mbmb 😅 I mean whether monitor size impacts performance? Like does it matter if I play 1080p settings on 1080p monitor vs 4k settings on 1080p monitor vs 1080p settings on 4k monitor etc?
For the best visual experience, always set your monitor to its native resolution. If your GPU struggles to handle it, use resolution upscaling technologies like DLSS, FSR, or XeSS to boost performance. As demonstrated in this video, it’s possible to find settings that let you enjoy games at 4K, even with a GPU that isn’t technically marketed as a 4K gaming card.
Monitor size itself does not impact performance; what matters is the resolution you are rendering the game at. Here’s how it breaks down: 1. 1080p settings on a 1080p monitor: The game is rendered and displayed at 1080p. This is optimal because the monitor’s native resolution matches the game resolution, ensuring sharp visuals with no scaling overhead. 2. 4K settings on a 1080p monitor: In this case, the GPU renders the game at 4K, but the monitor scales it down to display at 1080p. This is called downscaling. It produces sharper visuals than native 1080p because the higher render resolution adds more detail, but it still requires a lot more GPU power, similar to playing at native 4K. 3. 1080p settings on a 4K monitor: Here, the game is rendered at 1080p but displayed on a 4K monitor. The monitor or GPU upscales the 1080p resolution to fill the 4K screen, which can lead to a blurry or less crisp image compared to native 4K. However, the performance remains the same as playing at 1080p on a 1080p monitor since the GPU still renders at 1080p. Summary: • Performance depends on the resolution the game is rendered at, not the monitor size. • Downscaling (4K rendered → displayed at 1080p) looks better but is GPU-intensive. • Upscaling (1080p rendered → displayed at 4K) preserves performance but reduces visual quality. If your GPU can handle it, playing at a higher resolution like 4K and downscaling can provide better visuals even on a 1080p monitor. Otherwise, matching the game resolution to your monitor’s native resolution is the most efficient and visually clean option.
1080p is 1080p, even on a 100 inch monitor, it won't be harder to run because it's the same number of pixels and it will be the same computing power. But when we talk about what you see with your eyes, a 24" inch 1080 p monitor vs a 27"inchp 1080p monitor, the 24" monitor would be the one recommended because it has better pixel density compared to the 27" one.@@eleongo
Nope. Too weak. Not enough VRAM. Will suck in Cyberpunk, Indiana Jones, Hogwarts Legacy, The Last of Us Part 1, etc ....... unless Low settings and Low texture packs.
Indiana it runs 1080p full settings and full ray tracing. 4k I drop textures down a bit, most setting still highest, few medium. Barely notice difference in quality, certainly not worth £200 more for likes of 4070ti with only 11% performance increase. Ppl get too hung up on vram. Only a couple of games I’ve played actually get close to hitting the 12GB.
Купил готовый комп. 14400f RTX 4070 Super. 32гб Ddr4 Win 11. Периодически SSD нагружается на 100% Система тупит. Игры нормально работают , но загрузка и установка игр превращается в Ад. Помогите консольщику , куда копать?
Никогда не сталкивался с такой проблемой. Но я бы первым делом просканировал на наличие вирусов. Убедился бы что диск не забит файлами на максимум. И потом ещё переустановил бы Windows. Ну и самое последнее что можно сделать- заменить ssd.
Always glad to see this gpu on channel. Yes, 4K will be too much to chew in the future, but for 1440p, it is a very grateful graphics card.
My pleasure!
Other than the vram issue it’s still a pretty good card for 1440p and I’m going to ride it as long as possible before upgrading
Same combo with 1440p and 1080p :) Great job
Nice! I'll be testing this card at 1440p resolution next.
Keep up the good work! I love your videos as always! 😁
Thank you! Will do! I appreciate your continuous support.🤝🙂
*why FRAME GEN usually sucks...*
If you have, say 80FPS without FG and manage "100FPS" after FG is on then that's 25% more FPS. Right? Well, since exactly HALF the frames are now interpolated that "100FPS" is actually based on a render of 50FPS with every other frame extrapolated from the frame data. So?
Well, your RESPONSIVENESS (lag, latency, slugishness) is based on the real, rendered frames so it will be slightly worse than what you'd get at a normal 50FPS (since there's a delay to render the next frame as you have to WAIT longer for the generated frame to extrapolated data from).
The HIGHER the FPS number, the less of an issue this is.
And of course there's a visual cost which varies quite a bit.
Most of you guys probably know all this, but many don't. Do I think Frame Gen is the future? YES. Provided these three main things are true:
1) FPS gain is relatively high (i.e. 100->"150FPS"), and
2) FPS rendered is relatively high (i.e. "100FPS" (50FPS x2) for slower pace games, and higher for faster paced games), and
3) Visual artifacts, and frame time stutters are minimal
*much of this is subjective, this is a bit of an oversimplification but so far with my RTX4070 +R7-5700X3D I haven't found a game where FG improved the experience. It also varies by the monitor type as things like Persistance Blur can be pretty bad sometimes at lower FPS values, so adding latency to improve VISUAL fluidity with reduce PB might be a worthwhile tradeoff. Again, subjective.
Both the software and hardware will improve over time with dedicated HARDWARE reducing the drop in rendered frames. So, going from, say, 60FPS-> 116FPS or 100FPS->180FPS might be doable with minimal downside.
I can barely play modern games now a days on my 4070 super with reasonable fps on 1440p lol. It's like I never upgraded with how unoptimized and demanding these games keep getting.
Silent Hill 2 4k DLSS Qualty.Grafics max out.
40+ fps .Perfect for me.
4K55" TV 120hz.
Good to see the verification video in Japanese.
4070s seems to be still excellent.
By the way, I'm using 7800xt.
Dlss and FSR technology is gospel for gamers.
And your video too!
brother my specs i512600k ram ddr5 32 gb gpu rtx 3070 what is best gpu for me
Anything up to the 4070 SUPER level of performance should be okay I think. Keep in mind that I haven't actually tried the 12600K so it's just an educated guess.
you are fine with the setup, for a higher gpu you need a better cpu
As someone who plays more then 5 Hours .I do take my sweet time to enjoy 12G is not enough and sure 16 wont be in long run base on my own experince so best is to have 20G vram and over to be safe with or with out mode for long period of time
After being with AMD for half a decade I am upgrading from 1080p to 1440p to this gpu. I'm getting it next week. The 7900 GRE is 710 USD vs 730 USD for the 4070 Super. I think if I am spending this much money on a gpu I don't want to deal with the lackluster visuals of FSR. FSR 3.1 looks okay and even good when implemented well but 99% of devs just dont care to implement it correctly with care. I'm planning to keep this GPU till the PS6 launches and only upgrading maybe 1-2 years after the launch of PS6 because I know at first games will still be optimized for PS5s since not everyone moves to the latest consoles as soon as they arrive. Can't wait to see 1440p for the first time and see if DLSS is really worth the extra money. I've got sooo many RT games I have to play. First I am going to play Witcher 3 for the first time. Heard its good
The 4070 SUPER isn’t particularly strong for ray tracing. Additionally, RT demands more video memory to run effectively, which the 4070 SUPER lacks. If you’re serious about enjoying RT, consider the 4070 Ti SUPER instead. That said, it might be worth waiting to see what new GPUs NVIDIA announces at CES on January 6th.
@@theivadim RT is fine in any recent RTX Gpu's but Path Tracing is a whole different level
@@theivadim I have already been without a gpu for 2 weeks after selling all my spare pc parts early December. I am not waiting till February to buy the RTX 5070 also the RTX 4070 Ti Super is 280 USD more expensive. I don't have the money
@@west5385 I went for the rtx 4070 super for this reason. Almost 300$ for a few more GB or VRAM and a 15% power increase wasn't worth it in the end but it really depends on the resolution and framerate you like.
With dlss quality or balanced at 3840x2160 every game is playable with this card. Not native i know, but with dlss, it works well. And some games actually look better with dlss.
All I want it to do is 1440 and it does it oh so well.
You can max any game at 1080p thats all i want. Fuck 1440 i want everything high frames and maxed tf out. Solves my "12gb is not enough vram" problem
huge 71° and fan speed of 2000 rpm ? loud and hot to be honest if you are woundered how this card can do 4k and lower Graphics ... i am not woundered and RT not maxed out ? damn it Nvidia is really overhyped.
God there are no good options in the gpu market. If you want ray tracing and dlss, you need to go with Nvidia. I originally thought that as long as the gpu had the horse power, that it didn't matter that Nvidia decided to subtract the bus width by 64 bits, but now I realize, you need to settle for an uncomfortable amount of vram that is sure to be inadequate within 5 years I'd say. I'm thinking about starting a business mainly dedicated towards upgrading vram capacity. Otherwise, you're stuck with AMD. I'm not ashamed to admit that I am greedy and am not willing to give up RTX HDR, DLSS, and rt performance. The only way forward I can see is a straight vram upgrade.
At 4k the cpu does not matter as much as the bottleneck is your gpu. 4070 super not for native 4k. Maybe with dlss
So when people tested on said resolution (1080p, 1440p, 4k), it actually meant screen resolution in game settings not physical monitor size. Coz I see ur monitor is not like 4k size. Correct me if Im wrong about this bcoz this in-game screen resolution setting vs monitor size confuses me.
I have no idea what you’re talking about. Resolution and size are two different things; they measure completely separate aspects.
@@theivadim EDIT: nvm Im an idiot, quick google search tells me it does not, mbmb 😅
I mean whether monitor size impacts performance? Like does it matter if I play 1080p settings on 1080p monitor vs 4k settings on 1080p monitor vs 1080p settings on 4k monitor etc?
For the best visual experience, always set your monitor to its native resolution. If your GPU struggles to handle it, use resolution upscaling technologies like DLSS, FSR, or XeSS to boost performance. As demonstrated in this video, it’s possible to find settings that let you enjoy games at 4K, even with a GPU that isn’t technically marketed as a 4K gaming card.
Monitor size itself does not impact performance; what matters is the resolution you are rendering the game at. Here’s how it breaks down:
1. 1080p settings on a 1080p monitor: The game is rendered and displayed at 1080p. This is optimal because the monitor’s native resolution matches the game resolution, ensuring sharp visuals with no scaling overhead.
2. 4K settings on a 1080p monitor: In this case, the GPU renders the game at 4K, but the monitor scales it down to display at 1080p. This is called downscaling. It produces sharper visuals than native 1080p because the higher render resolution adds more detail, but it still requires a lot more GPU power, similar to playing at native 4K.
3. 1080p settings on a 4K monitor: Here, the game is rendered at 1080p but displayed on a 4K monitor. The monitor or GPU upscales the 1080p resolution to fill the 4K screen, which can lead to a blurry or less crisp image compared to native 4K. However, the performance remains the same as playing at 1080p on a 1080p monitor since the GPU still renders at 1080p.
Summary:
• Performance depends on the resolution the game is rendered at, not the monitor size.
• Downscaling (4K rendered → displayed at 1080p) looks better but is GPU-intensive.
• Upscaling (1080p rendered → displayed at 4K) preserves performance but reduces visual quality.
If your GPU can handle it, playing at a higher resolution like 4K and downscaling can provide better visuals even on a 1080p monitor. Otherwise, matching the game resolution to your monitor’s native resolution is the most efficient and visually clean option.
1080p is 1080p, even on a 100 inch monitor, it won't be harder to run because it's the same number of pixels and it will be the same computing power. But when we talk about what you see with your eyes, a 24" inch 1080 p monitor vs a 27"inchp 1080p monitor, the 24" monitor would be the one recommended because it has better pixel density compared to the 27" one.@@eleongo
Rememeber when 2080ti was 4k gpu hahah games are WORSE nowadays
Nope. Too weak. Not enough VRAM. Will suck in Cyberpunk, Indiana Jones, Hogwarts Legacy, The Last of Us Part 1, etc ....... unless Low settings and Low texture packs.
I play Indiana in 4k easy... Just put the texture pool to high or medium...
Indiana it runs 1080p full settings and full ray tracing. 4k I drop textures down a bit, most setting still highest, few medium. Barely notice difference in quality, certainly not worth £200 more for likes of 4070ti with only 11% performance increase. Ppl get too hung up on vram. Only a couple of games I’ve played actually get close to hitting the 12GB.
I play cyberpunk 4k dlss quality gets 70fps with high settings, you clearly know nothing
your right^^ not paying 700 bucks to lower my settings. Thats why i bought an 4080 for the 16gb and im happy with that
@@adegreen2731 How many hours are you playing ? 1 or 2 or 4+ where the real problem starts
Купил готовый комп.
14400f RTX 4070 Super.
32гб Ddr4 Win 11.
Периодически SSD нагружается на 100%
Система тупит.
Игры нормально работают , но загрузка и установка игр превращается в Ад.
Помогите консольщику , куда копать?
Никогда не сталкивался с такой проблемой. Но я бы первым делом просканировал на наличие вирусов. Убедился бы что диск не забит файлами на максимум. И потом ещё переустановил бы Windows. Ну и самое последнее что можно сделать- заменить ssd.