Please give this video a thumbs up & leave a comment for the youtube algorithm :), it really helps me, all these videos that I make are extremely time consuming. If you are still don't believe that my videos are real, feel free to check out my unboxing videos or my community section of my channel. 00:00 It's real and not fake - if you need any further proof check out my channel 00:11 God of War 00:44 Marvel's Spider-Man Remastered - TAA 01:10 Marvel's Spider-Man Remastered - DLSS Quality + Frame Generation vs DLSS Quality 01:36 Witcher 3 NextGen - Ray Tracing ON - TAAU 01:55 Witcher 3 NextGen - Ray Tracing ON - DLSS Quality + Frame Generation vs DLSS Quality 02:15 Witcher 3 NextGen - Ray Tracing OFF - TAAU 02:33 Call of Duty Modern Warfare 2 (2022) 03:29 Red Dead Redemption 2 (Vulkan) 05:35 Far Cry 6 06:29 Sons of the Forest 07:00 Resident Evil 4 (FXAA+TAA) 07:32 Horizon Zero Dawn 08:32 DOOM Eternal 08:59 Forza Horizon 5 10:29 Watch Dogs Legion 11:49 Cyberpunk 2077 - Native Resolution 12:51 Cyberpunk 2077 - DLSS Quality + Frame Generation vs DLSS Quality 13:56 Hogwarts Legacy - Native Resolution 14:21 Hogwarts Legacy - DLSS Quality + Frame Generation vs DLSS Quality 14:48 The Last of Us Part 1
It's impressive how much memory compression improved by looking at memory usage. The 4070 uses between 0.6 and close to 2GB less memory with the same settings.
A lot of these benchmarks are kinda silly. If we're trying to discern the performance between the two cards, putting one with frame gen and one without next to one another isn't a good way to convey the actual rasterization difference between the two at all, and can lead to people making bad buys.
Don’t know about you guys, but got mine for a great price, runs better than my 2080TI…and I’ve always been an enemy of DLSS but that performance difference without a visible drop in quality is simply fantastic
@@preyjenn4551 Not sure what you're talking about. Again, my 10700k was bottlenecking my 3080 for certain games like Spiderman, Farcry6, Forza, and couple more games. EVEN at 1440p 165hz. Otherwise, i would have no reason to upgrade. Even though it wasn't that expensive of a upgrade once i moved to 13600k system.
Please give this video a thumbs up & leave a comment for the youtube algorithm :), it really helps me, all these videos that I make are extremely time consuming.
If you are still don't believe that my videos are real, feel free to check out my unboxing videos or my community section of my channel.
00:00 It's real and not fake - if you need any further proof check out my channel
00:11 God of War
00:44 Marvel's Spider-Man Remastered - TAA
01:10 Marvel's Spider-Man Remastered - DLSS Quality + Frame Generation vs DLSS Quality
01:36 Witcher 3 NextGen - Ray Tracing ON - TAAU
01:55 Witcher 3 NextGen - Ray Tracing ON - DLSS Quality + Frame Generation vs DLSS Quality
02:15 Witcher 3 NextGen - Ray Tracing OFF - TAAU
02:33 Call of Duty Modern Warfare 2 (2022)
03:29 Red Dead Redemption 2 (Vulkan)
05:35 Far Cry 6
06:29 Sons of the Forest
07:00 Resident Evil 4 (FXAA+TAA)
07:32 Horizon Zero Dawn
08:32 DOOM Eternal
08:59 Forza Horizon 5
10:29 Watch Dogs Legion
11:49 Cyberpunk 2077 - Native Resolution
12:51 Cyberpunk 2077 - DLSS Quality + Frame Generation vs DLSS Quality
13:56 Hogwarts Legacy - Native Resolution
14:21 Hogwarts Legacy - DLSS Quality + Frame Generation vs DLSS Quality
14:48 The Last of Us Part 1
It's impressive how much memory compression improved by looking at memory usage. The 4070 uses between 0.6 and close to 2GB less memory with the same settings.
It's likely the 4070's massive cache
Why I am here? I do not have the money.
eduardo says he misses you
@@taroushi bruh 🤣
Now this is what I can actually believe. Showing actual hardware is the way to go, thanks!
A lot of these benchmarks are kinda silly. If we're trying to discern the performance between the two cards, putting one with frame gen and one without next to one another isn't a good way to convey the actual rasterization difference between the two at all, and can lead to people making bad buys.
When the comparisons show the 4070 within 10-20 fps of the 3090, you know theres something wrong here.
Yeah its called frame generation. Even a 4060 will destroy this card in games that support it.
@@GideonCyn frame generation makes everything look like shit and add inputlag.
@@GideonCynthat's not frame gen it just dlss
A 4070 is around 10-20fps less than a 3090.
2080 ti not bad if you got it way back in 2018
Price slashed by 70% in the last 5 years.
2080ti not bad in nowadays actually
I think you couldn't have bought an item with greater price depreciation if you tried.
pick up a used one for 200 usd and its well worth it
Yep 2080Ti have only ~10fps lower.
Look how poor optimisiation in Hogwarts is. GPU load barely reaches 95%, even worse with DLSS.
250 vs 650 i got the 2080ti and holding on to it until the 5070ti arrives
fax i got a used 2080 super for 180 and it wrks fine.
With bumping up to a 4070 or 4070 ti do you think it would b better to have 32gb ram of is 16 still enough?
Depends on the games you are playing, for Minecraft, more ram is always good
20% more performance for 60-100 watts less!
That is why I'm upgrading from 2080ti. Today's electric bills are crazy & I'm tired sitting next to the noisy heater.
@@NullifidianYTyou can easily undervolt the 2080 ti to 200W without any performance loss. makes a big difference in heat
@@nintendork07 you can do the same with the 4070
Great comparison! 👏👏👏
What😂😂😂😂😂
redo these comparison using frame generation on rtx 2080ti since is available via Lossless Scaling. I'm curious to see the results.
Only good thing about this card is power usage, as its abaout 10% faster than 2080ti just by watching 1% low !!! Rtx 3080 is stronger than 4070!
They about kneck and kneck
But the 4070 has 12 gdrr6x while the 3080 has 8
@@GOTEEGAMINGthe 3080 has 10, not 8. And the 3080 in raw power in 90% of games performs better than the 4070, especially in 4k.
Still on 2080 ti just overclock it and u have 4070 but power goes beyong 300w
@@aditrex an overclocked 2080ti does not equal a 4070.
can you please do 7950 x3d vs 5800x3d at 4k with 4090
Got my 4070 today , and quite disappointed on frame generation. Sadly we couldn't use it with Vsync enabled😑.
Who the hell use Vsync?
Bro forgot about G-Sync
I mean who needs that anyway when you pretty much have a downgraded 3090/Ti things insane for 600-500 bucks
Don’t know about you guys, but got mine for a great price, runs better than my 2080TI…and I’ve always been an enemy of DLSS but that performance difference without a visible drop in quality is simply fantastic
@@v1nigra3 Why would you be an enemy of DLSS? Weirdo
You can tell your friends you have 4070😂 who knows?
Спасибо! Отличный тест!
English you foolish russian.
Why your 2080ti have only 1920mhz on gpu?
prob undervolted
@@IgaTenzen And very slow ram. Why he use 3600MHz CL18-19-19 ? Why not CL 20 or 25 ? yes it's sarcasm. Optimal DDR4 is 3600/3733Mhz CL15.
Ye not worth teh upgrade. my 2080 super is still okayish.
Of course it isn't but still destroys your old 2080s which was as at an MSRP 700usd 🤡
Play Alan Wake 2 you go anything better out there in GPU.
With the specs centered on the screen its hard to watch. Useless this way
I7 9700f its enough for rtx 4070 ?
hell no... i use to have a 10700k, which was bottlenekcing my 3080 at 1440p. I fixed it with 13600k.
@@makavelideathrow3819 On 9700k/9900k/10700k gpu will be lock on 80%
if you want then buy 4070 for 9700k, but gpu will be lock on 80% and CPU on 100%.
On 1440p not that much.
@@preyjenn4551 Not sure what you're talking about. Again, my 10700k was bottlenecking my 3080 for certain games like Spiderman, Farcry6, Forza, and couple more games. EVEN at 1440p 165hz.
Otherwise, i would have no reason to upgrade. Even though it wasn't that expensive of a upgrade once i moved to 13600k system.
make tests without RT, cause why CP2077 only with RT, but Witcher with RT and without, wtf, and 1080p more interesting to see
FIRST
Actually, no. Jansen was first with his own pinned comment, so now stfu kid.
Bottleneck prosessor 😂 get Intel i7 13700K and its shows real result
i dont see the cpu bottleneck. watch the cpu usage in each game. the gpu's are bottlenecking
The gpus are on 99-100% usage. Dont see the bottleneck
@@TheCultOfDanny with the 2080ti processor wont bottleneck but with 4070 it will
@@JK-zi7hb How can the CPU be a bottleneck when its at less than 50% in every game, 99% of the time? Clearly no clue what you are talking about.
It Will bottleneck 4070 not 2080ti
Yeah I’m not gonna watch a video where you didn’t even bother to explain what is happening with your voice.
??? He never does commentaries in his videos, it's a benchmark only channel
???
🤨?? we watching the same video or was this for somebody else?
@@mikelowreyyy it got recommended to me. This is my opinion. Deal with it.
@@ErwinPPP nobody asked for your opinion entitled kid.