1080P on 24" monitor, DLSS Q + FG looks phenomenal. This is high end graphic card for 1080P resolution, high refresh rate gaming, not needed to spend more.
Lol NO! This is the very definition of torturing yourself for no reason. 😂 You don't need fg to get a high refresh rate experience at 1080p, and you could have much more realistic experience at 4k high settings 70+ fps without fg, as it's 4x sharper than 1080p, and u don't need high refresh rate in single player games.
@@clem9808 you don't need high refresh rate in single player games but it's nice to have, most games on the 4070 at 1080p will run above 100fps and that's a valuable experience. And this guy is future proofing to the max, he won't have to upgrade his gpu for a while at 1080p
@@mobarakjama5570 cyberpunk's optimization is decent. It gives the performance the gpu has to offer. Games like Hogwarts Legacy, RE 4 are just vram limiters and any game is better than TLoU pc port.
@@mobarakjama5570 it's heading for Geforce Now, exactly where nvidia want the average gamer to be (or Microsoft Blizzard, MS's competing online service)!
Tried it with 4070 TI and 7600X not expecting much but damn it was very playable. 85 fps average in Kabuki and it looks real good. For an experimental feature it seems pretty advanced in its development.
@@breadone_ probably 1080p. I had a 4070ti/13700k running very similar FPS with DLSS on Quality with the Ultra preset with RT OT and Path tracing and Ray Recon on.
@@wretchedslippage3255i get these fps at 1440p with my r7 5800x +rtx 4070 non ti, RT overdrive + ray reconstruction and path tracing on at dlss quality. weird.
Oh man, path tracing looks amazing. I'm gonna do everything I can to save up for the 5090 or AMD equivalent, whenever they come out, so I can experience this
Friendly piece of advice: if you have to 'save up for a 5090', then that card is not for you. Don't waste money you don't have on what is ultimately just meaningless entertainment.
@@YASxYT nice to see another trapper as I also sell edibles to 12 year olds but if u plan to use RT then NEVER buy amd as nvidia is vastly superior there 💪😝
@@Drip7914 as a child trafficker that you guys just drugged, I agree with your comment. Whats the point of bigger VRAM if you cant use RT, frame generation, and DLSS 69. Meaningless entertainment? Dude, seeing 1 more fps for $1000k more is my entertainment, cant live without these wet and reflective roads!
What I personally appreciate the most in your videos, Kryzzp - you actually max out settings, even on cards you're not 'supposed' to do that on. I really don't get anything out of watching benchmarks of games running at 1080p with medium settings. The most useful info I could get out of watching gameplay benchmarks is how games perform maxed out on the card in question, since I personally *always* crank my games to max. If I can't run a game maxed out, I don't play it until after I upgrade.
I love your videos bro, specially these 30min or more ones, the way you make them are top quality and doesn't annoys at all, keep the amazing work man Pd. Do you think they will add FG in the last of us? Cause i was sure that that game was gonna have it at its release
To me, the main problem with this game is that, other than the lighting itself... it really doesn't look that good, it's certainly not what I would expect from a next-gen AAA title (especially when considering the ultra-high hardware requirements).
@@ruxandy Yeah, the game also forces TAA, so it's a blurry mess too, and the draw distance is terrible too, even when you configure it to be higher in the files. Constant pop in of objects + blurry is not a good combo.
Nice,..its pretty intensive for sure!!! I tried it on my 2080ti at 2100Mhz does 1440p DLSS-P (720P) or 1080p DLSS Q (720P) Pathtracing at 25-40fps ranges.
Please don't, this card will become dead in few years, due to limited vram and horrible bus speeds, and this is coming from a guy with a 3060 I really wanted this card to be good, but after seeing it's capabilities, I say it's not worth it at all, even with it's good power draw, which is what got me excited in the first place.
@@mobarakjama5570 I don't think anyone purchasing a 70 series for 1440p will hold onto it for more than a few years anyway, and 12 gigs is still alright for now.
@@shuyaku99 It's a Mid-High end GPU, they should be able to, 70 class GPUs should be powerful enough to last a long time, specially at QHD when talking pure rasterization, but this RTX gimmick is what's ruining gaming, I use a RTX 3060 with 144hz QHD monitor and it's smooth experience at high-ultra, but turn RTX on and it's a whole different conversation.
I think optimally the FG needs something like at least 30fps to work with for the input behavior to become palpable. I wouldn`t try to use it without DLSS although I get you are testing for the worst case scenario as well. It would be interesting to see RT Overdrive path tracing enabled, but with otherwise turned down settings and aggressive DLSS (Performance), just put the sharpness up that helps quite a bit.
Exactly, generating extra frames with a baseline below 30 fps just is similar to running the game in an unplayable state as you just can feel that your input doesn't match with the onscreen image. Ideally you have at least 40 fps minimum when using framegen, which guarantees a pleasant experience. I speak from experience with my 4080 and 3800x. But if you go for the perfect experience i think you still want the 60 fps baseline, and add generated frames on top of that, just for a smoother image. The rule is in fact that the input lag with framegen will at least be as high as the framerate for which it adds frames. So getting 80 fps with framegen basically means you have at least an input lag that corresponds to 40-50 fps in that game.
Isn't it funny that we're at a point where I watch this on my 40inch 4k monitor and the quality downgrade that youtube does in its codec means that most of the differences you talk about are indistinguishable and I know that the actual gameplay must look so much better! :D
The judder you experienced was the card not knowing what to do due to going over its Vram limit it happens with my 4070 ti when I attempt to play 4K Overdrive Quality DLSS with Frame Gen. Also I found Frame Gen being a lot more tolerable with its input lag when using a controller instead of mouse and keyboard.
Same behavior happened to me, I thought my gpu is broken, it really sucks that you can't use the full potential because of vram but anyways, it's meant mostly for 1440p. Does a better cpu and ram help prevent this judder?
Definitely not the fault of the card that it dropping there 22:50, it's just bad optimization. There is nothing on screen that the card shouldn't be able to handle.
There is a mod that reduces the amount of bounces calculated for path tracing. It would be interesting to see you do some testing with the mod on some lower end cards
After optimizing the path-tracing to 1 ray and 2 bounces along with NIS and a light overclock, I can get close to 60 fps on my 3070 at 1080p DLSS quality, effectively better performance than a 4070 at stock settings since there's no frame generation so it's 60 real frames per second. It also looks better because now it renders at a high enough frame rate for DLSS to look decent even at 1080p.
Lol this is amazing. My new laptop with the i7-13700H and RTX 4080 is getting the same performance as this i5-13600K and 4070 desktop. I'm impressed since I got this laptop for only $1990 🎉
@@zWORMzGaming thanks, El Crispy 😄 Would be cool if you got a 40-series laptop, but I'm assuming you don't need to upgrade your 3070 laptop as it's still new
A future suggestion for when you buy or recieve a GPU for testing, please include hotspot aswell as mem temperatures as a separate thing in its own segment at the end of a video for this specific GPU model. So that we may know if the model is any good or not. Some models, like the Gainward Ghost (same GPU as the Palit Dual and another one) have quite high Hotspot temperatures of 90 or very close to 90, and thats not good for longevity, even though you can undervolt to reduce this. But for longevity it means the card will gradually climb in temperatures as it ages, more so than those with low temps from the start, they will also climb ofcourse but not nearly as high due to a lower day one temperature.
These cards are supposed to be the end all be all and they are only getting 5fps using the technologies they are advertised for. And no, we should not have to use DLSS. If all cards had access to frame generation then these cards would be irrelevant.
The funny thing about RT OD with rath tracing and Ray Recon is that the lighting looks... good. Its great tech but it doesnt look better than plain RT on psycho in Cyberpunk to my eyes. I find OD/PT/RR to wash things out, and lose a lot of the detail.. Especially in characters faces. 100% not worth the wild performance hit. I feel like itll be 5 years before a GPU can do this native res at any sort of acceptable framerate.
Would be interesting to test how much nVidia Reflex helps with Frame Generation. Talking about 1080p, I actually intend to buy this card for 1080p gaming because I think my 2060 will soon start getting overwhelmed by UE5 games, although it still runs recent games pretty well at high settings, and the 4070 is actually a 4060Ti re branded. I'll probably wait for prices to drop though cuz here in France the card is 650€, and I'm not sure I wanna pay that price for a 60 class card when I bought the 2060 for 325€ back in the day.
People still game in 1080p? I get it for competitive shooters because resolution doesn’t matter as much as frames, but if you’re willing to buy a 40 series card I’d upgrade to 1440 but just my opinion. Game how you want!
@@griff_the_boxer I actually mostly play DotA2, and when I'm bored of dota I play single player games. Also, I'm unfortunately not rich, so 1080p is fine for now :D
This gpu is horrible because of the price, it makes it looks really bad deal. People saying, hey, but performance close to 3080 and way more efficient. Well, guess what? OF COURSE IT LOOKS SUPER EFFICIENT FOR A XX70 GPU. This is a xx60 class gpu, and xx60 gpus were always rated for less than 200W. Just use your brain.
The thing about DLSS and people complaining about "its not native anymore" usually they havent used it or they have used an early implementation of it. And theyre typically the same crowd that screams and shouts about how Frame Generation is terrible and nobody should use it! Either they havent used it or dont own a GPU thats capable of it. At 1440p myself, I always use DLSS Quality unless its badly implemented. Sure it makes the image appear "softer" but thats typically because it also antialiases the image as it scales it back up. Hence why theres sharpening options for DLSS aswell as FSR. Though the amount depends on the game too. Like in Cyberpunk DLSS Quality 10 sharpness is more than enough. You could do less. On balanced 15-20 usually quite good. But in a game like Last of Us you wanna turn sharpness almost off. In a game like Warzone you want to have atleast 75 or it'll be really blurry looking. The gameworld it self looks somehow more detailed, but blurry. Its crazy how well DLSS reconstructs the gameworld (not in a literal sense) to add details you didnt see before and smooth out everything that would otherwise be distracting such as aliasing and shimmering.
With otherwise mixed settings RT Overdrive runs at around 60FPS for me at 1080p, using DLSS, on my 3060Ti. (No fake frames either) I was surprised to... not see the huge drop and even any large differences I expected between Path Tracing, RT, and pure rasterization on my system. (~58 vs ~62 vs ~120FPS)
@@dhaumya23gango75 I think it depends on the settings, and I wish I could double my framerate on RT Psycho. I might be bottlenecked by my 5600X there. Also it depends on the area, in heavier ones I go down to about 50, but in lighter ones reach the 70s. (everything on Ultra + PT) Without DLSS I'm obviously at around 12fps, but in more aesthetic/scenic moments I can live well with DLSS Performance - Ultra-Performance has very fuzzy flickering artifacts, but P is a good sweet spot. I also feel like Ray Reconstruction in 3.5 might help a lot with these very low pixel counts. I'm on a 1080p 144Hz monitor, so I like to turn RT off for combat, to reach +120fps (might also turn off DLSS, cause I might be bottlenecked by the CPU then, also depending on the area inside/outside) Edit: Important to mention, that it's also slightly overclocked and runs at ~2070MHz core + 7210MHz mem.
You are running out of vram in 4k with RT Overdrive mod on because if you take a look, some textures do keep reloading and most of the areas are a lot noisier than they should normally be. I have a 4070 as well and at 2k RT OD mod with DLSS Quality and DLSS frame generation turned on. 4070 can handle it with around 50-60fps and it looks really amazing.
but what the hell happened with DLAA at the end there? is it possible that the game was using DLSS in the same way it was bugged earlier in the video, and when turning on DLAA the bug got "circumvented" because DLAA deactivates upscaling or something like that? it simply can't be that intensive.. might also be just flat out broken on a new driver x_x
If in my country rx 7800 xt and rtx 4070 are at same price ( rtx 4070 a little less than rx 7800 xt) what should i get for new pc build.........i mainly have to do 1440p story line gaming and i need it for my computer science engineering
I would put the graphics settings on Medium or something that is optimized, and then put DLSS Balanced or Performance plus Frame Generation on it and play on 1440p high refresh. Disable Motion Blur and turn up the sharpness slider quite a bit. Combining Medium Settings with Ray Tracing and aggressive DLSS setting might be the way to get 100fps+ with RT.
@@nothingam9983 Cyberpunk is pretty much the defacto benchmark game because of how intensive it was and still is lol The game was practically built with DLSS in mind for it's highest settings mate
if they fixed the latency issues and had a better denoiser and actually set the path/ray tracing stuff up correctly id say it was next gen, wanna say they are using intel's denoiser but who knows its their own engine.
I played this game at 19 fps on my GTX 1060. If I can do it on that card, I can do it in Raytracing Overdrive if I switch my card somewhere in the future lmao
Woow path raicing is soo good. I think it will be going about 60fps (rt overdrive/ 1080p/ DLLS Q/ FG) on rtx 4060 ti or 50fps on rtx 4060 maybe with dlls B. No bad
This is such a weird generation. Somehow the 3080/90s are more expensive than the cards that are supposedly replacing them this generation. I’m in a weird spot where my GPU died and I have to replace it but otherwise I’d skip the 40 series. And it’s not worth overpaying for 30 series cards that are still overpriced for some reason.
Damn Pathtracing looks incredible :o But at the moment the 4090 is the only GPU that can handle it a little bit, maybe in like 2 Generations we will be able to use this! Good performance of the 4070, so sad that the pricing is not good :/ The 12 Gigs are ok, but rumors say that the 4060(TI) will get only 8 Gigs, what a ripoff! These cards will be dead at launch xD The 4050 will likely get 6 Gigs with a 96 bit bus, a new "GT" or "RT" card? ;D The 40 series might be the sadest Gen Nvidia ever released... Maybe AMD will put some stress against Nvidia. I ´m happy tbh now, because I got a refund for my 3060(Gigabyte is the worst, thx to the shop for helping me getting some money back) and I have now an ARC A770 LE in my system! It looks beautifull as heck, the cooler is silent and I am impressed how intel is improving the drivers so fast! After the 1.62 Patch in CP2077 and with the recent driver this Card comes close to a 3070, The RT Performance is not bad and better as by AMD. I´m playing CP through at the moment at 1440p Ultra, RT on expect lighting and XeSS 1.1 on balanced(On Intel GPUs it looks awesome and is like DLSS, in some instances better and in some a bit worse, looks better than FSR in my opinion) with 45-55 FPS, I don´t need 60+ in Single Player games. I think I made the right decision, in newer Games it´s comparable to like a 3060Ti - 3070, In older Games it´s like a 3060. No issues so far with DX9 Games or with crashes, everything works as it should, and with 16 Gigs it is futureproofed! payed 350€ for it btw :) Love your Videos Kryzzp, still the best benchmarker on YT
@@Rodrigo-rr6ym it is buddy. Idk if u r an nvidia user or amd user but the difference is night n day. Dlss performance > fsr quality n dlss perf is pretty bad tbh
DLAA is kinda broken for me in this game, im on a rtx 4080 and DLAA cuts my FPS in half vs nativ/taa. Looks better yeah, but useless if it's broken. Do some of you guys have the same issue?
@@zWORMzGaming 6800 non xt dropped in price by a lot now around 550 dollars, its is one of most underrated gpus and most interesting to see its benchmarks
Not sure if you do already, but for tests with frame generation, can you turn on NVIDIA reflex to ultra/on in the NVIDIA control panel and see if it noticeably improves latency? I feel like the biggest drawback for this new 40 series technology is the added latency..
Why don't you try set low-medium-high settings, maybe even Off some unnecessary options, and turn on overdrive ? I think it's better to play on medium with overdrive than on ultra without overdrive, cause good lighting is much more important than effects and textures.
14:22 even at native with rtx on and overdrive the GPU still drops down to 89% usage and then goes back up. this must just be the patch of this game? because I am having the same issue on version 1.62 even at 4k.
Am i missing something if FG generates one frame between to real frames then shouldn't enabling FG always grant 1/3 performance increase? How come its performance differs in various settings scenarios?
30:05 DLAA render higher resolution then downscale to native resolution like MSAA and opposite on DLSS render low resolution.. so this is also demanding setting.
So let me get this straight - a 2023 $600 70-class GPU marketed at and limited to the 1440p resolution by its bandwidth fails to maintain a steady 1440p60 pure rasterization performance at Ultra settings in a 2020 game. Yeah, this "4070" is a 4060Ti in disguise and should have been priced at $479. (For context, the 3060Ti that came out in 2020 struggled to maintain a steady 1440p60 in Red Dead 2, a 2018 game, at Ultra settings without upscaling)
@Niebuhr Are there any specific requirements for the die to be considered 104 though? If not, Nvidia can just name an x50 class die as a 104 tier die and use it as an excuse to put such slow chips into x70 class gpus at absurd prices
1080P on 24" monitor, DLSS Q + FG looks phenomenal. This is high end graphic card for 1080P resolution, high refresh rate gaming, not needed to spend more.
Lol NO!
This is the very definition of torturing yourself for no reason. 😂
You don't need fg to get a high refresh rate experience at 1080p, and you could have much more realistic experience at 4k high settings 70+ fps without fg, as it's 4x sharper than 1080p, and u don't need high refresh rate in single player games.
@@clem9808 you don't need high refresh rate in single player games but it's nice to have, most games on the 4070 at 1080p will run above 100fps and that's a valuable experience. And this guy is future proofing to the max, he won't have to upgrade his gpu for a while at 1080p
The menu in this game is well optimize!
Mor like the only optimized thing in the entire game 😂
@@mobarakjama5570 cyberpunk's optimization is decent. It gives the performance the gpu has to offer. Games like Hogwarts Legacy, RE 4 are just vram limiters and any game is better than TLoU pc port.
@@alphawolf60 where is gaming headed with this crap🤦🏽♂.
@@alphawolf60 TLOU1 looks godly in front of Jedi Survivor.
@@mobarakjama5570 it's heading for Geforce Now, exactly where nvidia want the average gamer to be (or Microsoft Blizzard, MS's competing online service)!
Tried it with 4070 TI and 7600X not expecting much but damn it was very playable. 85 fps average in Kabuki and it looks real good. For an experimental feature it seems pretty advanced in its development.
what res?
@@breadone_ probably 1080p. I had a 4070ti/13700k running very similar FPS with DLSS on Quality with the Ultra preset with RT OT and Path tracing and Ray Recon on.
@@wretchedslippage3255 i use 1440p rt oversrive and it looks amazing, with 40 to 50 fps which is enough.
@@wretchedslippage3255i get these fps at 1440p with my r7 5800x +rtx 4070 non ti, RT overdrive + ray reconstruction and path tracing on at dlss quality. weird.
Oh man, path tracing looks amazing. I'm gonna do everything I can to save up for the 5090 or AMD equivalent, whenever they come out, so I can experience this
Friendly piece of advice: if you have to 'save up for a 5090', then that card is not for you. Don't waste money you don't have on what is ultimately just meaningless entertainment.
@@ruxandy i sell weed and lsd to kids and make very good money out of it 😁
@@YASxYT nice to see another trapper as I also sell edibles to 12 year olds but if u plan to use RT then NEVER buy amd as nvidia is vastly superior there 💪😝
@@Drip7914 as a child trafficker that you guys just drugged, I agree with your comment. Whats the point of bigger VRAM if you cant use RT, frame generation, and DLSS 69. Meaningless entertainment? Dude, seeing 1 more fps for $1000k more is my entertainment, cant live without these wet and reflective roads!
@@Bos_Meong more like 3-6x RT fps for 400 more between the xtx and 4090 in the uk 😜👌
What I personally appreciate the most in your videos, Kryzzp - you actually max out settings, even on cards you're not 'supposed' to do that on. I really don't get anything out of watching benchmarks of games running at 1080p with medium settings. The most useful info I could get out of watching gameplay benchmarks is how games perform maxed out on the card in question, since I personally *always* crank my games to max. If I can't run a game maxed out, I don't play it until after I upgrade.
RT overdrive looks so good in cyberpunk. I hope that more games are going to use rt that good
Unfortunately we're still a few generations away from it being playable on mid range hardware, still, it's amazing to take a look at and test :)
doesnt matter how good it looks if it runs like a slideshow
@@zWORMzGaming i know but maybe in 2 or 3 years this is going to be playable, atleast i hope so.
@@MinnySteppa On 40 series it runs fine with dlss quality+FG
@@MinnySteppa 80 fps is a slideshow, what a clown.
I like the 1080 Overdrive DLSS Q setting. Looking beautiful.
I love your videos bro, specially these 30min or more ones, the way you make them are top quality and doesn't annoys at all, keep the amazing work man
Pd. Do you think they will add FG in the last of us? Cause i was sure that that game was gonna have it at its release
Another great suuuuunday video my friend. This was very fun. I love the most intensive parts when you get really low fps and crashes lol
Thanks buddy!
I'm glad you enjoyed the video 😃
Cyberpunk when it came out: No one can run me!
Cyberpunk a few years later with Overdrive: No one can run me!
To me, the main problem with this game is that, other than the lighting itself... it really doesn't look that good, it's certainly not what I would expect from a next-gen AAA title (especially when considering the ultra-high hardware requirements).
@@ruxandy ye red engine is pretty crap, especially the physics
@@ruxandy Yeah, the game also forces TAA, so it's a blurry mess too, and the draw distance is terrible too, even when you configure it to be higher in the files. Constant pop in of objects + blurry is not a good combo.
Nice,..its pretty intensive for sure!!! I tried it on my 2080ti at 2100Mhz does 1440p DLSS-P (720P) or 1080p DLSS Q (720P) Pathtracing at 25-40fps ranges.
Oh, and DLAA, even at 720p does amazing work!!
I turned the DLSS sharpening up alot and it looks much nicer
Thank you so much for this video. I'll probably buy this card for my future 1440p monitor. Keep up the great work!
Please don't, this card will become dead in few years, due to limited vram and horrible bus speeds, and this is coming from a guy with a 3060 I really wanted this card to be good, but after seeing it's capabilities, I say it's not worth it at all, even with it's good power draw, which is what got me excited in the first place.
@@mobarakjama5570 I don't think anyone purchasing a 70 series for 1440p will hold onto it for more than a few years anyway, and 12 gigs is still alright for now.
@@shuyaku99 It's a Mid-High end GPU, they should be able to, 70 class GPUs should be powerful enough to last a long time, specially at QHD when talking pure rasterization, but this RTX gimmick is what's ruining gaming, I use a RTX 3060 with 144hz QHD monitor and it's smooth experience at high-ultra, but turn RTX on and it's a whole different conversation.
I think optimally the FG needs something like at least 30fps to work with for the input behavior to become palpable. I wouldn`t try to use it without DLSS although I get you are testing for the worst case scenario as well.
It would be interesting to see RT Overdrive path tracing enabled, but with otherwise turned down settings and aggressive DLSS (Performance), just put the sharpness up that helps quite a bit.
Exactly, generating extra frames with a baseline below 30 fps just is similar to running the game in an unplayable state as you just can feel that your input doesn't match with the onscreen image.
Ideally you have at least 40 fps minimum when using framegen, which guarantees a pleasant experience. I speak from experience with my 4080 and 3800x.
But if you go for the perfect experience i think you still want the 60 fps baseline, and add generated frames on top of that, just for a smoother image.
The rule is in fact that the input lag with framegen will at least be as high as the framerate for which it adds frames.
So getting 80 fps with framegen basically means you have at least an input lag that corresponds to 40-50 fps in that game.
16:33 That frametime graph is crazy wow (Just wanted to see how RT Overdrive + Frame Gen on this card would handle at 4k)
It bugged out ahaha, I fixed it a few seconds later :)
@@zWORMzGaming Yup, watched a few minutes later, I'm surprised it ran semi decently tbh
Man, your videos give me a purpose to live, thanks a lot!
"it looks breath taking"
me whos watching in 360p: yes
Isn't it funny that we're at a point where I watch this on my 40inch 4k monitor and the quality downgrade that youtube does in its codec means that most of the differences you talk about are indistinguishable and I know that the actual gameplay must look so much better! :D
The judder you experienced was the card not knowing what to do due to going over its Vram limit it happens with my 4070 ti when I attempt to play 4K Overdrive Quality DLSS with Frame Gen. Also I found Frame Gen being a lot more tolerable with its input lag when using a controller instead of mouse and keyboard.
Same behavior happened to me, I thought my gpu is broken, it really sucks that you can't use the full potential because of vram but anyways, it's meant mostly for 1440p. Does a better cpu and ram help prevent this judder?
Thx for the vid, i def want a 4070 now ngl xD always enjoy the content
15:40 1440p 60fps in Overdrive mode is the sweet spot for me on this card
16:00 50fps....
Definitely not the fault of the card that it dropping there 22:50, it's just bad optimization. There is nothing on screen that the card shouldn't be able to handle.
great vid, very helpful for me right now!
A good video as always.
Though, i found it quite confusing when you went from overdrive to regular RT + DLSS. And then back to overdrive + DLSS
Damn, RT really changes the way Cyberpunk looks
There is a mod that reduces the amount of bounces calculated for path tracing. It would be interesting to see you do some testing with the mod on some lower end cards
Maybe when it’s officially released
Like those RTX remix game you can tweak RT settings in ALT+X
No.
the sweet spot for 1080p is 27:40 Ray Tracing : Ultra + DLSS Quality
Idk what's happening but this guy is even coming in my dreams now. 😢
i was thinking like you before about motion blur, but it is really good actually
Keep up the good work!
Stuttering in 4k RT Overdrive DLSS B FG was due to VRAM outage. FG requires more VRAM and when you hit the limit you'll get worse results than w/o FG.
After optimizing the path-tracing to 1 ray and 2 bounces along with NIS and a light overclock, I can get close to 60 fps on my 3070 at 1080p DLSS quality, effectively better performance than a 4070 at stock settings since there's no frame generation so it's 60 real frames per second. It also looks better because now it renders at a high enough frame rate for DLSS to look decent even at 1080p.
I played cyberpunk 1080p with dlss quality on 24 inch monitor and it looked great
This is absolutely insane
My 1050ti rocks no matter what...just glad with what I got from god within...❤❤😊😊😊😊😊
I think we just found a new meta for UA-cam Titles. (with ___ Included!)
Lol this is amazing. My new laptop with the i7-13700H and RTX 4080 is getting the same performance as this i5-13600K and 4070 desktop. I'm impressed since I got this laptop for only $1990 🎉
Beast laptop, enjoy!
@@zWORMzGaming thanks, El Crispy 😄 Would be cool if you got a 40-series laptop, but I'm assuming you don't need to upgrade your 3070 laptop as it's still new
I mean, laptop 4080 is superior enough to beat 4070, if it didnt it'd actually be a disappointing product.
@@niebuhr6197 true. I'm very happy with it, considering my previous 3080 laptop could only keep up with the desktop 3060Ti
Laptop 4080 has more cuda core and 175 watt, how the hell It's like 4070 Non ti 😂
A future suggestion for when you buy or recieve a GPU for testing, please include hotspot aswell as mem temperatures as a separate thing in its own segment at the end of a video for this specific GPU model. So that we may know if the model is any good or not. Some models, like the Gainward Ghost (same GPU as the Palit Dual and another one) have quite high Hotspot temperatures of 90 or very close to 90, and thats not good for longevity, even though you can undervolt to reduce this. But for longevity it means the card will gradually climb in temperatures as it ages, more so than those with low temps from the start, they will also climb ofcourse but not nearly as high due to a lower day one temperature.
Cyber Punk? really bro ? U have to say Cyber BUG as allways :D love your vids, Keep going
So sorry! I thought we wouldn't see many bugs in this one 🤣
Thank you mate!!
But it's perfectly playable
Just gonna buy a 6600 1440p high fsrQ looks fine to me
Frame Generation takes some of your VRAM so when you are out of VRAM it stutters like hell that is why at 4k you had such an issue
DLAA is just the antialiasing from the DLSS, but without the upscaling part. But the performance hit in this one is unacceptable.
Damn is this game gorgeous now
Yup! It does look amazing. Not playable but it's nice to take a look at anyway!
OK then, let's wait for the 5000 series.
These cards are supposed to be the end all be all and they are only getting 5fps using the technologies they are advertised for. And no, we should not have to use DLSS. If all cards had access to frame generation then these cards would be irrelevant.
I'm enjoying overdrive with fg and balanced dlss with a 4070ti. Getting like 90-120 1440p
Yeah, the 4070 ti is much better than the 4070. 4070 is honestly a scam
@@MutantMasterRace The 4070ti is also like $200 more than the 4070. The 4070ti is a terrible value for what you pay.
The funny thing about RT OD with rath tracing and Ray Recon is that the lighting looks... good. Its great tech but it doesnt look better than plain RT on psycho in Cyberpunk to my eyes. I find OD/PT/RR to wash things out, and lose a lot of the detail.. Especially in characters faces. 100% not worth the wild performance hit. I feel like itll be 5 years before a GPU can do this native res at any sort of acceptable framerate.
dlss i never use anything lower than balanced, the loss of quality is too visible. better keep it at quality or balanced if you really need it
My Face when i saw 4K ULTRA RT without any DLSS : o.0
Would be interesting to test how much nVidia Reflex helps with Frame Generation. Talking about 1080p, I actually intend to buy this card for 1080p gaming because I think my 2060 will soon start getting overwhelmed by UE5 games, although it still runs recent games pretty well at high settings, and the 4070 is actually a 4060Ti re branded. I'll probably wait for prices to drop though cuz here in France the card is 650€, and I'm not sure I wanna pay that price for a 60 class card when I bought the 2060 for 325€ back in the day.
People still game in 1080p? I get it for competitive shooters because resolution doesn’t matter as much as frames, but if you’re willing to buy a 40 series card I’d upgrade to 1440 but just my opinion. Game how you want!
@@griff_the_boxer I actually mostly play DotA2, and when I'm bored of dota I play single player games. Also, I'm unfortunately not rich, so 1080p is fine for now :D
@@griff_the_boxer isn't it the most common resolution according to steam
This gpu is horrible because of the price, it makes it looks really bad deal. People saying, hey, but performance close to 3080 and way more efficient. Well, guess what? OF COURSE IT LOOKS SUPER EFFICIENT FOR A XX70 GPU. This is a xx60 class gpu, and xx60 gpus were always rated for less than 200W. Just use your brain.
The thing about DLSS and people complaining about "its not native anymore" usually they havent used it or they have used an early implementation of it. And theyre typically the same crowd that screams and shouts about how Frame Generation is terrible and nobody should use it! Either they havent used it or dont own a GPU thats capable of it.
At 1440p myself, I always use DLSS Quality unless its badly implemented. Sure it makes the image appear "softer" but thats typically because it also antialiases the image as it scales it back up. Hence why theres sharpening options for DLSS aswell as FSR. Though the amount depends on the game too. Like in Cyberpunk DLSS Quality 10 sharpness is more than enough. You could do less. On balanced 15-20 usually quite good. But in a game like Last of Us you wanna turn sharpness almost off.
In a game like Warzone you want to have atleast 75 or it'll be really blurry looking. The gameworld it self looks somehow more detailed, but blurry.
Its crazy how well DLSS reconstructs the gameworld (not in a literal sense) to add details you didnt see before and smooth out everything that would otherwise be distracting such as aliasing and shimmering.
Please make video on Rx 6800 and 3070 ti which one is better to buy if you are getting them at the same price
This will blow up ur pc.
With otherwise mixed settings RT Overdrive runs at around 60FPS for me at 1080p, using DLSS, on my 3060Ti. (No fake frames either)
I was surprised to... not see the huge drop and even any large differences I expected between Path Tracing, RT, and pure rasterization on my system.
(~58 vs ~62 vs ~120FPS)
Than the settings did not apply on your system lol. Pathtracing halves the framerate on my 3060 ti system compared to when I use psycho raytracing.
@@dhaumya23gango75 I think it depends on the settings, and I wish I could double my framerate on RT Psycho. I might be bottlenecked by my 5600X there.
Also it depends on the area, in heavier ones I go down to about 50, but in lighter ones reach the 70s.
(everything on Ultra + PT)
Without DLSS I'm obviously at around 12fps, but in more aesthetic/scenic moments I can live well with DLSS Performance - Ultra-Performance has very fuzzy flickering artifacts, but P is a good sweet spot.
I also feel like Ray Reconstruction in 3.5 might help a lot with these very low pixel counts.
I'm on a 1080p 144Hz monitor, so I like to turn RT off for combat, to reach +120fps (might also turn off DLSS, cause I might be bottlenecked by the CPU then, also depending on the area inside/outside)
Edit: Important to mention, that it's also slightly overclocked and runs at ~2070MHz core + 7210MHz mem.
Bro I got you. You are in love with breaking your expensive GPUs😂
You are running out of vram in 4k with RT Overdrive mod on because if you take a look, some textures do keep reloading and most of the areas are a lot noisier than they should normally be. I have a 4070 as well and at 2k RT OD mod with DLSS Quality and DLSS frame generation turned on. 4070 can handle it with around 50-60fps and it looks really amazing.
but what the hell happened with DLAA at the end there? is it possible that the game was using DLSS in the same way it was bugged earlier in the video, and when turning on DLAA the bug got "circumvented" because DLAA deactivates upscaling or something like that? it simply can't be that intensive.. might also be just flat out broken on a new driver x_x
If in my country rx 7800 xt and rtx 4070 are at same price ( rtx 4070 a little less than rx 7800 xt) what should i get for new pc build.........i mainly have to do 1440p story line gaming and i need it for my computer science engineering
i dont even know if this card worth my money.
I would put the graphics settings on Medium or something that is optimized, and then put DLSS Balanced or Performance plus Frame Generation on it and play on 1440p high refresh. Disable Motion Blur and turn up the sharpness slider quite a bit.
Combining Medium Settings with Ray Tracing and aggressive DLSS setting might be the way to get 100fps+ with RT.
This card is struggling to play a 2 year old game
@@nothingam9983 the graphical upgrades aren't 2 years old though
And so is any other card on the market right now with complete donk settings
@@nothingam9983 Cyberpunk is pretty much the defacto benchmark game because of how intensive it was and still is lol
The game was practically built with DLSS in mind for it's highest settings mate
its not the 4000 series that make the difference, it's just FG DLSS3 with her upscaling ! :(
if they fixed the latency issues and had a better denoiser and actually set the path/ray tracing stuff up correctly id say it was next gen, wanna say they are using intel's denoiser but who knows its their own engine.
I played this game at 19 fps on my GTX 1060. If I can do it on that card, I can do it in Raytracing Overdrive if I switch my card somewhere in the future lmao
Can't believe Bob tried to Roast Kryzzp by saying that he's a 12 yr old, now i know why Kryzzp says Bob is evil
Woow path raicing is soo good. I think it will be going about 60fps (rt overdrive/ 1080p/ DLLS Q/ FG) on rtx 4060 ti or 50fps on rtx 4060 maybe with dlls B. No bad
I have a legion 7i with a 4090 2023
Is it better than a 4070 desktop??
Thank you.
This is such a weird generation. Somehow the 3080/90s are more expensive than the cards that are supposedly replacing them this generation. I’m in a weird spot where my GPU died and I have to replace it but otherwise I’d skip the 40 series. And it’s not worth overpaying for 30 series cards that are still overpriced for some reason.
Nice tests! DLSS Frame Generation technology can worked without DLSS Super Resolution?
If yes, try 1080p RT Overdrive only with FG.
Yes it will work without DLSS super resolution.
@@msnehamukherjee Then the performance of the game will be better without scale resolution.
this isn't a 4k gpu for Frame generation.
That's why stutters
The vram is maxed out
Damn Pathtracing looks incredible :o But at the moment the 4090 is the only GPU that can handle it a little bit, maybe in like 2 Generations we will be able to use this! Good performance of the 4070, so sad that the pricing is not good :/ The 12 Gigs are ok, but rumors say that the 4060(TI) will get only 8 Gigs, what a ripoff! These cards will be dead at launch xD The 4050 will likely get 6 Gigs with a 96 bit bus, a new "GT" or "RT" card? ;D The 40 series might be the sadest Gen Nvidia ever released... Maybe AMD will put some stress against Nvidia.
I ´m happy tbh now, because I got a refund for my 3060(Gigabyte is the worst, thx to the shop for helping me getting some money back) and I have now an ARC A770 LE in my system! It looks beautifull as heck, the cooler is silent and I am impressed how intel is improving the drivers so fast! After the 1.62 Patch in CP2077 and with the recent driver this Card comes close to a 3070, The RT Performance is not bad and better as by AMD. I´m playing CP through at the moment at 1440p Ultra, RT on expect lighting and XeSS 1.1 on balanced(On Intel GPUs it looks awesome and is like DLSS, in some instances better and in some a bit worse, looks better than FSR in my opinion) with 45-55 FPS, I don´t need 60+ in Single Player games.
I think I made the right decision, in newer Games it´s comparable to like a 3060Ti - 3070, In older Games it´s like a 3060. No issues so far with DX9 Games or with crashes, everything works as it should, and with 16 Gigs it is futureproofed! payed 350€ for it btw :)
Love your Videos Kryzzp, still the best benchmarker on YT
i would like new gtx card like a gtx 1950 8gb whit rtx 3050 perforance but no tensor and rt cores
@@dd22koopaesverde58 just get a used 1070 at that point
@@bignose1752 in the future im more interested on the rx 6600
1:48 this is the next gen tecnology lmao
Alien technology 👀
Can anyone tell me why does he kill Bob every time what is his beef with him?
They’re just giving an update where mostly people cannot enjoy. Why not just giving a gameplay update or any new features will come in handy.
I am REALLY corious to see in the future how FSR 3 will compare o DLSS 3
It will b 💩 jus like fsr upscaling
@@anuzahyder7185 DLSS might be better, but that doesn't make FSR shit
@@Rodrigo-rr6ym it is buddy. Idk if u r an nvidia user or amd user but the difference is night n day. Dlss performance > fsr quality n dlss perf is pretty bad tbh
surprised you didn't test 1440p RT Overdrive DLSS balanced or performance with FG
DLAA is kinda broken for me in this game, im on a rtx 4080 and DLAA cuts my FPS in half vs nativ/taa. Looks better yeah, but useless if it's broken. Do some of you guys have the same issue?
What would the rtx 3080 ti would look like? 🤔
Funny how the 3080 is that much better even at stock, simply because it’s got. 320bit bus rather then 192bit
DOA.
can't even do 1440p ultra 60 fps..let's not talk about fg performance decrease or ovedrive mode performance..
can you make an video on the 6800 one day ? its super underated in my opinion but i wanna see how well it do against ray tracing and 4k
If I find it for a killer deal I'll grab one :)
@@zWORMzGaming hell yeah mate, good luck
@@zWORMzGaming There is also the 6700 XT. It goes for ~$350 but I don't know how much it is for you.
@@zWORMzGaming 6800 non xt dropped in price by a lot now around 550 dollars, its is one of most underrated gpus and most interesting to see its benchmarks
Radeon gpus cannot handle cyberpunk rt. These gpus are only good for light rt which is hardly even noticeable
i really love DLSS, fsr is good and all, but DLSS not just give you frames, but if you have enough pixels to trade, its free fps and better image.
Not sure if you do already, but for tests with frame generation, can you turn on NVIDIA reflex to ultra/on in the NVIDIA control panel and see if it noticeably improves latency? I feel like the biggest drawback for this new 40 series technology is the added latency..
Latency is gud enuf with fg on.
You really need to have a summary section so people don't have to look at 30 video segments.
Oh sweet u have the same specs I’m going to be getting
crazy pop in
I admit it looks really good but sacrificing that much performance isn't worth it, maybe it'll be worth it in 6000 series in the future
Do you know what is the disadvantage of FG? go to a fence or something that looks like a lattice
on 1080p with rtx 3070 i do 50 60 fps with dlss on performance i dont know if it worth it compared to normal ray tracing honestly
Why don't you try set low-medium-high settings, maybe even Off some unnecessary options, and turn on overdrive ? I think it's better to play on medium with overdrive than on ultra without overdrive, cause good lighting is much more important than effects and textures.
Should i get the 4070 or the 4070ti? Not sure if the price increase is worth it
Don't get the ti, 12GB VRAM at that price range is a scam
14:22 even at native with rtx on and overdrive the GPU still drops down to 89% usage and then goes back up. this must just be the patch of this game? because I am having the same issue on version 1.62 even at 4k.
Yeah it's not a CPU bottleneck for sure!
Am i missing something if FG generates one frame between to real frames then shouldn't enabling FG always grant 1/3 performance increase? How come its performance differs in various settings scenarios?
I think in a perfect world it should double the fps but calculating the fake frame takes time so its usually slightly less than double
2020 RTX 3070 : I Am the 1440p King
2023 RTX 4070 Be Like : What Is 1440p King I can do both 4k and 1440p
"A 1080p can provide with sweet 60fps PLUS"
nope, using the "high" preset with RT overdrive its 45-60fps all the time
16:30 VRAM limitation?
30:05 DLAA render higher resolution then downscale to native resolution like MSAA and opposite on DLSS render low resolution.. so this is also demanding setting.
Usually it's not this intensive, for example in Hogwarts Legacy it runs better than TAA High.
So let me get this straight - a 2023 $600 70-class GPU marketed at and limited to the 1440p resolution by its bandwidth fails to maintain a steady 1440p60 pure rasterization performance at Ultra settings in a 2020 game. Yeah, this "4070" is a 4060Ti in disguise and should have been priced at $479.
(For context, the 3060Ti that came out in 2020 struggled to maintain a steady 1440p60 in Red Dead 2, a 2018 game, at Ultra settings without upscaling)
More like an RTX 4060 at 350$. 4060 Ti should have been the current 4070 Ti.
@Niebuhr Are there any specific requirements for the die to be considered 104 though? If not, Nvidia can just name an x50 class die as a 104 tier die and use it as an excuse to put such slow chips into x70 class gpus at absurd prices
Having a RX6600, is it worth upgrading to this card?