Please dont think I am saying that Lossless Scaling 4X is the same as DLSS 4X MFG. There will be differences in image quality, framerates, and input latency. DLSS MFG has a max of 4X, and that is why I started testing with 4X in this video, that is all. Also, this video was done purely for fun. It's not that serious.
u cant use 4x frame generation DLSS4 with a 4000 Series thats the diference. i can give the props u lock u frames ingame most people reviewing this app dont lock ingame fps
@ At least 40-series users can use the better DLSS4 (10-15 frames extra in performance mode with FG x2) and the other benefits, sharper image almost no smearing. But tests will show more
Well LS is improving really fast. It's a greater deal than to spend hundreds or thousands of dollars on a new gpu just because nvidia decided to lock the feature behind a fake limitation.
Guys, remember that if you want more fps than what your display is able to show with losless scaling, you have to change the option that says “Max frame latency”, you can put it at 3 and it will allow the program to generate all those fps that you wouldn't be able before.
Hi, dev here. Like you, I was puzzled by your video and wondered why LS doesn’t capture more than 60 FPS when it should, until I found the answer at 0:40. It turns out you’re running the game on a 60Hz monitor, and LS only captures the frames being displayed.
@@Mostly_Positive_Reviews Yes, LS uses non-intrusive capture APIs, to be safe for online games, like DXGI and WGC, which only capture frames that make it to the DWM and aren’t discarded beforehand. The 60 FPS cap will no longer apply once you run the game on the same monitor you’re playing on (I'm assuming it's high refresh, not sure what your setup is currently). For proper frame pacing, LS should capture all the game frames, so I recommend you to do so. I also recommend capping your game FPS, you’ll notice a big improvement in framepacing. Generating frames beyond your monitor’s refresh rate often makes little sense, as most of those frames will be dropped and you won't be able to see how smooth it's supposed to be. If you upgrade to a 480Hz monitor or higher, high multipliers like 60x8, 40x12, or 30x16 would make more sense. To see LS working as intended now, try locking your game at 60 FPS and using a 60x4 multiplier on a 240Hz display (if that matches your setup).
@@ths6388 Awesome, appreciate the feedback. Yeah, my main monitor is 165hz, will test it there and find another way to record and do a follow-up video. This was really just me fooling around to see how much frames I can get haha! And I have to say, the 2X, 3X and even 4X modes are very impressive for what it is. I actually use 2X here instead of DLSS frame gen in Final Fantasy 16, as it works better.
@@Mostly_Positive_Reviews For 165Hz go 82x2 or 55x3. You can also cap lower and let Gsync do its work. To capture LS you can use OBS. When using DXGI capture in LS, its window is invisible to Windows capture APIs OBS uses, so you should select Game Capture mode to hook into LS swapchain. When using WGC capture in LS, you can just use Display Capture mode. However, WGC is very limited until Windows 11 24H2 and won't detect the base game framerate, so game needs be capped at RR / multiplier for a proper framepacing.
Thank you very much 🙏 I really did have fun recording this one. I think the break in new games releasing is good in the sense that you can just do silly stuff that you enjoy.
@@mathdeep Will it be useable at ALL games old and new with a click of a buttom or only supported games? Keep in mind this program works on ALL games and almost ALL hardware its insane for just 5 bucks.
what i saw the other day is match the resolution scale slider in lossless with the dlss mode you are using, for example you using quality dlss set it to 65 in lossless and it produces a more stable FG
And we can use reflex & boost + lock original framerate with RTSS using reflex as framelimiter option inside (instead of async) the program settings to get even lower latencies.
@gotmilk6982 makes me wonder how DLSS 4 is going to look beyond 2x FG. That's really my deciding factor to upgrade from my 4080S. Looking forward to a 5080Ti if it comes down the line, or I'll just cave and get a 5090.
Surprised how well LFG works, X2 anyway. For not having access to engine data its quite good. That said though it does have a bit of a GPU cost, obviously. In a CPU bound scenario this wouldn't be a very big deal since you have GPU resources unused. I only noticed it playing KCD on my 4090. I had around 55% GPU usage with 85FPS lock than turned on X2, GPU utilization jumped to mid 70's% Will be interesting to see what that cost is on Blackwell with MFG as people will undoubtedly compare LSFG X3/4 with DLSS4 MFG.
Yeah, X2 is actually very usable in some games. 4X is also not as terrible as I thought it would be, considering it's not game aware. Will definitely do a few comparisons with 4x MFG when I get a 5070.
Yeah, I was thinking the same thing. In cpu bound scenarios where the video card has space to work, perhaps even space to generate frames without performance cost on old cpus, like my xeon 2697v3 LOL
@lacm81 can you use nvidia overlay (e.g. nvidia app, it can be installed separately)? It shows GPU and CPU utilization in the corner so you can check if there are huge problems. Actually there are some tweaks, options for less demanding upscaling & framegeneration in LosslessScaling, and you can always use for game even lower resolution. Considering this program requires for video / game to run in windowed mode.
@@marlanivanovich1828 I can. I just haven’t yet with that. I used to all the time for stats but now I use MSI afterburner and I only considered using GPU heat to monitor with my new build to ensure I had set things up properly. I’ll have to look into this. So let me get this straight, I should run NVIDIA and lossless together?
I found this app to be especially useful if I want to watch something at 60 locked frames but have a 120 Hz display. Especially anime looks really good and for some reason YT videos.
@@h1ghken no, LSFG works instantly with any Window any moving pixels on your screen. Which means you can always watch UA-cam, TikTok, Netflix, at 240 FPS any time you wish. Just be careful about your power draw since LSFG can push GPU usage to 99% of the time.
The minimum base framerate for good frame generation experience is 60fps. everything below that feel laggy because of the response time. The perfect combo in my case is (with a 120hz oled monitor / rtx 3080 ) : 60fps cap with rtss, rtss frame limiter mode nvidia reflex and reflex marker injection, vsync disable everywhere but in lossless scaling, lossless scaling lsfg3 at 2x. everything is so smooth, almost no artifact. It's really impressive how good it works in ALL game !
@@Mostly_Positive_Reviews yeah i could see that alot in cyberpunk, the buildings move out of shape like lego, funny enough it seem to have the most issues with just blank linear buildings.
This app is really neat, I digged my 1080ti out of retirement and dropped it in next to my 3080ti. With this app, you can offload the frame generating off your main card and run it on your secondary (and weaker card). This way, you get less of a performance impact when you activate frame gen.
I say that Lossless Scaling ( LSFG 3.0 x3, x4 ..) is actually impressive! Yes with some minor cons but it solves primary issue - it gives really smooth gameplay those without last amd or nvidia videocards. Even one of my ancient laptop's 1050 ti reborned with that. And another cool feature: now I can watch any video content in most of the players interpolated in my monitor's refresh rate without need to use RIFE AI, nvidia superresolution and other high demanding stuff. So it's system-wide. 9/10
I dont think that the purpose of FG is reaching "playable" framerates for crappy cards, I think its to be able to reach high fps on high hz monitors with good cards. Thats why nVidia said you need 60 fps minimum to run FG.
Both upscaling and frame generation are meant to take something good to something better. If your base frame rate (or base resolution) is bad, the output will be bad.
I think it's pretty good. Maybe not perfect right now, but it always improves. It's amazing that we have so many tools to experiment with. You should try using lossless scaling with FSR 3.1 frame generation too, just out of curiosity.
When it first came out it was fun to test with, but pretty bad. Now though it's actually pretty good. I did try before recording this video but LSFG kept breaking. I know it's possible so will try to get it sorted 👍
@@Mostly_Positive_Reviews You should definetely use lower resolution scale values for 4k and 1440p. What i recommend to start with 4k is 50 resolution scale. You will be shocked by how much your base fps will increase with almost no loss in image quality (res scale only effects optical flow resolution, not in game resolution). For 1440p you can use like 60-65. When you try higher frame gen x values, you need to increase max frame latency also or it wont go higher (target a fps then if fps cannot reach it even with gpu load not maxed increase the max latency value in ls)
Not saying that DLSS4 MFG cant run on older GPUs, but lossless scaling works in a very different manner to FSR and DLSS Frame Gen. It works in a different part of the pipeline and it doesnt have access to motion vectors, so while both generate or interpolate frames, it is done completely different.
@@Mostly_Positive_Reviews Tomorrow some modder will enable that for the 4000 series cards, mark my words, hell, maybe for the whole line up of 2000 3000 and 4000.
The only thing Blackwell purportedly has is flip metering hardware to improve frame pacing. They probably could release AI-based FG to all cards, only with the potential problems in frame pacing.
you should try it with dual gpu less input lag and the fact that you dont lose base fps you can even upscale imagine using dlss and fsr from lossless scaling at the same time
I trued it with the iGPU and it was worse, but installed another 1660 Super to test with. Havent gotten around to play around with yet but I heard it works well.
You can get another GPU (could be some bargain bin special) to render the interpolated frames, it can also work on laptop iGPUs (Ryzen 7000+ desktop iGPUs are a little weak) works really well.
i tried using my iGPU before, but it was incapable of achieving a 240 FPS output (with a 240 Hz refresh rate). Also and interestingly enough, using my iGPU impacts the my main GPU usage more than simply using my main GPU for lossless scaling. It was an interesting experiment, but did not work well. This was done with a Ryzen 9800X3D (with iGPU enabled) and a RTX 4090.
@@KillFrenzy96 Yep, the AM5 (Ryzen 7000 series and above including 9000 series) are very very weak. That's why I specified a laptop iGPU or some cheap dGPU. You can get away with using an APU though if you have one of those.
Hey bro, quick tips down at the Section in Lossless Scaling where it talks about sync and max latency,set max latency to 2 or higher in order to create more fps than your monitor. also i highly recommend going down to your gpu section and turn 'Preferred GPU' to your most powerful GPU, NEVERchoose auto, Auto only uses your integrated graphics
🤣Made me laugh when the car hit you Yeah tbh it does look better now compared to when I tried it & for ppl who want to use it, I think it's a good alternative tbh Btw that DLSS FG option in Ratchet happened to me in Spider man remastered recently, I think you have to enable it before you load up the game in the launcher, it's buggy in two Sony ports
Yeah, happens in Spider-Man and Ratchet. Works fine in Ghost of Tsushima, very weird. I noticed you can have FSR FG enabled at start, and it allows you to change it, but only once, and then you cant anymore.
Hahaha. I didnt use it as you should at all in this video, I just wanted to see how many frames I could get. Best is to set a framerate cap at half of your monitor's refresh rate when using X2, and a third if using X3. Those two modes work very well. So if you have a 120hz monitor and using X2 cap the in-game fps to 60. It will then go to 120 fps on X2 mode etc etc.
@@Mostly_Positive_Reviews the one game I found that it worked really well was dark souls 1 remasterd. Because the game is perfectly locked at 60 it feels so smooth to add lossless scaling to it. Maybe that engine just works really well with the tool not sure.
I made this video only for fun, and only after the fact thought "Maybe I shouldve tried locking the base framerate to 60 fps". I'm going to do a follow-up video where I test it a bit more seriously, and Elden Ring will be one of the games for sure!
@@Mostly_Positive_Reviews I got a 4090 with 2080 ti myself but I can't put it in because the card is too fat. It has to go to 4th slot but 4th slot has a few wires on it which makes it impossible to slot in a fat gpu like 2080 ti. Not as fat as a 4090 but still. Had I known of this I'd have bought the thinnest 4080 ti super pro max or sth instead, this one doesn't come thin. And the best argument for buying a 5090 is the fact that it's thin now, so that it's easier to slot in another one.
Yeah, going to do that. Just wanted to see how far I can push it for the fun of it. Had no idea people would actually be interested in this video, so will do a proper follow-up.
It’s not really possible though,the 4090 is 24gb while the 5090 is 32gb,even if everything else is the same it won’t be able to beat due to the difference in memory storage,not to mention power ratings and also the faster gddr7 memory.
do you have Rivatuner locking the framerate to 60fps? Or maybe it's an ingame cap... Great video. I have Lossless Scaling, the motion clarity is something else. Ideally you want to lock the framerate to a multiple of your max mnitor''s framerate. i.e. if you have a 180Hz monitor, that would be 45 or 1/4 of 180Hz, 60 or 1/3 of 180Hz, 90 or 1/2 of 180Hz and so on and so forth
I checked everywhere for a cap and couldnt find one. The telltale sign is that I can hit 500 fps in Ratchet and Clank without Lossless, yet with it on it locks to 60 for some reason 🤷♂️ Didnt think this silly video would get the traction it did so now planning a proper video where I'll be locking the fps, test more games etc. Appreciate the kind words 🙏
@@alpermertsk1688 I tried that too and unfortunately didnt work :( A friend made a few suggestions that I am going to be trying. Will do an updated video.
Sure. The Witcher 3 is a really awesome game, definitely up there for me too. Favourite game for me is Diablo 2, followed by Cyberpunk 2077, and then the first FEAR.
base framerate MUST be above 70 fps to look acceptably ok and not have too many motion artifacts, because otherwise the difference between neighboring frames is too large and the inbetween frame more difficult to estimate. if more than 1 frame is generated - changes in movement become unnatural. without the useless 3x and 4x frame generation the rtx5000 lineup is nothing but a rtx4000 refresh with a bit faster memory
You all who review Lossless Scaling didn't read the instructions manual. YOU HAVE TO LOCK YOUR DAMN FRAME RATE AND CAP IT SO YOU GET CONSTANT FRAMES WITH LESS IMPUT LAG Holly molly what's wrong with people these days. Been using this software for 2 years and it's a godsent software
@@Mostly_Positive_Reviews it's no personal but seriously every single review for Lossless Scaling doing the same wrong thing and then give bad evaluation of the software. I have nothing to do with the software heck I didn't even buy it. But still it's getting so much smack down in reviews for people being lazy to do a proper evaluation
@@Mostly_Positive_Reviews then please do everyone a favor and cap the frame rate then do an input lag test the only proper review I saw was by Fabio from Ancient Gamer. I have 3090 and I'm using LSFG in every single game to get 144 FPS @ 4K. Do a proper evaluation video about the software then make other videos, or in the same one, to goof around with settings. The software is not made to get 600 FPS out of nowhere. You need at least a steady 60 FPS AFTER you activate it. Use Riva Tuner for capping the FPS. When I'm able to achieve 74 FPS that's where I cap it and when below doing 3x I go with 60 FPS. Then LSFG will automatically cap at 144 because I have Gsync activated in Lossless Scaling. And remember you can always lower some settings to achieve a stable high base frame. Ngreedia's 5000 series is almost here. We need to raise the correct attention to this software so people can make our GPUs last longer against these greedy corporates
I was just fooling around with the 4090, but to answer your question, some games dont support frame gen natively, and you can then use this app. It's also useful for people who dont have a GPU that supports frame generation.
I have a 4090 and I tested some games like doom 2016 with it. I could run the game at 120fps native and using loss scaling 3.0 frame gen doesn’t seem as smooth as native. Maybe my settings are off but it isn’t quite there for me.
I have a 3060 Ti at the moment, a decently capable GPU, but one that struggles with some modern games. It’s great for games where I hit about 70fps or so, and just want a bit more visual smoothness. In those games, capping my FPS to 55, then using 3X LSFG, nets me 165fps in perceived performance, which, to my eyes, looks better and feels better than the native 70-80fps from before. It works on some games better than others, for example God of War and Ready or Not work wonderfully, but Cyberpunk doesn’t work as well in my experience, plus I already manage about 90-100fps most of the time, which does end up feeling and looking better to me than a generated 165fps personally. It’s also wonderful for games with fps caps, such as emulated games, or one of my favorite use cases, Elden Ring. Again, it’s certainly not perfect, but the sheer flexibility of being able to use it on every game I own, as well as any video I choose, makes it one of my favorite bits of software.
I can't seem to get this program running well without getting microstutters every 1-2 seconds. Which honestly is much more jarring at high fps than just having lower fps lol.
@@Mostly_Positive_Reviews I've noticed it in Ghost of Tsushima, Cyberpunk and Monster Hunter: World mostly. I do think it gets better when I cap my framerate to 60, it performs much better but I still have microstutters for whatever reason. I currently have an RTX 3080ti, i7-12700k, 32gb GDDR5 Ram (can't remember speed I think 3200). Maybe it's the processor but It's hard to say.
Yeah indeed. I didnt realize it would still cap to 60 fps even though v-sync was disabled, but the dev left a comment and said it does indeed cap at your monitor's refresh rate.
Let's face the fact that... Frame Generation is nothing more than a glitchy crutch, a make believe, for the real deal of real frames at high frame rates.....
can you use double gpu and try to render x20 use something like 4070 as main and lossless set preferred gpu is 2nd gpu which is a rtx 4090 or you try x20 on older game like halflife or counter strike source on 4090
So if I understand, I can use this software with my RTX 3080 ? I don't understand all the video. We need a game who support Framegeneration that's right ?
in fortnite creative. my 4060 laptop gets around 210 fps in native 1080p. when i turn resolution to 720p windowed and upscale to 1080p and use 7X Frame Generation, that gets anywhere from 780-830 fps. latency isnt an issue for me. 4060 mobile laptop= delayed 9800X3D 6090 PC
You didn't test or demonstrate a single scenario where this could actually be useful. But you aren't alone, all the videos I saw about LSFG were poorly put together and nobody tested any of the graphically intensive games where latency doesn't matter much -- I.e. Baldur's Gate 3 with "Ultra" settings on a low-end GPU would be a realistic use case scenario where somebody without the best hardware could actually benefit from using LSFG.
Apparently it does lock to your monitor's refresh rate for the base framerate, even if v-sync is disabled. So if you have a 144hz monitor it will use max 144 fps as a base if you can maintain it. Just tested it on my 165hz panel by disconnecting my second monitor and it caps the base framerate to 165, and then does frame interpolation from there.
i tried it yesterday. my msi afterburner shows the scaled FPS. but the 1% Lows and Input lag wass horrible. without LSFG and ~100FPS runs much smoother compared to 240fps with LSFG
V-sync to prevent the framerate from exceeding the monitor's refresh rate, and g-sync to sync the refresh rate to the framerate when below the monitor's refresh rate. G-sync doesnt cap the framerate, contrary to popular belief, so going over the monitor's refresh rate causes screen tearing.
On a 165hz monitor using v-sync + reflex will cap it to 158 fps. On games that dont support reflex I use RTSS to cap at 158. That way the input latency is lower than when hitting the v-sync limit.
Absolutely not designed for competitive gaming at all. Even at 2X, it will add a noticeable amount of latency to your mouse movements. But for single player games with a controller, the latency is very forgiving, even though you can still feel it at first. There are a few things you can do to help the latency, or at least make the latency more stable so you get used to it. The best thing to do, is run LS and see what it drops your base FPS down to e.g. if you are getting 60FPS before LS it will drop down to approximately 47 to 53 FPS (from the testing I have done with 2X and 3X) when you enable LS. You then lock the FPS (I use RivaTuner) to just below that i.e. lock to 45FPS. This will significantly help keep your latency stable and after a few minutes of playing, you may not even notice the lag. I actually find 3X to be more stable than 2X, but it does depend on the game.
Latency hit isn't that bad. As long as your input framerate is locked to 60 or above, the latency hit is only the frame time of the frame. I wouldn't use it in competitive shooters, but only because it's not adding any new data, only interpolating between existing data. Half of the point of high framerate in competitive games is seeing things faster, that's not a thing with this. Latency wouldn't be the reason I don't use it in something like CS2. But it should be fine in something like Cyberpunk and more than fine in something like R&C.
Yeah indeed. I do talk about the input latency. I cant measure it properly but it is noticeable. The 2X and 3X modes are actually quite okay in single player games, especially if you use a controller. I wouldnt recommend any frame gen for competitive shooters.
I've seen videos of where it "works", but for me it would just present a black screen, and a few seconds later LSFG will disable. Maybe it just doesnt work in the games I tested.
@@Mostly_Positive_ReviewsLossless did that to me when desktop recording specifically was ticked in NVShare settings. Turn that off and Lossless worked again. If you use that feature I mean...
600fps on a 60hz monitor huh? 😂 I kid, this software is great in the right application but it always makes me laugh to see how top of the line cards with maxed scaling do before the input lag, and inconsistent 50+ms frame times. I'll deal with it to get my Legion Go to reach its unnecessarily high 144hz refresh rate but a 4090? Just play native and enjoy the game man. 😂 (not you, OP. This content is appreciated)
Unfortunately my capture card is only 60hz :( But looking at getting a 120hz one soon! I agree with you though. Obviously what I showed is not meant for real-world use, but I'm glad you understood that as many didnt. The Legion Go is actually a good example as the screen is high refresh rate, but also small enough to make noticing artifacts harder. The app has its uses for sure!
Despite the fact that the Nvidia x4 frame gen solution will no doubt be better than Lossless Scaling in terms of image quality and latency, it's not going to be anything like £2000 better. Nvidia are basing the bulk of their pitch for the 5090 on MFG but when there's another app that does the same for almost nothing it's a pretty insane situation right now.
And that's $2000 MSRP. Seen a few listings at $3000, and will probably be $3200 here for an entry-level Zotac 5090, with the Strix and Aorus cards going for $3500+ here.
@Mostly_Positive_Reviews I would be interested to see a comparison of 4090 using its 2x FG with Lossless Scaling FG 2x layered on top Vs 5090 with 4x FG.
Where is it locked to 50? Base framerate is locked to your monitor's refresh rate, and in this case it's a 60hz capture card. No idea where you get 50 from.
@Mostly_Positive_Reviews i understand that it's ok to experience i am saying it in general. The need is x10 to work better on low profile cards not a 4090. Untill then its not a success its just marketing and profit. I tested it on 4070 as well.
It feels smooth in hentai anal porn. Think about it. You have a 24 fps hentai buttfuck porn, it's shit, the animation is all over the place. In one frame the cock's out, the other it's in. Nothing in between. So here comes LS, multiplies the assfucking by 20, and you get 480 frames of assfucking per fucking second. Nice right?
I think there are ways to improve that, so will be playing around more. Didnt think people would be interested in a video like this so will do a more serious one as well.
In high-end games ( say maxed out, you hit 70fps), I would only use any frame gen to hit 120 at the most, no matter the monitor. AI is the way forward due to native being too expensive/too power-demanding and needing a cooler twice the size of what GPUs have now, but we are not there yet. There must also be hardware changes to go hand in hand with software rather than big pushes on new cards that rely on software more than anything else (looking at you 5070/5080). A worry would be fast online multiplayer games, all these extra AI frames will lead to "I shot him, no way did I miss" "No, you shot the extra frames"
On 10x and 20x, yes. But 2x up to 4x with Reflex enabled and a framerate cap just below your monitor's refresh rate it's much better. Some people will notice the additional input lag and hate it, others not so much.
Please dont think I am saying that Lossless Scaling 4X is the same as DLSS 4X MFG. There will be differences in image quality, framerates, and input latency. DLSS MFG has a max of 4X, and that is why I started testing with 4X in this video, that is all.
Also, this video was done purely for fun. It's not that serious.
No, pero lo que queda claro es que poder se puede hacer y si Nvidia quisiera lo metería.
@@alexxxrarara Oh yes 100% - it's a great tactic to ensure FOMO in everyone owning a 40-series card and 30-series card.
u cant use 4x frame generation DLSS4 with a 4000 Series thats the diference.
i can give the props u lock u frames ingame most people reviewing this app dont lock ingame fps
@ At least 40-series users can use the better DLSS4 (10-15 frames extra in performance mode with FG x2) and the other benefits, sharper image almost no smearing. But tests will show more
Well LS is improving really fast. It's a greater deal than to spend hundreds or thousands of dollars on a new gpu just because nvidia decided to lock the feature behind a fake limitation.
Guys, remember that if you want more fps than what your display is able to show with losless scaling, you have to change the option that says “Max frame latency”, you can put it at 3 and it will allow the program to generate all those fps that you wouldn't be able before.
❤
Hi, dev here.
Like you, I was puzzled by your video and wondered why LS doesn’t capture more than 60 FPS when it should, until I found the answer at 0:40. It turns out you’re running the game on a 60Hz monitor, and LS only captures the frames being displayed.
Hi there! So it would always cap the base framerate at the monitor's refresh rate, even if v-sync is disabled?
@@Mostly_Positive_Reviews Yes, LS uses non-intrusive capture APIs, to be safe for online games, like DXGI and WGC, which only capture frames that make it to the DWM and aren’t discarded beforehand. The 60 FPS cap will no longer apply once you run the game on the same monitor you’re playing on (I'm assuming it's high refresh, not sure what your setup is currently). For proper frame pacing, LS should capture all the game frames, so I recommend you to do so.
I also recommend capping your game FPS, you’ll notice a big improvement in framepacing. Generating frames beyond your monitor’s refresh rate often makes little sense, as most of those frames will be dropped and you won't be able to see how smooth it's supposed to be. If you upgrade to a 480Hz monitor or higher, high multipliers like 60x8, 40x12, or 30x16 would make more sense.
To see LS working as intended now, try locking your game at 60 FPS and using a 60x4 multiplier on a 240Hz display (if that matches your setup).
@@ths6388 Awesome, appreciate the feedback. Yeah, my main monitor is 165hz, will test it there and find another way to record and do a follow-up video. This was really just me fooling around to see how much frames I can get haha! And I have to say, the 2X, 3X and even 4X modes are very impressive for what it is. I actually use 2X here instead of DLSS frame gen in Final Fantasy 16, as it works better.
@@Mostly_Positive_Reviews
For 165Hz go 82x2 or 55x3. You can also cap lower and let Gsync do its work.
To capture LS you can use OBS. When using DXGI capture in LS, its window is invisible to Windows capture APIs OBS uses, so you should select Game Capture mode to hook into LS swapchain.
When using WGC capture in LS, you can just use Display Capture mode. However, WGC is very limited until Windows 11 24H2 and won't detect the base game framerate, so game needs be capped at RR / multiplier for a proper framepacing.
@@ths6388I have a 120hz 4k G Monitor, I cap my fps to 60 and I use X2 mode, what is the correct configuration to ensure G-Synch works correcty?
5090 is losing for a $5 software
It's $3.5 in my country.
Mhhhhh would you offer me if I send you the money on PayPal?@@bobgame763
Full of visual artifacts but ok
Works great with very little artifacts.
Its nothing compared to DLSS FG but if multi-framegen is going to be locked to 50 series then this is pretty good.
sometimes its fun just to tinker around
Thank you very much 🙏
I really did have fun recording this one. I think the break in new games releasing is good in the sense that you can just do silly stuff that you enjoy.
Absolutely @@Mostly_Positive_Reviews
nvidia be like, "4090, 6090 performance in a 4090 for just 200000$"
I dont think people will ever forget the 5070=4090 slide 🤣
@@Mostly_Positive_Reviewswait till the benches go up after launch the memes will be glorious
@@pcgameshardware867 Yet you will still get people saying that it's true 🤣
So this is the Mul gen feature for only 5 bucks
Absolutely fantastic bit of software 😊
Lmao no, Nvidia MFG will be way better than this.
@@mathdeep Will it be useable at ALL games old and new with a click of a buttom or only supported games? Keep in mind this program works on ALL games and almost ALL hardware its insane for just 5 bucks.
@@nicknick4917 yes it's neat that it exists but it's not comparable to native MFG.
@ For sure MFG will have better quility no doubt about that.
what i saw the other day is match the resolution scale slider in lossless with the dlss mode you are using, for example you using quality dlss set it to 65 in lossless and it produces a more stable FG
Thanks! I did see a few people talking about that on Twitter, I might have to do a follow up video.
You can just lower it to 25, if you don't care about artifacting
Yeah like I can't see the difference between 100 and 60 😂
Penguin approves the nice duck icon in Loss Less Scaling
I've tried this with my 4090 and was surprised with the results. As long as you aren't maxing your GPU, the latency is manageable.
It's not perfect, but it's definitely something you can use.
And we can use reflex & boost + lock original framerate with RTSS using reflex as framelimiter option inside (instead of async) the program settings to get even lower latencies.
How is the image quality (ghosting, artifacting, etc.) comparing 2x FG with 2x LS? And 4x LS?
@@frantz4g63 only notice ghosting and artifacts past 2x
@gotmilk6982 makes me wonder how DLSS 4 is going to look beyond 2x FG. That's really my deciding factor to upgrade from my 4080S. Looking forward to a 5080Ti if it comes down the line, or I'll just cave and get a 5090.
Surprised how well LFG works, X2 anyway. For not having access to engine data its quite good. That said though it does have a bit of a GPU cost, obviously. In a CPU bound scenario this wouldn't be a very big deal since you have GPU resources unused. I only noticed it playing KCD on my 4090. I had around 55% GPU usage with 85FPS lock than turned on X2, GPU utilization jumped to mid 70's%
Will be interesting to see what that cost is on Blackwell with MFG as people will undoubtedly compare LSFG X3/4 with DLSS4 MFG.
Yeah, X2 is actually very usable in some games. 4X is also not as terrible as I thought it would be, considering it's not game aware. Will definitely do a few comparisons with 4x MFG when I get a 5070.
Yeah, I was thinking the same thing. In cpu bound scenarios where the video card has space to work, perhaps even space to generate frames without performance cost on old cpus, like my xeon 2697v3 LOL
No wonder my gtx970 tanks with that turned on. I never knew how much was being used or how much headroom is required…
@lacm81 can you use nvidia overlay (e.g. nvidia app, it can be installed separately)? It shows GPU and CPU utilization in the corner so you can check if there are huge problems. Actually there are some tweaks, options for less demanding upscaling & framegeneration in LosslessScaling, and you can always use for game even lower resolution. Considering this program requires for video / game to run in windowed mode.
@@marlanivanovich1828 I can. I just haven’t yet with that. I used to all the time for stats but now I use MSI afterburner and I only considered using GPU heat to monitor with my new build to ensure I had set things up properly. I’ll have to look into this. So let me get this straight, I should run NVIDIA and lossless together?
I found this app to be especially useful if I want to watch something at 60 locked frames but have a 120 Hz display. Especially anime looks really good and for some reason YT videos.
interesting
do you have to download the vid?
@@h1ghken no, LSFG works instantly with any Window any moving pixels on your screen.
Which means you can always watch UA-cam, TikTok, Netflix, at 240 FPS any time you wish.
Just be careful about your power draw since LSFG can push GPU usage to 99% of the time.
@niezzayt3809 nice
Interesting. Will check that out too, thanks!
What fps do you aim for in anime. I watch things in 480hz but with a 30fps source it doesn't work too well with motion. Hard transitions also look bad
The minimum base framerate for good frame generation experience is 60fps. everything below that feel laggy because of the response time.
The perfect combo in my case is (with a 120hz oled monitor / rtx 3080 ) : 60fps cap with rtss, rtss frame limiter mode nvidia reflex and reflex marker injection, vsync disable everywhere but in lossless scaling, lossless scaling lsfg3 at 2x. everything is so smooth, almost no artifact. It's really impressive how good it works in ALL game !
wow thats insanely good, other than the aim-button i could not see any atrifacts on 10x ratched and clank at 4k from my couch tv.
Yeah, it really is impressive. There are some other artifacts when in fast motion, but very impressive nonetheless.
@@Mostly_Positive_Reviews yeah i could see that alot in cyberpunk, the buildings move out of shape like lego, funny enough it seem to have the most issues with just blank linear buildings.
This app is really neat, I digged my 1080ti out of retirement and dropped it in next to my 3080ti. With this app, you can offload the frame generating off your main card and run it on your secondary (and weaker card). This way, you get less of a performance impact when you activate frame gen.
Crazy thing is that the new dlss model dropped optical flow and went for a similar approach to lossless with a ai model instead
It'll be interesting to see if they now bring it to previous RTX GPUs as well. They kinda hinted at that, but nothing confirmed.
I say that Lossless Scaling ( LSFG 3.0 x3, x4 ..) is actually impressive! Yes with some minor cons but it solves primary issue - it gives really smooth gameplay those without last amd or nvidia videocards. Even one of my ancient laptop's 1050 ti reborned with that. And another cool feature: now I can watch any video content in most of the players interpolated in my monitor's refresh rate without need to use RIFE AI, nvidia superresolution and other high demanding stuff. So it's system-wide. 9/10
Yeah, I was impressed. I was expecting a complete garbled mess, but it's actually decent.
Lossless scaling for frame generation in online movies is the best.
Never tried it but I've heard many people say that now. Will be sure to give it a shot 👍
Needing a top tier card in order to property use FG defeats the purpose of FG.
People needing it the most are the ones not benefiting from it.
Lol.
You dont need a top-tier card, nor was this used "properly". I merely wanted to see what the absolute highest framerate was I could push.
I dont think that the purpose of FG is reaching "playable" framerates for crappy cards, I think its to be able to reach high fps on high hz monitors with good cards. Thats why nVidia said you need 60 fps minimum to run FG.
Both upscaling and frame generation are meant to take something good to something better. If your base frame rate (or base resolution) is bad, the output will be bad.
I think it's pretty good. Maybe not perfect right now, but it always improves. It's amazing that we have so many tools to experiment with. You should try using lossless scaling with FSR 3.1 frame generation too, just out of curiosity.
When it first came out it was fun to test with, but pretty bad. Now though it's actually pretty good.
I did try before recording this video but LSFG kept breaking. I know it's possible so will try to get it sorted 👍
@@Mostly_Positive_Reviews You should definetely use lower resolution scale values for 4k and 1440p. What i recommend to start with 4k is 50 resolution scale. You will be shocked by how much your base fps will increase with almost no loss in image quality (res scale only effects optical flow resolution, not in game resolution). For 1440p you can use like 60-65. When you try higher frame gen x values, you need to increase max frame latency also or it wont go higher (target a fps then if fps cannot reach it even with gpu load not maxed increase the max latency value in ls)
@@Mostly_Positive_Reviews wait for 5000 series use 4x mfg then use 20x lossless scaling on 16k scale with 2 fps boom 160 fps! lol
Dlss 4 multi framegeneration is clearly a software lock by nvidia and could probably run on older cards. Lossless scaling is prove of that
Not saying that DLSS4 MFG cant run on older GPUs, but lossless scaling works in a very different manner to FSR and DLSS Frame Gen. It works in a different part of the pipeline and it doesnt have access to motion vectors, so while both generate or interpolate frames, it is done completely different.
@@Mostly_Positive_Reviews Tomorrow some modder will enable that for the 4000 series cards, mark my words, hell, maybe for the whole line up of 2000 3000 and 4000.
The only thing Blackwell purportedly has is flip metering hardware to improve frame pacing. They probably could release AI-based FG to all cards, only with the potential problems in frame pacing.
you should try it with dual gpu less input lag and the fact that you dont lose base fps
you can even upscale imagine using dlss and fsr from lossless scaling at the same time
I trued it with the iGPU and it was worse, but installed another 1660 Super to test with. Havent gotten around to play around with yet but I heard it works well.
You can get another GPU (could be some bargain bin special) to render the interpolated frames, it can also work on laptop iGPUs (Ryzen 7000+ desktop iGPUs are a little weak) works really well.
Interesting! Going to try use the 7800X3D iGPU in conjunction with it and see what happens. I also have a GTX 1660 Super that I can throw in the mix.
i tried using my iGPU before, but it was incapable of achieving a 240 FPS output (with a 240 Hz refresh rate). Also and interestingly enough, using my iGPU impacts the my main GPU usage more than simply using my main GPU for lossless scaling. It was an interesting experiment, but did not work well. This was done with a Ryzen 9800X3D (with iGPU enabled) and a RTX 4090.
@@KillFrenzy96 Yep, the AM5 (Ryzen 7000 series and above including 9000 series) are very very weak. That's why I specified a laptop iGPU or some cheap dGPU. You can get away with using an APU though if you have one of those.
Great job man!
I really had fun putting this one together, glad you liked it!
@@Mostly_Positive_Reviews it’s great.
Hey bro, quick tips down at the Section in Lossless Scaling where it talks about sync and max latency,set max latency to 2 or higher in order to create more fps than your monitor. also i highly recommend going down to your gpu section and turn 'Preferred GPU' to your most powerful GPU, NEVERchoose auto, Auto only uses your integrated graphics
Thanks! I have my iGPU disabled in BIOS so that's not an issue. Will test the other suggestions and play around a bit, thanks!
The 4090 is 100% faster than the 5090 how is that even possible? 😮
🤣Made me laugh when the car hit you
Yeah tbh it does look better now compared to when I tried it & for ppl who want to use it, I think it's a good alternative tbh
Btw that DLSS FG option in Ratchet happened to me in Spider man remastered recently, I think you have to enable it before you load up the game in the launcher, it's buggy in two Sony ports
Yeah, happens in Spider-Man and Ratchet. Works fine in Ghost of Tsushima, very weird. I noticed you can have FSR FG enabled at start, and it allows you to change it, but only once, and then you cant anymore.
Yeah I love Cyberpunk but annoying you have too keep restarting for FG
However it's still one of the first games i'll be testing when get a new GPU🤣
Yeah, the FG option got botched when they added FSR FG for some reason. No idea how or why 🤷♂️
watching this with lsfg x8. Works so well with gameplay videos
Always lower the Resolution Scale to 80-70ish at 1440p, in LS.
I have a 1080 resolution monitor, what values should I set the resolution scale to? Thanks, I'm writing from Italy🇮🇹
@74ciuix depending on the performance that you gain you want to set it at 100% 90% or 75% at 1080p resolution
Somehow ive missed the existence of this software
Nvm apparently I added it to my Steam wishlist 28th June 2024
Hahaha. I didnt use it as you should at all in this video, I just wanted to see how many frames I could get. Best is to set a framerate cap at half of your monitor's refresh rate when using X2, and a third if using X3. Those two modes work very well.
So if you have a 120hz monitor and using X2 cap the in-game fps to 60. It will then go to 120 fps on X2 mode etc etc.
I found that with a 1080ti the decrease in base framerate was not worth for the increase in fps when using lossless scaling 😢
Ah okay, makes sense. I am actually busy setting up a GTX 1660 Super system to test with, so this is good info.
@@Mostly_Positive_Reviews the one game I found that it worked really well was dark souls 1 remasterd. Because the game is perfectly locked at 60 it feels so smooth to add lossless scaling to it. Maybe that engine just works really well with the tool not sure.
@saiyaman9is there not a mod for that disable fps lock? 000
I made this video only for fun, and only after the fact thought "Maybe I shouldve tried locking the base framerate to 60 fps". I'm going to do a follow-up video where I test it a bit more seriously, and Elden Ring will be one of the games for sure!
do you have an igpu with your cpu? then you can set up the igpu for LSFG and the 1080ti to render only the game
Works fantastic on video when using vlc player video and youtube. Movies look fantastic at 4x framegen. 😊
I saw a few people mention that but I havent tried it yet. Definitely will!
Put another 2080 ti in there and let it do the scaling and let 4090 do the rendering. Show the real FUCKING power of the NEOSLI!
NEOSLI 🤣
I will definitely be testing something like this when I get new GPUs hopefully soon!
@@Mostly_Positive_Reviews I got a 4090 with 2080 ti myself but I can't put it in because the card is too fat. It has to go to 4th slot but 4th slot has a few wires on it which makes it impossible to slot in a fat gpu like 2080 ti. Not as fat as a 4090 but still. Had I known of this I'd have bought the thinnest 4080 ti super pro max or sth instead, this one doesn't come thin. And the best argument for buying a 5090 is the fact that it's thin now, so that it's easier to slot in another one.
You can lock the framerate to get better frametimes and eliminate stutter.
Yeah, going to do that. Just wanted to see how far I can push it for the fun of it. Had no idea people would actually be interested in this video, so will do a proper follow-up.
I love lossless scaling.
It's come a long way and is very good where it is now.
@Mostly_Positive_Reviews it has and I pray the developer keeps it going. I can't wait to try it on a 5K2K ultrawide monitor.
Can't wait to see 4090 with LS beat the 5090 with 4x FG
It’s not really possible though,the 4090 is 24gb while the 5090 is 32gb,even if everything else is the same it won’t be able to beat due to the difference in memory storage,not to mention power ratings and also the faster gddr7 memory.
Do you not think the 5090 will have LS as well?
@@Decadent_Descent Obviously it will have LS but I want to see the older gen with LS compete with the new gen with MFG.
do you have Rivatuner locking the framerate to 60fps? Or maybe it's an ingame cap... Great video. I have Lossless Scaling, the motion clarity is something else. Ideally you want to lock the framerate to a multiple of your max mnitor''s framerate. i.e. if you have a 180Hz monitor, that would be 45 or 1/4 of 180Hz, 60 or 1/3 of 180Hz, 90 or 1/2 of 180Hz and so on and so forth
I checked everywhere for a cap and couldnt find one. The telltale sign is that I can hit 500 fps in Ratchet and Clank without Lossless, yet with it on it locks to 60 for some reason 🤷♂️
Didnt think this silly video would get the traction it did so now planning a proper video where I'll be locking the fps, test more games etc.
Appreciate the kind words 🙏
@@Mostly_Positive_ReviewsDid you have any luck changing capture mode on lossless scaling?
@@alpermertsk1688 I tried that too and unfortunately didnt work :( A friend made a few suggestions that I am going to be trying. Will do an updated video.
Can you test it in a game that doesn't have frame gen techs?
Rtx 5090 has gone redundant for gamers. Gamers dont need rtx 4090 anymore. Rtx 3080 will get the same job done.
May I ask what some of your favorite games are? Mine is The Witcher 3 😊
Sure. The Witcher 3 is a really awesome game, definitely up there for me too. Favourite game for me is Diablo 2, followed by Cyberpunk 2077, and then the first FEAR.
Why the vsync was not stable? Maybe try setting it ikn rivatuner to get more firm base rate.
Follow-up video is on it's way 👍
base framerate MUST be above 70 fps to look acceptably ok and not have too many motion artifacts, because otherwise the difference between neighboring frames is too large and the inbetween frame more difficult to estimate.
if more than 1 frame is generated - changes in movement become unnatural.
without the useless 3x and 4x frame generation the rtx5000 lineup is nothing but a rtx4000 refresh with a bit faster memory
You all who review Lossless Scaling didn't read the instructions manual.
YOU HAVE TO LOCK YOUR DAMN FRAME RATE AND CAP IT SO YOU GET CONSTANT FRAMES WITH LESS IMPUT LAG
Holly molly what's wrong with people these days. Been using this software for 2 years and it's a godsent software
Wow, so angry over a video where I wanted to see what the highest framerate is I could get. It's not that serious 🤦♂️
@@Mostly_Positive_Reviews it's no personal but seriously every single review for Lossless Scaling doing the same wrong thing and then give bad evaluation of the software. I have nothing to do with the software heck I didn't even buy it. But still it's getting so much smack down in reviews for people being lazy to do a proper evaluation
I've been very positive in this video, even said I have no idea how the developer got it to this point where it's this good.
@@Mostly_Positive_Reviews then please do everyone a favor and cap the frame rate then do an input lag test the only proper review I saw was by Fabio from Ancient Gamer. I have 3090 and I'm using LSFG in every single game to get 144 FPS @ 4K. Do a proper evaluation video about the software then make other videos, or in the same one, to goof around with settings. The software is not made to get 600 FPS out of nowhere. You need at least a steady 60 FPS AFTER you activate it. Use Riva Tuner for capping the FPS. When I'm able to achieve 74 FPS that's where I cap it and when below doing 3x I go with 60 FPS. Then LSFG will automatically cap at 144 because I have Gsync activated in Lossless Scaling. And remember you can always lower some settings to achieve a stable high base frame. Ngreedia's 5000 series is almost here. We need to raise the correct attention to this software so people can make our GPUs last longer against these greedy corporates
Did you just delete my last reply? Lol. That says it all
Can you do some 4k reso as well?
I think the g-sync support turn on in loseless scaling causing issues with framegen, try to turn it off next time.
what issue did you observe with gsync option on ? it works great in my case but i may have miss something ;)
An honest question: What is the point of "lossless scaling"? Especially with a 4090. It just adds a lot of latency to an already very playable fps.
Because not everyone has a 4090. It’s a 5 dollar program that’s constantly improving and can be used on any graphics card.
I was just fooling around with the 4090, but to answer your question, some games dont support frame gen natively, and you can then use this app. It's also useful for people who dont have a GPU that supports frame generation.
I have a 4090 and I tested some games like doom 2016 with it. I could run the game at 120fps native and using loss scaling 3.0 frame gen doesn’t seem as smooth as native. Maybe my settings are off but it isn’t quite there for me.
I have a 3060 Ti at the moment, a decently capable GPU, but one that struggles with some modern games. It’s great for games where I hit about 70fps or so, and just want a bit more visual smoothness. In those games, capping my FPS to 55, then using 3X LSFG, nets me 165fps in perceived performance, which, to my eyes, looks better and feels better than the native 70-80fps from before. It works on some games better than others, for example God of War and Ready or Not work wonderfully, but Cyberpunk doesn’t work as well in my experience, plus I already manage about 90-100fps most of the time, which does end up feeling and looking better to me than a generated 165fps personally. It’s also wonderful for games with fps caps, such as emulated games, or one of my favorite use cases, Elden Ring. Again, it’s certainly not perfect, but the sheer flexibility of being able to use it on every game I own, as well as any video I choose, makes it one of my favorite bits of software.
4090 = 9090 got it. Thanks Nvidia!
🤣
I can't seem to get this program running well without getting microstutters every 1-2 seconds. Which honestly is much more jarring at high fps than just having lower fps lol.
Hmm, any specific game? Maybe it's a game issue and I can test it on my side as well.
What are your specs?
@@Mostly_Positive_Reviews I've noticed it in Ghost of Tsushima, Cyberpunk and Monster Hunter: World mostly. I do think it gets better when I cap my framerate to 60, it performs much better but I still have microstutters for whatever reason. I currently have an RTX 3080ti, i7-12700k, 32gb GDDR5 Ram (can't remember speed I think 3200). Maybe it's the processor but It's hard to say.
60 FPS cap was probably due to capture card?
Yeah indeed. I didnt realize it would still cap to 60 fps even though v-sync was disabled, but the dev left a comment and said it does indeed cap at your monitor's refresh rate.
The amount of artifacting is insane even at x2 its not as good as neither FSR FG or DLSS FG
Let's face the fact that... Frame Generation is nothing more than a glitchy crutch, a make believe, for the real deal of real frames at high frame rates.....
ive tried this and it does increase fps but the fps feels kind of the same. I think it just "DUPLICATE" frames, not interpolating it like Nvidia does.
You are doing something very wrong. The difference is quite noticeable even in 60/120 cases.
Thanks for the video! Would you be able to do a video like this for Helldivers 2?
I honestly didnt think people would be interested in this video. It has seen some traction so will do a more serious follow-up.
I've been using lsfg in helldivers 2 for awhile and it works really well always did 60->120 fps.
I want to see LSFG x4 with a high base framerate (120 fps for example)
Video coming early next week 👍
$2000 vs $5
Which is even scarier as it will probably retail for around $2500, and here it would be about $3000 😔
@Mostly_Positive_Reviews 💀
Good video!
Thank you big guy, appreciate it 💪
@@Mostly_Positive_Reviews❤️
i wonder the result if you let main gpu for game and integrated gpu for generating frames
Someone suggested it earlier too so I will definitely give it a shot 👍
YEEESSSS THIS IS THE WAY!
🤣🤣🤣
can you use double gpu and try to render x20 use something like 4070 as main and lossless set preferred gpu is 2nd gpu which is a rtx 4090
or you try x20 on older game like halflife or counter strike source on 4090
Can’t seem to get my lossless scaling working. I’ve been testing it but for some reason the frames just don’t feel smooth for me.
So if I understand, I can use this software with my RTX 3080 ? I don't understand all the video.
We need a game who support Framegeneration that's right ?
You can use it with almost any GPU in any game as long as the game can be run in Windowed, or Borderless mode. So will work on your RTX 3080.
You can even use it on videos, emulators, streams, etc
@ Thx !
in fortnite creative. my 4060 laptop gets around 210 fps in native 1080p. when i turn resolution to 720p windowed and upscale to 1080p and use 7X Frame Generation, that gets anywhere from 780-830 fps. latency isnt an issue for me. 4060 mobile laptop= delayed 9800X3D 6090 PC
Hahahaha! That's pretty awesome!
Ayy
You didn't test or demonstrate a single scenario where this could actually be useful. But you aren't alone, all the videos I saw about LSFG were poorly put together and nobody tested any of the graphically intensive games where latency doesn't matter much -- I.e. Baldur's Gate 3 with "Ultra" settings on a low-end GPU would be a realistic use case scenario where somebody without the best hardware could actually benefit from using LSFG.
It wasnt the point of this video... I made it very clear this video was me playing around to see what the highest framerate is I could get...
what if you dont use the capture card and just record with a camera? it might be an issue with the capture the 60fps lock
Apparently it does lock to your monitor's refresh rate for the base framerate, even if v-sync is disabled. So if you have a 144hz monitor it will use max 144 fps as a base if you can maintain it. Just tested it on my 165hz panel by disconnecting my second monitor and it caps the base framerate to 165, and then does frame interpolation from there.
@Mostly_Positive_Reviews it does not lock for me unless i set it in the app to vsync, it just doubles whatever the current fps is
I also dont have gsync and my monitor is 144, so if i play with 48fps x3 it goes to 144 but if i keep x3 and have 60fps base it goes 180
i tried it yesterday. my msi afterburner shows the scaled FPS. but the 1% Lows and Input lag wass horrible. without LSFG and ~100FPS runs much smoother compared to 240fps with LSFG
Fk even using Rift Apart best fkin game to test this on
disable vsync set to off in control panels for lossless remove limits for me on 60hz tv
V-sync is disabled in NVCP, no other FPS cap is set anywhere but unfortunately it's still locked to 60 fps. Trying a few things now to get it sorted.
Why do you combine V-Sync with G-Sync? 🤔
V-sync to prevent the framerate from exceeding the monitor's refresh rate, and g-sync to sync the refresh rate to the framerate when below the monitor's refresh rate.
G-sync doesnt cap the framerate, contrary to popular belief, so going over the monitor's refresh rate causes screen tearing.
@@Mostly_Positive_Reviews Do you enable V-Sync in game or in the in-game menus o in Nvidia Driver settings Control Panel?
Nvidia control panel. In-game v-sync becomes unavailable when using DLSS Frame Generation.
@ How do you cap the Framerate? You said you don't let it go over 158Fps.
On a 165hz monitor using v-sync + reflex will cap it to 158 fps. On games that dont support reflex I use RTSS to cap at 158. That way the input latency is lower than when hitting the v-sync limit.
Does Lossless Scaling only works with Games from Steam or is this usable for any Games ??
You can use it in any game 👍
@@Mostly_Positive_Reviews thx.....
man it can even be used for watching movies at 60fps
just do 3x framegen and it will turn 24fps to 72fps
4090 = 7080 super perfonmance!!!
Nvidia doesnt want you to know this ONE trick!
Wow the 4090 is now as powerful as the rtxra 8090 ti super max pro titan+ will be 🎉
2x and 3x is the most good for fps mult and gaming exp.try it nex time if you can pls
I uploaded a video about 2X and 3X about 2 hours ago 👍
surely there is an input latency hit right? meaning this isn't viable for competitive fps games for example
Absolutely not designed for competitive gaming at all. Even at 2X, it will add a noticeable amount of latency to your mouse movements. But for single player games with a controller, the latency is very forgiving, even though you can still feel it at first. There are a few things you can do to help the latency, or at least make the latency more stable so you get used to it. The best thing to do, is run LS and see what it drops your base FPS down to e.g. if you are getting 60FPS before LS it will drop down to approximately 47 to 53 FPS (from the testing I have done with 2X and 3X) when you enable LS. You then lock the FPS (I use RivaTuner) to just below that i.e. lock to 45FPS. This will significantly help keep your latency stable and after a few minutes of playing, you may not even notice the lag. I actually find 3X to be more stable than 2X, but it does depend on the game.
Latency hit isn't that bad. As long as your input framerate is locked to 60 or above, the latency hit is only the frame time of the frame. I wouldn't use it in competitive shooters, but only because it's not adding any new data, only interpolating between existing data. Half of the point of high framerate in competitive games is seeing things faster, that's not a thing with this. Latency wouldn't be the reason I don't use it in something like CS2. But it should be fine in something like Cyberpunk and more than fine in something like R&C.
Yeah indeed. I do talk about the input latency. I cant measure it properly but it is noticeable. The 2X and 3X modes are actually quite okay in single player games, especially if you use a controller. I wouldnt recommend any frame gen for competitive shooters.
So if I use Lossless scaling 20x on my 3060, can I get better than 5090 performance?
According to Nvidia's marketing of DLSS FG X4, yes 🤣
@@Mostly_Positive_Reviewsit's probably better than 6090 and 7090 combined 😊
What about dldsr can you use it at the same time?
Indeed.
🔥🔥
Fun stuff! Have you tried doing DLSS FG + LSFG x2? Any better than LSFG x4 alone?
I did and it broke 🤣 It was supposed to be part of the video but when I tested it before recording it would completely freak out.
@Mostly_Positive_Reviews Haha, oh well, thanks for trying!
I've seen videos of where it "works", but for me it would just present a black screen, and a few seconds later LSFG will disable. Maybe it just doesnt work in the games I tested.
@@Mostly_Positive_ReviewsLossless did that to me when desktop recording specifically was ticked in NVShare settings. Turn that off and Lossless worked again. If you use that feature I mean...
@@SKHYJINX Going to give it a shot, thanks!
Imagine something like lossless scaling in consoles maybe we could finally drop PC and start playing console
600fps on a 60hz monitor huh? 😂
I kid, this software is great in the right application but it always makes me laugh to see how top of the line cards with maxed scaling do before the input lag, and inconsistent 50+ms frame times. I'll deal with it to get my Legion Go to reach its unnecessarily high 144hz refresh rate but a 4090? Just play native and enjoy the game man. 😂
(not you, OP. This content is appreciated)
Unfortunately my capture card is only 60hz :( But looking at getting a 120hz one soon!
I agree with you though. Obviously what I showed is not meant for real-world use, but I'm glad you understood that as many didnt.
The Legion Go is actually a good example as the screen is high refresh rate, but also small enough to make noticing artifacts harder. The app has its uses for sure!
Are you using a capture card? maybe thats the reason why lossless' counter thinks you're using a 60hz display.
Yeah, it's a 60hz capture card, but even on my 165hz monitor it still caps the base to 60 fps. Will play around to figure it out.
@@Mostly_Positive_Reviews i have multiple monitors, before i use lossless, i turn off screen cloning first. Hope this helps too
wait, you have a 4090 with no hdr monitor like an oled?
I do have a 165hz HDR monitor. The videos are recorded using a 4K60 capture card connected to a 4K60 monitor, but I game on my 165hz panel.
@Mostly_Positive_Reviews oh okay, I was about to recommend one cause I feel like you were wasting that 4090 🤣
im just mad how bad your driving was in cyberpunk lol
It makes me mad too when I see how well other people drive in this game!
Despite the fact that the Nvidia x4 frame gen solution will no doubt be better than Lossless Scaling in terms of image quality and latency, it's not going to be anything like £2000 better. Nvidia are basing the bulk of their pitch for the 5090 on MFG but when there's another app that does the same for almost nothing it's a pretty insane situation right now.
And that's $2000 MSRP. Seen a few listings at $3000, and will probably be $3200 here for an entry-level Zotac 5090, with the Strix and Aorus cards going for $3500+ here.
@Mostly_Positive_Reviews I would be interested to see a comparison of 4090 using its 2x FG with Lossless Scaling FG 2x layered on top Vs 5090 with 4x FG.
600 fps hahaha un app che dovrebbe costare 4000miliardi di dollari
Considering you have 'THE fastest GPU in the world', and I have a RTX4070, my Cyberpunk looks way better than yours lol
Yes, but mine is running at 600 FPS! Bigger number better, obviously!
Hahaha. Reverse uno card
If I cant have DLSS MFG 4X on my 4090 I'll just get it elsewhere!
U from sa?
I am indeed!
@@Mostly_Positive_Reviews Ha SA accents always so distinct .
It's so funny how people can always immediately tell 🤣
So this is how you cheat MFG 4X on "unsupported" GPUs.
lol and it turns out its locked to 50
Where is it locked to 50? Base framerate is locked to your monitor's refresh rate, and in this case it's a 60hz capture card. No idea where you get 50 from.
What the heck is lossless
Look for Lossless Scaling on Steam. That's the app I'm using here for frame generation.
Why would someone with a 4090 need loseless scaling this sickness isn't going anywhere good.
I guess you missed the part where I said I'm just doing this for some stupid fun. Three times I said that...
@Mostly_Positive_Reviews i understand that it's ok to experience i am saying it in general. The need is x10 to work better on low profile cards not a 4090. Untill then its not a success its just marketing and profit. I tested it on 4070 as well.
Why are you not showing a proper latency overview? It doesen't matter if you have 9 million in FPS if the game doesen't feel right.
I did explain why in the video.
why do someone even need 500fps?
It feels smooth in hentai anal porn. Think about it. You have a 24 fps hentai buttfuck porn, it's shit, the animation is all over the place. In one frame the cock's out, the other it's in. Nothing in between. So here comes LS, multiplies the assfucking by 20, and you get 480 frames of assfucking per fucking second. Nice right?
Damn that frametime looks ew
I think there are ways to improve that, so will be playing around more. Didnt think people would be interested in a video like this so will do a more serious one as well.
Just give me real frames, not fake ones.
That'll be 6 grand and 3 kidneys.
Every test using fake frames should have latency numbers or the test is pointless...
None of the usual apps that show latency works when using Lossless Scaling. I did show this and speak about it in the video...
lol
Just a bit of silly fun.
In high-end games ( say maxed out, you hit 70fps), I would only use any frame gen to hit 120 at the most, no matter the monitor.
AI is the way forward due to native being too expensive/too power-demanding and needing a cooler twice the size of what GPUs have now, but we are not there yet. There must also be hardware changes to go hand in hand with software rather than big pushes on new cards that rely on software more than anything else (looking at you 5070/5080).
A worry would be fast online multiplayer games, all these extra AI frames will lead to "I shot him, no way did I miss" "No, you shot the extra frames"
We are there for years, watercooling
but input lag is way too bad
On 10x and 20x, yes. But 2x up to 4x with Reflex enabled and a framerate cap just below your monitor's refresh rate it's much better. Some people will notice the additional input lag and hate it, others not so much.
@Mostly_Positive_Reviews for me 120hz display, when i cap to 60fps and use 2x its okay but 30fps cap with 4x is not good
Yeah, 30 fps base framerate is going to feel bad. 60 fps minimum is what I'd recommend, with any frame gen tech, not just Lossless.