@@johnpatrickteran2825 I'm sure he's right about it, because I use it daily and can confirm the input latency only gets awful once you drop under 30 fps
Actually quite nice. That means we no longer need purple pills to feel nauseous while gaming. Just slap in LSFG in 6x and we are good. For some reason everything becoming slow motion at 20x. That was some time hack we can use. 🤣🥴
For Loss Scaling to work perfectly, the video card cannot be at 100% usage, the video card must have a base FPS and have a gap to use it correctly, Loss Scaling uses 10% of the video card to generate the new frames.
@@zznabil8109 If the video card is consuming a lot, use Resolution Scale, do the same as FRS, set it to 75%, you will not see a difference in image quality, and will leave the card with a gap.
@@vinidsr Well, in lossless scaling, there is an option for preferreed gpu, never have it set to auto, itll just use ur integrated graphics and not dedicated graphics. . set it to most powerful gpu that you have in your computer
Still 2x-3x gives the best result. But fps fix is the key. For example if you takes 40 max fps you need to state 30 fps. for 4x you need to fix 25 fps. They added these 10x-20x specs probobly mock against fucking scummer NVDIA.
I guess it's useful for turn based games. Sometime ago it seems we forgot about why we search for more FPS. What we want is lower input ms. FPS is just something easy to understand in a graph. If you multiply your 5 FPS by 100. You're still only sending and receiving 5 inputs to the program. That does not change. But it's actually quite worse, since you're sending input based on information that is fake.
You should use resolution scale at lower values to gain fps, especially with 1440p and 4k fps you gain is very high. I use 70 res scale on my 1080p and i recommend 60 for 1440p and 50-45 for 4k. It only changes optical flow resolution not in game resolution. I recommend testing with res scale. You wont even notice the difference
That is so weird! I have played games, before we had 3D. And I remember we needed 20-25 frames in 3D to get the game playable properly. Now I see games played with just 5fps real... that proves, the generated frames matter for playability! Yes, with this factor 10-20, there are lots of artefacts. But 2-4 frames seem to really make sense.
Dlss is far better especially as it has motion vectors and use dedicated cores for that workload anyway Reflux 2.0 would make it actually playable (frame warp )
I’m not sure If anyone can see it but when ever they move u see these weird traces it kind of looks like a dream lol Intel definitely needs to improve on this
The developer recommends at least 60 real fps and upscale from that to get a good experience. I tired it on my pc upscaled from 30fps to 60 as my monitor is only 60hz and it was acceptable but I wouldn't play like that. 7 base fps like in the video must have had terrible input lag lol
It doesn’t affect because LS doesn’t have access to the game’s internal data or mechanics, meaning it operates externally without modifying or interacting with the core game files or systems.
Can you test with it with 2x mode(which people might actually wanna use.) We want to know how it performs with b580. It might help with cpu overhead issues of b580.
Hey, great video! I have a few questions: Do you maybe have the possibility to add a second B580 to your rig? Of course, technologies like SLI and Crossfire are no longer a thing, but it seems like you could offload the heavy calculations of the Lossless Scaler to a second GPU. The idea is that the main GPU used for rendering wouldn't be affected by the extra calculations. However, it doesn’t seem to work with many GPU combinations, possibly because of the PCIe 16x split into 2x 8x, which might cause a bottleneck due to the heavy communication between the GPUs. Or maybe it's because the tested GPUs had different bandwidths and needed to adapt first - I’m not entirely sure. It would be interesting to see a test with similar GPUs to eliminate most of these variables. By the way, some users have reported significant performance improvements with their mobile GPUs by utilizing the iGPU for the Lossless Scaler. On the settings side, I’d say it’s crucial to get close to 60 FPS as a baseline (this can be monitored and limited with RivaTuner). From there, you can increase your gen counter up to your screen’s FPS limit for a smoother experience. The resolution scale (of the gen pictures) can be lowered - it makes the game object's motion a bit blurrier, but is barely noticeable unless you reduce it too much. I’d say scaling it down to 80% works great. I wouldn’t use upscale unless absolutely necessary. I’m not using this setup primarily for (be able to) gaming. Instead, I use it to cut the energy consumption of my RTX 3080 in half, without any noticeable quality differences. With some MSI Afterburner work as extra, makes it Quite and cool without a performance hit.
People compare nvidias DLSS FG/MFG with this lol. I got a 4090 and when i use it with DLSS FG, it looks waaay better and smoother then Lossless Scaling with 2x, besides that with Reflex enabled i get half the latency of the Lossless Scaling. This app is just a joke and was meant for kids who want to boost their 50$ laptop to play roblox, not for actual gaming with high end gpus.
I wonder what Nvidia 8x generation will be like, would be cool if you tested this at 1440p no ray tracing to see how it looks with more real frames to go off of (obviously still terrible)
im using the b580 too, but when using the x2 mode, i get half real framerates boosted up to my orginal framerate, which mean if i play a game that is already 60 fps, the lsfg will tuen my framerate to 30 fps and boost to 60fps. i never had this issue when i was using my amd 5700xt, do you know what issue i may have faced?
I've seen other videos of this and x3 is the most I'd use x6 could pass but only as a last resort and I'd only consider it if I was playing on a steam deck sized screen
The best way to get frame gen , open wallet or savings or sell a kidney and buy more powerful hardware which will get you frame gen without input latency most of the time lol
I think the old tv i used had some kind of framegen type of thing. It felt really smooth in games with it but it gave terrible image quality on moving things
I tried it in gt 730 4gb ddr3, it ran assassin's creed unity with nice framegen but after a 10mins of playing the game crashes(720p low settings with framegen x3) and also in some games it reduces fps so i dont recommend using these gpus for lsfg
The x20 mode is just a joke, lol. I’ve already tested this software in several games, but I’ve never really been convinced. The x2 mode with Nvidia Reflex enabled is “acceptable,” but beyond that, there are always issues (artifacts, ghosting, etc.) For me, it’s really a last resort because, even with these fake extra FPS, and despite the latency being fine in x2 mode, you don’t really feel any improvement in fluidity.
i played fortnite on my RTX 4060 Ryzen 7 7735hs laptop. without frame gen i get 180 fps native 1440p. with 720p upscaled to 1440p, and 6x frame gen I get 560-830 fps in reload ranked. if anything. NVIDIAS LOGIC: RTX 4060 mobile 75 watt + Lossless Scaling (Snapdragon Game Super Resolution+ 6X frame Gen)= Delayed 9800X3d 4090 PC. on videos where i see those specs they only get around 500 to 600 fps in fortnite
Hardware based frame generation is a superior approach, as it integrates directly with the game engine, unlike software based frame generation, which operates externally without access to game internals.
I have gt 635m Can i use lossless scaling to get more playable fps or is it just useless? Because my base fps in games like farcry 5 is 15 Can you test this app with a similar card like gt 710?
No. The minimum spec for Far Cry 5 is five times more powerful than your card. Honestly just play the game in 480p, or check a dumpster behind a tech store to get a card 10 to 20 times more powerful than yours for free.
They need at least consistent 30 fps and ideally 60 fps to produce stable frames, andthing below that will hurt your eyes like hell and those extra fps gained would be absolutely nothing.
It's a evolution but a shit at the same time.. it produces the effect when change a 30 fps video to 60 fps with interpolation on adobe premiere pro, evolution but shit
x2 is pretty ok,and that's as far as I would gor with frame gen. Also,make sure you can render at least 50ish real frames,also,your gpu must not hit hight 90%+ utilisation.
input lag: 7 business day
Wise
@@Diegonando64 true
input lag will not be felt if the native FPS is above 30fps ++
but if it is below 30fps.
you will feel input lag
especially if it is below 20fps 💩
@@penonton4260 is that based from your experience?
@@johnpatrickteran2825 I'm sure he's right about it, because I use it daily and can confirm the input latency only gets awful once you drop under 30 fps
See? 4090 performance in a B580!
- Intel CEO
What the dog doing
@@Spacercodefinna eat you out lil bruh
😏😏😏😏😏😏@@solarray8484
@@solarray8484 blud calm down
lol
Future gaming:
95% GPU - upscaling
5% GPU - actually game
And then we will play fully AI generated games
@kolyacpp NPU will handle upscaling in the future. Like it does right now with PCs that have an NPU.
Finally, I can experience what it's like to play videogames from a tub of vaseline.
Actually quite nice. That means we no longer need purple pills to feel nauseous while gaming. Just slap in LSFG in 6x and we are good. For some reason everything becoming slow motion at 20x. That was some time hack we can use. 🤣🥴
Woman ☕
@@BxbdNdn-c6y WTF with this guy? Who hurt him? 🤔
@@Jakiyyyyyautism
Because the base fps is too much low
I think for it to be playable you need to get at least 15 or 20 fps +
😂@@brainlessbiscuit161
AI generated videos be like:
4:42 wtf this banner😂
For Loss Scaling to work perfectly, the video card cannot be at 100% usage, the video card must have a base FPS and have a gap to use it correctly, Loss Scaling uses 10% of the video card to generate the new frames.
I think LSFG uses more than 10% GPU usage especially with mid/low tier GPUs and especially when the generation multiplier is beyond 4x
@@zznabil8109 If the video card is consuming a lot, use Resolution Scale, do the same as FRS, set it to 75%, you will not see a difference in image quality, and will leave the card with a gap.
@@vinidsr Well, in lossless scaling, there is an option for preferreed gpu, never have it set to auto, itll just use ur integrated graphics and not dedicated graphics. . set it to most powerful gpu that you have in your computer
Still 2x-3x gives the best result. But fps fix is the key. For example if you takes 40 max fps you need to state 30 fps. for 4x you need to fix 25 fps. They added these 10x-20x specs probobly mock against fucking scummer NVDIA.
I guess it's useful for turn based games.
Sometime ago it seems we forgot about why we search for more FPS.
What we want is lower input ms. FPS is just something easy to understand in a graph.
If you multiply your 5 FPS by 100. You're still only sending and receiving 5 inputs to the program. That does not change. But it's actually quite worse, since you're sending input based on information that is fake.
But, the problem is Steering Sensitivity no big resolution screen.
The fact that it isn't a complete garbled mess is impressive. I can see 1 fake frame to 1 real frame being pretty useful as it hides most errors.
It is completely garbled, it's like I'm looking at a scene with Vaseline smeared over my glasses
You should use resolution scale at lower values to gain fps, especially with 1440p and 4k fps you gain is very high. I use 70 res scale on my 1080p and i recommend 60 for 1440p and 50-45 for 4k. It only changes optical flow resolution not in game resolution. I recommend testing with res scale. You wont even notice the difference
That is so weird! I have played games, before we had 3D. And I remember we needed 20-25 frames in 3D to get the game playable properly.
Now I see games played with just 5fps real... that proves, the generated frames matter for playability!
Yes, with this factor 10-20, there are lots of artefacts. But 2-4 frames seem to really make sense.
Dlss is far better especially as it has motion vectors and use dedicated cores for that workload anyway
Reflux 2.0 would make it actually playable (frame warp )
Its smooth but kinda trippy
Its so useful for games, I can finally play at 144fps any game with DLSS + Raytracing + Lossless (X3 or X4 is so nice...)
do dlss and lossles scaling work in a couple?
@@НазарійГривнакdlss fg probably wouldnt work but dlss upscale works
my guy needs x20 desktop space
I’m not sure If anyone can see it but when ever they move u see these weird traces it kind of looks like a dream lol Intel definitely needs to improve on this
0:23 wtf clean your desktop
Bro still having BF hardline😭
DLSS 4 - more fps
DLSS 5 - play the game by itself
Frame Gen go crazy
Bro 💀
Game Devs in 2077 would just show us still images in teaser and trailer. The gameplay would be 1fps x 60 generated frames.
Wow. This is just magic at this point. How was the input lag?
Crazy
Honestly unnoticeable especially if the app is used properly
@@callmealex1247 And how you do that?
@@callmealex1247 how to use it properly?
The developer recommends at least 60 real fps and upscale from that to get a good experience. I tired it on my pc upscaled from 30fps to 60 as my monitor is only 60hz and it was acceptable but I wouldn't play like that. 7 base fps like in the video must have had terrible input lag lol
The true savior for my GT 710 to play on cyberpunk
🙂in truth even this cant be saviour for your gt710
Looks perfect for chess games as long as you are not playing rapid games.
5:50 sound like fast ride, music same:
speed: 1 km per year:
the input lag is horrible comparing to any FG, yes it is very smooth but unplayable
Generated frames don't affect physics, so having so much generated frames with low base fps doesn't make sense
It doesn’t affect because LS doesn’t have access to the game’s internal data or mechanics, meaning it operates externally without modifying or interacting with the core game files or systems.
DLSS 4: I nEeD mOrE mOnEY
FSR 4: aI 4 EvErYoNe (every1 with 9070)
LSFG 3: best 6fps experience of your life
Can you test with it with 2x mode(which people might actually wanna use.) We want to know how it performs with b580. It might help with cpu overhead issues of b580.
Hey, great video!
I have a few questions:
Do you maybe have the possibility to add a second B580 to your rig?
Of course, technologies like SLI and Crossfire are no longer a thing, but it seems like you could offload the heavy calculations of the Lossless Scaler to a second GPU. The idea is that the main GPU used for rendering wouldn't be affected by the extra calculations.
However, it doesn’t seem to work with many GPU combinations, possibly because of the PCIe 16x split into 2x 8x, which might cause a bottleneck due to the heavy communication between the GPUs. Or maybe it's because the tested GPUs had different bandwidths and needed to adapt first - I’m not entirely sure. It would be interesting to see a test with similar GPUs to eliminate most of these variables.
By the way, some users have reported significant performance improvements with their mobile GPUs by utilizing the iGPU for the Lossless Scaler.
On the settings side, I’d say it’s crucial to get close to 60 FPS as a baseline (this can be monitored and limited with RivaTuner). From there, you can increase your gen counter up to your screen’s FPS limit for a smoother experience.
The resolution scale (of the gen pictures) can be lowered - it makes the game object's motion a bit blurrier, but is barely noticeable unless you reduce it too much. I’d say scaling it down to 80% works great. I wouldn’t use upscale unless absolutely necessary.
I’m not using this setup primarily for (be able to) gaming. Instead, I use it to cut the energy consumption of my RTX 3080 in half, without any noticeable quality differences. With some MSI Afterburner work as extra, makes it Quite and cool without a performance hit.
People compare nvidias DLSS FG/MFG with this lol. I got a 4090 and when i use it with DLSS FG, it looks waaay better and smoother then Lossless Scaling with 2x, besides that with Reflex enabled i get half the latency of the Lossless Scaling. This app is just a joke and was meant for kids who want to boost their 50$ laptop to play roblox, not for actual gaming with high end gpus.
I thought it could be great for increasing fps from base 60 to reach max monitor refresh rate, especially in games without FG
it's like playing drunk
please do i5 10400f + intel arc b580 vs i5 10400f + rx 7600 or xt version
i heard that intel arc b580 doesnt like older cpus like 10/11th
GUYS, in preferred gpu option do NOT select Auto, it just uses your weak ahh integrated graphics
i lost it at x20 mode 😭😭it's like swimming through gel
I wonder what Nvidia 8x generation will be like, would be cool if you tested this at 1440p no ray tracing to see how it looks with more real frames to go off of (obviously still terrible)
all of this to make a fried egg
Pulls out a gt. 710, 4090 perfomance you sau?
Hope soon enough developers found solution to import the Frame times so gpu thinks its actual game.
is B570 not out yet?
im using the b580 too, but when using the x2 mode, i get half real framerates boosted up to my orginal framerate, which mean if i play a game that is already 60 fps, the lsfg will tuen my framerate to 30 fps and boost to 60fps.
i never had this issue when i was using my amd 5700xt, do you know what issue i may have faced?
How about latency bro?
I've seen other videos of this and x3 is the most I'd use
x6 could pass but only as a last resort and I'd only consider it if I was playing on a steam deck sized screen
I hope in future it can use the GPUs NPU to generate frames in game.
Is this card a good pair for i5 10400f?
The best way to get frame gen , open wallet or savings or sell a kidney and buy more powerful hardware which will get you frame gen without input latency most of the time lol
What is the software used to measure lossless scaling frames?
How did you make the stats show the generated frames?
I think the old tv i used had some kind of framegen type of thing. It felt really smooth in games with it but it gave terrible image quality on moving things
Hey, that's maybe a dumb question but how do you have your framegen FPS on your osd ?
Para que LSFG muestre buenos resultados el juego debe ir por lo menos a 30 FPS con uso de GPU al 70/80%.
Can you pls try it on legendary GT710
I tried it in gt 730 4gb ddr3, it ran assassin's creed unity with nice framegen but after a 10mins of playing the game crashes(720p low settings with framegen x3) and also in some games it reduces fps so i dont recommend using these gpus for lsfg
what overlay you have that detects all this?
Ur overworking your 3D resources.. If u have room and another GPU in ur PC...
U can set ur Frame gen for the other GPU and make it like SLI
The x20 mode is just a joke, lol.
I’ve already tested this software in several games, but I’ve never really been convinced.
The x2 mode with Nvidia Reflex enabled is “acceptable,” but beyond that, there are always issues (artifacts, ghosting, etc.)
For me, it’s really a last resort because, even with these fake extra FPS, and despite the latency being fine in x2 mode, you don’t really feel any improvement in fluidity.
The improvement is very noticeable, specially on old games locked at 30 fps, making them 60 fps improves the experience a ton.
4:41
i played fortnite on my RTX 4060 Ryzen 7 7735hs laptop. without frame gen i get 180 fps native 1440p. with 720p upscaled to 1440p, and 6x frame gen I get 560-830 fps in reload ranked. if anything.
NVIDIAS LOGIC: RTX 4060 mobile 75 watt + Lossless Scaling (Snapdragon Game Super Resolution+ 6X frame Gen)= Delayed 9800X3d 4090 PC. on videos where i see those specs they only get around 500 to 600 fps in fortnite
Okay its cool that its a thing just, if it actually didnt cause you to want to vomit, even just looking at 6x had me wanting to throw up xD
Its bad when you can sens the latency trough a video
exactly what nvidial is doing with 50-series now... low hardware with high AI... 😂
Hardware based frame generation is a superior approach, as it integrates directly with the game engine, unlike software based frame generation, which operates externally without access to game internals.
@@edwardbenchmarks my B580 steel legend is arriving this week!
How is this possible????But I can't do it
It doesn’t look that good I could see the imperfections, it could look better
i dont understant why my fps in cs2 drop 45 to 9 fps in i5 13500
Because your refresh rate on the monitor isn’t high enough to support 20x mode most likely
@@TheBottleneckedGamer i use 2x and 3x and my monitor has 100hz
Please do gtx 660 with lossesles scaling new update
I have gt 635m
Can i use lossless scaling to get more playable fps or is it just useless?
Because my base fps in games like farcry 5 is 15
Can you test this app with a similar card like gt 710?
Para que este programa muestre buenos resultados el juego debe ir por lo menos a 30 FPS con uso de GPU al 60%/70%.
No. The minimum spec for Far Cry 5 is five times more powerful than your card.
Honestly just play the game in 480p, or check a dumpster behind a tech store to get a card 10 to 20 times more powerful than yours for free.
Lossless scaling losing all details
It looks very strange.. I prefer more real fps with low quality.
They need at least consistent 30 fps and ideally 60 fps to produce stable frames, andthing below that will hurt your eyes like hell and those extra fps gained would be absolutely nothing.
That looks unplayable imo
PLZ try it on gtx 750
looks like ai minecraft
It's a evolution but a shit at the same time.. it produces the effect when change a 30 fps video to 60 fps with interpolation on adobe premiere pro, evolution but shit
Why?
Because we can lol!
FG of lsfg is still crap.
180 ms lag
For 9.99$ It is ok)
он скачал все игры😮
Ai garbage is what it looks like
yess
and now B580 = 4090
that's what Nvidia said with its latest Fake frame ~
🤣🤣
Most stupid thing I will see today
this card shit
simply trash
Looks terrible
its very bad compared to other frame gens lke fsr and dlss i dont see why anyone should use it over them
It's not terrible at 4x mode and only becomes trash with anything beyond that. 2x mode works excellent.
Fsr can only be used in certain games whereas lossless scaling works in every game and even in movies and videos.
@@tharusmc9177 AFMF 2 framegen from amd works on every game too.
There are games that don't support frame gen
@@Nico_0900 afmf supports every game
terrrible
play minecraft and stop doing stupid things
x2 is pretty ok,and that's as far as I would gor with frame gen. Also,make sure you can render at least 50ish real frames,also,your gpu must not hit hight 90%+ utilisation.
gl playin with that....hah there is no magic to make 15 fps feel like 100...
Incredible
If this program works well enough i might actually be able to run cp at 4k 60fps finally