Lossless Scaling makes heavily modded Skyrim playable for me. Without it at 4k on a 3090 I'm barely at 30fps in heavy parts of the game. With LS I can upscale from 80% res with NIS and FG X3 and it's a complete life saver, smooth as butter. LS is an incredible application.
If u use RIVATUNER Statistics to Cap FPS and the Game supports Framegen + Reflex u can Cap FPS at 60 or whatever u can reach stable and under Setup in RTSS, scroll down, chose Reflex Framelimiter, instead of Async and the Input lag is gone. Made Stalker 2 playable for me with Framegen. Havent tried Cyberpunk yet, but it should work the same way. Dont know if the effect is the same or as good with Lossles Scaling.
The mouse input lag is actually MUCH better with this latest update - just make sure you cap the frames in the in-game settings. You don't need 100fps base, even with 50fps with 3X scaling, I am getting a SOLID 150fps with verrry minor input lag. Well worth the compromise for the visual improvement.
It even works when watching UA-cam or anime or whatever u watch. Doesn't work on Amazon prime for some reason. And is not fully ideal for multiplayer games. I've used it on competitive Fortnite and although it's smooth AF I do notice the slight input lag at X2. Best used for graphics demanding single player games
I think the reason that people are hating on Nvidia for their frame gen is because they 1) refuse to enable it on all RTX cards and 2) They demand the cost of a used car for their GPUs. Meanwhile, this application does very similar frame generation on ANY program for less than $10!
Ive had this program for I think a year on my 4050 laptop, mostly used it when i first got it for open world games where i got 90 fps, so id set it to 72 and get 144 and this app is amazing! The 3.0 update is incredible!
My biggest issue with framegen at 120hz is the performace hit to reach it. Atm I need 80-90fps base to get 120 fps framegen. Activating it at 60 will never give you 120. It always reduces base fps by about 25%. And real 90fps with VRR feels way better than 120fps after framegen. So there is no real reason to use it. Only exception are slow games where I can't reach 60fps and I use framegen to get about 80fps output. (Alan Wake 2) I wish there was a mode that only adds frames when needed to reach a framerate target, instead of doubling a base framerate.
That usually happens when your GPU is already fully loaded, running at 100% even before Lossless was enabled. To stop that from happening, you need to keep you'r GPU usage down to around the 85-90% range (the lower, the better) before even enabling Lossless Scaling, cause LSFG uses a bit of your GPU's processing power (about 5-10% or so depending on the settings used) to make those interpolated frames. You can do so by simply lowering your in-game settings or just using FSR, XESS, DLSS etc. to get your usage down to around that level before enabling Lossless Scaling in Borderless Windows mode. Creator of LSFG also advices it users to disable any and all overlay they might be using before enabling Lossless (Afterburner/ Radeon Metrics etc.) as they sometimes interfere with LSFG. Just turn on Draw FPS in Lossless settings instead to be able to see your curent base frame and interpolated frames. Would Also help if you turn off some of the settings in LossLess Scaling to reduce the latency and GPU load even slightly. Stuff like Scaling Type to Off, Sync Mode to Off (Allow Tearing to reduce latency slightly), Max Frame Time to 1 for nvidia and 3 for AMD Gpu's and Lastly Capture Mode to WGC (WGC is the reccomended as it has less Latency, switch to DXGI if LSFG doesnt work)
@@jamesFX3 "make your graphics significantly worse to have an ability to render game at 60 fps required to framegen to 120 instead of playing at native 90fps" that's hilarious
@@jamesFX3 Sorry, I was talking about NVIDIA Framegen. Should have mentioned that. But there is no framegen (NVIDIA, AMD or Lossless) that doesn't have that issue. And reducing load doesn't make sense when I get a worse expierence than native... That's the issue I was talking about.
I use an FSR3 mod on games i play for my 3080. Lossless is great but FG affects UI, so where possible, having a more intergrated solution is ideal. The fact that it works with FSR3 though, just goes to show if AMD can do it for everyone, Nvidia could too. They havn't since its been their only real selling point for the last two generations.
@@aberkae just the updated DLSS upscaler will come to older RTX cards (and probably at worse image quality or performance), not the frame generation component that really should have been called something else instead of being conflated with the upscalers by NVIDIA and AMD. Also DLSS will only be swappable in titles with DLSS 3.x implementation, not games with older implementation and not application agnostic at driver level (like NIS or AMD's FSR1 or Lossless Scaling upscaling methods)
@adriankoch964 Isen't the upscaler enhancement coming from New Trasformer model though? So the upscaling when using dlaa and lower will improve visual quality across the board.
I remember when Nvidia enabled gsync for all freesync monitors and added RT support via driver update for the 10 series, so it obviously can be done. And even without tensor cores, I was able to use Ray Tracing in COntrol, for example with around 40-ish fps. I also had screen flickering problem with Gsync since my monitor wasn't officially "gsync compatible", but fixed it easily by decreasing VRR range by 1fps in CRU. So yeah, I definitely think that most of this FG, DLSS, MFG bullshit is marketing to force you to buy the new hardware, which in itself proves that actual architectural advancement is incremental, at best.
I have an MSI 32" 4K QD-Oled with 240Hz and have lossless scaling since it ages (back then it was really just a scaler). I tried the new update yesterday in a few games. I used it in slower paced games where the input is not dsastically needed. Examples: X4 Foundations 60 /240 FPS - this made the game even in heavy battles totally smooth and made it feel amazing - never saw the game that smooth in heavy situations. Xcom 2 60 /240 FPS - same as above since you don't have much direct input and "watch" most of the time it looked insane smooth. Its also great for Ryujinx to get those 60 FPS switch games up to speed too. For those 30 FPS games I mod them to 60 then LLS them to 240. Also HDR works here too, Zelda and Fireemblem look amazing with it. I also use Reshade (AutoHDR.fx shader with addons) to interpolate SDR to HDR and tonemapped to 1000 Nits. For games where there is RenoDX HDR fix availiable I use that instead. My specs are a Ryzen 5900X, 32GB RAM and a RTX 3090. I skipped the 4090 and thought to get a 5090, but damn its way too expensive tbh - might skip this gen to and get a whole new setup then with newer X3D CPU and more RAM too then.
60 to 120 conversion feels just great with a mouse imo, don't forget that you can inject Reflex into games now. I wouldn't use it for competitive shooters but it's not a huge deal for nearly any other genres and there's many genres that are harder than shooters. People act as if motion clarity never help your gameplay lmao. Also if you want better motion clarity then at 60fps you'd use 60hz Vsync with BFI, which is a ton of input lag, Lossless Scaling FG allows you to use 120hz with NO vsync instead. Because yes LSFG can take care of vsync by itself
Personally anything under 90fps feels like shit for me on the mouse. I can make concessions on Steam Deck (gamepad) or when streaming games over internet from my host machine (which adds like 10ms on top due to transcoding and network altency) but generally speaking anything above 10ms latency feels awful on mouse. The only time I've ever used FG was when I've temporarily had underpowered power supply for the GPU and had to cut down power draw under 300W to prevent driver timeouts and it was a miserable experience (capped at 82fps native to get 164 on 165Hz ultrawide monitor) I will NEVER subject myself to ever again. It was absolutely dogshit, to the point where I've opted to do something else than playing games. Not to mention the artifacts and smearing - I guess it doesn't faze people who use VA or cheap IPS panes and are used to garbage smeary vomit in front of their eyes but on OLED it's just headache inducing and like going back to 2005 early LCDs experience.
i tried an older version of LS (~3-4 months ago) on Valheim and Viscera Cleanup. Both games were capped at 60fp and i used x3 to get total of 180fps (1440p 180hrz monitor) it was ok, mouse movement at 60 was good enough and there was minimal artefacts on the edges of the screen and yes 90+ fps native would be better, but i couldnt keep them stable and i'm particularly sensitive to that jump between 90 to 60.
The difference is: Lossless Scaling is pixelbased. It has no access to in-engine information like motion vectors or ZDepth. Motion frames works on the driver level, which is a very deep integration into the graphics card as a frame generation process. The next best would be NVidia's DLSS FG or FSR 3.1 from AMD. Both work on the engine level and have access to this information. The information itself (specifically the motion vectors) is reducing interpolation artifacts to nearly zero (there is ghosting though). Lossless Scaling on the other hand has none of this information, and still: It does a great job in my opinion. Only static elements that have to move over fast motions in the background (like the main character driving a bike) or static elements like the HUD, show the distortions often very visibly. Since it is pixelbased (and not engine or driver based), there is not much the programmers of Lossless Scaling can do to eliminate those distortions at all. But for what it offers for this little price, it truly is amazing.
Before the upscaling trend, developers were pushing toward standardizing 120 FPS gaming, even on consoles. Now we're seeing new games that barely manage 40 FPS natively. Frame generation feels like digital makeup - it masks the underlying performance issues rather than solving them. That being said, Lossless Scaling is amazing in cases where games are locked at 60fps, like Elden Ring, Dark Souls, etc. Glad that they keep refining it.
Kingdom Come Deliverance - wanted to play it, but it runs like crap even after 8 years and i upgraded my PC 3 times. Lossless scaling can make it playable for me.
@@bwellington3001 What hardware are you playing with? It's also important to understand that even though Kingdom Come: Deliverance was released in 2018, its Ultra settings were designed to push even newer hardware basically to its limit, kind of like the path-tracing mode in Cyberpunk 2077 where even a 4090 can only get 23 FPS at 4K if you don't use either DLSS upscaling or frame generation, or the Unobtainum mode in Avatar: Frontiers of Pandora.
Don't confuse certain concepts. Just because it works doesn't mean it will be as good as NVIDIA's solution. It won't be, and AMD has learned that with its FSR 3.0. The only advantage is that we'll run it in games that don't support FG We can recommend the x2 mode here, which does a pretty good job. The rest noticeably spoils the image and is not worth the attention
I just purchased it. Amazing software. I have a 4080 and am actually using it in Fallout New Vegas of all games. Pegged at 58 fps and x3 to 170+ and its much smoother and honestly better than running it any other way, that ive found
I have the 80-in of that TV or at least a very similar model at the same tech in it and it is absolutely amazing. Played Star Citizen with dual flight sticks pedals on that 80-in TV and you are so in the zone. Only problem I have is when I play games that have lower quality textures or graphics it really shows
been watching since you had a few subs, you deserve more. I paid a small fortune for my gigabyte aurus 43 inch monitor and although its hdr1000 with all the bells and whistles the screen always felt too bright and dark colours looked bad. After doing your tweaks for lut etc my screen is amazing, hdr is brilliant and local dimming works really well.
Agreed. I was shocked when I tried his preset for Elden Ring with Lilium shaders to fix the black floor level. The difference was night and day and I never would have known about these shaders otherwise
The biggest gripe with HDR is how much fiddling it requires in every game for it to not look like complete trash or at least make no difference to just leaving Windows in SDR mode and skip all sorts of little pain points. And that is if you actually have a monitor that has "good enough" HDR hardware to actually get high enough peak brightness and true blacks in combination with plenty of low latency local dimming zones (or OLED pixel perfect local dimming at cost of reduced full screen peak brightness) to qualify as HDR output, which isn't true for 95% of all "HDR" displays on the market, especially all of the cheap ones.
Honestly I thank the guys that developed Lossless Scaling, and because of them I can play native with supersampling on while maintaining a smooth experience 😎
Hello friend. Just found your channel after Nvidia announced their RTX 50 series. You are like Daniel Owen but with a funny nice to listen to accent. Subscribed X)
2029 Nvidia won't even sell physical graphic cards anymore, they'll go full AI, just buy license and download the "GPU" straight into your computer to help 2029 path traced capable APUs to generate as much frame as you want
@@meb0y494 Geforce Now kinda is that already. They have 4080 on there. They will upgrade to 5080 soon on geforce now. Allowing more AI frames straight from the cloud
The thing is at some point you cant push more power other then more gpus . Its already basicly just buing a software licence with new gpus over raw power
I wish Reflex 2 was like the async warp demo, but it only updates mouse movement inside the current frame, rather than to your monitors hertz, so it feels more responsive but its not perfect input lag, and things don't really look any smoother either like async reprojection does. And it only updates mouse input, not keyboard. I think the reason it hasn't been done yet is because of the severe amount of artifacts it had. Gamers don't really care that much since the the latency & smoothness benefits are so massive, but NVIDIA cares a lot, they want it to look as close as possible to real frames. Its something their definitely working towards, extrapolation to your refresh rate has been discussed as early as like 2021 but its going to take time. I think 60 or 70 series will have it.
Funny how for me I'd accept reprojection a lot easier just because of the input lag and, from what I understand, quite low performance cost. Because right now on 4080 with framegen it takes a lot of cost, doesn't look good, and doesn't feel good. My tinfoil theory is that Nvidia is scared to use reprojection because once they show us that we don't need high fps to fill out their Hz and have good input lag - the cat is out of the bag. I think a lot would stop buying these incremental upgrades. But now since not many people even know, it's a lot easier to sell people on 5080 and 5090.
What we need is interest from big players like nvidia, because interest eventually compels game engines and game devs to include proper standard features like asynchronous reprojection. Given that it's a tried and true technology for VR, i don't think it would take too much time to implement a barebones version. For what nvidia calls infill, it will remain vendor-locked as it's all AI based. To be honest, if i had to choose i would rather have artifacts than a terrible input lag, especially in fast paced fps games. For indiana jones and the GC? Frame gen away as it's basically a graphic novel with first oerson gameplay
@@fpgamemearray call me a tinfoil guy but I think Nvidia doesn't want framegen, because it would disincentivize people from constantly upgrading for tiny 10-20prc jumps.
@@GFClocked i think the opposite, all the "impossible without AI" allows them to pump out gen after gen with minimal generational improvements and then justify it because "all the new games use RT and RT is too expensive". They leverage the monumental budget of the datacenter AI division to increase AI performance (thus emphasis on AI TOPS) while neglecting raster improvements. I wouldn't be surprised to see that the 6000 generation uses neural BVH or something like that, they are already exploring it in research. As i understand it, framegen was never meant to be used by the poor pc players to go from 30fps to 60, it was meant to be a technology thst allows very high refresh rate 2k/4k monitors to be fully utilized.
Exciting closing argument in favor of producing less graphic detail in exchange for perfect performance. I remember this is what happened with most arcade games or on the Sega Dreamcast, a console that you connected via VGA to a CRT monitor and were amazed at how clear everything looked at 640x480 and 60hz/60fps. The videogame market is huge and there will always be titles that work without DLSS or FrameGen. I personally enjoy very much 'Metroidvania' style games (Hollow Knight for example), with relatively simple graphics, but with polished gameplay and optimal performance in the old style. Greetings, Ariel.
I don't use any upscaling tech. You get a massive perfomance boost by changing the aspect ratio, you add black bars to the image but it feels significantly better to play and lower input lag. An old CRT trick.
alot of people dont understand how important raytracing and path tracing is for game devolpment and how long it used to take just render 1 frame with it, what used to take hours to get 1 frame of pathtracing now taking 1 sec for 25 frame for a native frame is amazing
I noticed a world of difference in LSFG 3.0 yesterday when playing NFS: Underground and Arena Football: Road to Glory on PCSX2 upscaled to 4K. It really made the ray tracing on my 4070 Ti Super stand out.
Man I'm telling you.. later on people will preffer and start making waves that they want to play a video game on native resolution without any AI upscaling crap. The thing with upscaling is that there are too many issues ( at the moment at least ) After I've bought my 4090 rog strix, I've played for around 7 months with Frame Generation on in any game that supported it, and after those 7 months, without FG. It's HUGE difference ! especially in FPS games. A trained eye can tell the difference ! me personally I would preffer to play a well made game on a solid 60-120 REAL fps without blur effects and anything like that in the game and without any micro-stutters etc. than let's say 480 FAKE frames
The more I think about it, the more I'm starting to be convinced that the way to achieve perfect motion clarity is not going to be 200 real frames + 800 fake ones. In my opinion, at some point a completely new type of display has to be invented that would push sample and hold displays out of the market.
@@plasmatvforgaming9648not really for the reason that CRT at 70hz -I believe- has no motion blur or input lag due to black frame insertion. The power consumption is something to be thought of and rather than changing the hardware of the computer, I believe the hardware of displays need to be looked into, and could be the most simple way in reality to fix the way displays are made.
Think I’m going to sell my 32in 4K 240hz monitor for a 27in 1440p 500hz when they come out simply because I value motion clarity & smaller size over PPI. Should be much easier to run too, especially w/ DLSS 4. Still have my 65 & 77 S90Cs for high fidelity single player w/ HDR impact. Cannot wait for those monitors to come out. Believe frame gen’s only worth it at much, much higher frame rates. At minimum, ~60 w/ DLSS quality before frame gen so the input lag isn’t completely terrible. I play on controller, so my tolerance is much higher.
Don't know how 32" 4K is compared to 27" 1440p but let me tell you. I had 28" 4K then got my nephew 27" 1440p. To be honest if I went from one room to the other I really cannot tell the difference immediately. 1440p is a great option. Even if I lower mine from 4K to 1440p I still don't get as clear picture as his
Hi Ariel, been watching your content since I got into HDR PC gaming. An off-topic question, do you use limited or full rgb on the Nvidia control panel and do you match it on the QD-Oled? (I have the same S90D as you.
Always on Auto which is Full automatically because the GPU res is set to full. Never miss match them. That was an old trick I used to recommend when I didn't know better
@@plasmatvforgaming9648 I tried Full + full but image has this gray haze on it, I tried Limited + limited and it looks so much better, I'm not quite sure why
10:20 I have been telling you, that you didn't experience the RTX 4090 the RTX 5090 will blow your brains out compare to the RTX 3080. The drawing pipeline will be a lot shorter, everything will be more snappy, not to mention the utterly insane 1.79 TB/s memory speed. Der8auer has said it himself in his last video, that overclocking the core on the RTX 4090 did not do much for performance, but overclocking the VRAM showed big performance gain!
@@rufus5208 I asume overclocking core means GPU cores. Just like CPUs have cores, so do GPUs. Overclocking VRAM? Well just you can overclock Sistem RAM , you can overclock GPUs RAM (called VRAM as well). Some other things can be tempered with as well. But I would never do that. Too risky for my taste.
@@wikwayer why wouldn't it be? as far as i understand what the program is doing is running the previous frame through an algorithm, drawing the real frame, delaying the following frame injecting the generated one. To account for the multiple window servers on linux, supporting gamescope would be the universal way to support linux (and the steam deck)
If you want even lower input lag, switch from the DS5 controller to a Flydigi Vader 4 Pro. It has up to 1000hz refresh rate wirelessly and also much faster polling rate on the sticks. IIRC DS5 is only 60hz on PC but the sticks themselves are polled even slower, so it can add a bit of lag. Plus HAL effect sticks instead of Sony's scam sticks that'll break after 1000 hours of use and microswitches on all of the important inputs, plus rear buttons and 2 extra ones at your right thumb. Super build quality too and toolless adjustable Joystick spring tension.
The problem with dlss4 mfg is that it needs very high hz display. For a 144hz tv, i believe the lowest playable scenario is 48fps x3 to 144hz. If base framerates is below 40fps, no fg tecnology can offer acceptable result. 240hz or even 480hz displays are for competative gamers, those games are usually not very demanding, x3 or x4 fg are useless in this scenario(also more input lag which competative gamers won't like).
@@plasmatvforgaming9648 It is a bit difficult to setup at first because you can only control reshade via the config file directly or using a controller to navigate and open the reshade menu. Or if you want, you can just setup the reshade config and files then just drop them in the lossless scaling folder. If you want a tutorial, the lossless scaling server has the guides and support.
Higher frame rates don't just need good graphic power but also cpu power and that's usually limiting factor in some games to hit higher frame rates , especially if it's badly optimised.
For some reason on my Samsung QN90D GameMotion+ is only ever becomes available if I set the TV to 30hz mode. Any other mode and it stays grayed out in the game bar, no matter if VRR or HDR are on or off. :(
What would make this perfect for me would be no artifacts with retro games with scanlines. The app gets confused by scanlines or stair patterns in games. I freaking love this app, though. Getting better every version.
It doesn't work and the developers are aware of it. The problem is that RTX HDR goes over both the Game and LosslessScaling causing HDR on top of HDR and that's why it looks messed up
@ I can look past it but that is my biggest gripe. You can’t use crosshair x though because for some reason it registers the frames of the dot and not the game so it goes haywire on frames
Love your videos man! great work I think there''s an easy way for them to make this perfect, they just need to implement depth detection, but this would require the user the point the program to the game's .exe file and it looks like the dev isn't interested in introducing such a step, just want the program to be nothing more than an overlay. I respect their choice but it's a missed opportunity IMO, they can EASY destroy Nvidia if they wanted, because from my testing I think LSFG is smoother than DLSS3, the only issues left come from the lack of depth detection.
Those Samsung QD OLEDs are exactly how my Syncmaster 955DF VGA CRT (also made by Samsung) looks as well. No normal camera in the world can handle the insane dynamic range and color of these displays. Maybe with a movie camera you can capture their detail.
@@plasmatvforgaming9648 By the way, no amount of BFI will beat a CRT. My 955DF at 1600x1200 at 60Hz as an example, takes 22 microseconds to draw a line and 8 nanoseconds for a single pixel. I usually use 85Hz at 1600x900 and that gives me 6 nanoseconds. No flicker and perfectly smooth.
rather than cropping, you might be able to edit the graphics config of certain games and set them to render at a slightly higher resolution, and then change the scaling mode in driver settings to not do any scaling. this would remove the black bars and just crop anything outside of your native resolution.
Can you make a video of RenoDX HDR where you go through all the (important) settings and what they do? It would be nice to hear what is actually different from the original HDR implementation and the one that you prefer.
@@plasmatvforgaming9648 I tried it and only adjusted the value of peak brightness to 1230 (I believe on G2) and paper white 203. Looks much better! Are there any values you would try and change except these? Have a good week.
Hi Brother, nice video ! disable vsync in LS to get way less input lag(switch the sync mode from Default to None). I play Single Player Tarkov on a RTX 3070 with a base frame rate between 40-60FPS (the game is CPU bottlenecked with all the bots AI), and with LS x2 mode it's really good playing with a mouse of course. Having GSync/Freesync helps. Cheers
I have absolutely no problem with generated frames as long as they work well. DLSS 3 performs exceptionally and demonstrates that visual quality is paramount over raw response time. A response time of 16 to 20ms is more than adequate for a responsive experience, especially when you're delivering a higher frame count. I love frame generation. So DLSS Frame Generation (FG) has proven to everyone that a response time of 16 to 20ms is more than sufficient for an incredibly smooth experience. What truly matters is how many frames you have available in a second.
Excelent video. And you touch on the point on why enthusiasts criticize FG (Frame Generation) stuff so hard. It was meant as a "easy hack" solution for those that want a similar (not the same) impression for the image motion as it'd be with much higher framerate, while accepting many of its downsides (artifacts, ghosting and latency). It was never meant to be used as a "crutch" for lazy game developers to use as a tool, to counteract the absurd sh!tjob they now notoriously do with games optimization (or rather don't do, it seems they no longer care about it!). FG is basically a more intricate/complex AI version of the "soap-opera effect" (motion interpolation) that, for years, could already be enabled in many TVs and PC monitors. Nothing more.
To me, the ideal way to go about it with current technology goes a little bit like this: OLED (+QD, or even MicroLED): Concentrate on increasing the panel's native refresh rate, there is not real reason it can't go beyond what the input signal can take. All displays should have built-in frame generation (hopefully the algorithms get better with time). For retro SDR and content that performs poorly with frame-gen, implementing something similar to the CRT simulation that takes advantage of the panels native refresh rate would be great. Bringing back rolling scan would be great but since it costs money for something the average consumer doesn't use, I won't be holding my breath. With a PC you can always implement software frame-gen or CRT simulation, but since you're limited to what the input signal takes, I think it will be crucial to have these options built into the displays with a higher internal refresh rate to take advantage of them. In conclusion, with OLED you concentrate on brute forcing fps and accept the artifacts and the fact that it will never reach CRT-level motion (or at least not for a long time, let alone without artifacts). LCD: Concentrate on cramming a lot of bright LEDs to maximize the dimming zones to at least compete with CRT and Plasma's contrast, and keep an edge over OLED when it comes to pure brightness potential. There can be all the same frame-gen trickery as on OLEDs, the problem is that it won't look as good as OLED. For an LCD to be relevant, they need high-perfomance backlight strobing that allow for 1ms MPRT motion and minimal crosstalk. In this case you brute force brightness and strobing, giving you the most brightness possible at the lowest MPRT possible (even at 50/60hz) compromise for HDR gaming, and you just accept the fact that you won't have perfect black levels and pixel response times. There's no ideal scenario but I can see a world where both tech work side by side depending on your priorities for different types of content.
can you make a video (if you havent already) that goes through some nice to have programs and how your thought process is, when making these monitors look so good? :D
It works well, and the motion clarity improves 2x but the brightness loss is bigger than with better more optimal flickering technologies. For example my LG C1 OLED at 120fps using the flickering tech OLED Motion Pro which simulates a CRT rolling scan (without the phosphorus decay and closer to a square wave) has 3.2ms of persistence, that's like 312. This same tech at 60 as bright as my Samsung S90D 60bfi and significantly more clear. Considering that the Samsung TV is way brighter this shows that BFI is not Optimal
The best for you well be CRT simulation tech created by Blur Busters. I made 2 videos about it and a livestream. Also check the latest livestream where I explained it too
I have a rtx 3090 desktop and a 4060 laptop. Comes with Frame Generation. Have also LS. I love this, and I think it’s a must buy because it has helped me improved my handheld experience, (and my 3090 experience too). But technology is not the same or even close to Frame Generation unfortunately. I wish it was because it would be cool to throw the middle finger to Nvidia. But FG is actually Generating frames via hardware, things that software haven’t been able to achieve since is an interpolation method, just an average between frames. Even like AMD “Frame Gen” is not frame gen, but interpolation. Maybe in the future gets way better since it’s software, but Nvidia”s Technology is wayyyyyy ahead on this one.
happy new year! Would the frame generation work for average without frame generation 45 50fps, and frame gen enabled locks to 60fps, just to enable BFI properly?, with Crysis 3 remastered I didn't want to use even lower dlss modes than performance modes, and maybe I can also do with enable frame gen enable RT on lockt 60fps and bfi? also a good explanation of the need and what happens without the frame generation technology, which I agree with, I did want to add that it also means efficiency, the power consumption is not feasible or barely feasible, and that this is because CMOS transistor technology is almost at its limit, I think that is a bigger reason, keep up the good work!!!
Yes DLSS4 upscaling works with any RTX (Multi Framgen works only for 50 series, single framegen works only for 40 series, and all other features work for all RTX cards) You can combine DLSS Upscaling or DLAA with AMD FSR Framegen or LLS Framegen without problems.
@@plasmatvforgaming9648 thank you for clarification. I researched about DSC however I couldn't find a proper comparison between native and compressed signal to check the loss. Do you game in 4k? What's your experience
Owners of s90d how do you watch hdr movies on pc? To enable filmmaker mod, you need to reduce the hertz to 24, but the picture starts to twitch, as if it is not synchronized with the monitor hertz. Or is it also ok in game mode, the settings of which are similar to filmmaker? I use 20 color with color booster low on the advice of the channel author, it seems not bad, but filmmaker seems brighter (although I have not yet increased the max brightness to 1500 yet)
All frames are fake frames, whether they are rasterized or AI generated. They only become real frames once we see them. Frame time generation, and latency are separate issues which continue to be addressed and improved. Imagine 20 years ago people arguing that GPU frames were fake frames because they were offloaded from the CPU and not generated by their Pentium 3.... that is all that is happening now. Technology is changing, and will continue to improve, and this argument will seem brain dead in 20 years.
About fps games and fast paced games in general, i play helldiver 2 on a 4090 (with a mouse) 60fps x2 mode, it's still an amazing experience. I can just as good as running the game at 120 fps without lossless scaling, (i use it fo reduce heat and power consumption)
Yes, you will be able to force reflex2 on any game via nvidia app, then you can use dlss4 and fg or use lossless instead of fg if not supported. Reflex2 will be there anyway for anyone with a rtx card, of course frames will still vary depending on which gpu you have and what resolution you aim at but still its gonna be good.
If "frame warp" reflex can take inputs during the "fake" frames, and represent that on screen, that might be something interesting. Still though, one thing it can't do is show an opposing player peeking from round a corner, it will continue to generate frames without new information from the game engine. That's not a problem in single-player games with slower action, but sort of makes little sense for CS or Valorant. Maybe the overall motion clarity will help with mentally de-cluttering the screen though, that's the principle issue with motion blur in FPS games, you don't know what to look at (or, more importantly, ignore) when moving the camera around. If frame generation helps further de-noise a moving POV, then even without more game engine info that would be a benefit. Not sure I want to be playing the games it benefits at less than 200 frames starting though, so I'd probably only go 1:1.
Only issue I have, and someone please correct me if I am wrong, is that this software cannot be used in full-screen mode right? That's a problem because windowed-fullscreen causes some latency/fps drop.
With DLSS 4, input lag more increases, and AI assistance will handle aiming for you, allowing you to focus on other aspects of the game. This feature is called Nvidia AI Auto Aim.
It doesn't work well because RTX HDR gets enabled in both game and LosslessScaling so you have HDR on top of HDR and that's why it looks blown out. This is an issue they're aware of but haven't been able to fix.
Lossless scaling is extremely amazing there will come a time every issue will be fixed unless AMD or Intel or Nvidia don't acquire the company to kill off this software
Yes, the explanation is long. Check out the last livestream where I answered that. Basically, we need at least as many Hz as your individual maximum eye tracking speed and then some to fix the stroboscopic issues of finite refresh rates
Try the Lossless Scaling on your plasma. Find out what the native sub field drive rate of your Plasma is 240? 480? Because this program will eliminate stroboscopic effect even on Plasma and CRTs. Give it a shot
Watching these videos at 240fps is let me tell you, simply awesome. Apart from just your videos,watching movies at 240fps is totally amazing as well. I dont mind artifacts that much because the benefits are huge. I didnt enjoy games with it because when you get 90fps and do 2x it results in only 120. Thats a huuuuge performance hit and not worth it imo.
Bros intros be legendary 😭😭
Im dead
Hahaha
It’s that pause and unload 😂
He's got that non-smoker's inhale.
my pulse is increasing as it my super inhale
Lossless Scaling makes heavily modded Skyrim playable for me. Without it at 4k on a 3090 I'm barely at 30fps in heavy parts of the game. With LS I can upscale from 80% res with NIS and FG X3 and it's a complete life saver, smooth as butter. LS is an incredible application.
Nice!
Same for me in modded RDR2 and Minecraft, 30 to 45 fps in max settings with all the graphics mods but lossless FG fixes it and makes it playable
what in gods name did you do to skyrim to make it 30 fps on a 3090
@freshfrij0les Ro modlist @4k. Over 20gb vram usage too 😂
If youve got to upscale or downscale, you're not running native resolution.
If u use RIVATUNER Statistics to Cap FPS and the Game supports Framegen + Reflex u can Cap FPS at 60 or whatever u can reach stable and under Setup in RTSS, scroll down, chose Reflex Framelimiter, instead of Async and the Input lag is gone. Made Stalker 2 playable for me with Framegen. Havent tried Cyberpunk yet, but it should work the same way. Dont know if the effect is the same or as good with Lossles Scaling.
Nice! I'll give that a try, thanks for sharing 👍
This is why I look in comments lol. I didn’t even know about this.
The mouse input lag is actually MUCH better with this latest update - just make sure you cap the frames in the in-game settings. You don't need 100fps base, even with 50fps with 3X scaling, I am getting a SOLID 150fps with verrry minor input lag. Well worth the compromise for the visual improvement.
This could be amazing for older games locked at 60fps.
And then adding the CRT Simulator from retroarch,the possibilities are endless.
CRT simulation will probably interfere with Lossless Scaling.
I tried it with MGSV, notoriously capped at 60, this worked great with 2X.
It even works when watching UA-cam or anime or whatever u watch. Doesn't work on Amazon prime for some reason. And is not fully ideal for multiplayer games. I've used it on competitive Fortnite and although it's smooth AF I do notice the slight input lag at X2. Best used for graphics demanding single player games
@@DeckTested Me too, and it looks so good. Also auto hdr with this game is amazing as well.
BREATHING IN INTENSIFIES
I think the reason that people are hating on Nvidia for their frame gen is because they 1) refuse to enable it on all RTX cards and 2) They demand the cost of a used car for their GPUs. Meanwhile, this application does very similar frame generation on ANY program for less than $10!
lossless scaling > fidelity super resolution
"download more fps" is very real!!!
Ive had this program for I think a year on my 4050 laptop, mostly used it when i first got it for open world games where i got 90 fps, so id set it to 72 and get 144 and this app is amazing! The 3.0 update is incredible!
For 72 to 144 works amazing
My biggest issue with framegen at 120hz is the performace hit to reach it. Atm I need 80-90fps base to get 120 fps framegen. Activating it at 60 will never give you 120. It always reduces base fps by about 25%. And real 90fps with VRR feels way better than 120fps after framegen. So there is no real reason to use it. Only exception are slow games where I can't reach 60fps and I use framegen to get about 80fps output. (Alan Wake 2) I wish there was a mode that only adds frames when needed to reach a framerate target, instead of doubling a base framerate.
What’s your graphics card?
That usually happens when your GPU is already fully loaded, running at 100% even before Lossless was enabled. To stop that from happening, you need to keep you'r GPU usage down to around the 85-90% range (the lower, the better) before even enabling Lossless Scaling, cause LSFG uses a bit of your GPU's processing power (about 5-10% or so depending on the settings used) to make those interpolated frames. You can do so by simply lowering your in-game settings or just using FSR, XESS, DLSS etc. to get your usage down to around that level before enabling Lossless Scaling in Borderless Windows mode.
Creator of LSFG also advices it users to disable any and all overlay they might be using before enabling Lossless (Afterburner/ Radeon Metrics etc.) as they sometimes interfere with LSFG. Just turn on Draw FPS in Lossless settings instead to be able to see your curent base frame and interpolated frames.
Would Also help if you turn off some of the settings in LossLess Scaling to reduce the latency and GPU load even slightly.
Stuff like Scaling Type to Off, Sync Mode to Off (Allow Tearing to reduce latency slightly), Max Frame Time to 1 for nvidia and 3 for AMD Gpu's and Lastly Capture Mode to WGC (WGC is the reccomended as it has less Latency, switch to DXGI if LSFG doesnt work)
@@jamesFX3 "make your graphics significantly worse to have an ability to render game at 60 fps required to framegen to 120 instead of playing at native 90fps" that's hilarious
@@Hyde1415 RTX 4070Ti
@@jamesFX3 Sorry, I was talking about NVIDIA Framegen. Should have mentioned that. But there is no framegen (NVIDIA, AMD or Lossless) that doesn't have that issue. And reducing load doesn't make sense when I get a worse expierence than native... That's the issue I was talking about.
I use an FSR3 mod on games i play for my 3080. Lossless is great but FG affects UI, so where possible, having a more intergrated solution is ideal. The fact that it works with FSR3 though, just goes to show if AMD can do it for everyone, Nvidia could too. They havn't since its been their only real selling point for the last two generations.
I heard new dlss 4.0 upscaler coming to 2000 series and higher and can potentially toggle on in any game.
@@aberkae just the updated DLSS upscaler will come to older RTX cards (and probably at worse image quality or performance), not the frame generation component that really should have been called something else instead of being conflated with the upscalers by NVIDIA and AMD.
Also DLSS will only be swappable in titles with DLSS 3.x implementation, not games with older implementation and not application agnostic at driver level (like NIS or AMD's FSR1 or Lossless Scaling upscaling methods)
@adriankoch964 Isen't the upscaler enhancement coming from New Trasformer model though? So the upscaling when using dlaa and lower will improve visual quality across the board.
Facts
But what of it? AMD or Loses Scaling do not have a qualitative start to the solution of nvidia. You can't do everything programmatically
I remember when Nvidia enabled gsync for all freesync monitors and added RT support via driver update for the 10 series, so it obviously can be done. And even without tensor cores, I was able to use Ray Tracing in COntrol, for example with around 40-ish fps. I also had screen flickering problem with Gsync since my monitor wasn't officially "gsync compatible", but fixed it easily by decreasing VRR range by 1fps in CRU. So yeah, I definitely think that most of this FG, DLSS, MFG bullshit is marketing to force you to buy the new hardware, which in itself proves that actual architectural advancement is incremental, at best.
It's spelled "convince" not "force".
Words matter, or is it just Nvidia we hold to that standard?
Of course its BS, the 4090 has more AI perf than a 5070 so no reason it couldn't do FGx4 in fact the tech was probably developed using the 40 series .
@Unit-kp8wm I used "force" because of "Ge Force".See? I follow the proper terminology.
I have an MSI 32" 4K QD-Oled with 240Hz and have lossless scaling since it ages (back then it was really just a scaler). I tried the new update yesterday in a few games. I used it in slower paced games where the input is not dsastically needed.
Examples:
X4 Foundations 60 /240 FPS - this made the game even in heavy battles totally smooth and made it feel amazing - never saw the game that smooth in heavy situations.
Xcom 2 60 /240 FPS - same as above since you don't have much direct input and "watch" most of the time it looked insane smooth.
Its also great for Ryujinx to get those 60 FPS switch games up to speed too. For those 30 FPS games I mod them to 60 then LLS them to 240. Also HDR works here too, Zelda and Fireemblem look amazing with it.
I also use Reshade (AutoHDR.fx shader with addons) to interpolate SDR to HDR and tonemapped to 1000 Nits. For games where there is RenoDX HDR fix availiable I use that instead.
My specs are a Ryzen 5900X, 32GB RAM and a RTX 3090. I skipped the 4090 and thought to get a 5090, but damn its way too expensive tbh - might skip this gen to and get a whole new setup then with newer X3D CPU and more RAM too then.
He's hilarious to watch just subbed for it!
60 to 120 conversion feels just great with a mouse imo, don't forget that you can inject Reflex into games now. I wouldn't use it for competitive shooters but it's not a huge deal for nearly any other genres and there's many genres that are harder than shooters. People act as if motion clarity never help your gameplay lmao. Also if you want better motion clarity then at 60fps you'd use 60hz Vsync with BFI, which is a ton of input lag, Lossless Scaling FG allows you to use 120hz with NO vsync instead. Because yes LSFG can take care of vsync by itself
Personally anything under 90fps feels like shit for me on the mouse. I can make concessions on Steam Deck (gamepad) or when streaming games over internet from my host machine (which adds like 10ms on top due to transcoding and network altency) but generally speaking anything above 10ms latency feels awful on mouse. The only time I've ever used FG was when I've temporarily had underpowered power supply for the GPU and had to cut down power draw under 300W to prevent driver timeouts and it was a miserable experience (capped at 82fps native to get 164 on 165Hz ultrawide monitor) I will NEVER subject myself to ever again. It was absolutely dogshit, to the point where I've opted to do something else than playing games. Not to mention the artifacts and smearing - I guess it doesn't faze people who use VA or cheap IPS panes and are used to garbage smeary vomit in front of their eyes but on OLED it's just headache inducing and like going back to 2005 early LCDs experience.
i tried an older version of LS (~3-4 months ago) on Valheim and Viscera Cleanup. Both games were capped at 60fp and i used x3 to get total of 180fps (1440p 180hrz monitor)
it was ok, mouse movement at 60 was good enough and there was minimal artefacts on the edges of the screen
and yes 90+ fps native would be better, but i couldnt keep them stable and i'm particularly sensitive to that jump between 90 to 60.
The difference is: Lossless Scaling is pixelbased. It has no access to in-engine information like motion vectors or ZDepth. Motion frames works on the driver level, which is a very deep integration into the graphics card as a frame generation process. The next best would be NVidia's DLSS FG or FSR 3.1 from AMD. Both work on the engine level and have access to this information. The information itself (specifically the motion vectors) is reducing interpolation artifacts to nearly zero (there is ghosting though). Lossless Scaling on the other hand has none of this information, and still: It does a great job in my opinion. Only static elements that have to move over fast motions in the background (like the main character driving a bike) or static elements like the HUD, show the distortions often very visibly. Since it is pixelbased (and not engine or driver based), there is not much the programmers of Lossless Scaling can do to eliminate those distortions at all.
But for what it offers for this little price, it truly is amazing.
I love when videos are in HDR 👌 thanks for this video it really helped me understand more about fake frames 🙄
Before the upscaling trend, developers were pushing toward standardizing 120 FPS gaming, even on consoles. Now we're seeing new games that barely manage 40 FPS natively. Frame generation feels like digital makeup - it masks the underlying performance issues rather than solving them. That being said, Lossless Scaling is amazing in cases where games are locked at 60fps, like Elden Ring, Dark Souls, etc. Glad that they keep refining it.
Kingdom Come Deliverance - wanted to play it, but it runs like crap even after 8 years and i upgraded my PC 3 times. Lossless scaling can make it playable for me.
If that was true why did Cerny say gamers demanding 60fps on PS5 was a big surprise.
4k basically stopped all progress..
@@bwellington3001 i get 144 frames locked on max settings with my 4090 on native 4K on Kingdom Come Deliverance...
@@bwellington3001 What hardware are you playing with? It's also important to understand that even though Kingdom Come: Deliverance was released in 2018, its Ultra settings were designed to push even newer hardware basically to its limit, kind of like the path-tracing mode in Cyberpunk 2077 where even a 4090 can only get 23 FPS at 4K if you don't use either DLSS upscaling or frame generation, or the Unobtainum mode in Avatar: Frontiers of Pandora.
Don't confuse certain concepts. Just because it works doesn't mean it will be as good as NVIDIA's solution. It won't be, and AMD has learned that with its FSR 3.0. The only advantage is that we'll run it in games that don't support FG
We can recommend the x2 mode here, which does a pretty good job. The rest noticeably spoils the image and is not worth the attention
Unfortunately, I can't compare it with Nvidia FG because my 3080 doesn't support it.
Great video! You explain everything so well, I completely agree with you.
I just purchased it. Amazing software. I have a 4080 and am actually using it in Fallout New Vegas of all games.
Pegged at 58 fps and x3 to 170+ and its much smoother and honestly better than running it any other way, that ive found
Nice!
I have the 80-in of that TV or at least a very similar model at the same tech in it and it is absolutely amazing. Played Star Citizen with dual flight sticks pedals on that 80-in TV and you are so in the zone. Only problem I have is when I play games that have lower quality textures or graphics it really shows
That's one of the reasons why I want to upgrade the GPU, with the size increase from.48 to 65" I can now see the defects a lot more
been watching since you had a few subs, you deserve more. I paid a small fortune for my gigabyte aurus 43 inch monitor and although its hdr1000 with all the bells and whistles the screen always felt too bright and dark colours looked bad. After doing your tweaks for lut etc my screen is amazing, hdr is brilliant and local dimming works really well.
Agreed. I was shocked when I tried his preset for Elden Ring with Lilium shaders to fix the black floor level.
The difference was night and day and I never would have known about these shaders otherwise
@Rafael57YT With all this upscaling and frame generation, I might try playing Elden Ring on my 3050 notebook lol.
The biggest gripe with HDR is how much fiddling it requires in every game for it to not look like complete trash or at least make no difference to just leaving Windows in SDR mode and skip all sorts of little pain points. And that is if you actually have a monitor that has "good enough" HDR hardware to actually get high enough peak brightness and true blacks in combination with plenty of low latency local dimming zones (or OLED pixel perfect local dimming at cost of reduced full screen peak brightness) to qualify as HDR output, which isn't true for 95% of all "HDR" displays on the market, especially all of the cheap ones.
@@mathias2277 dlss 4 is coming idk if elden ring uses dlss tho im playing dark souls 2 at 120fps with lossles scaling on a 3080
Awesome! Thanks for sharing👍
Honestly I thank the guys that developed Lossless Scaling, and because of them I can play native with supersampling on while maintaining a smooth experience 😎
just bought it, life saver for some titles on my daily 3090 pc
Hello friend. Just found your channel after Nvidia announced their RTX 50 series. You are like Daniel Owen but with a funny nice to listen to accent. Subscribed X)
It's funny that was actually a Match teacher for 3 years too,😁
2029 Nvidia won't even sell physical graphic cards anymore, they'll go full AI, just buy license and download the "GPU" straight into your computer to help 2029 path traced capable APUs to generate as much frame as you want
@@meb0y494 Geforce Now kinda is that already. They have 4080 on there. They will upgrade to 5080 soon on geforce now. Allowing more AI frames straight from the cloud
The thing is at some point you cant push more power other then more gpus . Its already basicly just buing a software licence with new gpus over raw power
@@xythiera7255they are reaching the limits. The power usage is going up so they are going software for performance 🤷🏻♂️
@@xythiera7255Nvidia started doing it 😅
and AI is cheaper
I wish Reflex 2 was like the async warp demo, but it only updates mouse movement inside the current frame, rather than to your monitors hertz, so it feels more responsive but its not perfect input lag, and things don't really look any smoother either like async reprojection does. And it only updates mouse input, not keyboard.
I think the reason it hasn't been done yet is because of the severe amount of artifacts it had. Gamers don't really care that much since the the latency & smoothness benefits are so massive, but NVIDIA cares a lot, they want it to look as close as possible to real frames.
Its something their definitely working towards, extrapolation to your refresh rate has been discussed as early as like 2021 but its going to take time. I think 60 or 70 series will have it.
Funny how for me I'd accept reprojection a lot easier just because of the input lag and, from what I understand, quite low performance cost. Because right now on 4080 with framegen it takes a lot of cost, doesn't look good, and doesn't feel good.
My tinfoil theory is that Nvidia is scared to use reprojection because once they show us that we don't need high fps to fill out their Hz and have good input lag - the cat is out of the bag.
I think a lot would stop buying these incremental upgrades. But now since not many people even know, it's a lot easier to sell people on 5080 and 5090.
What we need is interest from big players like nvidia, because interest eventually compels game engines and game devs to include proper standard features like asynchronous reprojection.
Given that it's a tried and true technology for VR, i don't think it would take too much time to implement a barebones version. For what nvidia calls infill, it will remain vendor-locked as it's all AI based. To be honest, if i had to choose i would rather have artifacts than a terrible input lag, especially in fast paced fps games.
For indiana jones and the GC? Frame gen away as it's basically a graphic novel with first oerson gameplay
@@fpgamemearray call me a tinfoil guy but I think Nvidia doesn't want framegen, because it would disincentivize people from constantly upgrading for tiny 10-20prc jumps.
@@GFClocked i think the opposite, all the "impossible without AI" allows them to pump out gen after gen with minimal generational improvements and then justify it because "all the new games use RT and RT is too expensive". They leverage the monumental budget of the datacenter AI division to increase AI performance (thus emphasis on AI TOPS) while neglecting raster improvements. I wouldn't be surprised to see that the 6000 generation uses neural BVH or something like that, they are already exploring it in research.
As i understand it, framegen was never meant to be used by the poor pc players to go from 30fps to 60, it was meant to be a technology thst allows very high refresh rate 2k/4k monitors to be fully utilized.
It will happen it is inevitable.
Exciting closing argument in favor of producing less graphic detail in exchange for perfect performance. I remember this is what happened with most arcade games or on the Sega Dreamcast, a console that you connected via VGA to a CRT monitor and were amazed at how clear everything looked at 640x480 and 60hz/60fps.
The videogame market is huge and there will always be titles that work without DLSS or FrameGen. I personally enjoy very much 'Metroidvania' style games (Hollow Knight for example), with relatively simple graphics, but with polished gameplay and optimal performance in the old style.
Greetings, Ariel.
I don't use any upscaling tech. You get a massive perfomance boost by changing the aspect ratio, you add black bars to the image but it feels significantly better to play and lower input lag. An old CRT trick.
alot of people dont understand how important raytracing and path tracing is for game devolpment and how long it used to take just render 1 frame with it, what used to take hours to get 1 frame of pathtracing now taking 1 sec for 25 frame for a native frame is amazing
I noticed a world of difference in LSFG 3.0 yesterday when playing NFS: Underground and Arena Football: Road to Glory on PCSX2 upscaled to 4K. It really made the ray tracing on my 4070 Ti Super stand out.
I wonder if the app allows dlss 4 upscaler coming to 2000 series and up and frame gen?
@aberkae As far as I know, LS works with other frame-gen programs. I think that it states this on the LS main screen.
Man I'm telling you.. later on people will preffer and start making waves that they want to play a video game on native resolution without any AI upscaling crap. The thing with upscaling is that there are too many issues ( at the moment at least ) After I've bought my 4090 rog strix, I've played for around 7 months with Frame Generation on in any game that supported it, and after those 7 months, without FG. It's HUGE difference ! especially in FPS games. A trained eye can tell the difference ! me personally I would preffer to play a well made game on a solid 60-120 REAL fps without blur effects and anything like that in the game and without any micro-stutters etc. than let's say 480 FAKE frames
60 real FPS is almost unplayable to me but I understand your point
The more I think about it, the more I'm starting to be convinced that the way to achieve perfect motion clarity is not going to be 200 real frames + 800 fake ones. In my opinion, at some point a completely new type of display has to be invented that would push sample and hold displays out of the market.
Blur Busters is the voice when it comes to motion clarity and brute force sample and hold 1000Hz+ is the future and the best we can get
@@plasmatvforgaming9648not really for the reason that CRT at 70hz -I believe- has no motion blur or input lag due to black frame insertion. The power consumption is something to be thought of and rather than changing the hardware of the computer, I believe the hardware of displays need to be looked into, and could be the most simple way in reality to fix the way displays are made.
The push for trying to move away from sample and hold is odd to me. You want your display to flicker?
@@whatistruth_1 As long as they made the flicker imperceivable it would be fine to me.
Waiting on DLSS4 new transformer model and relfex 2 to test out. Should look really good even with 2x frame gen.
It's going to be mind-blowing
Think I’m going to sell my 32in 4K 240hz monitor for a 27in 1440p 500hz when they come out simply because I value motion clarity & smaller size over PPI. Should be much easier to run too, especially w/ DLSS 4. Still have my 65 & 77 S90Cs for high fidelity single player w/ HDR impact.
Cannot wait for those monitors to come out. Believe frame gen’s only worth it at much, much higher frame rates. At minimum, ~60 w/ DLSS quality before frame gen so the input lag isn’t completely terrible. I play on controller, so my tolerance is much higher.
Don't know how 32" 4K is compared to 27" 1440p but let me tell you. I had 28" 4K then got my nephew 27" 1440p. To be honest if I went from one room to the other I really cannot tell the difference immediately. 1440p is a great option. Even if I lower mine from 4K to 1440p I still don't get as clear picture as his
Pg27ucdm is a 27inch 4k 240 hertz qd oled monitor if you want the smaller size and no compromise it comes out soon!
@@mossen98 Yes, I saw! Debating on this or 500hz. Leaning toward 500hz since if I want 4K, I’d just use my TV & sit back a bit.
500Hz QDOLED is going to be insane
I really enjoyed and agree with you "ted talk" towards the end, games need to go back what they were, optimized & clear not all this upscale taa slop.
Hi Ariel, been watching your content since I got into HDR PC gaming. An off-topic question, do you use limited or full rgb on the Nvidia control panel and do you match it on the QD-Oled? (I have the same S90D as you.
Always on Auto which is Full automatically because the GPU res is set to full. Never miss match them. That was an old trick I used to recommend when I didn't know better
@@plasmatvforgaming9648 I tried Full + full but image has this gray haze on it, I tried Limited + limited and it looks so much better, I'm not quite sure why
10:20 I have been telling you, that you didn't experience the RTX 4090 the RTX 5090 will blow your brains out compare to the RTX 3080.
The drawing pipeline will be a lot shorter, everything will be more snappy, not to mention the utterly insane 1.79 TB/s memory speed.
Der8auer has said it himself in his last video, that overclocking the core on the RTX 4090 did not do much for performance, but overclocking the VRAM showed big performance gain!
What is overclocking the core and overclocking the VRAM? What does that mean?
@@rufus5208
I asume overclocking core means GPU cores.
Just like CPUs have cores, so do GPUs.
Overclocking VRAM?
Well just you can overclock Sistem RAM , you can overclock GPUs RAM (called VRAM as well).
Some other things can be tempered with as well.
But I would never do that.
Too risky for my taste.
@GameslordXY so that just makes your GPU more powerful? And you wouldn't overclock core and vram, or you wouldn't do the other things?
I wish lossless was available for linux, something like a gamescope integration for VRR and HDR enable gameplay on a big TV
I don't think that possible
@@wikwayer why wouldn't it be? as far as i understand what the program is doing is running the previous frame through an algorithm, drawing the real frame, delaying the following frame injecting the generated one. To account for the multiple window servers on linux, supporting gamescope would be the universal way to support linux (and the steam deck)
Optimization is what we truly need, we need to make more noise about it. Games are getting worse and worse, performance and its looks.
Facts
If you want even lower input lag, switch from the DS5 controller to a Flydigi Vader 4 Pro.
It has up to 1000hz refresh rate wirelessly and also much faster polling rate on the sticks. IIRC DS5 is only 60hz on PC but the sticks themselves are polled even slower, so it can add a bit of lag.
Plus HAL effect sticks instead of Sony's scam sticks that'll break after 1000 hours of use and microswitches on all of the important inputs, plus rear buttons and 2 extra ones at your right thumb. Super build quality too and toolless adjustable Joystick spring tension.
I'll check it out. Thanks for sharing 👍
I was thinking of getting 8bitdo 2C is that good too?
ps5 controller can do 1000hz no problem
With Special K even an old DS4 controller runs at 1000hz.
The problem with dlss4 mfg is that it needs very high hz display. For a 144hz tv, i believe the lowest playable scenario is 48fps x3 to 144hz. If base framerates is below 40fps, no fg tecnology can offer acceptable result. 240hz or even 480hz displays are for competative gamers, those games are usually not very demanding, x3 or x4 fg are useless in this scenario(also more input lag which competative gamers won't like).
For 3x+ we definitely need more Hz
you should also try using reshade hdr within the lossless app so hdr can be used on anything that you scale
Interesting, I'll give that a try
@@plasmatvforgaming9648 It is a bit difficult to setup at first because you can only control reshade via the config file directly or using a controller to navigate and open the reshade menu. Or if you want, you can just setup the reshade config and files then just drop them in the lossless scaling folder. If you want a tutorial, the lossless scaling server has the guides and support.
Higher frame rates don't just need good graphic power but also cpu power and that's usually limiting factor in some games to hit higher frame rates , especially if it's badly optimised.
For some reason on my Samsung QN90D GameMotion+ is only ever becomes available if I set the TV to 30hz mode. Any other mode and it stays grayed out in the game bar, no matter if VRR or HDR are on or off. :(
That's weird, I guess you're in HDMI Console Mode. Try turning off ALLM if on Console
i like this guy's videos
What would make this perfect for me would be no artifacts with retro games with scanlines. The app gets confused by scanlines or stair patterns in games. I freaking love this app, though. Getting better every version.
@Plasma TV for Gaming How can I use nvidia RTX HDR with this program??
It doesn't work and the developers are aware of it. The problem is that RTX HDR goes over both the Game and LosslessScaling causing HDR on top of HDR and that's why it looks messed up
@@plasmatvforgaming9648 thanks
My only problem with LS is the warping of crosshairs. If they get that figured out I will basically have no problems with anything else.
I don't believe that will be an issue, but let's see
@ I can look past it but that is my biggest gripe. You can’t use crosshair x though because for some reason it registers the frames of the dot and not the game so it goes haywire on frames
Love your videos man! great work
I think there''s an easy way for them to make this perfect, they just need to implement depth detection, but this would require the user the point the program to the game's .exe file and it looks like the dev isn't interested in introducing such a step, just want the program to be nothing more than an overlay. I respect their choice but it's a missed opportunity IMO, they can EASY destroy Nvidia if they wanted, because from my testing I think LSFG is smoother than DLSS3, the only issues left come from the lack of depth detection.
Interesting
ITs no doubt a good feature, especially to fill out a high hertz monitor. But you need real frames to begin with XD
Exactly
Those Samsung QD OLEDs are exactly how my Syncmaster 955DF VGA CRT (also made by Samsung) looks as well. No normal camera in the world can handle the insane dynamic range and color of these displays. Maybe with a movie camera you can capture their detail.
Nice!
@@plasmatvforgaming9648 By the way, no amount of BFI will beat a CRT.
My 955DF at 1600x1200 at 60Hz as an example, takes 22 microseconds to draw a line and 8 nanoseconds for a single pixel. I usually use 85Hz at 1600x900 and that gives me 6 nanoseconds. No flicker and perfectly smooth.
rather than cropping, you might be able to edit the graphics config of certain games and set them to render at a slightly higher resolution, and then change the scaling mode in driver settings to not do any scaling. this would remove the black bars and just crop anything outside of your native resolution.
Good idea
Can you make a video of RenoDX HDR where you go through all the (important) settings and what they do? It would be nice to hear what is actually different from the original HDR implementation and the one that you prefer.
If you try it, you'll see the difference is absolutely huge, not a lot of tweaking necessary
@@plasmatvforgaming9648 Ok will do! Thanks
@@plasmatvforgaming9648 I tried it and only adjusted the value of peak brightness to 1230 (I believe on G2) and paper white 203. Looks much better! Are there any values you would try and change except these? Have a good week.
Hi Brother, nice video ! disable vsync in LS to get way less input lag(switch the sync mode from Default to None). I play Single Player Tarkov on a RTX 3070 with a base frame rate between 40-60FPS (the game is CPU bottlenecked with all the bots AI), and with LS x2 mode it's really good playing with a mouse of course. Having GSync/Freesync helps. Cheers
Thanks for the tip!
It's amazing, It's WAY better than amd afmf.
set sync mode to off (allow tearing) and make sure vsync is off in game. i wanna know what your thoughts are.
I have absolutely no problem with generated frames as long as they work well. DLSS 3 performs exceptionally and demonstrates that visual quality is paramount over raw response time. A response time of 16 to 20ms is more than adequate for a responsive experience, especially when you're delivering a higher frame count. I love frame generation.
So DLSS Frame Generation (FG) has proven to everyone that a response time of 16 to 20ms is more than sufficient for an incredibly smooth experience. What truly matters is how many frames you have available in a second.
Excelent video. And you touch on the point on why enthusiasts criticize FG (Frame Generation) stuff so hard. It was meant as a "easy hack" solution for those that want a similar (not the same) impression for the image motion as it'd be with much higher framerate, while accepting many of its downsides (artifacts, ghosting and latency). It was never meant to be used as a "crutch" for lazy game developers to use as a tool, to counteract the absurd sh!tjob they now notoriously do with games optimization (or rather don't do, it seems they no longer care about it!).
FG is basically a more intricate/complex AI version of the "soap-opera effect" (motion interpolation) that, for years, could already be enabled in many TVs and PC monitors. Nothing more.
Well said!
People keep forgetting an essential aspect of performance. LOWER THE GODAMN SETTING. Lol then you will get a higher base frame
NEVER!
To me, the ideal way to go about it with current technology goes a little bit like this:
OLED (+QD, or even MicroLED): Concentrate on increasing the panel's native refresh rate, there is not real reason it can't go beyond what the input signal can take.
All displays should have built-in frame generation (hopefully the algorithms get better with time). For retro SDR and content that performs poorly with frame-gen, implementing something similar to the CRT simulation that takes advantage of the panels native refresh rate would be great. Bringing back rolling scan would be great but since it costs money for something the average consumer doesn't use, I won't be holding my breath.
With a PC you can always implement software frame-gen or CRT simulation, but since you're limited to what the input signal takes, I think it will be crucial to have these options built into the displays with a higher internal refresh rate to take advantage of them.
In conclusion, with OLED you concentrate on brute forcing fps and accept the artifacts and the fact that it will never reach CRT-level motion (or at least not for a long time, let alone without artifacts).
LCD: Concentrate on cramming a lot of bright LEDs to maximize the dimming zones to at least compete with CRT and Plasma's contrast, and keep an edge over OLED when it comes to pure brightness potential. There can be all the same frame-gen trickery as on OLEDs, the problem is that it won't look as good as OLED. For an LCD to be relevant, they need high-perfomance backlight strobing that allow for 1ms MPRT motion and minimal crosstalk.
In this case you brute force brightness and strobing, giving you the most brightness possible at the lowest MPRT possible (even at 50/60hz) compromise for HDR gaming, and you just accept the fact that you won't have perfect black levels and pixel response times.
There's no ideal scenario but I can see a world where both tech work side by side depending on your priorities for different types of content.
I've also used the 0 brightness trick on my monitors for recording 😂 works like a charm
can you make a video (if you havent already) that goes through some nice to have programs and how your thought process is, when making these monitors look so good? :D
I would love to hear your thoughts on BFI for those 240hz 4k oleds to get 120 hz with BFI
It works well, and the motion clarity improves 2x but the brightness loss is bigger than with better more optimal flickering technologies. For example my LG C1 OLED at 120fps using the flickering tech OLED Motion Pro which simulates a CRT rolling scan (without the phosphorus decay and closer to a square wave) has 3.2ms of persistence, that's like 312. This same tech at 60 as bright as my Samsung S90D 60bfi and significantly more clear. Considering that the Samsung TV is way brighter this shows that BFI is not Optimal
The best for you well be CRT simulation tech created by Blur Busters. I made 2 videos about it and a livestream. Also check the latest livestream where I explained it too
I have a rtx 3090 desktop and a 4060 laptop. Comes with Frame Generation. Have also LS. I love this, and I think it’s a must buy because it has helped me improved my handheld experience, (and my 3090 experience too). But technology is not the same or even close to Frame Generation unfortunately. I wish it was because it would be cool to throw the middle finger to Nvidia. But FG is actually Generating frames via hardware, things that software haven’t been able to achieve since is an interpolation method, just an average between frames. Even like AMD “Frame Gen” is not frame gen, but interpolation. Maybe in the future gets way better since it’s software, but Nvidia”s Technology is wayyyyyy ahead on this one.
I feel like with this beta i dont need to crop. It looks great at the bottom of the screen
happy new year! Would the frame generation work for average without frame generation 45 50fps, and frame gen enabled locks to 60fps, just to enable BFI properly?, with Crysis 3 remastered I didn't want to use even lower dlss modes than performance modes, and maybe I can also do with enable frame gen enable RT on lockt 60fps and bfi? also a good explanation of the need and what happens without the frame generation technology, which I agree with, I did want to add that it also means efficiency, the power consumption is not feasible or barely feasible, and that this is because CMOS transistor technology is almost at its limit, I think that is a bigger reason, keep up the good work!!!
It doesn't work well with low base FPS, it shines at 100+ base fps
That setup is not going to look good, a lot of artifacts
Can rtx 2000 series use dlss 4 upscaler and lossless scaling 3.0 frame gen?
Yes DLSS4 upscaling works with any RTX (Multi Framgen works only for 50 series, single framegen works only for 40 series, and all other features work for all RTX cards)
You can combine DLSS Upscaling or DLAA with AMD FSR Framegen or LLS Framegen without problems.
Best DLSS upscaling with LosslessScaling, yes
HDMI 2.1 does not support more than 120hz at 4k. Does it still make sense to go over it?
Using DSC (display stream compression) we can go 2x higher
Also some monitors can come with 2 inputs to double it one more time, so 2 outs in GPU 2 ins in monitor, one signal
@@plasmatvforgaming9648 thank you for clarification. I researched about DSC however I couldn't find a proper comparison between native and compressed signal to check the loss. Do you game in 4k? What's your experience
Turn on draw fps on lossless scaling, it has a fps counter in top left when you turn that on
Nice! Thanks for letting me know 👍
Owners of s90d how do you watch hdr movies on pc? To enable filmmaker mod, you need to reduce the hertz to 24, but the picture starts to twitch, as if it is not synchronized with the monitor hertz. Or is it also ok in game mode, the settings of which are similar to filmmaker? I use 20 color with color booster low on the advice of the channel author, it seems not bad, but filmmaker seems brighter (although I have not yet increased the max brightness to 1500 yet)
What GPU should i buy for 144HZ in QHD, only raster, no ray tracing no dlss no framegen no ai, everything maxed/high?
It depends on the CPU and display
@plasmatvforgaming9648 i mean assuming i have a powerful cpu and a good monitor, what's the GPU that fullfill those requirements?
@@NicoPezzotti712 RX 7900 XT or RX 7900 XTX depending on your budget. I would wait for RX 9070 XT if u cant afford RX 7900 XTX
@@west5385 thanks king 👑
9:80 fyi there is a mod for Minecraft that does async reprojection. Its also really just a demo, but its in a real game.
Interesting! Thanks for letting me know 👍
Does this frame generation also work in VR?
I'd like to use it for DCS.
I haven't tried it yet
I'll take fake frames over motion interpolation any day lol
All frames are fake frames, whether they are rasterized or AI generated. They only become real frames once we see them. Frame time generation, and latency are separate issues which continue to be addressed and improved. Imagine 20 years ago people arguing that GPU frames were fake frames because they were offloaded from the CPU and not generated by their Pentium 3.... that is all that is happening now. Technology is changing, and will continue to improve, and this argument will seem brain dead in 20 years.
bro on cyberpunk 2077 which settings are best on LS
It depends on your target and performance you can get based on your system
I'd say try different combinations
About fps games and fast paced games in general, i play helldiver 2 on a 4090 (with a mouse) 60fps x2 mode, it's still an amazing experience. I can just as good as running the game at 120 fps without lossless scaling, (i use it fo reduce heat and power consumption)
Nice!
Yes, you will be able to force reflex2 on any game via nvidia app, then you can use dlss4 and fg or use lossless instead of fg if not supported. Reflex2 will be there anyway for anyone with a rtx card, of course frames will still vary depending on which gpu you have and what resolution you aim at but still its gonna be good.
I'm looking forward to it
If "frame warp" reflex can take inputs during the "fake" frames, and represent that on screen, that might be something interesting.
Still though, one thing it can't do is show an opposing player peeking from round a corner, it will continue to generate frames without new information from the game engine. That's not a problem in single-player games with slower action, but sort of makes little sense for CS or Valorant. Maybe the overall motion clarity will help with mentally de-cluttering the screen though, that's the principle issue with motion blur in FPS games, you don't know what to look at (or, more importantly, ignore) when moving the camera around.
If frame generation helps further de-noise a moving POV, then even without more game engine info that would be a benefit. Not sure I want to be playing the games it benefits at less than 200 frames starting though, so I'd probably only go 1:1.
We can always use Reflex 2 without FG and that will be amazing for competitive players
Only issue I have, and someone please correct me if I am wrong, is that this software cannot be used in full-screen mode right? That's a problem because windowed-fullscreen causes some latency/fps drop.
Win11 resolved it. They changes the way it draws so it's now the same latency
Full-screen doesn't work
they are cooking as the kids say
do i need this software with a rtx 4080
It works for Videos too
I get insane mouse latency using this app.
Increase the base FPS. Wait for Reflex 2
With DLSS 4, input lag more increases, and AI assistance will handle aiming for you, allowing you to focus on other aspects of the game. This feature is called Nvidia AI Auto Aim.
25:00 PREACH BROTHER
how to use rtx hdr with lossless?
It doesn't work well because RTX HDR gets enabled in both game and LosslessScaling so you have HDR on top of HDR and that's why it looks blown out. This is an issue they're aware of but haven't been able to fix.
Lossless scaling is extremely amazing there will come a time every issue will be fixed unless AMD or Intel or Nvidia don't acquire the company to kill off this software
Do you know the frame rate limit to our visual perception?
Yes, the explanation is long. Check out the last livestream where I answered that. Basically, we need at least as many Hz as your individual maximum eye tracking speed and then some to fix the stroboscopic issues of finite refresh rates
@@plasmatvforgaming9648 I'll check it! Thanks man
you dont have to use beta to crop display
I don't remember seeing that setting, but maybe it was there before the beta. Try the regular one and see
Rivatuner has nvidia reflex I think
Were?
Try the Lossless Scaling on your plasma. Find out what the native sub field drive rate of your Plasma is 240? 480? Because this program will eliminate stroboscopic effect even on Plasma and CRTs. Give it a shot
How?
@@plasmatvforgaming9648 When you are playing on your plasma, just enable the program like normal
since there are a lot of 144,165, and 180hz monitors, how does the input lag feel with 72, 82, and 90 fps as base?
It feels good, the biggest problem to solve is the artifacts
@@plasmatvforgaming9648 how are the artifacts, compared to a lowering of ingame settings for natural equivalent frames?
その前にそのクソ綺麗なモニター欲しいな
S90D QDOLED is the best-looking display I've ever seen. The calibration is so amazing. Everything looks correct to me
@@plasmatvforgaming9648ありがとう😊
Turbos are better but there is NO replacement for displacement. You need a minimum of Liters to get the turbo going. FG ~ TURBO
I get the car analogy
So you gonna get the 5090? its a better choice imo
Yes, I hope it doesn't sell out in seconds
Play everything at 1440 p Ultra at 35/105 fps with it ( 3080) it cost 6 dollars guys !
I tried this Lossless scaling last night and is garbage. The input lag is double vs Nvidia Frame Gen and is full of artifacts....
I can't compare it with Nvidia FG because my Nvidia GPU doesn't support the Nvidia FG Feature
@@plasmatvforgaming9648 I get it. I have a 4090 and i tried it just to see what is the fuss about it .
Watching these videos at 240fps is let me tell you, simply awesome. Apart from just your videos,watching movies at 240fps is totally amazing as well. I dont mind artifacts that much because the benefits are huge. I didnt enjoy games with it because when you get 90fps and do 2x it results in only 120. Thats a huuuuge performance hit and not worth it imo.
how do you use lossless fg on browser?
@Superdazzu2 Same as usual, open video make it fullscreen, type the shortcut and voila!
Thanks for sharing!