Check this link to get yourself a comfortable chair this new year: www.flexispot.ca/flexispot-professional-ergonomic-office-chair-c7m?KOL&Threat+Interactive_C7M PLEASE READ FOR UPDATES & RESPONSES: 1. Watch this video in 4K (as in streaming settings) as any lower compression will hide the visual details discussed. 2. We would really appreciate it if viewers could share on giant game related subreddits. These prevent self promotion but these can really push our channel's success by the thousands. Watching the full video without skipping etc also boosts the algorithm. We know some viewers want to help us reach more people and these are the best methods to do so. 3. 11:36 consist of three 4K videos that refused to render properly with multiple tries. *4. Vote here to see who else thought which was which at 8:26! ua-cam.com/users/postUgkxdwlBoI9FKFU_1TMCFqW4CEzEYz21XiXk?si=mu3u-TiX23tXNnSG *5. Our TAA commands are pretty similar to the ones we discussed in our other videos with slight modifications and still having the same requirements such as 60FPS+, v-sync etc. *6. Please Note: At 7:40 an error listed "8 core 3.5GHZ" twice instead of once. *7. Stay Tuned
can you please explain why with pathtracing enabled an open world game like cyberpunk 2077 works on 60 series rtx card and b580 at almost playable (30-40fps@1080p upscaled)but for a linear game like indiana jones pathtracing doesnt work at all(20 fps at max)?
@@ThreatInteractivewould you discuss about tokyo xtreme racer being very optimized and even more with some community mods? it's pretty insane that the community can go optimize the game further that even a 550ti able to run it, despite the game using ue5
Thank u for your videos. I am a CTO of an IT company which specializes in commercial simulators (we're using ue5.3). We have used lumen, nanite, full dynamic lighting cause it's easy to implement and saves production time (costing framerate ofc). One day I've watched your video and said "lets try it the old way". We made LODs, optimized them, baked the lights (we do not have any destructible objects or day/night cycle), made changes to expensive materials, reduced drawcalls, etc. Boom, 60fps on 1060 on a 2x2km forest map with grass, good lighting etc, 1080p native. Triple the fps on a 4090 in 1440/4k. The simulator feels way more responsive, clients are happy, and it feels right Sry for my english btw, I'm not a native speaker
That's the core of the issue, Nvidia is pushing their worse stuff under the guise of "easier for developers", you went the old way and that exposes how unoptimized the new stuff is.
Is it a small company, because if it isn't, then unanimously deciding to dedicate time and resources to a project doesn't sound like a likely scenario.
@@parlor3115 We tested the traditional approach on the worst performing map to see what can be achieved. I am a CTO, so I took some guys to work with me on something that might work. It took around a week to do the job (the map is just a forest with some fields, river and a road). We saw the results, and now we're optimizing other maps the same way. It isn't the fastest process, but it is beneficial for us in the long run (lower PC requirements - more customers, especially with our target audience, where ok level GPU is 1070 and good GPU is 3060 or something). And yes, we're not a big company, we have only 27 people
I will pick some vaseline with path tracing than raster with the sharpest textures. Path tracing looks like a movie / real life, raster is still a fantasy game with loads of bad / incorrect lighting, shadow, reflection details that break immersion
"Threat Interactive just hates Nvidia!". Threat Interactive: "No, AMD is WORSE". This isn't about stanning brands or some shit. This about fundamental problems in the industry. Another great vid TI
Yeah, Jensen and Lisa are family. I don't get fanboys when both CEO can go to family dinners and laugh at you. Your opinion on AMD being better price to performance are irrelevant to them.
@@dingickso4098 They are like Mafia with AMD agreeing to lose desktop GPU market but get console market (outside of Switch) while Nvidia agreed to have entire desktop and laptop market, AMD dominates now CPU market but it's because of intel incompetence not their false competition. Anyway most of the money are in the AI and data centres and gpus/cpus are just a side gig (about 40-60% of revenue).
@@dingickso4098 when AMD is charging $1000 for a videocard with RTX4070 level of raytracing performance - you know it's a complete BS, like AMD is actually WORSE at price to performance. So I have no idea how people not see it.
Literally every game on UE5 drive me crazy because of the smearing, blurry, grainy stuff and ghosting effect. Hell, the games on UE5 doesn't even support MSAA anymore and you can't get a good image
@@vitaliyleopard7309Try Tokyo Xtreme Racer, it looks very good on the lowest setting. That's the only UE5 games I know that looks "good enough" on lower end hardware
One of the big promises of higher resolutions was to eventually reach a point of not needing any Anti Aliasing at all any more. No need for computing more than the actual pixels on screen again, no down or upscaling, no camera jittering, no blur, and a pixel perfect match between inputs and what you see on screen. That is the future i want
@@theanimerapper6351 what does that even mean, deferred rendering is essentially the same output, unless you start rendering stuff in the pipeline below screen resolution and upscaling it. I am currently writing a deferred renderer for a module at university and the output is identical to a forward renderer - the only specific caveat here is that all the G-buffers are at screen resolution.
@@Berserkism to get rid of aliasing it isn't about resolution per se, it's about pixel densities. Your eyes can resolve a crazy amount of detail so you would need absured levels of pixel densities to achieve it. You also end up with temporal aliasing issues as well and solving those isn't something you can do with a resolution bump either. It instead requires motion blurring, but unlike the blurring we get in video games, it instead is a compositive of many more frames condensed down into one. It's essentially the entire operating principle of TAA. TAA only works well at extremely high frame rates so it has enough frames to sample from. But I digress, there is too much nuance to fully reply in just a YT comment...
The GPU market is a perfect representation that most consumers don't understand what they want or are talking about. There is no universe in which a 24 GB GPU is needed to play video games. However, rather than people calling for better optimization, the defacto solution is "just buy a better card bro". I hate this hobby.
Literally the same thing that happened with IA and deepseek, everyone is going the "invest in more processing" way instead of the "be more efficient" way. Until someone disrupts it, the bubble keeps growing.
It’s not the same. It’s true that deepseek r1 shows very good results compared to o3. However, it was trained on openai model output. If openai hadn’t trained its billion-dollar model, the million-dollar r1 simply wouldn’t exist.
@arcarius8169 it is the same though, because we're talking about optimization. Deepseek slightly outperforming gpt isn't that wild, but what is wild is that it does that with severely less performance costs. Them saying "screw CUDA" and jumping down to PTX is literally what made it all possible. They worked within their limitations, optimized, and open sourced. That's exactly what's needed in the industry from AI to graphics.
IKR?? No one takes a moment to stop and think maybe we need to optimize shit instead of just throwing more power at it. xD Like coding but instead of writing clean efficient code you're like "eh ill fix/refactor it later" when its already huge and would take a lot of effort to do it. Yes this is faster, but its so bad... I often joke saying "we'll get it in post" when either coding or drawing, but these companies actually think that way.
@@arcarius8169 There's no concrete proof for this whatsoever. Just because the model sometimes mentions openai doesn't mean it was trained on its output. Remember that gpt 3.5 and 4 shitted up the internet irreversibly and filtering it out from the datasets completely is pretty much impossible. Also oai doesn't show their reasoning tokens so that couldn't be stolen either. Most of the stuff is explained pretty clearly in their paper and math adds up.
Deepseek isn't that low on processing as it's a tier 2 AI model that uses other models data. It's in no way more efficient to train. It is a tad more efficient to run as it a pre condensed form of the top LLMs.
Unfortunately lots of others (rich snobs) are calling you ignorant in pc gaming community. They don't want to see the games being optimized. They would rather through money at the problem
He is a bit biased, but most of his critics are often less knowledgeable. The point is to not be tribalistic about it and look at facts. Temporal techniques were also invented to solve things that classics methods couldn't handle like specular shimmering//shader aliasing. He even admits in this video that destructible walls require raytracing for good GI like Lumen, so he is partially admitting that this "bad" technology wasn't without a real need. We now have a lot of games with dynamic environments and those older rendering methods he prefers CANNOT handle them properly, so some of the stuff he hates are genuine innovations to make games better. It's more of a mixed bag than the way he describes it, but the overall premise of his videos are correct: game graphics are becoming quite a mess and many expensive techniques are used in games that do NOT need them (e.g. using Lumen in a static game is insane).
@@kazioo2I think the only reason you see people be 100% against this tech is because it’s gotten so out of hand. I mean cmon, games today can barely even run
They only valid reason for calling this guy ignorant I see can be summerized as : "what ? You mean that, with a little bit of effort, instead of relying on bloated "Auto" presets, that game could run as good on a $400 GPU than it does on my $2000 one ? Impossible ! Only suckers throw their money away, and I can't be a sucker, only everyone else can !"
Yall ever notice how games devs put more energy into attacking random people than they do corporations? Like all these devs suddenly come out the woodwork to slander threat interactive. But why are they silent when it comes to talking about the horrors of the industry? Where were they when Activision was a horror house? And where are all game dev discords and videos dedicated to talking about bad management and corporate greed? They seem to only come out when it’s regulars people. Remember when game devs attacked Larian and from soft? Mf how about yall attack an executive or something?
because these devs are the college students or fresh off of college who spend an entire time in UE. Instead of focusing on making games, they chase that "cutting edge" high which they will never going to use.
Its getting funnier when he is saying shit about RT,DLSS and FG quality and performance (which is true) and then blame AMD why they didn''t invent it first xD
They always have, it's how Gameworks was conceived, and literally how it worked for over a decade. It's always introduced effects into games that their latest cards were uniquely capable of handling, but their older models (and the competitor's models too) couldn't keep pace.
Hey there, as a 37-year-old physician with a deep passion for video games (and a strong disdain for poorly implemented antialiasing and blurriness), I rarely comment on UA-cam, but I felt compelled to chime in here. First off, I really appreciate the overall message this channel is trying to convey-it’s important and resonates with me. That said, I want to offer some sincere advice: the message might not reach as far if it’s delivered with hostility or egotistic bravado. I’ve seen it time and time again-when the tone becomes confrontational, it often pushes people away rather than drawing them in. I’m sorry to pull the ‘I’m older, so I’ve seen this play out before’ card, but it’s true. A more constructive and inclusive approach could make a world of difference in spreading your message effectively. Also, I think it’s worth mentioning that visual topics like these often land in subjective territory. When you’re younger, it’s easy to want to see everything in absolute objectivity-black and white, right and wrong. But as you get older, you start to realize how important subjectivity really is. People experience and perceive things differently, and that’s okay. Embracing that nuance can lead to more meaningful conversations and connections. Keep up the great work, and I hope this helps!"
I also made a comment advising on better choice of words and not using such a tone of voice, totally agree that it affects the number of viewers who are willing to listen.
When DOOM came out in 2016 its minimum-spec GPU was four years old and had an MSRP of $400 (GTX 670), but by then you could get 50% better performance from a $200 GPU (RX 470). When The Dark Ages drops, its min-spec GPU will be nearly six years old and also had a $400 MSRP (RTX 2060 SUPER). Do you know how many GPUs you can buy for $200 that offer 50% better performance than a 2060SUPER? Here's a hint: it's the same as the number of GPUs that you can buy for $300 that offer 50% better performance. The closest you can get is $400 with GPUs like the RTX 4060ti, almost a whole six years later. When DOOM'16 required a four-year old GPU as its min-spec, that was reasonable because GPUs were doubling in performance every 2-3 years for the same cost. That hasn't been happening these last 2-3 generations and pretending that it has by making your game require ever-burlier and more-expensive GPUs just means a smaller audience and less sales for your game. If in 2016 all I had to do was spend $200 to have a GPU that was 50% better than DOOM's min-spec, then I could afford a new GPU and the game. If today it costs $400 to get a 50% better-than-min-spec GPU to play The Dark Ages, I'm either going to buy different games or upgrade my GPU and not buy any games!
Windows and game studios seem to be conspiring with computer manufacturers to encourage people to buy new machines way faster than they need to. Or if they're not doing that, they're doing something that doesn't look like anything but that.
@@yurisei6732 Supply-side economics. The manufacturers gotta keep manufacturing or else their profits go down, so we gotta keep coming up with excuses for consumers to buy new stuff even though their old stuff isn't broken. It also generates truly astonishing quantities of electronic waste.
And my rx 5700 xt with -10% performance as rtx 4060 won't run it since i don't have a mesh shaders. Yeah amazing, FF7 Rebirth and Indiana also can't open the exe file of the game. Modern devs are ultra lazy.
Im holding the line on FB and Reddit forums that some graphics devs are lazy for relying too much on raytracing. Pre-RT, we had a lot of great games that have terrific global illumination. I'd rather have proper GI than some half-assed RT implementation. I believe RT and GI can co-exist, but it's just wrong to expect RT will be the standard in the future unless they fix the hardware performance tax to run it well.
It’s actually really profound how good developers got at lighting before ray tracing. Like I don’t know much about tech but it really seems like such a modern marvel
Literally the only thing you cant do well without raytracing is reflections in openworld (third person) big city aka spiderman games. Every other scenario rasterization is better.
Yeah, I'd say that the current global Illumination with probes, or voxels is generally good enough and can be accelerated further by utilizing RT. But throwing it away altogether isn't the way, imho.
You may call it lazy, but try doing the "right" thing in their shoes. Invest 3x as much into graphics development (which then lowers the scope of your project). Then add double the cost in labor and time to release window. Now see your "lazy" competitors run away with all the pre-sales hype while you're left optimizing your game the right way. At the end of the day, it's a job like any other, and it takes someone truly special and also financially well backed, to even consider going against the stream. Instead of calling them lazy, call them non-visionary, or just doing the work they're paid to do, etc. Ofc with the few devs who get hostile about this stuff and pretend they're doing a great job when they're not -well they can be lynched for all I care. But in general most devs are simply not paid to do good work anymore, because the alternative is too profitable for all the competitors to run them out of business real quick.
There's a massive difference between per pixel RT solutions and the olden day GI stuff, though. Especially foliage is just impossible to shade correctly. However, modern development seems to be all about optimizing cost instead of performance. And consumers don't really have a say because everything is UE5 now and using the same shitty unoptimized solutions. It's an enshittification engine.
So, Linus is speaking about something he doesn't really understand, misinforms the viewers, and shills expensive hardware? Wow. I mean, who knew? Surely, that never happened before XD
Tbf Linus's team is way too small and egotistical to do research or fact-checking beyond a few google searches. Its the Walmart of tech review channels. You'll get information, whether its good or bad is a gamble but at least it was easy lmfao.
@@mocchi3363 his team isn’t small he downsized recently but for a while he had 120 employees. But they focus on marketing to the lowest common denominator and have basically decided to market to middle school age kids who know nothing about computers
@@Akkbar21 journalistic standards that #LIEnus himself does not follow and is only using as a shield when someone exposes their lack of integrity and internal processes
journalistic standards matter to who? the same video where LTT cries about GN, they make sure to say "we aren't journalist" so he doesn't get held to the same standard
Absolutely. All their content has gotten somewhat reminiscant to those home shopping channels on TV. It feels to me like 99% of what they do nowadays is to advertise products they get sent their way. The fun/comedic content has gotten so scarce, because it simply doesn't make them enough money and LTT has long become a company like any other where making more profit than last year is the only goal. For actual hardware news (not advertisement) and benchmarks I trust only Gamers Nexus and der8auer.
I'd argue Pixar is actually a great example for how important assets are. Despite bidirectional path tracing being known since 1997 and supported by open source renderers like LuxRender since at least 2011, they only started using it in 2016 with Finding Dori, and it didn't matter for the movies before because they always had high quality assets. They're merely not photo-realistic.
It is a good card but one which is sadly left behind with newer titles , you might need to upgrade soon. If you ask me , I reckon mainstream $350 GPUs should by now be at 3080/6800XT level. I bet you the 5060/9060 won't match that bar , the 4060ti might do but not for $350
the 1080 ti is still great. even with its age it still competes favourably with brand new entry level cards and plays well optimized games beautifuly. with nvidia putting the gtx line into legacy support for security only soon, new aaa games would run less optimal. but those are badly optimized mostly in general. worthwhile titles will run well for a long time. the greatest gpu price to performance to longevity to versatility of all time.
Don't you mean response time rather than refresh rate? Not really the same thing (although most if not all high refresh rate monitors have short response time)
Bro could you please make a tutorial series purely showcasing optimisation per skill? (As in textures, materials, lighting, etc. for Unreal) I'd honestly pay a bit for something like that provided it is giving good and quality information
@ no need to be a dickhead. I watch LTT for entertainment. Not for being the most reliable source of tech benchmarking. I haven’t said anything negative about this vid
4060 is above msrp now and the 3060 has increased in price too, its like everyone knows we hit a dead end, a plateu in technological development. The 5080 is a joke and the 5090 is barely any better than the 4090. Yet devs insist that we must use dlss and frame generation and nvidia is just trolling us at this point with "multi frame generation" Come on. Its obvious that unless devs stop improving graphics, then no one will be able to play the games, the hardware just isnt powerful enough and unreal engine 5 is terribly bloated to the core. I just play old games now, they just work and are better, also finished without the need for endless updates and patches.
I got BF4, a game from 2013, that can render perfectly clear at native 4k WITH upscaling and run VERY good compared to literally every game I've played that has DLSS. It looks so much sharper than any DLSS game that it's not even a remotely fair fight. Sure the lighting, models, and textures aren't up to modern "standards" but the core rendering is so clear those details fade away fast and I'm left with a far nicer image to look at that's far easier to tell what I'm looking at when it isn't hidden behind a blur fest. It really puts into perspective just how much better rendering used to be (and I'll argue that older games than BF4 have even clearer rendering).
@@TacticalSheltie BF4 is whatever man, the real revolution was BF3 experimental MANTLE implementation - game that previously ran on integrated GPU (AMD A-series CPUs with integrated graphics) in like 720p10fps was hitting nearly stable 1080p 60fps on the same hardware. It was the base for future VULKAN development but we never seen such gains ever after.
It got to a point where it's so blurry that I thought I was looking at a video with tons of compression, and my brain turns off when I move the camera quickly to prevent nausea.
Point is, game company are unwilling to spend their development budget on optimization, because why do the work when you can tell the players to put in their own money both for the game, and for the hardware necessary in order to mask the poor optimization aspect of their product.
I really love the fact that you use true black for background. Soo good to watch on amoled. So many videos use i dont exatcly know what. Shining black i guess which looks like lcd.
LTT once again getting dunked on by someone other than GamersNexus and they think it's only GamersNexus who has a problem with how they perform their Journalism and content...
@@marcogenovesi8570GN did call himself a journalist many times without holding himself to journalist ethics which is wrong. He magically stopped calling himself a journalist after Linus called him out, strange isn't it? This video is a critic of LTT, which is not the same. There is no critics ethical code. Threat interactive could talk shit about Linus however he likes. By the way, yes right to respond is critical in journalism. Coffeezilla for example always reaches out to all the people he instigates, no exception.
I may not be able to do much besides liking and subscribing to your content, but I am very grateful that you stating this is still more than enough to keep spreading word about this matter. Give these bad graphics hell, Threat Interactive.
Honestly, considering the insane levels of optimization in old games etc, I wouldn't be mad if a modern pixel art game was using 4-10 times more compute than it would when compared to the old gen. That said, there are pixel art games that are using 50-1000 times more compute. (Or more. These aren't hard numbers, more just pulled out of nowhere to illustrate my impression. ) Heck, any 2D game should run smooth on any modern hardware that isn't extremely weak. While I think Hollow Knight might be one of the best games ever made, performance wise it was a total mess until they released the switch port where someone who knew how to program fixed a ton of their code. Hollow Knight, if coded well, should have been able to run on a potato. What's sad is that it looks like this idea of developing for ultra-high end specs is prevalent in the whole industry. I honestly think every dev team should buy a $1-1.5k laptop or have some computer with equivalent specs, and be able to run their game on it at reasonable settings. Because scaling up graphics from that baseline is far better than scaling down graphics from a broken baseline.
Core Keeper is a banger example of your point. I tried to run it on my work laptop once and got FPS in the mid-30s on an i5-1135G7. Just like, think about it. This is a pixel art game barely sustaining 1080p30 on what's effectively a midrange 700-series card.
better yet, they should buy a refurbished laptop, preferably an older model from like 2017 or so. Ensures that they cover the vast majority of potential customer hardware.
I feel so lucky to have discovered this channel and the like-minded community around the demand for more optimization - or at least a dissent against the ever-increasing complexity and price that come from laziness and greed. Thank you, Threat Interactive. And thank you, the caring gamers.
Optimization is one thing but I would really, really want games to stop looking like it has motion blur cranked up and was played on poor quality VA panel. It's just smears man... it's night and day playing recently released game vs game released in 2010-2019 on OLED...
yea, idek if amd wants to sell GPUs... they have virtued this next gen will be different but we'll see. AMD's best advantage on PC is their APUs, however, they don't necessarily have that on lock, Given a few more years, I wouldn't be surprised if Nvidia is making APUs based on ARM, then Nvidia spreads the same disease there. Intel is crashing real hard rn especially on mobile
Nvidia created a problem and offered a solution. Moores law is dead, therefore they couldnt make generational leaps in performance to stay ahead of competition like they used to. So they implemented raytracing and dlss. Now that dlss and raytracing is a must, they offer you videocards with that functionality, and they can boast how they tripled Tensor cores, while the raster performance is barely 30% more.
1:40 In all seriousness, having 2D pixel art games that don't just sit in the single digits for CPU and GPU use bugs me more than any poorly optimised 3D game. Running an emulated SNES game on my Steam Deck makes the power draw go _down_ from the Steam Big Picture view! And that's not even native code. I know, a lot of pixel art games still use 3D assets and particle effects but still, it seems absolutely unreasonable how comparatively resource intensive a lot of them are.
Yeah this one is wild. My little linux laptop has made me respect devs that use barebones engines so much more. Also take a look at PICO-8 and the upcoming Picotron, they're like programmer toys that make devs restrict themselves to a format that runs on basically everything lol
You're surprised a modern game is more demanding than a SNES emulator? I think you're underestimating just how much pixel art games have advanced since then
Love how you're shining the light to everyone in this industry! Learned so much from all the videos your team makes and it's trully stunning the differences you can make with such optimization. Keep up the amazing work
I really hope Lisa Su finally makes Radeon do something different. With Ryzen AMD did change things, like offer more cores at first, then chiplets and now the 3D V Cache. And it payed off in a big way
It's a cyclical investment. First they have to make a card that can do something cooler then get that software into major engines. Nvidia is very proactive in sticking their noses into game development, they target the most popular engine (UE) with plugins and even have their own engine.
I really enjoy your videos, however i want to give feedback what personally I think you could improve. Your videos are very fast, which is generally speaking not fast, however you put a lot of information on your slides which have to be manually paused to understand them. Also when you do comparison screens that you overlay you could switch in between a few times or mark the relevant parts. Apart from that, great videos as always!
There are some people out there that are still trying to play the newest games on 10 plus year old cards. They don't understand the importance of upgrading
Finally, someone consistently calling out the broken mindset of modern game graphics! Your deep dives into Nvidia's tactics and Unreal Engine 5's terrible optimization expose exactly what gamers have been frustrated with for years. The industry needs more voices like yours pushing for real optimization instead of pointless upscaling and performance compromises. Keep up the fight!
I always hate the excuse it gives devs more time to do other stuff optimisations don't take forever and you can still use RT in offline settings then bake that lighting for us gamers to not waste our electricity and compute on. The second excuse they give for no longer baking lights is it costs HDD space when SSDs price per gb is at an all time low.
The Thanos of video game graphics. Dread it. Run from it. Threat interactive arrives either way. “One does not usually consider fun when balancing unreal engine…. But this does put a smile on my face” -Threat interactive probably.
There's not enough history for it to become a standard just yet. 240p60 and 480i30 were standards because CRTs were manufactured to receive signals that way in most countries, but that dates back from the 60s. Only in the mid 2000s we really broke this scanline per second barrier with HD and digital standards, but those were in the making well before that decade. 60hz still is the norm, even if higher end products have been changing that. It's good to think about a future where we target 120 fps, but that's not a reality worth pursuing today simply because it's too much of a premium niche yet.
Unless the everyday consumer becomes more technically minded "like in the good ol' days" & sees through the capitalism then sadly we can just shout in the void about this, as the everyday consumer couldn't give lesser of a fuck how their game looks like as long as it launches & they can play it, framerate & graphics be damned
Consumers didnt make good decisions in past and were even more uninformed/missinformed than today. Some of the biggest companys in PC world menage to survive horrible product lines or even few horrible product lines in raw and somehow still outsold competitors who offered better products at every price level. That includes his beloved Nvidia that historically looking doesnt deserve success they have today. People were not voting with their money back then and they do now vote now eather. We live in a world were 70USD unfinished games sell over 10-20 million copys, RTX 3050 outsold RX6600, RTX 4060TI is outseling RX 7800 XT and more AIOs coolers are sold than air coolers in some markets.....just to name few examples.... thats what consumers are.... SHEEPS and always will be.
I only needed to watch until they start saying "RT helps performance" and that's all I know they either are stupid or are intentionally misleading their viewers. There is NO WAY someone who runs a tech UA-cam channel is stupid enough to believe that.
It's going to be so neat having to pay 3k USD for the 5090 for games in the next 2-4 years. Simply because games are going to get worse, optimization wise. And by fun i mean I wont and will stick with my 4070TISUPER. If these developers gave a damn, the 1080 Ti would still be running practically every game today at least 30+ fps, with 60 being viable with some settings tweaks. Also can we get rid of motion blur? I hate that.
Yeah , I can't see how it is sustainable to need $3000 600W space heaters to run your game at 4k with an ideal experience. I think that now that AI uses GPUs there will be a hard limit for how much horsepower is economically viable for gaming - and we might get there with the 10th console gen.
@@tourmaline07 I mean, at this point we are hitting the limit of household wiring (at least in the US) Computers can't really pull too much more power.
The sad reality is that a chunk of the console market still use motion interpolation on their TVs. It's hard to shift the mindset of the layman when they don't realise their devices could push out clean, clear frames at incredible quality if only the industry still prioritised making games run on broad hardware. For almost every solution raytracing claims to fix, there's been an established alternative in use by developers for years that wouldn't have brought 90 tier cards to their knees but provide an impressive presentation. I recently took another look at the Battlefield 3 F18 Hornet mission and I am still impressed at what we were able to push on 7th generation consoles.
I really respect the realistic approach to breaking down the GPU market. I had always suspected that AMD was not a serious threat to NVIDIA but didn't know exactly why. Your explanation of AMD not having a clear vision for future graphics makes a lot of sense and your criticisms were fair and not held back. I am hopeful your studio will see success because we need more leaders with a VISION to bring the whole industry forward at this point. Keep it up! I will continue to watch your videos and hope for your success.
Great content! That MUST be more popular. As a gamers we just should STOP buying games, unless game is optimized properly. Voting with your wallet is the most effective thing.
Great video man, graphics in old games like metal gear solid V : TPP were so good on my potato pc, so what I didn't understand is how 10 years later, games have worse performance and visuals even when I now have 5x the compute. Good to see I wasn't imagining it.
I like GN and LTT for what they are, but I’m no fan boy of either. I felt Steve is high on his own self righteousness and thinking he’s above the rules. Then he gets with Rossmann, the king of self righteousness and doubles down. Don’t call yourself a journalist and then think you can play fast and loose with the rules because you’re on the side of justice. Linus isn’t perfect obvs, but he was right on that one point.
You should not let anything Linus says bother you. Linus acts like he knows everything. He once said something in one of his videos about AMD engineers not knowing what they are doing or thereabout.
I can't tell if it's my own bias and you're just reinforcing my prejudice or if things are very rotten. Maybe both. It's the capitalist incentive I think, money. AAA games aren't made by creatives with a shared, singular vision. They're made by 1000 people working for a huge corporation. The trend is that things are going to get worse even though the hardware continues to improve drastically every generation. What a strange thing!
Wow. I've learned a lot. This video is very educational and has really stated how I kinda felt deep down in while, but you were able to put it into words. Thank you.
Raytracing should be a developer tool to have a reference point on how a scene's lighting should look, not a feature for hardware to brute force and cheat/lie about quality while also being expensive to run in real time.
See Source 2 engine: When creating a map in Hammer 2 editor, it uses Ray Tracing. It does not run without a hardware RT card. But this is the editor. It uses RT to preview the lighting that will be baked in when completing the map. The playable map will have very convincing fake GI and RT, but it does not require any HW RT card, and runs very well. Yes, the map is static, but the physics entities has some kind of GI on them, and I do not know how Valve did that.
What is ironic, is that Indiana Jones's modified idtech is still generations faster than UE5. Still does not look really better ten new order/new colossus, on the same engine. Shame they decided to use TAA only and not have MSAA. Also, most shadows do not seem to be really ray-traced anyway.
Do you think its possible that eventually we will have AI tools that are specifically designed to help optimize games, or is that something that's outside the scope of ai and needs to be done manually?
This is something we have been advocating for awhile. AI is already being heavily integrated into game-dev but in the wrong ways. We've discussed this topic is several of our videos. I highly suggest watching each one. We only 10 at the moment but each one is a dense presentation that gets you up to speed.
I’d love to see you and Alex Bataglia have a debate on this. I feel his coverage and dismissal of devs abusing upscaling as well as his unhealthy obsession with RT encourages this kind of behavior in the industry.
Honestly, I don't dislike the use of AI, if used right- When it was just "render the game at lower resolution and upscale it" it was fine (if it was like 50-80% native), the input lag diff was minimal and in non competitive games it was fine. But the insane input lag compared to native I've experienced so far with any "generative AI / extra / fake frames" is in my opinion absolutely unacceptable. When I move my mouse, I expect the character on screen to move along, not like a bungee that charges with my mouse movement and then follows along after. Yes, I am sitting on my 1080TI and yes that might impact the experience I have with frame gen ( I very much expect the new cards to have a lot less input lag with the gen AI). But even after seeing it on a system of a friend with a newer tier card (30-series) I felt the bungee effect and I detest it. I love the sharp feeling of my aim starting and stopping where and when I move my mouse / to. It's a clear difference in immersion. Also, why should I upgrade to a new gen card when I *TO THIS DAY* can get 60fps in 1440p in most NEW games with a little settings magic around high / medium settings. And easily 100-200fps in actual optimized games (i.e R6S / MC /... I don't really play Fortnite but that one too.., Titanfall 2 (Older but GOAT, btw, where part 3?)) The only games and scenarios where I don't get 60fps is in new releases when they rush the release. I remember Cyberpunk release (30fps). Hogwarts Legacy release (40fps + 20 minute crash cycle) and many more. A few weeks / months later, most of them run no problem on my System at 60+ on medium / high. If the "AI" frames would be skipped when the next real frame is there I think it would remove / minimize the input lag while providing smoother gameplay, maybe become a bit more choppy but still more reactive. (Not a frame gen expert obv. but maybe sth like that?) I was appalled when I downloaded cod6 2 weeks ago and the standard settings for the game were up scaling at 25% resolution + ai frame gen, at high. I got choppy frames, input lag and washed graphics.... *WHAT* . After switching to (imo) optimized settings with frame gen I got 80fps at high but ofc, the bungee effect again.. So I swapped to native and low / med and got my 60-80 fps with sharp reaction times. (The graphics still look great at med / high-low when keeping model textures at higher levels while turning environment and shadows lower) But even that they would have all that Ai and up scaling on from the start.. my game looked like a wet towel smeered over an oilpaining of the game every frame. Everything more than 10m away from the character looked blurry and text unreadable. (Compared to Native) (Again, 1080TI, this obviously makes a difference, but why would you turn all this -shit- on when the game renders just fine on native frames?) It's like they don't even test and optimize for anything lower than 30-series (and whatever amd alternative that year / release model ) anymore. It feels like the time again when my PC with all that I could afford couldn't even run Overwatch1... I get it, better graphics, larger games.. But what do the graphics help me when the game settings don't even show me the beautiful graphics the designers made, down scaling it to 640*360 and then upscale it again + fake frames? Games should be immersive, and again, input lag doesn't help that.. Idk, imo there needs to me more work done, every game release gets huge criticism for bad fps / optimization / ..in general? Stop rushing games, if the game is good you get the money, if you release it in a bad state you don't.. Why are they still releasing the games unfinished.. Are their market researchers that blind or is it the directors board that can't get their heads around polishing and optimizing? -Or do the devs just not want to do optimization?- ( I doubt the last, I'm sure the devs also want to release a good, fun and well running game) Rant over, don't mind it to much. Just annoyed by the way the games industry is going. Most games run fine and are really fun still.
@@silvershines Not native, yes. But you can use software alternatives. OFC they aren't "as good as native" but I think they should be good enough to handle them.
Nah I have a 4070 super and lossless frame gen for me is way better then the current dlss FG, I use it over it (Don't know about the transformer models FG) only issue is it takes a effort to make lossless FG run properly @@Jxdiac
@hamzakhalil-gs7oc with lossless frame gen do you mean the steam program or a certain frame gen setting exclusive to the 40-series? I only had a look at frame gen using the 30-series so I dont quite know about the options the 40 series has.
Your message is correct in its core. I am also a supporter of what you are trying to change in the industry. Though I would strongly advise to change the density of the videos themselves or also produce videos that focus on broadly understood language to explain the wrongdoings and approaches you are trying to get. I am watching your videos at 90% speed and still it’s a lot of information plus text on slides that are shown for a second or two. Tldr. I think your message and passion would reach more people if there is a simplified version of your content. Maybe increasing the team to get some proper marketing experts in, to help with formulating core value propositions and the messaging of those. Thanks for your work.
This is the exact reason why I really stopped trusting anything "Technical" from LTT. Time and time again they are called out for inaccuracies. With the reach they have it makes it multiple times worse. Even excluding all of the recent drama or this video I often listened to their content asking myself "WTF they are talking about?" They really need to fix that as nowadays they only hurt the industry instead of helping us like it used to.
I'm worried about the nex DOOM. The baked lighting in Eternal looked amazing, but now with the mandaroty ray tracing I feel like it will look so bad on low / medium settings like IIndiana Jones whereas it could have looked ok at those exact settings and run amazingly well.
Check this link to get yourself a comfortable chair this new year:
www.flexispot.ca/flexispot-professional-ergonomic-office-chair-c7m?KOL&Threat+Interactive_C7M
PLEASE READ FOR UPDATES & RESPONSES:
1. Watch this video in 4K (as in streaming settings) as any lower compression will hide the visual details discussed.
2. We would really appreciate it if viewers could share on giant game related subreddits. These prevent self promotion but these can really push our channel's success by the thousands. Watching the full video without skipping etc also boosts the algorithm. We know some viewers want to help us reach more people and these are the best methods to do so.
3. 11:36 consist of three 4K videos that refused to render properly with multiple tries.
*4. Vote here to see who else thought which was which at 8:26!
ua-cam.com/users/postUgkxdwlBoI9FKFU_1TMCFqW4CEzEYz21XiXk?si=mu3u-TiX23tXNnSG
*5. Our TAA commands are pretty similar to the ones we discussed in our other videos with slight modifications and still having the same requirements such as 60FPS+, v-sync etc.
*6. Please Note: At 7:40 an error listed "8 core 3.5GHZ" twice instead of once.
*7. Stay Tuned
can you please explain why with pathtracing enabled an open world game like cyberpunk 2077 works on 60 series rtx card and b580 at almost playable (30-40fps@1080p upscaled)but for a linear game like indiana jones pathtracing doesnt work at all(20 fps at max)?
Btw you forgot the answers to the blind test at 8:26
@@waterbottler2761 Go vote!
@@ThreatInteractivewould you discuss about tokyo xtreme racer being very optimized and even more with some community mods? it's pretty insane that the community can go optimize the game further that even a 550ti able to run it, despite the game using ue5
@@ThreatInteractive thanks!
Thank u for your videos. I am a CTO of an IT company which specializes in commercial simulators (we're using ue5.3). We have used lumen, nanite, full dynamic lighting cause it's easy to implement and saves production time (costing framerate ofc). One day I've watched your video and said "lets try it the old way". We made LODs, optimized them, baked the lights (we do not have any destructible objects or day/night cycle), made changes to expensive materials, reduced drawcalls, etc. Boom, 60fps on 1060 on a 2x2km forest map with grass, good lighting etc, 1080p native. Triple the fps on a 4090 in 1440/4k. The simulator feels way more responsive, clients are happy, and it feels right
Sry for my english btw, I'm not a native speaker
That's great to hear!
That's the core of the issue, Nvidia is pushing their worse stuff under the guise of "easier for developers", you went the old way and that exposes how unoptimized the new stuff is.
Is it a small company, because if it isn't, then unanimously deciding to dedicate time and resources to a project doesn't sound like a likely scenario.
@@parlor3115 He is the CTO so they didn't "unanimously decide" he said "do it" and the employees did it
@@parlor3115 We tested the traditional approach on the worst performing map to see what can be achieved. I am a CTO, so I took some guys to work with me on something that might work. It took around a week to do the job (the map is just a forest with some fields, river and a road). We saw the results, and now we're optimizing other maps the same way. It isn't the fastest process, but it is beneficial for us in the long run (lower PC requirements - more customers, especially with our target audience, where ok level GPU is 1070 and good GPU is 3060 or something). And yes, we're not a big company, we have only 27 people
Stop Killing Games, and stop smearing vaseline on my screen!
I will pick some vaseline with path tracing than raster with the sharpest textures. Path tracing looks like a movie / real life, raster is still a fantasy game with loads of bad / incorrect lighting, shadow, reflection details that break immersion
@@deivytrajan but you will not see it, as you will have vaseline smeared on your display.
@@deivytrajan Looks like a 720p movie. Trash
@@marceelino womp womp
You will take your vaseline and you will be happy!
"Threat Interactive just hates Nvidia!". Threat Interactive: "No, AMD is WORSE". This isn't about stanning brands or some shit. This about fundamental problems in the industry. Another great vid TI
Yeah, Jensen and Lisa are family. I don't get fanboys when both CEO can go to family dinners and laugh at you. Your opinion on AMD being better price to performance are irrelevant to them.
That is the problem with a market duopoly and running after the other doing jack sh*t to innovate
@@dingickso4098 They are like Mafia with AMD agreeing to lose desktop GPU market but get console market (outside of Switch) while Nvidia agreed to have entire desktop and laptop market, AMD dominates now CPU market but it's because of intel incompetence not their false competition. Anyway most of the money are in the AI and data centres and gpus/cpus are just a side gig (about 40-60% of revenue).
@@dingickso4098 when AMD is charging $1000 for a videocard with RTX4070 level of raytracing performance - you know it's a complete BS, like AMD is actually WORSE at price to performance. So I have no idea how people not see it.
@@selectthedeadtriopoly to be honest
TAA's biggest opp
✋🏻🥸🤚🏻
Edit: This comment was "First" and has now been edited. Just letting you know
TAA was a mistake and cancer
TAA will never sleep well as long as this guy is still breathing
people misinterpret frame gen: It does not improve performance, but maybe image smoothness
he's very clearly not anti taa, but against it being used to hide poor development and cost cutting and pro consumer choice
The ghosting and blur with Unreal Engine 5 games is UNREAL.
But not as much..@BusyWeaverBeats
Literally every game on UE5 drive me crazy because of the smearing, blurry, grainy stuff and ghosting effect. Hell, the games on UE5 doesn't even support MSAA anymore and you can't get a good image
@@vitaliyleopard7309Try Tokyo Xtreme Racer, it looks very good on the lowest setting.
That's the only UE5 games I know that looks "good enough" on lower end hardware
DLSS4 has clean up a lot of it. It's like night and day compared to what we had before.
Say that again?
One of the big promises of higher resolutions was to eventually reach a point of not needing any Anti Aliasing at all any more.
No need for computing more than the actual pixels on screen again, no down or upscaling, no camera jittering, no blur, and a pixel perfect match between inputs and what you see on screen.
That is the future i want
Here I was thinking that was the whole point of 4k. Since when did we need AA for a 4k image? Oh, yeah, since this DLSS and Ray Tracing BS started.
@@Berserkism deferred rendering will probably always require anti aliasing. Even at 8k
@@theanimerapper6351 what does that even mean, deferred rendering is essentially the same output, unless you start rendering stuff in the pipeline below screen resolution and upscaling it.
I am currently writing a deferred renderer for a module at university and the output is identical to a forward renderer - the only specific caveat here is that all the G-buffers are at screen resolution.
@@Berserkism to get rid of aliasing it isn't about resolution per se, it's about pixel densities. Your eyes can resolve a crazy amount of detail so you would need absured levels of pixel densities to achieve it. You also end up with temporal aliasing issues as well and solving those isn't something you can do with a resolution bump either. It instead requires motion blurring, but unlike the blurring we get in video games, it instead is a compositive of many more frames condensed down into one. It's essentially the entire operating principle of TAA. TAA only works well at extremely high frame rates so it has enough frames to sample from. But I digress, there is too much nuance to fully reply in just a YT comment...
Back 10 years ago I had a 4k laptop screen. I was always amazed that if I turned AA off at 4k it would look better than AA on 1080p
The GPU market is a perfect representation that most consumers don't understand what they want or are talking about.
There is no universe in which a 24 GB GPU is needed to play video games. However, rather than people calling for better optimization, the defacto solution is "just buy a better card bro".
I hate this hobby.
Literally the same thing that happened with IA and deepseek, everyone is going the "invest in more processing" way instead of the "be more efficient" way. Until someone disrupts it, the bubble keeps growing.
It’s not the same. It’s true that deepseek r1 shows very good results compared to o3. However, it was trained on openai model output. If openai hadn’t trained its billion-dollar model, the million-dollar r1 simply wouldn’t exist.
@arcarius8169 it is the same though, because we're talking about optimization. Deepseek slightly outperforming gpt isn't that wild, but what is wild is that it does that with severely less performance costs. Them saying "screw CUDA" and jumping down to PTX is literally what made it all possible. They worked within their limitations, optimized, and open sourced. That's exactly what's needed in the industry from AI to graphics.
IKR?? No one takes a moment to stop and think maybe we need to optimize shit instead of just throwing more power at it. xD
Like coding but instead of writing clean efficient code you're like "eh ill fix/refactor it later" when its already huge and would take a lot of effort to do it. Yes this is faster, but its so bad...
I often joke saying "we'll get it in post" when either coding or drawing, but these companies actually think that way.
@@arcarius8169 There's no concrete proof for this whatsoever. Just because the model sometimes mentions openai doesn't mean it was trained on its output. Remember that gpt 3.5 and 4 shitted up the internet irreversibly and filtering it out from the datasets completely is pretty much impossible.
Also oai doesn't show their reasoning tokens so that couldn't be stolen either. Most of the stuff is explained pretty clearly in their paper and math adds up.
Deepseek isn't that low on processing as it's a tier 2 AI model that uses other models data. It's in no way more efficient to train. It is a tad more efficient to run as it a pre condensed form of the top LLMs.
Unfortunately lots of others (rich snobs) are calling you ignorant in pc gaming community. They don't want to see the games being optimized. They would rather through money at the problem
Yeah f them.
He is a bit biased, but most of his critics are often less knowledgeable. The point is to not be tribalistic about it and look at facts. Temporal techniques were also invented to solve things that classics methods couldn't handle like specular shimmering//shader aliasing. He even admits in this video that destructible walls require raytracing for good GI like Lumen, so he is partially admitting that this "bad" technology wasn't without a real need. We now have a lot of games with dynamic environments and those older rendering methods he prefers CANNOT handle them properly, so some of the stuff he hates are genuine innovations to make games better. It's more of a mixed bag than the way he describes it, but the overall premise of his videos are correct: game graphics are becoming quite a mess and many expensive techniques are used in games that do NOT need them (e.g. using Lumen in a static game is insane).
@@kazioo2I think the only reason you see people be 100% against this tech is because it’s gotten so out of hand.
I mean cmon, games today can barely even run
They only valid reason for calling this guy ignorant I see can be summerized as : "what ? You mean that, with a little bit of effort, instead of relying on bloated "Auto" presets, that game could run as good on a $400 GPU than it does on my $2000 one ? Impossible ! Only suckers throw their money away, and I can't be a sucker, only everyone else can !"
throw
Yall ever notice how games devs put more energy into attacking random people than they do corporations?
Like all these devs suddenly come out the woodwork to slander threat interactive. But why are they silent when it comes to talking about the horrors of the industry? Where were they when Activision was a horror house? And where are all game dev discords and videos dedicated to talking about bad management and corporate greed?
They seem to only come out when it’s regulars people.
Remember when game devs attacked Larian and from soft? Mf how about yall attack an executive or something?
Yep they are a bunch of bottom feeders.
because these devs are the college students or fresh off of college who spend an entire time in UE. Instead of focusing on making games, they chase that "cutting edge" high which they will never going to use.
Anyone with half a brain can see this man is correct.
@@JediCore college creates dumb out of unknowing it seems
Bars
Nvidia is selling us solutions to the problems they created, and it ain't cheap.
@BusyWeaverBeats because nvidia is just the biggest player in a plan. Doesnt mean that they are alone in that plan. Everybody is.
Its getting funnier when he is saying shit about RT,DLSS and FG quality and performance (which is true) and then blame AMD why they didn''t invent it first xD
They always have, it's how Gameworks was conceived, and literally how it worked for over a decade.
It's always introduced effects into games that their latest cards were uniquely capable of handling, but their older models (and the competitor's models too) couldn't keep pace.
Classic Problem -> Reaction -> Solution that government been using for generations.
Hey there, as a 37-year-old physician with a deep passion for video games (and a strong disdain for poorly implemented antialiasing and blurriness), I rarely comment on UA-cam, but I felt compelled to chime in here. First off, I really appreciate the overall message this channel is trying to convey-it’s important and resonates with me. That said, I want to offer some sincere advice: the message might not reach as far if it’s delivered with hostility or egotistic bravado. I’ve seen it time and time again-when the tone becomes confrontational, it often pushes people away rather than drawing them in. I’m sorry to pull the ‘I’m older, so I’ve seen this play out before’ card, but it’s true. A more constructive and inclusive approach could make a world of difference in spreading your message effectively.
Also, I think it’s worth mentioning that visual topics like these often land in subjective territory. When you’re younger, it’s easy to want to see everything in absolute objectivity-black and white, right and wrong. But as you get older, you start to realize how important subjectivity really is. People experience and perceive things differently, and that’s okay. Embracing that nuance can lead to more meaningful conversations and connections. Keep up the great work, and I hope this helps!"
Dude, "inclusive"? Stop drinking soy milk. Not only are you old you are out of f*cking touch! We are pissed! And we NEED real passionate men.
Thanks. I wanted to write just that.
I also made a comment advising on better choice of words and not using such a tone of voice, totally agree that it affects the number of viewers who are willing to listen.
Effing wimpy comment.
I am 35 and I think the video is to the point. He points out everything correctly, so I don't see how it's hostile.
Sounds like "you" problem.
When DOOM came out in 2016 its minimum-spec GPU was four years old and had an MSRP of $400 (GTX 670), but by then you could get 50% better performance from a $200 GPU (RX 470). When The Dark Ages drops, its min-spec GPU will be nearly six years old and also had a $400 MSRP (RTX 2060 SUPER). Do you know how many GPUs you can buy for $200 that offer 50% better performance than a 2060SUPER? Here's a hint: it's the same as the number of GPUs that you can buy for $300 that offer 50% better performance. The closest you can get is $400 with GPUs like the RTX 4060ti, almost a whole six years later. When DOOM'16 required a four-year old GPU as its min-spec, that was reasonable because GPUs were doubling in performance every 2-3 years for the same cost. That hasn't been happening these last 2-3 generations and pretending that it has by making your game require ever-burlier and more-expensive GPUs just means a smaller audience and less sales for your game. If in 2016 all I had to do was spend $200 to have a GPU that was 50% better than DOOM's min-spec, then I could afford a new GPU and the game. If today it costs $400 to get a 50% better-than-min-spec GPU to play The Dark Ages, I'm either going to buy different games or upgrade my GPU and not buy any games!
Windows and game studios seem to be conspiring with computer manufacturers to encourage people to buy new machines way faster than they need to. Or if they're not doing that, they're doing something that doesn't look like anything but that.
@@yurisei6732 Supply-side economics. The manufacturers gotta keep manufacturing or else their profits go down, so we gotta keep coming up with excuses for consumers to buy new stuff even though their old stuff isn't broken.
It also generates truly astonishing quantities of electronic waste.
And my rx 5700 xt with -10% performance as rtx 4060 won't run it since i don't have a mesh shaders. Yeah amazing, FF7 Rebirth and Indiana also can't open the exe file of the game. Modern devs are ultra lazy.
@@Dregomz02I bought a 6600xt for that reason, even though the 5700xt is a lil better
Indiana jones runs horrifically bad on pc so ur not missing much
DOOM 2016 is built from the ground up on TSSAA, Its at it's core a temporal engine. That is why it's so performant...
Im holding the line on FB and Reddit forums that some graphics devs are lazy for relying too much on raytracing. Pre-RT, we had a lot of great games that have terrific global illumination. I'd rather have proper GI than some half-assed RT implementation. I believe RT and GI can co-exist, but it's just wrong to expect RT will be the standard in the future unless they fix the hardware performance tax to run it well.
It’s actually really profound how good developers got at lighting before ray tracing. Like I don’t know much about tech but it really seems like such a modern marvel
Literally the only thing you cant do well without raytracing is reflections in openworld (third person) big city aka spiderman games. Every other scenario rasterization is better.
Yeah, I'd say that the current global Illumination with probes, or voxels is generally good enough and can be accelerated further by utilizing RT. But throwing it away altogether isn't the way, imho.
You may call it lazy, but try doing the "right" thing in their shoes. Invest 3x as much into graphics development (which then lowers the scope of your project). Then add double the cost in labor and time to release window. Now see your "lazy" competitors run away with all the pre-sales hype while you're left optimizing your game the right way.
At the end of the day, it's a job like any other, and it takes someone truly special and also financially well backed, to even consider going against the stream. Instead of calling them lazy, call them non-visionary, or just doing the work they're paid to do, etc.
Ofc with the few devs who get hostile about this stuff and pretend they're doing a great job when they're not -well they can be lynched for all I care. But in general most devs are simply not paid to do good work anymore, because the alternative is too profitable for all the competitors to run them out of business real quick.
There's a massive difference between per pixel RT solutions and the olden day GI stuff, though. Especially foliage is just impossible to shade correctly.
However, modern development seems to be all about optimizing cost instead of performance. And consumers don't really have a say because everything is UE5 now and using the same shitty unoptimized solutions. It's an enshittification engine.
Wake up babe,
Threat Interactive uploaded
So, Linus is speaking about something he doesn't really understand, misinforms the viewers, and shills expensive hardware? Wow. I mean, who knew? Surely, that never happened before XD
Lmao exactly, they still keep fumbling the same way :DD
Sad
Tbf Linus's team is way too small and egotistical to do research or fact-checking beyond a few google searches. Its the Walmart of tech review channels. You'll get information, whether its good or bad is a gamble but at least it was easy lmfao.
@@mocchi3363 his team isn’t small he downsized recently but for a while he had 120 employees.
But they focus on marketing to the lowest common denominator and have basically decided to market to middle school age kids who know nothing about computers
Clearly not a long time LTT viewer
"We do a little gaslighting" - All AAA companies now-a-days
Prepare for Linus to cry that you didnt contact him for a response and denied him the right to reply by sending texts to your old phone numbers.
OH
MY
GOD
Pfft. Just stop. We don’t care and journalistic standards do matter regardless of what you think about LTT and GN.
@@Akkbar21 journalistic standards that #LIEnus himself does not follow and is only using as a shield when someone exposes their lack of integrity and internal processes
journalistic standards matter to who? the same video where LTT cries about GN, they make sure to say "we aren't journalist" so he doesn't get held to the same standard
@ they matter only if you are exposing LTT it seems.
love how dramatic the thumbnail is
Batman of the graphics industry.
It isn’t an easy life
The man's style feels so unintentionally funny it's great
@@Dorumin It’s definitely intentional
LTT decayed in quality years ago, but I can't really say it was ever a channel for advanced content
LTT can go suck a lemon
LTT has always been a comedy channel first and foremost
Luke showing you how to mineral cool your PC from ages ago was fairly advanced
Absolutely. All their content has gotten somewhat reminiscant to those home shopping channels on TV. It feels to me like 99% of what they do nowadays is to advertise products they get sent their way. The fun/comedic content has gotten so scarce, because it simply doesn't make them enough money and LTT has long become a company like any other where making more profit than last year is the only goal. For actual hardware news (not advertisement) and benchmarks I trust only Gamers Nexus and der8auer.
Infotainment
I'd argue Pixar is actually a great example for how important assets are. Despite bidirectional path tracing being known since 1997 and supported by open source renderers like LuxRender since at least 2011, they only started using it in 2016 with Finding Dori, and it didn't matter for the movies before because they always had high quality assets. They're merely not photo-realistic.
Agreed!
love when this guy uploads, he makes me feel validated for having a 1080 ti
The 1080Ti is still a great card
It is a good card but one which is sadly left behind with newer titles , you might need to upgrade soon.
If you ask me , I reckon mainstream $350 GPUs should by now be at 3080/6800XT level. I bet you the 5060/9060 won't match that bar , the 4060ti might do but not for $350
the 1080 ti is still great. even with its age it still competes favourably with brand new entry level cards and plays well optimized games beautifuly.
with nvidia putting the gtx line into legacy support for security only soon, new aaa games would run less optimal. but those are badly optimized mostly in general.
worthwhile titles will run well for a long time.
the greatest gpu price to performance to longevity to versatility of all time.
Our boy is still fighting the good fight.
First we were sold monitor's with more and more refresh rate to reduce smearing and now smearing is reintroduced to reach those high frame rates.
Don't you mean response time rather than refresh rate? Not really the same thing (although most if not all high refresh rate monitors have short response time)
This now my monitor and 4k 144hz qled looks blurry as shit. I gave up went back to older vr titles
Bro could you please make a tutorial series purely showcasing optimisation per skill? (As in textures, materials, lighting, etc. for Unreal) I'd honestly pay a bit for something like that provided it is giving good and quality information
Stay safe, dude. Linus's fans' toxicity is on another level compared to EPIC or any other GameDev subreddit.
there's a difference between fair reporting and coverage vs trying to smear their image like steve has been doing.
@@bryan_350_5 oh god they're here already
@ no need to be a dickhead. I watch LTT for entertainment. Not for being the most reliable source of tech benchmarking. I haven’t said anything negative about this vid
@@bryan_350_5 even outside the gamers nexus stuff the fanbase has always been super defensive against criticism
@@bryan_350_5 Entertainment? As in you find it fun to watch someone that doesn't know anything pretend to know everything as a professional?
Never stopped stuttering through a game so fast to click on a video.
4060 is above msrp now and the 3060 has increased in price too, its like everyone knows we hit a dead end, a plateu in technological development. The 5080 is a joke and the 5090 is barely any better than the 4090. Yet devs insist that we must use dlss and frame generation and nvidia is just trolling us at this point with "multi frame generation" Come on. Its obvious that unless devs stop improving graphics, then no one will be able to play the games, the hardware just isnt powerful enough and unreal engine 5 is terribly bloated to the core. I just play old games now, they just work and are better, also finished without the need for endless updates and patches.
I got BF4, a game from 2013, that can render perfectly clear at native 4k WITH upscaling and run VERY good compared to literally every game I've played that has DLSS. It looks so much sharper than any DLSS game that it's not even a remotely fair fight. Sure the lighting, models, and textures aren't up to modern "standards" but the core rendering is so clear those details fade away fast and I'm left with a far nicer image to look at that's far easier to tell what I'm looking at when it isn't hidden behind a blur fest. It really puts into perspective just how much better rendering used to be (and I'll argue that older games than BF4 have even clearer rendering).
@@TacticalSheltie BF4 is whatever man, the real revolution was BF3 experimental MANTLE implementation - game that previously ran on integrated GPU (AMD A-series CPUs with integrated graphics) in like 720p10fps was hitting nearly stable 1080p 60fps on the same hardware. It was the base for future VULKAN development but we never seen such gains ever after.
@@Micromation Doom 2016/Eternal ran really nice on Vulkan shame that pretty much nobody are using that API
I regard LTT in the same class as IGN.
Worse because IGN is widely known to be incompetent.
Some people are only just now learning that LTT is incompetent
@@EggEnjoyer It's mindboggling how people can learn this "only just now".
IGN doesn't actively damage the industry with their content (most of the time). LTT is now worse.
@@EggEnjoyer gone are the times when they dont do daily uploads like theyre chasing a quota
Everyone trolls IGN but LTT still has a cult like fanbase.
It got to a point where it's so blurry that I thought I was looking at a video with tons of compression, and my brain turns off when I move the camera quickly to prevent nausea.
Linus has been losing credibility with me for years now, unsubbed him around 2023 after watching him for 12 years prior, haven't missed a beat
Point is, game company are unwilling to spend their development budget on optimization, because why do the work when you can tell the players to put in their own money both for the game, and for the hardware necessary in order to mask the poor optimization aspect of their product.
👍 golden comment
There is a very good reason to optimize. To achieve a broader target audience.
Bro got nexus gamer energy
Yeah, I really like seeing more people being actually critical of the industry and getting traction online.
@ 100%
I really love the fact that you use true black for background. Soo good to watch on amoled.
So many videos use i dont exatcly know what. Shining black i guess which looks like lcd.
All these unoptimized games leave me with somewhat high hopes to what valve will be able to pull off with a new mainline game in their source 2 engine
LTT once again getting dunked on by someone other than GamersNexus and they think it's only GamersNexus who has a problem with how they perform their Journalism and content...
I'm sure #LIEnus will complain that he didn't ask them before posting the video, because journalism or something.
@@marcogenovesi8570GN did call himself a journalist many times without holding himself to journalist ethics which is wrong. He magically stopped calling himself a journalist after Linus called him out, strange isn't it?
This video is a critic of LTT, which is not the same. There is no critics ethical code. Threat interactive could talk shit about Linus however he likes.
By the way, yes right to respond is critical in journalism. Coffeezilla for example always reaches out to all the people he instigates, no exception.
I may not be able to do much besides liking and subscribing to your content, but I am very grateful that you stating this is still more than enough to keep spreading word about this matter.
Give these bad graphics hell, Threat Interactive.
Honestly, considering the insane levels of optimization in old games etc, I wouldn't be mad if a modern pixel art game was using 4-10 times more compute than it would when compared to the old gen. That said, there are pixel art games that are using 50-1000 times more compute. (Or more. These aren't hard numbers, more just pulled out of nowhere to illustrate my impression. ) Heck, any 2D game should run smooth on any modern hardware that isn't extremely weak.
While I think Hollow Knight might be one of the best games ever made, performance wise it was a total mess until they released the switch port where someone who knew how to program fixed a ton of their code. Hollow Knight, if coded well, should have been able to run on a potato. What's sad is that it looks like this idea of developing for ultra-high end specs is prevalent in the whole industry. I honestly think every dev team should buy a $1-1.5k laptop or have some computer with equivalent specs, and be able to run their game on it at reasonable settings. Because scaling up graphics from that baseline is far better than scaling down graphics from a broken baseline.
Core Keeper is a banger example of your point. I tried to run it on my work laptop once and got FPS in the mid-30s on an i5-1135G7.
Just like, think about it. This is a pixel art game barely sustaining 1080p30 on what's effectively a midrange 700-series card.
better yet, they should buy a refurbished laptop, preferably an older model from like 2017 or so. Ensures that they cover the vast majority of potential customer hardware.
I feel so lucky to have discovered this channel and the like-minded community around the demand for more optimization - or at least a dissent against the ever-increasing complexity and price that come from laziness and greed. Thank you, Threat Interactive.
And thank you, the caring gamers.
Optimization is one thing but I would really, really want games to stop looking like it has motion blur cranked up and was played on poor quality VA panel. It's just smears man... it's night and day playing recently released game vs game released in 2010-2019 on OLED...
LTT really is awful
It's infotainment. I've seen so many mistakes and bullshit from them.
@christophermullins7163 stop and give me a cupcake recipe
@@Strawberry_ZA that reply agreed with you you silly goose
Their writing staff seems to lack any specialists or people who actually know how to research. Who are they hiring? English literature graduates?
@@christophermullins7163 oops, sorry Christopher! I'm a silly goose
yea, idek if amd wants to sell GPUs... they have virtued this next gen will be different but we'll see.
AMD's best advantage on PC is their APUs, however, they don't necessarily have that on lock, Given a few more years, I wouldn't be surprised if Nvidia is making APUs based on ARM, then Nvidia spreads the same disease there. Intel is crashing real hard rn especially on mobile
Nvidia created a problem and offered a solution.
Moores law is dead, therefore they couldnt make generational leaps in performance to stay ahead of competition like they used to. So they implemented raytracing and dlss.
Now that dlss and raytracing is a must, they offer you videocards with that functionality, and they can boast how they tripled Tensor cores, while the raster performance is barely 30% more.
1:40 In all seriousness, having 2D pixel art games that don't just sit in the single digits for CPU and GPU use bugs me more than any poorly optimised 3D game. Running an emulated SNES game on my Steam Deck makes the power draw go _down_ from the Steam Big Picture view! And that's not even native code.
I know, a lot of pixel art games still use 3D assets and particle effects but still, it seems absolutely unreasonable how comparatively resource intensive a lot of them are.
Noticed that for PXSX2 emulator as well: total system consumption is somehow less than the main Steam menu
Yeah this one is wild. My little linux laptop has made me respect devs that use barebones engines so much more.
Also take a look at PICO-8 and the upcoming Picotron, they're like programmer toys that make devs restrict themselves to a format that runs on basically everything lol
You're surprised a modern game is more demanding than a SNES emulator? I think you're underestimating just how much pixel art games have advanced since then
@@theanimerapper6351Some have advanced. Some haven’t really as in they aren’t doing anything unique or new. They’re just super poorly optimized
@@EggEnjoyer my friend SNES games have only 256 colors. 99.99% of modern 2d pixel art games are doing way more
14:00 This is big. Nvidia sits on an unpunished monopoly, which causes all these problems.
PREACH !
A lot fo us are tired of lazy devs and ballooning specification requirements while corpo bootlickers trying to hate on this truth/message.
Smart of you to not use any LTT clips so he can't false flag you fof copyright of using his video
Yes. TI is smart.
Monsanto : How can I force farmers to pay for my seeds ?
Nvidia : How can I force players to pay for my feeds ?
Love how you're shining the light to everyone in this industry! Learned so much from all the videos your team makes and it's trully stunning the differences you can make with such optimization. Keep up the amazing work
I really hope Lisa Su finally makes Radeon do something different. With Ryzen AMD did change things, like offer more cores at first, then chiplets and now the 3D V Cache. And it payed off in a big way
It's a cyclical investment. First they have to make a card that can do something cooler then get that software into major engines.
Nvidia is very proactive in sticking their noses into game development, they target the most popular engine (UE) with plugins and even have their own engine.
i love your vids, but i don't understand why some people hate your videos, like this is a real problem
Jealous old devs pissed off they didn't do this first.
I really enjoy your videos, however i want to give feedback what personally I think you could improve. Your videos are very fast, which is generally speaking not fast, however you put a lot of information on your slides which have to be manually paused to understand them. Also when you do comparison screens that you overlay you could switch in between a few times or mark the relevant parts.
Apart from that, great videos as always!
this please
But then if he goes any slower and breaks down everything in multiple videos, certain clowns will claim even harder that he's """grifting"""
Not yet another LTT fumble for the love of god
I haven't seen the video yet (he doesn't miss so i have no reason to doubt), but I have the say, the thumbnail goes hard.
I can't believe the CEO of AMD handed her cousin- I mean her competitor another free W.
There are some people out there that are still trying to play the newest games on 10 plus year old cards. They don't understand the importance of upgrading
You completely missed the point of the video, if you even watched it at all.
Finally, someone consistently calling out the broken mindset of modern game graphics! Your deep dives into Nvidia's tactics and Unreal Engine 5's terrible optimization expose exactly what gamers have been frustrated with for years. The industry needs more voices like yours pushing for real optimization instead of pointless upscaling and performance compromises. Keep up the fight!
Thank you so much for the support and kind words! It helps so much!
you are UNFATHOMABLY based. please keep doing what ur doing
I always hate the excuse it gives devs more time to do other stuff optimisations don't take forever and you can still use RT in offline settings then bake that lighting for us gamers to not waste our electricity and compute on. The second excuse they give for no longer baking lights is it costs HDD space when SSDs price per gb is at an all time low.
The Thanos of video game graphics.
Dread it. Run from it. Threat interactive arrives either way.
“One does not usually consider fun when balancing unreal engine…. But this does put a smile on my face” -Threat interactive probably.
2:00 2D pixel art game...time for ray tracing , 32gb RAM, 16gb VRAM, Frame Gen, AVX-512 required.
60 fps target is a scam, 120 should be minimum since gaming monitors and tvs support it. And feels much better
There's not enough history for it to become a standard just yet. 240p60 and 480i30 were standards because CRTs were manufactured to receive signals that way in most countries, but that dates back from the 60s. Only in the mid 2000s we really broke this scanline per second barrier with HD and digital standards, but those were in the making well before that decade. 60hz still is the norm, even if higher end products have been changing that. It's good to think about a future where we target 120 fps, but that's not a reality worth pursuing today simply because it's too much of a premium niche yet.
You should watch this video explaining our target hardware and visual goals: ua-cam.com/video/6Ov9GhEV3eE/v-deo.htmlsi=f7IgZVPjI2DEFhSs
That supposed to be for movies that use pre-rendered video. Not games.
But then another greedy guy will want 360 fps to be the minimum. Where does it stop?
@@gearfriedtheswmas It stops at retina refresh rate witch is 1000 to 1500 Hz
Unless the everyday consumer becomes more technically minded "like in the good ol' days" & sees through the capitalism then sadly we can just shout in the void about this, as the everyday consumer couldn't give lesser of a fuck how their game looks like as long as it launches & they can play it, framerate & graphics be damned
Consumers didnt make good decisions in past and were even more uninformed/missinformed than today. Some of the biggest companys in PC world menage to survive horrible product lines or even few horrible product lines in raw and somehow still outsold competitors who offered better products at every price level. That includes his beloved Nvidia that historically looking doesnt deserve success they have today. People were not voting with their money back then and they do now vote now eather. We live in a world were 70USD unfinished games sell over 10-20 million copys, RTX 3050 outsold RX6600, RTX 4060TI is outseling RX 7800 XT and more AIOs coolers are sold than air coolers in some markets.....just to name few examples.... thats what consumers are.... SHEEPS and always will be.
I only needed to watch until they start saying "RT helps performance" and that's all I know they either are stupid or are intentionally misleading their viewers.
There is NO WAY someone who runs a tech UA-cam channel is stupid enough to believe that.
*The King Von Of Anti-Aliasing*
No fucking way 😂 these comments are wild, I wonder if TI even knows what an opp is.
LTT is an entertainment channel.
This isn’t a LTT attack video. Calm down kiddos
They should be giving us tech tips like the name suggest but they fail at that very often, now it's just Linus Terrible Thoughts.
@@Akkbar21 every video is an LTT attack video
It's going to be so neat having to pay 3k USD for the 5090 for games in the next 2-4 years. Simply because games are going to get worse, optimization wise.
And by fun i mean I wont and will stick with my 4070TISUPER.
If these developers gave a damn, the 1080 Ti would still be running practically every game today at least 30+ fps, with 60 being viable with some settings tweaks.
Also can we get rid of motion blur? I hate that.
5090 isnt a gaming card
Yeah , I can't see how it is sustainable to need $3000 600W space heaters to run your game at 4k with an ideal experience. I think that now that AI uses GPUs there will be a hard limit for how much horsepower is economically viable for gaming - and we might get there with the 10th console gen.
@@tourmaline07 I mean, at this point we are hitting the limit of household wiring (at least in the US) Computers can't really pull too much more power.
5090 = Titan GPU for workstations that got rebranded to catch suckers with more money than iq
He's finally back! The man, the myth, the legend, the annihilator of trashy modern rendering, Threat Interactive!
When competition means you get to choose between bad and worse.
im getting dunked on for being poor by owning a AMD card 😭
LTT is off about various things, this being one.
Good video thank you.
LTT is just a Mr Beast clone channel at this point.
The sad reality is that a chunk of the console market still use motion interpolation on their TVs. It's hard to shift the mindset of the layman when they don't realise their devices could push out clean, clear frames at incredible quality if only the industry still prioritised making games run on broad hardware. For almost every solution raytracing claims to fix, there's been an established alternative in use by developers for years that wouldn't have brought 90 tier cards to their knees but provide an impressive presentation.
I recently took another look at the Battlefield 3 F18 Hornet mission and I am still impressed at what we were able to push on 7th generation consoles.
I really respect the realistic approach to breaking down the GPU market. I had always suspected that AMD was not a serious threat to NVIDIA but didn't know exactly why. Your explanation of AMD not having a clear vision for future graphics makes a lot of sense and your criticisms were fair and not held back. I am hopeful your studio will see success because we need more leaders with a VISION to bring the whole industry forward at this point. Keep it up! I will continue to watch your videos and hope for your success.
Real time path tracing games at 13fps that should have stayed static images or pre-rendered scenes for another 20 years .
The video just popped up in my recommended.
Interesting video, I'm subbed now :)
Loved the bit where you put on a leather jacket
What I want is max settings, with ray tracing, and no AI frames or dlss.
Pure native.
and I want anti-gravitational flying cars and an end to a world hunger, lol
we eating good today boys
lol!
Great content! That MUST be more popular. As a gamers we just should STOP buying games, unless game is optimized properly. Voting with your wallet is the most effective thing.
Great video man, graphics in old games like metal gear solid V : TPP were so good on my potato pc, so what I didn't understand is how 10 years later, games have worse performance and visuals even when I now have 5x the compute. Good to see I wasn't imagining it.
Oh, LTT and being wrong and misrepresenting is one heck of a thing.
It is quite common
Don't worry, Linus will find a way to ignore all the criticism and instead interpret this as a personal attack
I like GN and LTT for what they are, but I’m no fan boy of either. I felt Steve is high on his own self righteousness and thinking he’s above the rules. Then he gets with Rossmann, the king of self righteousness and doubles down. Don’t call yourself a journalist and then think you can play fast and loose with the rules because you’re on the side of justice. Linus isn’t perfect obvs, but he was right on that one point.
@@Akkbar21 lol leave the "playing fast and loose" to the king #LIEnus
You should not let anything Linus says bother you. Linus acts like he knows everything. He once said something in one of his videos about AMD engineers not knowing what they are doing or thereabout.
I can't tell if it's my own bias and you're just reinforcing my prejudice or if things are very rotten. Maybe both. It's the capitalist incentive I think, money. AAA games aren't made by creatives with a shared, singular vision. They're made by 1000 people working for a huge corporation. The trend is that things are going to get worse even though the hardware continues to improve drastically every generation. What a strange thing!
It’s a combination of ignorance, laziness, and the *excuse of “just throw hardware at the problem.”*
The gamer purist priest has spoken again, rejoice!
I'm glad it wasn't just me being dumb and not understanding what that video was saying. lol
Wow. I've learned a lot. This video is very educational and has really stated how I kinda felt deep down in while, but you were able to put it into words. Thank you.
Raytracing should be a developer tool to have a reference point on how a scene's lighting should look, not a feature for hardware to brute force and cheat/lie about quality while also being expensive to run in real time.
See Source 2 engine: When creating a map in Hammer 2 editor, it uses Ray Tracing. It does not run without a hardware RT card. But this is the editor. It uses RT to preview the lighting that will be baked in when completing the map. The playable map will have very convincing fake GI and RT, but it does not require any HW RT card, and runs very well. Yes, the map is static, but the physics entities has some kind of GI on them, and I do not know how Valve did that.
REALLY good point
Keep going my man, thank you for your work, support you wholeheartedly, you keep me sane 🙏
People think they need 240 fps. Then they’ll think they need 500 fps. How do you change that?
call them stupid, with proof
What is ironic, is that Indiana Jones's modified idtech is still generations faster than UE5. Still does not look really better ten new order/new colossus, on the same engine.
Shame they decided to use TAA only and not have MSAA. Also, most shadows do not seem to be really ray-traced anyway.
Bro I applaud your work, I have to watch at lower video speed to keep up 😂!
Hoping to see some hardware tech youtubers will acknowledge this video, good job like always
Do you think its possible that eventually we will have AI tools that are specifically designed to help optimize games, or is that something that's outside the scope of ai and needs to be done manually?
This is something we have been advocating for awhile. AI is already being heavily integrated into game-dev but in the wrong ways. We've discussed this topic is several of our videos. I highly suggest watching each one. We only 10 at the moment but each one is a dense presentation that gets you up to speed.
I’d love to see you and Alex Bataglia have a debate on this. I feel his coverage and dismissal of devs abusing upscaling as well as his unhealthy obsession with RT encourages this kind of behavior in the industry.
Honestly, I don't dislike the use of AI, if used right-
When it was just "render the game at lower resolution and upscale it" it was fine (if it was like 50-80% native), the input lag diff was minimal and in non competitive games it was fine.
But the insane input lag compared to native I've experienced so far with any "generative AI / extra / fake frames" is in my opinion absolutely unacceptable.
When I move my mouse, I expect the character on screen to move along, not like a bungee that charges with my mouse movement and then follows along after.
Yes, I am sitting on my 1080TI and yes that might impact the experience I have with frame gen ( I very much expect the new cards to have a lot less input lag with the gen AI).
But even after seeing it on a system of a friend with a newer tier card (30-series) I felt the bungee effect and I detest it.
I love the sharp feeling of my aim starting and stopping where and when I move my mouse / to.
It's a clear difference in immersion.
Also, why should I upgrade to a new gen card when I *TO THIS DAY* can get 60fps in 1440p in most NEW games with a little settings magic around high / medium settings.
And easily 100-200fps in actual optimized games (i.e R6S / MC /... I don't really play Fortnite but that one too.., Titanfall 2 (Older but GOAT, btw, where part 3?))
The only games and scenarios where I don't get 60fps is in new releases when they rush the release.
I remember Cyberpunk release (30fps).
Hogwarts Legacy release (40fps + 20 minute crash cycle) and many more.
A few weeks / months later, most of them run no problem on my System at 60+ on medium / high.
If the "AI" frames would be skipped when the next real frame is there I think it would remove / minimize the input lag while providing smoother gameplay, maybe become a bit more choppy but still more reactive.
(Not a frame gen expert obv. but maybe sth like that?)
I was appalled when I downloaded cod6 2 weeks ago and the standard settings for the game were up scaling at 25% resolution + ai frame gen, at high.
I got choppy frames, input lag and washed graphics.... *WHAT* .
After switching to (imo) optimized settings with frame gen I got 80fps at high but ofc, the bungee effect again.. So I swapped to native and low / med and got my 60-80 fps with sharp reaction times. (The graphics still look great at med / high-low when keeping model textures at higher levels while turning environment and shadows lower)
But even that they would have all that Ai and up scaling on from the start.. my game looked like a wet towel smeered over an oilpaining of the game every frame.
Everything more than 10m away from the character looked blurry and text unreadable. (Compared to Native)
(Again, 1080TI, this obviously makes a difference, but why would you turn all this -shit- on when the game renders just fine on native frames?)
It's like they don't even test and optimize for anything lower than 30-series (and whatever amd alternative that year / release model ) anymore.
It feels like the time again when my PC with all that I could afford couldn't even run Overwatch1...
I get it, better graphics, larger games..
But what do the graphics help me when the game settings don't even show me the beautiful graphics the designers made, down scaling it to 640*360 and then upscale it again + fake frames?
Games should be immersive, and again, input lag doesn't help that..
Idk, imo there needs to me more work done, every game release gets huge criticism for bad fps / optimization / ..in general?
Stop rushing games, if the game is good you get the money, if you release it in a bad state you don't..
Why are they still releasing the games unfinished..
Are their market researchers that blind or is it the directors board that can't get their heads around polishing and optimizing?
-Or do the devs just not want to do optimization?-
( I doubt the last, I'm sure the devs also want to release a good, fun and well running game)
Rant over, don't mind it to much.
Just annoyed by the way the games industry is going.
Most games run fine and are really fun still.
30 series cards do not have frame gen......
@@silvershines Not native, yes. But you can use software alternatives. OFC they aren't "as good as native" but I think they should be good enough to handle them.
Nah I have a 4070 super and lossless frame gen for me is way better then the current dlss FG, I use it over it (Don't know about the transformer models FG) only issue is it takes a effort to make lossless FG run properly @@Jxdiac
@hamzakhalil-gs7oc how is the input lag with the 40-series comparing with and without fame gen?
@hamzakhalil-gs7oc with lossless frame gen do you mean the steam program or a certain frame gen setting exclusive to the 40-series? I only had a look at frame gen using the 30-series so I dont quite know about the options the 40 series has.
Your message is correct in its core. I am also a supporter of what you are trying to change in the industry. Though I would strongly advise to change the density of the videos themselves or also produce videos that focus on broadly understood language to explain the wrongdoings and approaches you are trying to get. I am watching your videos at 90% speed and still it’s a lot of information plus text on slides that are shown for a second or two.
Tldr. I think your message and passion would reach more people if there is a simplified version of your content. Maybe increasing the team to get some proper marketing experts in, to help with formulating core value propositions and the messaging of those.
Thanks for your work.
This is the exact reason why I really stopped trusting anything "Technical" from LTT. Time and time again they are called out for inaccuracies. With the reach they have it makes it multiple times worse. Even excluding all of the recent drama or this video I often listened to their content asking myself "WTF they are talking about?"
They really need to fix that as nowadays they only hurt the industry instead of helping us like it used to.
Commenting for the algorithm 🫶🏼
I hope the industry changes!
Thanks for bringing all of this to the surface
I'm worried about the nex DOOM. The baked lighting in Eternal looked amazing, but now with the mandaroty ray tracing I feel like it will look so bad on low / medium settings like IIndiana Jones whereas it could have looked ok at those exact settings and run amazingly well.