I feel like physics is something that companies just stop caring about.. We need better hair/fur physics and textures. Movements in clothes. Air and movement in trees. Interaction with things or objects within the game itself. Right now everything is so stiff
We really need better AI, I wonder why it cannot be some online AI doing the job since your console is always on the internet with an option for regular AI if you want and of course the offline people as well Imagine an AI like we had in F.E.A.R. but evolving into an apex predator from learning player's patterns, mistakes and successes
In case anyone's curious, 1080p is still top on the steam hardware survey (58%), 1440p is growing slowly but surely (19%), and 4K is basically in the same place it was about 4 years ago (3.7%).
@@Goblue734 Yes, although there's a subset of people who might have a 4K screen for productivity as well as gaming, and the needle hasn't even moved on *that* in the last few years.
@@Goblue734 even with top tier gaming gpu some games struggling to hit 60 fps unless you use upscaling, frame generation, or reducing graphics settings, because you gues it incompetent developer, or publisher
Because the screen is directly beside your eye and you expect to be able to have clarity far into the distance as well as to the sides (as you do it reality). 4k isn't enough!
4k already does this perfectly, even 1440p does it decent. at this point the resolution isn't as important as the effect crt shaders have on brightness and colour. oled is fine on the colour front but with shaders applied its very dim compared to my trinitron especially since you have to change the gamma curve
4k gaming is still hard forget 8k gaming. Most people are still at 1080p or trying to use 1440p but 4k entry level graphics card is very expensive and not providing performance to play properly.
@@romeocalmo its barely holding native 1440p in big titles, and 4k that made with fsr... well, if shimmering is your thing im not gonna judge you 😇 upd. since this heating up a bit, ill made a correction: im talking about games that come out in like past 12 months or so. and narrowing this down to only current gen makes things too sad. first party launch games are sharp looking but they always are - just look at killzone that was released with launch of ps4. same with cross-gen games in most cases.
@@mitsuhh launch title(that using dynamic resolution and game is 4yo) and past gen game(good looking but still ps4 game) can hold 4k? great. enjoy your fsr in every other game!
I'm still rocking a 24-inch 1080p monitor, I'd take the very high frame rate and responsive gameplay over more pixel-density. And in motion, a high frame rate does look better than a higher resolution.
@@yarincool1237 1440p is quite pointless, just get a 4k screen, then you get way more benefits with game/movie scaling the game and especially film/TV industry really don't care about 1440p it's basically an awkward middle resolution between 2 industry standards
Yes, you need a minimum of 4080 if you want to the play the latest AAA games at native 4k @ 60fps. There’s a reason why the majority of pc gamers still game at 1080p 😂
@Krypto121 You are not running RE4 on 4070 at native 4K with RT @ 60fps. Unless you are planning to tweak everything in the settings to low. What's the point in that? Source: watch?v=EhZ090tTKA8
@@electrikoptik you shifted the goal posts to using RT in your argument, he did not mention anything about using ray tracing. Its also mediocre in majority of games anyway. RT is not the reason majority of PC games play on 1080p as you mentioned.
biggest downside for me using 8k was that it really exposes inconsistent assets or just low quality ones in general if you play a game at 1080p then everything blends together, but 8k really exposes everything if you have a high quality game that is extremely consistent in its visual delivery then yeah 8k will look nice but your framerate will die, and that tiny extra sharpness just isn't worth the massive drop in FPS and/or reliance on fake frames/resolutions
Its going to eventually be a problem once 4k is the standard and half of that population is on 8k screens, then the market will wish we move up to 16k resolution. Its a vicious cycle. 16k is going to take more than a decade to become even popular I imagine. I think companies that make displays are going to have to start thinking about going a different route with display tech in the future and maybe not rely on increasing resolution like it was in the past.
@@Finger112 I really don't think so. You're forgetting the fact that human vision is fixed. Past a certain point(pretty much 4k), there is just no real benefit. 4k with a solid antialiasing method is already pixel-less, pretty much. Makes no sense to go higher.
Maybe in the end of 2030s we can start get 8K gaming, movies and tvs for real. But never in 2030. People want 120 fps instead of 8K. If they can make games that good, then we can talk again. Focus on making 4K the best there is first. Don't rush with 8K
I think people overstate the number of people obsessed with higher framerates. I can appreciate it, but many people have difficulty discerning beyond 60, and some people claim they can’t see the difference between 30 and 60 (which is madness to me, as it’s a night-and-day difference). If the only people you interact with are in communities like this or others with higher end PCs it distorts your perception of wider tastes.
It depends on the size of your monitor. 28" I'd say 1440p was the sweet spot personally. 32" and above that 4K is gonna be more enticing before 40" where you would see 4K as the real sweet spot.
It's really simple: To be able to differentiate pixels, they cannot be closer than approx. 1 arc minute, as this is the resolution of the human eye. To truly appreciate a higher resolution you have to move actually closer to the screen to be able to see the difference, i.e. make worth the investment. It's also quite intuitive: Imagine a 4k display at the right distance for every pixel to be exactly 1 arc minute in size, quadrupling the resolution at the same distance with the same screen size will result in that surplus resolution being completely wasted on our eyes inability to resolve any higher.
And yet you’ll still have people swearing by 8K monitors despite it being unlikely they have the vision/panel size to discern a difference. Post-purchase rationalization is strong in the PC hardware space.
So, are you saying that the most you'll ever need is 1 pixel per arc minute * 60 arc minutes in an arc degree = 60 pixels per arc degree * 180 degrees = 10800x10800 pixels is the maximum we'll ever need ? With good antialiasing (so maybe downscale from 40-50k x 40-50k resolution, to have perfect edges, thin lines and stability) ? Well, it would be per eye, so a bit less than double of that for horizontal pixels (less than double instead of exactly double, because there's a good amount of overlap) That would mean that tall 16K would be the last resolution we'll ever need. Given how demanding it will be, it might be the most we'll be able to reasonably do at good framerates and able to stream over the Internet, before we get capped at how much we can shrink the transistors. But that will surely not be in 2030, other than maybe some lab demo.
The average person can distinguish between frames upto 500, trained (fighter jet pilots mostly) eyes can up to 1000. So we are hitting the peak of what can be perceived by 99% of people. 8k is a little high, but it happen just to make anti-aliasing obsolete
i’m glad we’ve hit a practical resolution limit, now we can focus on fidelity & frame rate 8K is likely to remain niche except for perhaps VR and enterprise
@@le4-677 could say the same about all graphical pushes now. Or even the point of pushing FPS beyond 60, certainly not beyond 120. Absolutely everything in video games is diminishing returns now. Is the PS5 much better than a PS4? Will the PS6 be leaps and bounds better than a PS5?
Except Moores law was alive and well during those times. It's very dead now. Progress is slowing down significantly and costs are increasing. It ain't gonna happen. We don't even have 8k movie or TV content yet and that's far easier to achieve.
Focus the power on raytracing. Global path tracing, shadows and reflections at stable 4K60/120. True lighting is the way to make games look better, not more resolution.
ray/pathtracing is stupid it looks disgusting better to use high quality rasterisation(like for example RDR2 from 2018 - imagine how good games could look now in 2024 using the same techniques RT/PT is the reason no one can confortably max out games anymore except MAYBE when relying on fake frames/resolutions video games are losing clarity due to these -tracing techniques and the temporal upscalers
@@Koozwad well Ray tracing is the future of lighting, rather than faking it. And in 2030, which was the year in the OG question, Ray tracing should be able to run at better frame rates than it does today.
@@coyley72 It still has a lot of downsides, FPS aside. As a 30 series user, I turned it off pretty much since day 1 as it's just too blurry/flickery and whatnot. I prefer clarity with ZERO side-effects.
One reason I can see for 8K displays and beyond is for Virtual Reality and perhaps Augmented Reality (for redundant clarity of pass-thru in realworld situations). I'm still gaming with a 1080ti in 1440p/60 and it's the sweet spot for me. Even when rendering at 4k with 50% resolution scaling. I can't use all settings or hardware RTX but my immersion doesn't need those things if the gameplay is worthwhile. I can still play DooM once I settle in.
Monitor manufacturers should try doing 5k monitors, it’ll be like 1080p and 1440p. I find in games like Battlefield 1 for example, that 4k looks great but at 130-135% res scale, (I think 5k would be 133%) it becomes noticeably sharper and less jagged, then going up to 200% which is 8k, it makes basically no difference. The hit from 4k to 5k is still pretty big though
1440p 27" isn't enough for me unless the AA is really good, like 4x MSAA or SGSSAA. It's not a problem of detail retrieval; it's a problem of image stability and clarity. The problem is that most games made today will never have good AA, and so the only way to fix that is with brute-force resolution and PPI.
Honestly niche opinion, but 1800p with upscaling is the prime for 4k OLED screens trying to get 120 frames in games i think is a good goal to shoot for in building a PC.
Probably when the only television you can buy to replace your dead one is an 8K display. Plenty of people still out there perfectly happy with 720p/1080p no frills televisions.
Playing 10 year old games maybe. The 5090 is projected to be only 50% more powerful than the 4090, even smaller gains as you go down the stack. You need a 4x increase in power to drive 8k, not to mention memory and memory bandwidth increases and data cable transfer rates can't possibly deliver that without heavy compression. 8k with DLSS sure, not native.
1440p PC monitor, 1080p living room TV here. If you could integer scale, wouldn't a 1080p output on a 2160p screen look roughly the same (not better or worse) than the same content on the same size 1080p screen?
Will definitely become relevant, it will be marketed and advertised and hyped. When tvs switch to 8k as standard, and they will do. It will be something like what we have today, some games will run nicely 8k 60fps with dlss and upscaling tech, more demanding games will be at 4k.
I think if it becomes a common resolution, it'll be for mainstream TVs where people think "bigger number better" (and then continue to stream 1080p content anyway). I doubt that many PC gamers will bother with 8k over running 4k at a higher res or better graphics settings. I daily drive a 42" 4k display as my monitor and, even with my glasses on, I have to lean in and be looking for pixels/aliasing to really see it at all. Add DLAA and aliasing is basically non-existent to my eyes and everything looks incredibly crisp. I really can't see myself buying into 8k unless I get something like an 85" TV and sit like 6 feet away all the time while watching a futuristic 8k digitally-mastered bluray. Even then, a lot of movies are recorded on film, which would not benefit much from going above 4k. Even 1080p to 4k isn't hugely different for movies.
@@justitgstuff5284 Right now very few 8k TV can be sold in Europe as they have very bad EEI (Energy efficency index)...the maximum allowed is 0,90 most 8k tvs are well over 2
It ain't gonna happen. There is still 0 movie or TV content for 8k and the studios have no intentions of making it. Shit they can't even get half of the 4k remasters right. It's a monumentally bigger task for games. With the push for path tracing and high refresh 4k the processing power is simply not there to increase the pixel count by 4x and it won't be ever there on silicon. Moores law is dead. Of course I'm talking about native. You can always upscale your 4k content to 8k which is the only way it's going to be relevant.
I want 2880p monitors to come to the market somewhat soon. I can still see pixels at 32 inches at 4k. 8k just seems like a huge waste for at least another decade
Honestly we had the same conversation with 4k back in the day. Even when 720p to 1080 the argument was we could probably benefit from 1080p HUDs, but 720p games are good enough.... I guess the more common TV size was 40 inches those time where now we are at 50-60. Still we had all these graphs oh you can't really see the difference between 4K and 1080p unless you're like X number of feet away from the TV. Yet here we are where 1080p and even 1440p isn't playable enough for some gamers anymore. Unless nothing drastically changes in Gaming in the next 6 years 8K will be a thing. We'll have been through 4 GPU generations by that point. At the very least we'd be super sampling from 6-8K at that point.
… I mean 8k with my 7900xtx looks beautiful for games like Dave the Diver and Manor Lords… low refresh rate necessary and sitting roughly 10ft from 75”. Can I play Cyberpunk with it with high refresh. Not at all but I can flip between 4k and 8k and immediately notice a difference in the edges.
I'm using Panasonic Cannon. Those models have smooth screen technology that removes all scanlines. Heck, even on 1280*720 screen looks 4K when size of the picture is is squeezed into 50 inch or so.
8k could be very relevant in VR, I feel you should discuss this as it's becoming more mainstream lately. I regularly play in 6k in VR because this makes games look the cleanest, if 8k was viable by the hardware i'd definitely use it.
So the thing to keep in mind is that every single aliasing artifact you ever see is caused by insufficient resolution. We know 4k isn't good enough because you can't turn of anti-aliasing at 4k and still have a good image. Of course it's questionable if it's worth pushing 4 times as many pixels for a slightly better image but it's not the case that you wouldn't be able to see the difference. The other thing is that with a 8k panel you can do a pixel perfect upscale of 1440p (3x on each axis), which you can't do with 4k. So it adds 1440p to the list of static resolutions which will look the same as they would on a native panel of the same size. Neither of these are a huge deal, but I just felt it was important to point out that it's not the case that there's no point at all in going for an 8k panel, and the idea that you can't see the difference because the pixels are too small is frankly nonsense.
I've seen 8k games and a big gain is the sharpness of details. It's surprisingly easy to see. It's like using a sharpen filter but without the artifacts. But needing 4x the performance of 4k? good luck. Sometimes it's more than a 4x loss since the rendering pipelines can't handle that level of data. Once day we'll get there but for now 4k @ 32inches is ultra sharp looking.
I think the industry should alternate the focus on framerate vs. resolution. In other words, aim for 4k@120fps next (for gaming on all next-gen console). Once that becomes available for everyone at all price levels by 2030, aim for 8k@60fps next and then 8k@120fps by 2040. I do want a dual-mode OLED monitor for 8K@60hz + 4k@120hz tho, mainly to run 200% scaling for clear text during work and switch to 4k@120hz for gaming.
8k isn’t a good anti aliasing because it’s entirely not antialiasing. Antialiasing always has some draw backs that can’t seem to be resolved, but with 8k the only draw back is software and hardware that is always improving every day. I believe one day graphics will be so good that we won’t notice any graphics cut backs, and that day seems to be closer than expected. Now this argument is based entirely on whether you prefer blur from antialiasing or the pure and natural result of such a smooth and clear a 8k resolution.
As the crew has said: higher refresh and framerates are becoming more important. At 4K I'd sooner jump from 60 fps to 120 fps, instead of jump from 4K to 8K. I believe the pleasing effect is more pronounced. Btw 1440P resolution user right here. Sharper than 1080P, but still efficient and effective.
Have you guys not seen Samsara or Baraka? Both were shot on 72mm film and demand to be seen in 8K. For a true IMAX experience 8K is the only way to have the film equivalent of that. I agree for most things it's not needed, but for pixeling peeping goodness, I hope it becomes a thing in the future for gaming, but it's just fun to look at things in such detail. I'll wait until the performance is there, but eagerly await it.
I want an 8K 55-70" tv for a decent price, and hardware that can reasonably drive a desktop at that resolution. My 1060 is running greta with my 4K 55" desk monitor, but text sharpness leaves something to be desired.
Well that is actually a good case for 8K, to lower 4K prices, otherwise those specs are forever $600-800+ new, since most if not everyone consider them high end monitor, so manufacturers don’t really have an incentive to lower it from that price, just make it cheaper on their end.
The sad thing about 8k is that the one feature it was most useful for in gaming is long dead now. I'm talking of course about split-screen multiplayer. Displays are getting so large now and with more densely packed resolutions. Imagine a video game that could draw 4 to 8, maybe even 12 split windows on a huge area from a 8k projector? People can have a hell of a blast with a local multiplayer party on that, like the old Golden Eye and Perfect Dark days but much better.
Sure it will. Eventually it will. At some point even if it’s unnecessary manufacturers will turn to it to try to one up each other. Just how things work. Even Apple went higher than their Retina ppi thing even though they were saying they would never need to.
Using smaller TVs 42-55'' as gaming monitors would benefit from 8k. You sit close enough as normal monitors with peripheral vision being taken up by the TV, and 8k would provide a huge PPI improvement.
From steam hardware survey may 2024: 58% of users has 1080p as primary display, 17% has 1440p as primary. Around 9% of users have primary displays of resolutions lower than 1080p. 1% of users have 1200p (16:10), 3% have 1600p (16:10). Only 3,7 % of users have actual 4k primary displays, and the resolutions not mentioned are very wacky. 4k displays have beeen around for years now, from what I am seeing, it is not really a thing for PC gaming, I dont think 8k will ever be a thing, and not 4k either.
@@RonniSchmidt-mi7pdGTA 5 is an ancient game. Slapping raytracing on it doesn't make it graphically outstanding. People call consoles weak because they are mostly limited to 30 fps while PCs can go easily above 60 in the same games.
@@AdamMi1 You are very stupid 💀 most of these games are PS4 games like Red dead redemption 2 and didn't receive PS5 enhancements Steam survey shows most people use RTX 3060, PS5 is way more powerful than that.
I think the move to 8K is a significantly slower match than the move to 4K. OTA broadcast is still in 1080i for most of the world. And the bandwidth demands of 8K is still too high for the current global Internet infrastructure, even when we get a newer more efficient codec than HVEC. But I know we will eventually get there, as the TV manufacturers will eventually force the inevitable 8K upgrade, but I do not see it happening by 2030. Based on the performance of 9th gen consoles, we're only now moving towards 1440p. Native 2160p rendering is still few and far in between, especially if we want 60fps or 120fps. I don't even think the hypothetical PS6 can hit 4K60 consistently.
Is this really a question 😄. Of course it will be eventually, just like 720p was, then 1080p, then 4K and eventually 8K and so on and so on. Technology doesn't stand still and yes 8K is needed because even at 4K once you hit 40"+ it isn't high res enough. I have a 43" monitor and at 3840x2160 it's only 102 PPI so yes 8K and double that is needed.
@@leospeedleo I didn't know that about Alan Wake 2. I was just playing that on my 65" OLED and it looks sharp. While I know some people aren't big on FSR, it looks really good to me for such a low resolution game.
Ironically, I have an overclocked PS3 (750/1000) and many games do look fine especially the 3 Arkham games, Killzone 2 to name only 2 on my 4K LED, AA is what really matters
Also, with calculated viewing distances.. these bigger resolutions arent going to matter as much as 8k wont be noticed till the screen is even larger.. but then thats an additional variable to reccomend a father viewing distance. Also given the economy for most middle class people... we're not going to be investing into 8k any time soon.. luckily the 4k TVs are super cheap and deep blacks with micro/qled/ oled is a better focus. Resolution is fine, constrast and refresh is going to continue being more catered to for now IMO
@@alanandfriends-s2o Even high end gpus (like 4090) struggle at delivering good 4k framerate without resorting to dlss . 8k 100+ fps eta 2035 on flagship cards (x90) 2040 on mainstream cards (x60)
I had a QN900D for a couple of weeks before returning it 8K on a 65" + screen is definitely noticeable vs 4K from a separate OLED TV unfortunately it's pointless outside of the desktop experience. 8K 60hz after having the possibility of 4K 240hz... also negate any need for it until 8K 120hz is also a thing...
8k Would probably be nice for me. But I'm having 55" TV on my desk. It's not needed but it would be a nice to have bonus. But for gaming, it would need algorithmic changes for graphics and ability to drive the display with two display port cables. (fastest HDMI is about half the display port cable bandwidth.) The resolution benefits at screen sizes are not about display size, it's about display size divided by distance. In order to get benefit the screen just needs to fit large percentage of your vision and that can be accomplished by making screens bigger or have big screen close enough. It's about pixels per certain angle from your eyes instead of pixels per inch in target surface. 144hz and 10bit color should give more to the experience than 8k resolution, but you are limited by cable bandwidth, to have reasonable frequency with 8k and 10bit color. And even with 8bit color displayport would be in a range where you could probably would want to improve refresh rate. 8k probably is very likely eventually be ideal gaming resolution. But there are many hurdles to overcome before its worth it.
An outlier here but, in my case, my 55" TV is FHD (1080p) and I don't see how 4K (even less 8K) would improve my enjoyment when playing. Moreover, my AVR is 1080p, and adding to that, the electronics market where I live lacks middle-range TVs (just crappy or premium) and AVRs are non-existent (just soundbars) so I would have to import, so it would be costly for me to upgrade my setup.
I just just got a 4K OLED, and I just can't see why I'd upgrade from that. I don't think movies/tv shows like Breaking Bad changes that much from 1080p to 4K, and many games struggle to get native 4K.
I don't know but I dont have enough pixels on my 28 inch 4k screen for videogames, its not enough for microdetails on models depends a lot on rendering engine though
hi im a time traveler, here in 2077 we dont really use screens anymore but you could say games are in 8k.. or even 16k, they call it "relive" and you can experience the world as it was before 3rd war, play "games" (which is the only way of interacting with another human), and even eat a meal.. oh, and d4 still sucks
Unless tv channels and streaming services gets to the point where they agree with TV manufacturers to slowly discontinue 4k options. I don’t PC and consoles would need to push it out.. But it needs a push where it’s becoming a new norm.
8K makes no sense for media or gaming. 8K should be pushed for desktop use instead - sharp desktop with tons of space to work with. Something like the Samsung ARK 55" in 8K would be great. You'd game at 4K or less thanks to integer scaling at 1080p, 1440p or 4K.
I have a 65” 4k OLED and there are still days where I’m truly blown away with how good it looks and I’ve owned it for 3 years. To me, I don’t see a need for it and especially given the cost of computing power necessary to push 8k graphics and content. Imagine playing Alan Wake or Cyberpunk 2077 at 8k with path tracing on using a 4090 would net you -negative frame rates 😂
I think 4k will be sufficient for a long time. Other specs will be more important such as framerate, HDR & color accuracy, raytracing etc. Right now QHD is the sweetspot and this might remain the case for many years to come.
usually I would have said yes. but after discovering dldsr it really is not so urgent at all. maybe in 10-15 years it will be more than standard but dldsr or other good anti aliasing options like it are getting us most of the way to that pixel perfect sharpness
@@datsneakysnek Did you come here just to flex? The man has a very valid point. Most people game further away from their TV that you do. 1.5m away from a 48 would give me a headache.
Unless there is a massive breakthrough in computer tech 8k then native high refresh will never be a thing. Moores law is dead and progress is already grinding to a halt. Combined with the push for path traced rendering and even higher refresh rates the processing power will simply not be there for it. Up-scaling might take you there but how much better will that look over 4k native with dlsdr? Probably not noticeable on a monitor smaller than 40".
Yeah, I wouldn't mind if console resolutions stopped at 4k and then they just focused on picture stability, anti-aliasing, etc. Because, for movies & TV shows... 8k TV is gonna be pointless. Pretty much all content out there is 4k maximum. No real digital 8k movies and even for the old movies, you would need the ones that were shot on film stock bigger than 35mm (which was VERY rare).
Your sitting anywhere between 10 and 20 times further from a tv than a monitor, this is completely not comparable to a monitor where you may be 1-2 foot from your screen meaning your ability to perceive higher resolutions is way higher on a monitor. It makes barely any sense to broadcast at higher resolutions as the perceivable difference is negligible, the same can't be said about monitors.
Pretty interesting steam stats around this as of May 2024 58% still on 1080p, 18% on 1440p and 3.7% on 2160p. For a PC I think a 4K 144Hz 27" monitor is an absolute no brainer because it's not just about gaming but in the TV space a console (always released with old hardware) won't run 8K until long after PC does e.g. if PCs can run 8K in 2030 don't expect it on consoles until 2035-40 and when it does your sitting so far back you won't even notice.
We rarely have even native 4k gaming since game developers expect us to use upscalers nowadays so I'd say if it does happen it won't be in the near future.
I hate the everything needs upscalers. And everyone blames consoles for being to weak but that literally just means last gen consoles are more powerful because they didnt use upscalers and when they did it was minimal
As someone who sits 1.5 meters away from a 4k OLED, I will tell you that I would prefer more pixel density...but the GPU cost is insane and even my 4090 wouldn't do 8K. Maybe if we get a big jump in gpu tech/DLSS gets much better I would be interested in an 8k display
Seems people are falling into the same trap as when 4k was new. You're not supposed to see the individual pixels. If you do, the resolution should be higher. Anti aliasing hides a lot of flaws, but it's also adding "blur". A maximum "usable" resolution should have no visible stair stepping or noticable crawling without relying on AA. That said, I personally have a 55" 4K TV and at the distance I'm sitting I don't feel the need for a higher resolution. The 40" 1080p TV I used to have was much worse. I don't know how I'll feel if I replace my current TV with a 75" 4K though. I also have a 32" 21:9 1440p monitor, and there the resolution is definitely too low IMO. The pixel density is about the same as a 16:9 27" 1440p monitor. The main problem with the resolution isn't in games. It's with e.g. programming. Blurry/blocky text is a pain. I can see myself upgrading to 8k in another 6 years, but not mainly for gaming.
Personally think for 50+" tv's should have a slightly higher resolution, at most I would say 6k but 5k would probably be enough to smooth out the edges
I play in my games in 4k up to 120 fps on my PC and I love it. I think it's pointless to go 8K. I want more detail not resolution. Higher draw distance.
At some point manufacturing 8k monitors and TVs will cost the same as 4k monitors and TVs. At that point, manufacturing 4k monitors and TVs will be superfluous and will be slowly dropped in favor of streamlining TV production. So yes, I think "8k gaming" will definitely happen, but not necessarily because the demand for 8k will be huge, but because progress never stops and it'll just slowly creep up on us as lower resolution displays are phased out.
Making 8k displays is not the issue. The monitor tech progress has far surpassed graphics processing progress. Moores law is dead. The push for future diminishing gains are being aimed towards path tracing and higher refresh rates. There will be no left over power to give a 4x increase in pixels. With up-scaling from 4k sure it might be possible with the very best PC hardware but the majority of that base are not playing their games on massive TV's. Consoles have no chance.
I would love an 8k display. I'm using a 43inch 4k display right now with 100% text scaling and it's pixel density is not much better than my old 24" 1080p monitor. For productivity 8k would be amazing. For gaming though??? Maybe for something like Anno? I think the second half of the 2030s might be realistic for 8k gaming to enter the realms of enthusiast mainstream.
i want 8k gaming to be a thing so bad because then you wont need anti aliasing in games, so many games today have terrible forced blurry anti aliasing and its an eye sore, if 8k will be the norm then there wont be a need for it, while i would just rather game devs to just make good anti aliasing, this is the alternative.
Eventually it will become a thing, but not for a very long time. The cost of screens, not to mention the compute power needed to drive that many pixels with modern effects, is nuts. I wait for the day where there's a 4K game running on an 8K screen and someone says, "Damn that looks blurry". We're still in an era where 4K TV broadcasts are still rare, so content that drives the technology outside of gaming has to get there too, and as I said, that won't happen for a long time.
Big difference between a 'thing'' and ''mainstream'' 8K has been a ''thing'' since the first 8k monitor came out which was in 2017... It has been realistically possible in terms of getting 30+fps in games since the 3090 release in 2022. Mainstream wise it probably won't be until the 7090 that every single game can be maxed out without DLLSS or ray tracing on at 8k and still maintain at least 30fps at all times
I feel like physics is something that companies just stop caring about.. We need better hair/fur physics and textures. Movements in clothes. Air and movement in trees. Interaction with things or objects within the game itself. Right now everything is so stiff
Exanima
We really need better AI, I wonder why it cannot be some online AI doing the job since your console is always on the internet with an option for regular AI if you want and of course the offline people as well
Imagine an AI like we had in F.E.A.R. but evolving into an apex predator from learning player's patterns, mistakes and successes
@@Agent-mb1xx the AI stuff is coming for npc dialogues etc... we saw this with the microsoft leaks.
In case anyone's curious, 1080p is still top on the steam hardware survey (58%), 1440p is growing slowly but surely (19%), and 4K is basically in the same place it was about 4 years ago (3.7%).
Which totally makes sense because you need pretty much a 4080 super or a 4090 to even enjoy 4K and how many people have those.
@Goblue734 That's an overstatement a stable 30fps is all you need on a 4k TV and it can be done with cards like an a750
@@Goblue734 Yes, although there's a subset of people who might have a 4K screen for productivity as well as gaming, and the needle hasn't even moved on *that* in the last few years.
@@Goblue734 even with top tier gaming gpu some games struggling to hit 60 fps unless you use upscaling, frame generation, or reducing graphics settings, because you gues it incompetent developer, or publisher
@@Goblue734 and most people don't change their monitors often
8k is only needed in VR
Why VR?
@@pixeljauntvr7774Put a magnifying 🔍 glass against a TV you will see the pixels FOV will get wider more pixels needed to eliminate screen door effect
Not true try red dead redemption 2 on the Dell up3218k and then we talk again
@@pixeljauntvr7774 Because the displays are right next to your eyes. When the viewing distance is that short, you need a lot more detail.
Because the screen is directly beside your eye and you expect to be able to have clarity far into the distance as well as to the sides (as you do it reality). 4k isn't enough!
Only if my whole living room wall is a TV.
True😂
8k will be good for emulating CRTs shadow masks and scanlines.
I have been hearing this about 4k for years 😂
Yeah it's just that 8k would be even better
4k already does this perfectly, even 1440p does it decent. at this point the resolution isn't as important as the effect crt shaders have on brightness and colour. oled is fine on the colour front but with shaders applied its very dim compared to my trinitron especially since you have to change the gamma curve
And simulating non-square pixels.
@@wikwayer4k isn’t even good if you want high quality fonts on a screen larger than 27”.
4k gaming is still hard forget 8k gaming. Most people are still at 1080p or trying to use 1440p but 4k entry level graphics card is very expensive and not providing performance to play properly.
A ps5 is only 400)
@@romeocalmo its barely holding native 1440p in big titles, and 4k that made with fsr... well, if shimmering is your thing im not gonna judge you 😇
upd. since this heating up a bit, ill made a correction: im talking about games that come out in like past 12 months or so. and narrowing this down to only current gen makes things too sad.
first party launch games are sharp looking but they always are - just look at killzone that was released with launch of ps4. same with cross-gen games in most cases.
@@paskudne AAA games like Demon's Souls and Horizon Forbidden West are native 4K
@@paskudne Your comment is complete nonsense
@@mitsuhh launch title(that using dynamic resolution and game is 4yo) and past gen game(good looking but still ps4 game) can hold 4k? great.
enjoy your fsr in every other game!
I'm still rocking a 24-inch 1080p monitor, I'd take the very high frame rate and responsive gameplay over more pixel-density.
And in motion, a high frame rate does look better than a higher resolution.
Just say you're poor lol
@@Fablemahn he's not wrong though
motion clarity looks amazing on my 360 Hz 1080p monitor(there are even faster monitors out there)
@@Fablemahn idiocy at its finest
24-27 inch 1440p is way better though and 100% worth it, trust me i was just like you.
@@yarincool1237 1440p is quite pointless, just get a 4k screen, then you get way more benefits with game/movie scaling
the game and especially film/TV industry really don't care about 1440p
it's basically an awkward middle resolution between 2 industry standards
We can barely get 4K 60 fps, even with upscaling from 1440p
Yes, you need a minimum of 4080 if you want to the play the latest AAA games at native 4k @ 60fps. There’s a reason why the majority of pc gamers still game at 1080p 😂
@Krypto121 You are not running RE4 on 4070 at native 4K with RT @ 60fps. Unless you are planning to tweak everything in the settings to low. What's the point in that?
Source: watch?v=EhZ090tTKA8
@Krypto121 Elden Ring is not even good looking and you are wrong too.
Elden Ring can not hit 60fps on any system
@@electrikoptik you shifted the goal posts to using RT in your argument, he did not mention anything about using ray tracing. Its also mediocre in majority of games anyway.
RT is not the reason majority of PC games play on 1080p as you mentioned.
@Krypto121 You know what he means...... 🙄
biggest downside for me using 8k was that it really exposes inconsistent assets or just low quality ones in general
if you play a game at 1080p then everything blends together, but 8k really exposes everything
if you have a high quality game that is extremely consistent in its visual delivery then yeah 8k will look nice but your framerate will die, and that tiny extra sharpness just isn't worth the massive drop in FPS and/or reliance on fake frames/resolutions
Its going to eventually be a problem once 4k is the standard and half of that population is on 8k screens, then the market will wish we move up to 16k resolution. Its a vicious cycle. 16k is going to take more than a decade to become even popular I imagine. I think companies that make displays are going to have to start thinking about going a different route with display tech in the future and maybe not rely on increasing resolution like it was in the past.
@@Finger112 I really don't think so. You're forgetting the fact that human vision is fixed. Past a certain point(pretty much 4k), there is just no real benefit. 4k with a solid antialiasing method is already pixel-less, pretty much. Makes no sense to go higher.
Maybe in the end of 2030s we can start get 8K gaming, movies and tvs for real. But never in 2030. People want 120 fps instead of 8K. If they can make games that good, then we can talk again. Focus on making 4K the best there is first. Don't rush with 8K
Not me. I'm gaming at 8K 60fps with my 4090. (upscaling from 4K with DLSS of course)
8k would be nice for more screen real-estate. I use 4k at 100% scaling on a 48 inch. so I can see the use of higher resolution.
Fps is heavily overstated in importance in places outside of competitive games
@@badpuppy3how big is your screen?
I think people overstate the number of people obsessed with higher framerates. I can appreciate it, but many people have difficulty discerning beyond 60, and some people claim they can’t see the difference between 30 and 60 (which is madness to me, as it’s a night-and-day difference). If the only people you interact with are in communities like this or others with higher end PCs it distorts your perception of wider tastes.
4K the sweet spot?
I would say so
Not for Xbox and Nintendo players, the sweet spot is around 1080p 😂@@keithmichael112
8K upscaling would be viable. With AI chips and stuff now days. Would be great if the panels can do this with minimal input lag
It depends on the size of your monitor. 28" I'd say 1440p was the sweet spot personally. 32" and above that 4K is gonna be more enticing before 40" where you would see 4K as the real sweet spot.
Eventually 4k will be the same as 1080p is today. I dont see 1080p screens being sold in 10 years
It's really simple: To be able to differentiate pixels, they cannot be closer than approx. 1 arc minute, as this is the resolution of the human eye. To truly appreciate a higher resolution you have to move actually closer to the screen to be able to see the difference, i.e. make worth the investment. It's also quite intuitive: Imagine a 4k display at the right distance for every pixel to be exactly 1 arc minute in size, quadrupling the resolution at the same distance with the same screen size will result in that surplus resolution being completely wasted on our eyes inability to resolve any higher.
And yet you’ll still have people swearing by 8K monitors despite it being unlikely they have the vision/panel size to discern a difference. Post-purchase rationalization is strong in the PC hardware space.
So, are you saying that the most you'll ever need is 1 pixel per arc minute * 60 arc minutes in an arc degree = 60 pixels per arc degree * 180 degrees = 10800x10800 pixels is the maximum we'll ever need ? With good antialiasing (so maybe downscale from 40-50k x 40-50k resolution, to have perfect edges, thin lines and stability) ? Well, it would be per eye, so a bit less than double of that for horizontal pixels (less than double instead of exactly double, because there's a good amount of overlap)
That would mean that tall 16K would be the last resolution we'll ever need. Given how demanding it will be, it might be the most we'll be able to reasonably do at good framerates and able to stream over the Internet, before we get capped at how much we can shrink the transistors. But that will surely not be in 2030, other than maybe some lab demo.
Only reason I'd want 8k would be for retro scalers and creating a perfect replacement option for high end crt's.
This. Basically every common resolution is covered by 8K with an integer scale, would be bliss.
Can't wait for the $2000 RetroTink 8K.
When we're done pushing for nonsensical framerates maybe we can go back to push for more nonsensical resolutions.
It's the TV and GPU marketing teams at their best selling people overpriced plastic they don't really need.
The average person can distinguish between frames upto 500, trained (fighter jet pilots mostly) eyes can up to 1000.
So we are hitting the peak of what can be perceived by 99% of people.
8k is a little high, but it happen just to make anti-aliasing obsolete
i’m glad we’ve hit a practical resolution limit, now we can focus on fidelity & frame rate
8K is likely to remain niche except for perhaps VR and enterprise
People had the same questions and takes about 4k and even 1080p before that.
It’s called diminishing returns
@@le4-677 that too had been said previously.
@@le4-677 could say the same about all graphical pushes now. Or even the point of pushing FPS beyond 60, certainly not beyond 120. Absolutely everything in video games is diminishing returns now. Is the PS5 much better than a PS4? Will the PS6 be leaps and bounds better than a PS5?
Except Moores law was alive and well during those times. It's very dead now. Progress is slowing down significantly and costs are increasing. It ain't gonna happen. We don't even have 8k movie or TV content yet and that's far easier to achieve.
@@mojojojo6292 Far easier to achieve but very expensive for streaming services.
Focus the power on raytracing. Global path tracing, shadows and reflections at stable 4K60/120. True lighting is the way to make games look better, not more resolution.
ray/pathtracing is stupid
it looks disgusting
better to use high quality rasterisation(like for example RDR2 from 2018 - imagine how good games could look now in 2024 using the same techniques
RT/PT is the reason no one can confortably max out games anymore except MAYBE when relying on fake frames/resolutions
video games are losing clarity due to these -tracing techniques and the temporal upscalers
@@Koozwad well Ray tracing is the future of lighting, rather than faking it. And in 2030, which was the year in the OG question, Ray tracing should be able to run at better frame rates than it does today.
@@coyley72 It still has a lot of downsides, FPS aside. As a 30 series user, I turned it off pretty much since day 1 as it's just too blurry/flickery and whatnot. I prefer clarity with ZERO side-effects.
One reason I can see for 8K displays and beyond is for Virtual Reality and perhaps Augmented Reality (for redundant clarity of pass-thru in realworld situations). I'm still gaming with a 1080ti in 1440p/60 and it's the sweet spot for me. Even when rendering at 4k with 50% resolution scaling. I can't use all settings or hardware RTX but my immersion doesn't need those things if the gameplay is worthwhile. I can still play DooM once I settle in.
I gotta imagine there's a limit for what most people can even see. Like increasing resolution is surely gonna have diminishing returns?
4K is already great for living room setup all we want is affordable great HDR display.
I still have "8K" labelled on my PS5 box lol, ridiculous
Monitor manufacturers should try doing 5k monitors, it’ll be like 1080p and 1440p. I find in games like Battlefield 1 for example, that 4k looks great but at 130-135% res scale, (I think 5k would be 133%) it becomes noticeably sharper and less jagged, then going up to 200% which is 8k, it makes basically no difference. The hit from 4k to 5k is still pretty big though
1440p 27" isn't enough for me unless the AA is really good, like 4x MSAA or SGSSAA. It's not a problem of detail retrieval; it's a problem of image stability and clarity. The problem is that most games made today will never have good AA, and so the only way to fix that is with brute-force resolution and PPI.
I think devs should be focusing on the FRAME RATE rather than resolution.
Eventually I'm sure it will like in 2050 but no time soon
The 5090 will be able to handle 8k 144hz with previous gen titles
Honestly niche opinion, but 1800p with upscaling is the prime for 4k OLED screens trying to get 120 frames in games i think is a good goal to shoot for in building a PC.
Probably when the only television you can buy to replace your dead one is an 8K display. Plenty of people still out there perfectly happy with 720p/1080p no frills televisions.
8k is beautiful and will be quite relevant with RTX 5090. I can’t wait to enjoy games in beautiful 8K
Playing 10 year old games maybe. The 5090 is projected to be only 50% more powerful than the 4090, even smaller gains as you go down the stack. You need a 4x increase in power to drive 8k, not to mention memory and memory bandwidth increases and data cable transfer rates can't possibly deliver that without heavy compression. 8k with DLSS sure, not native.
@mojojojo6292 couple year old games will be able to handled with the 5090. 8k 144hz is coming.
@@xpodx cs2 and doom already do 100 fps in 8k now today on a 4090, the 5090 if we're lucky does 150-160, meaning the day could be close...
@@skandiaart for sure lighter titles
1440p PC monitor, 1080p living room TV here.
If you could integer scale, wouldn't a 1080p output on a 2160p screen look roughly the same (not better or worse) than the same content on the same size 1080p screen?
Will definitely become relevant, it will be marketed and advertised and hyped. When tvs switch to 8k as standard, and they will do. It will be something like what we have today, some games will run nicely 8k 60fps with dlss and upscaling tech, more demanding games will be at 4k.
I'm looking forward to play Pacman 8k on my new Ps7
I think if it becomes a common resolution, it'll be for mainstream TVs where people think "bigger number better" (and then continue to stream 1080p content anyway). I doubt that many PC gamers will bother with 8k over running 4k at a higher res or better graphics settings. I daily drive a 42" 4k display as my monitor and, even with my glasses on, I have to lean in and be looking for pixels/aliasing to really see it at all. Add DLAA and aliasing is basically non-existent to my eyes and everything looks incredibly crisp.
I really can't see myself buying into 8k unless I get something like an 85" TV and sit like 6 feet away all the time while watching a futuristic 8k digitally-mastered bluray. Even then, a lot of movies are recorded on film, which would not benefit much from going above 4k. Even 1080p to 4k isn't hugely different for movies.
@@federicocatelli8785let's be realistic. It might be the case on your PS9
@@justitgstuff5284
Right now very few 8k TV can be sold in Europe as they have very bad EEI (Energy efficency index)...the maximum allowed is 0,90 most 8k tvs are well over 2
It ain't gonna happen. There is still 0 movie or TV content for 8k and the studios have no intentions of making it. Shit they can't even get half of the 4k remasters right. It's a monumentally bigger task for games. With the push for path tracing and high refresh 4k the processing power is simply not there to increase the pixel count by 4x and it won't be ever there on silicon. Moores law is dead. Of course I'm talking about native. You can always upscale your 4k content to 8k which is the only way it's going to be relevant.
I want 2880p monitors to come to the market somewhat soon. I can still see pixels at 32 inches at 4k. 8k just seems like a huge waste for at least another decade
By 2030 we better at least be at 32K. I'm tired of the blocky 4K experience.
We aren't even rtemotely near mass consumer 4k gaming yet 🤣
Honestly we had the same conversation with 4k back in the day. Even when 720p to 1080 the argument was we could probably benefit from 1080p HUDs, but 720p games are good enough.... I guess the more common TV size was 40 inches those time where now we are at 50-60. Still we had all these graphs oh you can't really see the difference between 4K and 1080p unless you're like X number of feet away from the TV. Yet here we are where 1080p and even 1440p isn't playable enough for some gamers anymore.
Unless nothing drastically changes in Gaming in the next 6 years 8K will be a thing. We'll have been through 4 GPU generations by that point. At the very least we'd be super sampling from 6-8K at that point.
… I mean 8k with my 7900xtx looks beautiful for games like Dave the Diver and Manor Lords… low refresh rate necessary and sitting roughly 10ft from 75”.
Can I play Cyberpunk with it with high refresh. Not at all but I can flip between 4k and 8k and immediately notice a difference in the edges.
I'm using Panasonic Cannon.
Those models have smooth screen technology that removes all scanlines.
Heck, even on 1280*720 screen looks 4K when size of the picture is is squeezed into 50 inch or so.
It may be useful when glassless 3D becomes popular. You'll get 6K in each eye. We've currently got 1440p for each eye on existing 4K displays.
The massive display is a misunderstanding, you’d still be sitting too far back see the individual pixels, otherwise the image is outside you fov.
The 4090, even with DLSS can't hit conistently hit 60 in 4k, without FG in games with PT. Grreat points about bandwidth and encoding artifacts.
8k could be very relevant in VR, I feel you should discuss this as it's becoming more mainstream lately. I regularly play in 6k in VR because this makes games look the cleanest, if 8k was viable by the hardware i'd definitely use it.
So the thing to keep in mind is that every single aliasing artifact you ever see is caused by insufficient resolution. We know 4k isn't good enough because you can't turn of anti-aliasing at 4k and still have a good image. Of course it's questionable if it's worth pushing 4 times as many pixels for a slightly better image but it's not the case that you wouldn't be able to see the difference.
The other thing is that with a 8k panel you can do a pixel perfect upscale of 1440p (3x on each axis), which you can't do with 4k. So it adds 1440p to the list of static resolutions which will look the same as they would on a native panel of the same size.
Neither of these are a huge deal, but I just felt it was important to point out that it's not the case that there's no point at all in going for an 8k panel, and the idea that you can't see the difference because the pixels are too small is frankly nonsense.
I've seen 8k games and a big gain is the sharpness of details. It's surprisingly easy to see. It's like using a sharpen filter but without the artifacts. But needing 4x the performance of 4k? good luck. Sometimes it's more than a 4x loss since the rendering pipelines can't handle that level of data.
Once day we'll get there but for now 4k @ 32inches is ultra sharp looking.
I am using a 65 inch 8K TV as my main PC-Monitor. It is a ginormous upgrade against 4K. You just need bigger displays...
The single benefit to 8k screens, 4 player split screen rendering effectively at 4K for each player on a large screen… still seems like overkill.
I think the industry should alternate the focus on framerate vs. resolution.
In other words, aim for 4k@120fps next (for gaming on all next-gen console). Once that becomes available for everyone at all price levels by 2030, aim for 8k@60fps next and then 8k@120fps by 2040.
I do want a dual-mode OLED monitor for 8K@60hz + 4k@120hz tho, mainly to run 200% scaling for clear text during work and switch to 4k@120hz for gaming.
8k isn’t a good anti aliasing because it’s entirely not antialiasing. Antialiasing always has some draw backs that can’t seem to be resolved, but with 8k the only draw back is software and hardware that is always improving every day. I believe one day graphics will be so good that we won’t notice any graphics cut backs, and that day seems to be closer than expected.
Now this argument is based entirely on whether you prefer blur from antialiasing or the pure and natural result of such a smooth and clear a 8k resolution.
As the crew has said: higher refresh and framerates are becoming more important. At 4K I'd sooner jump from 60 fps to 120 fps, instead of jump from 4K to 8K. I believe the pleasing effect is more pronounced. Btw 1440P resolution user right here. Sharper than 1080P, but still efficient and effective.
To me it doesn’t even make sense below a certain screen size
Have you guys not seen Samsara or Baraka? Both were shot on 72mm film and demand to be seen in 8K. For a true IMAX experience 8K is the only way to have the film equivalent of that. I agree for most things it's not needed, but for pixeling peeping goodness, I hope it becomes a thing in the future for gaming, but it's just fun to look at things in such detail. I'll wait until the performance is there, but eagerly await it.
I want an 8K 55-70" tv for a decent price, and hardware that can reasonably drive a desktop at that resolution. My 1060 is running greta with my 4K 55" desk monitor, but text sharpness leaves something to be desired.
4K + OLED + 244Hz + 250$ price tag = pure Bliss
Well that is actually a good case for 8K, to lower 4K prices, otherwise those specs are forever $600-800+ new, since most if not everyone consider them high end monitor, so manufacturers don’t really have an incentive to lower it from that price, just make it cheaper on their end.
The sad thing about 8k is that the one feature it was most useful for in gaming is long dead now. I'm talking of course about split-screen multiplayer.
Displays are getting so large now and with more densely packed resolutions. Imagine a video game that could draw 4 to 8, maybe even 12 split windows on a huge area from a 8k projector? People can have a hell of a blast with a local multiplayer party on that, like the old Golden Eye and Perfect Dark days but much better.
Sure it will. Eventually it will. At some point even if it’s unnecessary manufacturers will turn to it to try to one up each other. Just how things work. Even Apple went higher than their Retina ppi thing even though they were saying they would never need to.
Using smaller TVs 42-55'' as gaming monitors would benefit from 8k.
You sit close enough as normal monitors with peripheral vision being taken up by the TV, and 8k would provide a huge PPI improvement.
From steam hardware survey may 2024: 58% of users has 1080p as primary display, 17% has 1440p as primary. Around 9% of users have primary displays of resolutions lower than 1080p. 1% of users have 1200p (16:10), 3% have 1600p (16:10). Only 3,7 % of users have actual 4k primary displays, and the resolutions not mentioned are very wacky. 4k displays have beeen around for years now, from what I am seeing, it is not really a thing for PC gaming, I dont think 8k will ever be a thing, and not 4k either.
8k would only be for antialiasing. There might be even techniques that avoid the heavy utilization required to render it.
They can’t even make good games worth playing. What are we even talking about. What was the last 5 universally praised good games
Gimme 600 FPS @ QHD over 30 FPS @ 8K anytime
GTA 5 on PS5 uses native QHD Ray tracing shadows at 60FPS and still people complain and call the console weak lol
@@RonniSchmidt-mi7pdGTA 5 is an ancient game. Slapping raytracing on it doesn't make it graphically outstanding. People call consoles weak because they are mostly limited to 30 fps while PCs can go easily above 60 in the same games.
@@AdamMi1 You are very stupid 💀 most of these games are PS4 games like Red dead redemption 2 and didn't receive PS5 enhancements
Steam survey shows most people use RTX 3060, PS5 is way more powerful than that.
Consoles are weak b/c they’re using 7-8 year old hardware and labeling it “next-gen” 😂
@@ericcarlton87 Pretty sure a RTX 2080 Ti was the best GPU in 2020, until they released 3080 and 3090
PS5 is like between RTX 2080 Super and Ti
I'm sure we'll get there eventually, I still remember when 640 x 480 aka 480p was "overkill". I think it is a case of diminishing returns though
I think the move to 8K is a significantly slower match than the move to 4K.
OTA broadcast is still in 1080i for most of the world.
And the bandwidth demands of 8K is still too high for the current global Internet infrastructure, even when we get a newer more efficient codec than HVEC.
But I know we will eventually get there, as the TV manufacturers will eventually force the inevitable 8K upgrade, but I do not see it happening by 2030.
Based on the performance of 9th gen consoles, we're only now moving towards 1440p. Native 2160p rendering is still few and far in between, especially if we want 60fps or 120fps. I don't even think the hypothetical PS6 can hit 4K60 consistently.
Is this really a question 😄. Of course it will be eventually, just like 720p was, then 1080p, then 4K and eventually 8K and so on and so on. Technology doesn't stand still and yes 8K is needed because even at 4K once you hit 40"+ it isn't high res enough. I have a 43" monitor and at 3840x2160 it's only 102 PPI so yes 8K and double that is needed.
*looks at console games running at 720p in 2024*
Hahahahahaha 😂😂😂😂😂😂
Series s?
@@Matt-jc2ml many games are 720p or near that in their 60 fps modes on Series X and PS5. Immortals of Aveum, Alan Wake 2, etc.
@@leospeedleo I didn't know that about Alan Wake 2. I was just playing that on my 65" OLED and it looks sharp. While I know some people aren't big on FSR, it looks really good to me for such a low resolution game.
@@stuckintheinbetween you aren't sitting 60cm(2 foot) from that screen though I take it, or maybe you're just used to low res?
Ironically, I have an overclocked PS3 (750/1000) and many games do look fine especially the 3 Arkham games, Killzone 2 to name only 2 on my 4K LED, AA is what really matters
Also, with calculated viewing distances.. these bigger resolutions arent going to matter as much as 8k wont be noticed till the screen is even larger.. but then thats an additional variable to reccomend a father viewing distance.
Also given the economy for most middle class people... we're not going to be investing into 8k any time soon.. luckily the 4k TVs are super cheap and deep blacks with micro/qled/ oled is a better focus. Resolution is fine, constrast and refresh is going to continue being more catered to for now IMO
In 20 years 😂
Nice joke 😂
@@FantasyNero i mean it is not a joke. 4K has barely gotten relevant yet.
@@chy.01904k is mainstream and eye cancer
@@chy.0190 For gaming monitors with high refresh rates? Sure.. As a living room TV? 4k is so affordable at this point it's kind of dumb not to.
@@alanandfriends-s2o
Even high end gpus (like 4090) struggle at delivering good 4k framerate without resorting to dlss .
8k 100+ fps eta 2035 on flagship cards (x90)
2040 on mainstream cards (x60)
I had a QN900D for a couple of weeks before returning it 8K on a 65" + screen is definitely noticeable vs 4K from a separate OLED TV
unfortunately it's pointless outside of the desktop experience.
8K 60hz after having the possibility of 4K 240hz... also negate any need for it until 8K 120hz is also a thing...
8k Would probably be nice for me. But I'm having 55" TV on my desk. It's not needed but it would be a nice to have bonus. But for gaming, it would need algorithmic changes for graphics and ability to drive the display with two display port cables. (fastest HDMI is about half the display port cable bandwidth.)
The resolution benefits at screen sizes are not about display size, it's about display size divided by distance. In order to get benefit the screen just needs to fit large percentage of your vision and that can be accomplished by making screens bigger or have big screen close enough. It's about pixels per certain angle from your eyes instead of pixels per inch in target surface.
144hz and 10bit color should give more to the experience than 8k resolution, but you are limited by cable bandwidth, to have reasonable frequency with 8k and 10bit color. And even with 8bit color displayport would be in a range where you could probably would want to improve refresh rate.
8k probably is very likely eventually be ideal gaming resolution. But there are many hurdles to overcome before its worth it.
An outlier here but, in my case, my 55" TV is FHD (1080p) and I don't see how 4K (even less 8K) would improve my enjoyment when playing. Moreover, my AVR is 1080p, and adding to that, the electronics market where I live lacks middle-range TVs (just crappy or premium) and AVRs are non-existent (just soundbars) so I would have to import, so it would be costly for me to upgrade my setup.
Mr. Huang guy said that the allmighty 3090 is a 8k gpu
imo there is really no point to go above 4K unless you are playing on a HUGE tv. other than that, 1440p and 4k is more than good enough
I just just got a 4K OLED, and I just can't see why I'd upgrade from that. I don't think movies/tv shows like Breaking Bad changes that much from 1080p to 4K, and many games struggle to get native 4K.
I don't know but I dont have enough pixels on my 28 inch 4k screen for videogames, its not enough for microdetails on models depends a lot on rendering engine though
Someone turned up the sub surface scattering on rich a bit too high, he looks very red in the skin tone
hi im a time traveler, here in 2077 we dont really use screens anymore but you could say games are in 8k.. or even 16k, they call it "relive" and you can experience the world as it was before 3rd war, play "games" (which is the only way of interacting with another human), and even eat a meal.. oh, and d4 still sucks
1080p with good AA on 4K is good enough for me
Native 4K for older titles
1440p for me.
Unless tv channels and streaming services gets to the point where they agree with TV manufacturers to slowly discontinue 4k options. I don’t PC and consoles would need to push it out..
But it needs a push where it’s becoming a new norm.
Give it about a decade and it will be fairly standard, at least in new hardware (consoles, GPUs, TVs etc), although 4k/2k will still be commonplace.
Why not 4k with good AA instead of 8k?
8k is sharper and looks better.
Look, it will become a thing one day for sure. Be is 2030 or 2040 but we will get there.
And yeah, in 2032 I can see it being part of High-End gaming.
8K makes no sense for media or gaming. 8K should be pushed for desktop use instead - sharp desktop with tons of space to work with. Something like the Samsung ARK 55" in 8K would be great. You'd game at 4K or less thanks to integer scaling at 1080p, 1440p or 4K.
I have a 65” 4k OLED and there are still days where I’m truly blown away with how good it looks and I’ve owned it for 3 years. To me, I don’t see a need for it and especially given the cost of computing power necessary to push 8k graphics and content. Imagine playing Alan Wake or Cyberpunk 2077 at 8k with path tracing on using a 4090 would net you -negative frame rates 😂
I think 4k will be sufficient for a long time. Other specs will be more important such as framerate, HDR & color accuracy, raytracing etc.
Right now QHD is the sweetspot and this might remain the case for many years to come.
usually I would have said yes. but after discovering dldsr it really is not so urgent at all. maybe in 10-15 years it will be more than standard but dldsr or other good anti aliasing options like it are getting us most of the way to that pixel perfect sharpness
Hot take: even 4k is not worth the GPU cost, heat, noise and power draw in gaming. The visual difference compared to 1440p is minor.
Only if you are sitting far away. The difference between 1440p and 4k is very obvious to me when I'm sitting 1.5 meters away from my 48 in 4k
@@datsneakysnek Did you come here just to flex? The man has a very valid point. Most people game further away from their TV that you do. 1.5m away from a 48 would give me a headache.
Might as well just game at 720P with AA on going by your logic
totally makes sense for vr, not for a tv for sure
It’s just nature of technology.
People will say the same of 20k or whatever the next few benchmarks are.
Unless there is a massive breakthrough in computer tech 8k then native high refresh will never be a thing. Moores law is dead and progress is already grinding to a halt. Combined with the push for path traced rendering and even higher refresh rates the processing power will simply not be there for it. Up-scaling might take you there but how much better will that look over 4k native with dlsdr? Probably not noticeable on a monitor smaller than 40".
Yeah, I wouldn't mind if console resolutions stopped at 4k and then they just focused on picture stability, anti-aliasing, etc.
Because, for movies & TV shows... 8k TV is gonna be pointless. Pretty much all content out there is 4k maximum. No real digital 8k movies and even for the old movies, you would need the ones that were shot on film stock bigger than 35mm (which was VERY rare).
As long as cable TV providers are still releasing videos in 720p
8k won't be needed until the average resolutie goes up.
Your sitting anywhere between 10 and 20 times further from a tv than a monitor, this is completely not comparable to a monitor where you may be 1-2 foot from your screen meaning your ability to perceive higher resolutions is way higher on a monitor. It makes barely any sense to broadcast at higher resolutions as the perceivable difference is negligible, the same can't be said about monitors.
Pretty interesting steam stats around this as of May 2024 58% still on 1080p, 18% on 1440p and 3.7% on 2160p. For a PC I think a 4K 144Hz 27" monitor is an absolute no brainer because it's not just about gaming but in the TV space a console (always released with old hardware) won't run 8K until long after PC does e.g. if PCs can run 8K in 2030 don't expect it on consoles until 2035-40 and when it does your sitting so far back you won't even notice.
We rarely have even native 4k gaming since game developers expect us to use upscalers nowadays so I'd say if it does happen it won't be in the near future.
I hate the everything needs upscalers. And everyone blames consoles for being to weak but that literally just means last gen consoles are more powerful because they didnt use upscalers and when they did it was minimal
As someone who sits 1.5 meters away from a 4k OLED, I will tell you that I would prefer more pixel density...but the GPU cost is insane and even my 4090 wouldn't do 8K. Maybe if we get a big jump in gpu tech/DLSS gets much better I would be interested in an 8k display
Yea I have a 4090 and 4k 144hz 28" and a 5k 27" and the 5k is way sharper. I'm just waiting for a 8k 144hz 28" and 5090
Seems people are falling into the same trap as when 4k was new. You're not supposed to see the individual pixels. If you do, the resolution should be higher.
Anti aliasing hides a lot of flaws, but it's also adding "blur". A maximum "usable" resolution should have no visible stair stepping or noticable crawling without relying on AA.
That said, I personally have a 55" 4K TV and at the distance I'm sitting I don't feel the need for a higher resolution. The 40" 1080p TV I used to have was much worse. I don't know how I'll feel if I replace my current TV with a 75" 4K though.
I also have a 32" 21:9 1440p monitor, and there the resolution is definitely too low IMO. The pixel density is about the same as a 16:9 27" 1440p monitor.
The main problem with the resolution isn't in games. It's with e.g. programming. Blurry/blocky text is a pain.
I can see myself upgrading to 8k in another 6 years, but not mainly for gaming.
Screw anything above 576i.
Wasn't it 50Hz?
@@Agent-mb1xx Yep.
Personally think for 50+" tv's should have a slightly higher resolution, at most I would say 6k but 5k would probably be enough to smooth out the edges
You are not gonna notice it if you sit further away
I play in my games in 4k up to 120 fps on my PC and I love it. I think it's pointless to go 8K. I want more detail not resolution. Higher draw distance.
At some point manufacturing 8k monitors and TVs will cost the same as 4k monitors and TVs. At that point, manufacturing 4k monitors and TVs will be superfluous and will be slowly dropped in favor of streamlining TV production. So yes, I think "8k gaming" will definitely happen, but not necessarily because the demand for 8k will be huge, but because progress never stops and it'll just slowly creep up on us as lower resolution displays are phased out.
Making 8k displays is not the issue. The monitor tech progress has far surpassed graphics processing progress. Moores law is dead. The push for future diminishing gains are being aimed towards path tracing and higher refresh rates. There will be no left over power to give a 4x increase in pixels. With up-scaling from 4k sure it might be possible with the very best PC hardware but the majority of that base are not playing their games on massive TV's. Consoles have no chance.
I would love an 8k display. I'm using a 43inch 4k display right now with 100% text scaling and it's pixel density is not much better than my old 24" 1080p monitor. For productivity 8k would be amazing. For gaming though??? Maybe for something like Anno? I think the second half of the 2030s might be realistic for 8k gaming to enter the realms of enthusiast mainstream.
I’m 10 years if 8k becomes a reasonable price and if graphics proper keep progressing as it is I can see it still being a thing even if rather niche
This generation is still relying on upscalers. 8K is not happening.
i want 8k gaming to be a thing so bad because then you wont need anti aliasing in games, so many games today have terrible forced blurry anti aliasing and its an eye sore, if 8k will be the norm then there wont be a need for it, while i would just rather game devs to just make good anti aliasing, this is the alternative.
Eventually it will become a thing, but not for a very long time. The cost of screens, not to mention the compute power needed to drive that many pixels with modern effects, is nuts. I wait for the day where there's a 4K game running on an 8K screen and someone says, "Damn that looks blurry". We're still in an era where 4K TV broadcasts are still rare, so content that drives the technology outside of gaming has to get there too, and as I said, that won't happen for a long time.
Big difference between a 'thing'' and ''mainstream'' 8K has been a ''thing'' since the first 8k monitor came out which was in 2017... It has been realistically possible in terms of getting 30+fps in games since the 3090 release in 2022. Mainstream wise it probably won't be until the 7090 that every single game can be maxed out without DLLSS or ray tracing on at 8k and still maintain at least 30fps at all times