The 30FPS VS 60FPS Debate - Luke Reacts
Вставка
- Опубліковано 10 бер 2024
- 👉🏼 LINKS: linktr.ee/lukestephens 👈🏼
--
Luke Stephens is a Content-Creator and Streamer that co-streams on UA-cam and Twitch. This channel has Live Stream clips and highlights in addition to the streams themselves. Check out Luke's main channel & other content by going to: linktr.ee/lukestephens !!
--------------------------------------------------------------------
╔═╦╗╔╦╗╔═╦═╦╦╦╦╗╔═╗
║╚╣║║║╚╣╚╣╔╣╔╣║╚╣═╣
╠╗║╚╝║║╠╗║╚╣║║║║║═╣
╚═╩══╩═╩═╩═╩╝╚╩═╩═╝
--------------------------------------------------------------------
Subscribe and I'll be your bestest friend!
#gaming #LukeStephens #ps5 #xbox - Розваги
Would rather see games go to 1080p and 60 fps than back to 30 fps
480p 30fps, take it or leave it
Agreeeeeeeeeeed! I do not get the 4k thing……
True
4k interlaced 40fps with VRR, 1080p 60fps, 1080p upscaled from 720p at 120fps.
Absolutley true 60 should be minimum standard in 2024.. So tired seeing these top graphics moving 30 its just really lame
Framerate is so much more important than resolution. I would rather play a game at 1080p 60fps than 4k 30fps. It is unacceptable when modern games don't at least have a 60fps performance mode.
What's the importance of it, how does it affect you.
Yea I can do 4k 30fps with most games. 1080p bugs my eyes and makes everything blurry and hazy to me. The only games I can't do in 30fps is racing games obviously
@@alexwilliams4729
But 1080p is still considered good resolution.
@@kiefline37851080p is bad . Especially when you play on big screens.
@@kiefline3785it's not a good resolution. Look at digital foundry's TAA video. 1080p sucks for modern gaming
Watching a clip at 30fps is much different than holding a controller and playing at 30fps. Its more of the way it feels, not necessarily the way it looks. 30fps is most of the time a complete deal-breaker for me.
For me it depends on what type of game it is. I think some games can work with 30 FPS just fine but when it comes to shooters like a cod or doom yea it’s gotta be 60 or it feels like you’re stuck in mud when you move around. Or like certain character action games such as devil may cry those games feel so good at 60. But on the other hand if I look at something like resident evil 2 remake I don’t necessarily need 60 FPS for a game like that, 30 feels fine to me so it just depends. It’s also about what you are personally used to. Let’s say you have a nice setup with a 144hz monitor and play most all games at 120 FPS or above then yea I could see how 30 would feel unplayable after that
I'd rather play gta 6 at 1080p 60fps than 4k at 30fps
100%
I would rather 1440 p 45 fps
@@omega458 you don't know how bad that would look. 40 fps only looks good on 120hz displayers. 45 fps is such a weird number incompatible with 60hz monitors, it would look choppy and inconsistent.
Imagine playing a modern fighting game at 30fps 💀
But it is possible right? Didn't guilty gear xrd do it and the game looks and plays great
I can tell a HUGE fucking difference between 30 and 60FPS. PC has ruined 30 for me, I can’t fucking do it.
Big time, and wait until u pop ur cherry with 120fps. like an eyegasm, provided u have the 120hz monitor.
@@FullyShadow Honestly I have a 144hz monitor and can go back to 60 fps without even batting an eye, anything lower than that is visual torture tho
30fps done right with correct frame-pacing and double-buffer VSYNC with low input-latency feels great. I'll take it over highly inconsistent 60fps with stutters every second, a mountain ridge for a frame-pacing graph and a resolution of 450p upscaled to 1440p. Bottom line: Developers have to put actual work to make their games feel good to play.
Agreed. I've been a laptop gamer before, and was always at 30FPS but after grad and get enough money for a high-end PC, it's only natural to ask for 60FPS. You've spent time and effort to save up money and play a game on a good PC. Anything that's lower than 30FPS isn't a big deal to me, but it definitely felt lacking.
Absolutely agree, going back to 30 genuinely does feel unplayable. For me what's ideal is having most games run at 60 & any competitive games at 120. The difference between 60 & 120 is noticeable when you jump between the two but it's a far smaller gap than 30 to 60. I wonder if a lot of the people that are perfectly comfortable with 30 are just people stuck on console, because aiming or just moving the camera around with a thumbstick is usually a lot slower and smoother than aiming with a mouse. Mouse aiming is more erratic & fast & sharp and when you're flicking around, 30 is simply unacceptable.
Performance matters more than graphics. I wouldn't mind games being 1080p but I want 60fps minimum. You can tolerate low graphics but low performance/laggy performance can literally make a game unplayable & frustrating even if its a good game.
Y'all are the same ones saying there haven't been any true next-gen games yet. If performance is all you care about then it's been plenty lol.
@@tyler-hp7oq ??? Classic throw out a random topic because you have no real reply, like seriously, what was the point of this reply?
@@tyler-hp7oqcan the PS5/Xbox actually run 4k with 60fps? No. It makes you choose between graphics & performance mode. Not a single "next gen" game you are talking about can run 4k60fps on consoles. Not even one.
So my comment was actually about that issue. If I have to choose between two, then I will always choose Performance over graphics. And if games can't even give me 60fps despite making me choose between graphics & performance, then guess what? F*** that game.
It's not like I want worse graphics. If I can get both, I would take both. But the reality is right now many so called "next gen" games make you choose between graphics & performance & they are not delivering on the performance front at all. Uncapped 30fps, seriously?
@@ShinjiIkari007 Do you not understand hardware limitations? Consoles will always be pushed to their limits in one direction. Either performance or graphics. You can't have one without the cost of the other. Any game that actually decides to push the scope of what's possible will limit the game to 30fps on console. That's why games like GTA6 will definitely be 30fps. If you're someone who only cares about performance then don't cry about games this generation not being next-gen. In terms of performance, you are already getting tons of next-gen games with 60fps modes. Saying you want the best possible graphics and 60fps on consoles shows you have zero understanding.
@@tyler-hp7oqWhen GTA6 ends up looking like it did in the trailer and actually renders that many NPCs, i think a lot of people would understand a 30fps cap.
However, most games that struggled with performance this generation also looked shit. It's not like the games that target 30 look incredible.
60 > 30 I ain’t gonna waste my time watching a bunch of Masters debatering each other
So playing RDR2 for first time ever after playing GOW Ragnarok, if you think I couldn't notice the difference between 30 fps vs 60 fps ur high
100%, that exact reason is why I'm holding off for that rumoured current-gen update/port for RDR2 before I play it.
@lukeloggedin I don't blame ya I thought about it but didn't want to wait any longer. Once you get into it you don't recognize the difference much but at first for a lil while it's super noticeable
@@LukeLeafOnlineMaxpayne 3 needs a 60 port.
@@Ronuk1996 no it should be 120fps, my weak 1650 laptop can run MP3 over 100fps, the consoles should do even better
im pretty sure actually that you cant ;D
Maybe for those that can't tell the difference between 30 fps and 60 fps should try turning off motion blur.
inb4 "If you need to turn off motion blur to tell the difference in framerate, then just keep motion blur on". Honey, motion blur is disgusting. I'd rather play on 30fps without motion blur than on 60fps with it, so yeah it's a moot point.
Never understood the hate for motion blur 😂😂 I think it’s looks pretty tbh.
@@henrixrahethey are fine in movies for effect. But in gaming you have to make out detail for survival or to find things to achieve objectives. The latter you can do by coming to a complete stop. For survival that is not an option.
@@henrixrahebtw, by having to come to a complete stop to make out detail takes longer to find things. But most of the time you just miss seeing things. How I know that is I've played a game numerous times on a console, which was 30 fps, then played it again on a PC and I didn't know there were things that were there. The higher frame rate on the PC helped. But being able to turn off motion blur and depth of field helped as well. And yes, lets not forget depth of field. It is a good effect for movies, because that is what the director wants you to focus on, and is natural for your eyes to do when you focus on things, but in gaming you are the director and you can focus your eyes on anything you want.
@@0x8badbeef ohh I totally see what you mean. I usually play story games mostly so I can see why it hasn’t been too big of an issue for me
30 to 60 is a massive jump, I don’t understand why it’s even a debate. 1080p 60fps and higher is objectively better
NO TF IT AIN'T! STOP. FUCKING. LYING.
there never will be leaps in performance if we keep targeting 4k....at least not for a while
It is absolutely noticable. Especially in fps games. And the clip is moving the camera very slow.
The person who posted this tweet also claims Astral Chain is the best action combat game of all time. Their proof is a screenshot of a metacritic review. I wouldn’t trust a word they say.
I love Astral Chain it’s underrated at best
@@thetruestar6348 yeah even looking at Platinum's own catalog it doesn't hold a candle to other action combat games they've done
@Mandingy24 it's a gem of a game
is that Nintend guy high? 90% dont notice. 90% WOULD notice the difference between 30 and 60
chill out
Hes an ass,always has been.a new comer to gaming can play both and tell the difference
his name explains all, he plays Nintendo games so he's used to playing at 30fps
Play a game at 60fps for half an hour to an hour. Then switch to 30 and do the same. Yeah you'll switch right back to 60
30 is not acceptable anymore sorry. I'll go back to 720p if I have to
I think more games on PS5 specifically should add a 40fps mode for that balance of resolution/graphics and framerate. That’s how i played Horizon Forbidden West, A Plague Tale: Requiem, and Spider-Man 2
Agreed. It's especially important for action games where input are really important.
40fps mode? Lol Spider-man 1 runs fine at 100+ Fps on PC. Ya'll are getting shafted hard console damn.
@@stonaraptor8196 How much does that pc cost? doubt it costs less than a ps5.
@@firasdrass943 You can have a fine PC for about 1000 EUR. But the more important part is that if you like to play games, there are a lot more discounts and sales for PC games throughout the year. So in the long run you will be getting your games for a bargain.
And then there is the possibility of modding your game, including with performance enhancing or visual mods.
@@stonaraptor8196You got shafted into spending $1500 just to have something run better. Imagine being that bad with your money. 😂
40-50 fps should be the new 30.
45fps is a nice middle point
@@adk4986 yeah 45 is a really good sweet spot
If 30FPS was that much of a deal breaker, I simply would not play many games at all these days.
Huh? Are consoles that bad nowadays? Such a weird perspective to have anyways accepting 30fps as a standard in 2024...Completely the opposite for me. I am used to play every game at 100+ fps. In a rare case a game I'd like to play (zelda on switch) can't run at 30 it's gonna be a hard pass. Hell, even breath of the wild was great on PC at 60fps. But generally, 30fps is a broken game in my eyes.
@@stonaraptor8196 How games run just aren't a priority for me if the game is fun, I guess. Games like original FF7, Dragon Quest 9, or Ocarina of Time don't exactly running at what people would consider acceptable nowadays, but I return to those more than any game I've bought in the last decade or so.
@@stonaraptor8196the Nintendo ninjas will get you for playing BOTW on PC.
30 fps is okay on consoles. Have recently played RDR2 and TLOU2 on my trusty ol ps4 pro and it's fine. If the visuals justify going with the 30 fps then fine by me. However, whoever says they can't see the difference is lying or delusional.
90 is the sweet spot for my eyes, less than that I can see jank, more than that I can't really tell the difference. I just lock everything at 90.
same
My issue with the 30/60 fps debate is that those who can't tell the difference KNOW there is a difference, but they can't tell you WHAT the difference actually is. It isn't obvious until you've had someone explain what to look for. Once you get it, you can't un-see it.
When i play story driven games I don’t mind upscaled 4K at 30-45 fps (ryzen 5600x with a rtx 3070) but multiplayer games I will turn down settings to hit 120-144 on my 1440p monitor
Thats quite fair
Going from TLOU part 1 or 2 and then to RDR2 really hurts my eyes but after half an hour, my eyes adjust and then I'm back having fun
I could care less as long as it’s stable.
30 fps aren’t stable
@@PassportBro_No offense, but it seems like you don't understand the meaning of stable framerate, bro.
@@Sum_Yousah i do understand it.
What i meant is that the 30 fps in dragons dogma 2 on console aren’t stable as confirmed by the devs.
The game is going between 20 and 40 fps, which is everything but stable.
@@PassportBro_ Oh ok. I obviously misunderstood you, sorry.
@@Sum_Yousah no problem bro
Always the voice of reason/skepticism. Great video, Luke!
I can tell the difference between 30 and 60 and thought it was a game changer when I first started playing at 60+, but since I'm mostly a console player I still often choose to play games in quality mode, but I know that many can't go back to 30.
Ps5 DID feel like a big leap at the beginning of the console. Demons Souls was amazing. They just stopped trying and made games for old systems too
That stalker trilogy release has a menu toggle, and no amount of motion blur can save the 30 fps image destruction.
I'm used to 144fps, anything below 60fps is unplayable for me. This "debate" is a bit weird. Like plebs fighting over scraps. This sounds extremely bad, but it's kinda hilarious to watch.
Upscaling is used heavily by consoles. Both the PS5 and Series X render at about 1440p or lower on for most games to hit 4K 30 fps. In 60fps performance modes render resolution can sometimes dip to 720p or lower upscaled to 1440p.
30fps is super noticable now that Im used to 60fps+ since I bought a PC. But it doesn't make or break a game for me.
I switched to PC gaming because I'm tired of 30 fps on consoles
Yeah yeah yeah let’s all pretend we can actually see the difference between 30 and 60. Guess what, there isn’t one
@@TheCephalon there is a difference between 30 and 60 but 120 fps seems excessive
@@TheCephalonstick to ur 30fps then we will enjoy our games on pc with 120+fps :)
@pwyll63627 I play at 165 and I notice the difference
@@TheCephalonThere is a fucking difference otherwise you're blind
I upgraded my pc few years ago, since then I left all my consoles behind gathering dust. Once you go 60, it's really hard to go back to 30.
I use a big 4k tv but play everything at 60, i hardly really even tell the different from 1080 to 4k in games that have the option
I started saying this years ago, the push for 4K was gonna be highly detrimental to the industry and here we are. For rasterization media 4K absolutely should not be the standard if it means sacrificing actual graphical fidelity and performance.
The difference between frame rates gets less noticeable as you go higher up because of the frame timing. It's diminishing returns. The frame time between 30 and 60 fps is 16ms, but from 60 to 120 it's only 8ms
Majority of people cannot FACTUALLY tell the difference from 30 to 60 FPS. Most gamers feel a difference then SEE it. You're eyes can't even process 120 fps, but our brain feels a difference and that's why plenty of us get headaches with fps that high.
Totally false. Most people can tell the difference just by looking. A good example is television recorded with the "soap opera effect" which is just motion interpolation from 24 fps to 60 fps. Almost anyone will pick up on it.
I started gaming on PC so I started with 60FPS and If I play something at 30 fps for more than 10 minutes I get headache and motion sickness.
Having a 240hz monitor from 60fps to 120fps is also substantial shift and a welcoming one gameplay becomes buttery smooth.
However going to 120 to 240fps the smoothness increase is noticeable, but its not a substantial shift as this needs my PC to work twice as hard.
Honestly… I used to think I’d be unable to play games at 30 fps, and for the most part I still do - on a monitor. When I’m up close, it makes me motionsick. But if I’m further away, even if the screen is larger and takes up the same percentage of my view, I’m largely fine with it as long as it’s stable. If frame times are all over the place and there’s spikes and all sorts of anomalies, then it’s problematic, aiming is more difficult at 30, but that’s about it. I vastly prefer 60, but 4k is just sooo KRISP it hits different.
3 years ago I went out and bought a 120zh tv for this generation looking back I should have bought a 1080p screen and an XSS still waiting for console to catch up to my tv
debate? not really
what i have come to understand is its far more to fo with the 1% s than the average . i used to be 120fps or die. but now with my current rig being so well rounded its so smooth and im very happy with 80+ , if your on a synced and locked 30fps a long way from a heavy motion blurred TV playing with a controller it can be acceptable in some genres.
im a person who get used to 30 fps, the moment i put it to 60........ i got motion sickness the first time i play a game with 60fps setting. it took me a while to get used to it, cant stand getting motion sickness every 15 minutes wayy back before.
He is turning the camera slowly enough to fit his narrative. In quick turns of the camera it causes dizziness.
I find I kind of get used to 30 FPS if I play long enough and "power though" the noxious phase. It's probably also less of a problem for people who don't have 4090s and 270hz VRR monitors.
Imagine being a console peasant who thinks he's better than the PC master race.
It's 2024. I can't do 30fps. Switch is the only one I can do. Otherwise it's a no. Never. Never again.
It really depends on the game. A lot of games are fine at 30. But adding in the option for stronger hardware is a good idea. I have always had mid-range to low end hardware. So, I tend to lock to 30-40 fps.
Recent Doom games use heavy (but high quality) motion blur; maybe that’s why they give you motion sickness. Try disabling that if you haven’t next time.
Theres no argument, we had 60 fps games on N64 and PS1 for christs sake. If a PS5 with a gpu on par with a 5700XT/6600XT cant do 60 in all games in 2024, thats a yikes on the devs dog.
With the graphical games that aren’t coming out as much, in terms of making leaps and bounds, I think they are coming but it’s taking much longer every new game. I think Ghost of Tshuimina 2, Unchartered 5, GTA6 will do that- just will come late gen. And that will probably happen from now on- all these transformative games coming late in the new console gen.
I've been playing 60 since 2020 w the ps5, didn't think I could ever go back to 30 when any time I try, my eyes hurt and I get a headache. However- I recently decided to start and finish an old AC game, sucked at first but then my eyes adjusted and 30 is fine for me now. Don't get me wrong tho, ideal world is a console that can deliver true 4k with locked 60fps, even 120. RT I can totally do without
I have never played on PC i play on ps5 ever since i got it i just cnt do 30fps anymore i hate it i always play on performance and if theres no 60 i most likely wont buy
I would rather play a game at a locked down 30fps than a shaky 60fps with frequent drops or stutters. I have also found that, as I’ve gotten older I care more about image quality than I used to.
I don’t like jagged edges, I don’t like fuzzy or patchy textures or upscale ghosting. These issues are usually best solved by upping settings and or resolution. This usually means playing at 40 or even 30fps sometimes so I can actually enjoy what I’m looking at, and I’m okay with that. I will say however I do prefer high fps 120+ in shooters.
Baldurs gate was because if the slip screen, that anyway runs bad on high en console, and the series s is kind of a one s, yes, but the target res is completely different (one x was targeting 4k, one S realistically full HD)
The Xbox Series X is what ruined 30fps for me. Their back compat program and its FPS boost is insane.
Almost every new game I play has a 60fps option and basically every old game I play has FPS boost. In the rare case where a game comes out locked at 30, I'll happily just skip it, there's already way too much to play.
I use to be 60fps or nothing. After getting a steam deck, however, my thoughts have changed a bit. 30fps doesn’t bother me all that much, as long as the frame pacing isn’t horrendous. I still like 60 of course, but I don’t feel robbed if I’m unable to play that way.
This is why games should offer performance mode at 60 fps and fidelity at 30 fps
My problem is the inconsistency of the devs. Certain devs can run 60fps on console easily while others make it seem more complicated than it should be.
After switching to PC many years ago 30 fps feels like 15 fps to me now. Like watching a slideshow. Completely unplayable for me.
As long as it's not under 30, then the game is completely fine and playable. There, I solved the debate.
You haven't solved anything though. This is sort of the starting point of the debate. Fine and playable are not optimal and great. And shouldn't we strive for that? Some say fine is enough others won't settle for that.
there is nothing to be solved, lets be honest here, its just another debate everybody will forget about, thats what you fricking apes do all the time xD xD@@TheHidalgo99
yes the will settle thats the nice thing, everytime people say they care for something they actually dont, its the truth, just accept it@@TheHidalgo99
1440p at 60 is what every game should target. I think that would give the most room for players to adjust to their liking based on their setup. Then if someone has a 4k tv with VRR they can play at 4k 45 fps without it looking choppy. However, of they're rocking a 1080p 144hz monitor they can adjust the resolution down for the extra fps.
I don’t really care about 60 vs 30 fps so long as it’s stable but if I’m given the choice between 4K 30 or 1080p 60 I’m choosing the later every time
Why does *_Rebirth_* require *_a full quartering of it's resolution_* to go from 30 to 60 FPS?
I can't tell the difference unless it's side by side
Many people don't realise, the 30fps trend started with the Playstation 2 era. Before that everything was 60 fps.
30 fps with high motion blur to mask it actually makes me feel motion sick. I am used to 144fps these days but yeah. I'd say 95% of people who have played 30fps as their standard won't notice the difference until they spend some time comparing and getting used to 60fps. If your used to 60fps its IMMEDIATELY obvious.
For the type of games I usually plays I don’t care about the framerate at all and can’t really tell the difference. So for me personally as long as I don’t have fps drop it’s fine
I find that I sit in the middle. Either frame rate is great, depending on the genre you're playing. But frankly, I find I can play almost anything at 48fps. Less choppy than 30, more cinematic than 60. It's a nice experience.
not having at least 60 fps is far more unacceptable than a game not having 4k resolution
all games should just come with a performance mode that can run 60/120 fps with 1080p
It was a Standard back in the days … after the Dreamcast it was all about fidelity and resolution and only in a few instances about performance … with the current generation I had high hopes to have 60 back as a standard
And I also think optimization is just not in budget anymore and usually only experienced devs really care about
i bought a Series S when it came out (which is like 4 years ago) and i cant say anything really good about it. doesnt matter the game, its super blurry, FPS are inconsistent and can be pretty bad and idk, now its somewhere in my tech stuff storage shelf because i dont use it and im to lazy to sell it for a hundred bucks.
These are the same people who in 2030 will say “60 fps in 2030 is criminal, standard should be 256fps anything under that is literally unplayable” and have no clue what goes into making a game. If you’re willing to miss out on a great game because of it, go ahead.
I have a monitor that does 75 fps. I can tell the difference from 60 to 75. But the thing is if games are being charged the same price, why not get the better version. The new Contra game for example. Why get it on the Switch?
the thing is that if you try more FPS or higher HZ you can't go back, 60 fps is just not smooth for me, i need at least like 90+
Can't we all just agree that not having the option of choosing res and fps just sucks? For some games I want 4K even if it's at 30, at others I was 60fps even if it's at 1080p. Not getting to choose is what sucks
I honestly like both
As someone who appreciates the classic 80,90s era of games and obviously the current day 2000s to now era
I don't necessarily see the issue,I prefer something functional or at the least enjoyable.
Personally 30fps is fine for cinematic styled games
And 60fps or higher for true competitive/pvp games seems logical.
Both are playable and nowhere near as bad as the games from the ps3,360 era that were mostly inconsistent with the 20-30fps range.
In the end,I believe player choice is the best option for all,I have never heard of a person getting mad because devs gave them choices for their visual/audio options,so clearly the majority of players like that
30 is fine for Xcom or something, anything that has action based gameplay is rough at 30.
I personally don't mind playing 30fps if the trade off is worth it. 30fps cap with good frame pacing is better imo than 60 but stutters and freezes. 1080p60 is ideal for competitive or fast paced games.
Exception would be games where high frame rate doesnt matter as much like persona. I would rather get 4k30 vs 1080p60 there. The trick is not to go back and forth because your eyes need to adjust.
I really want to say that the Series S is holding AAA gaming back but it's difficult when there are some many games that still chug on the series x and on the PS5
Because a part of me hoped that developers would aim for the Series S the weakest console and the stronger ones would benefit simply by being stronger but that hasn't been the case
it is but its also helping series x and ps5 perform better than they should a lot of stable 60fps modes were only possible because they were developed for ps4, xbox one, series s first
ive been playing on performance when available on my ps5, you cant be honest and say it is not noticeable, one of the last games I played was deadspace remake, I put that thing on graphics mode to test it, I felt input lag suddenly came to the table. had to change it back immediately
30fps chugs on vrr monitors/TVs
When I was in university one of my idiots professors said : How should profit at the company be ? Only wrong answers ! Profit should be higher ! Maximum ! Only that is good !! 😊 720p or 1080p? With VRR or without VRR ? Maximum ! 4k native with VRR on at 80+ fps ! We are in 2024 ! 🫣🫣🫣🫣😮💨😮💨😮💨😑 Of course not by paying 400-500 $ on console . It is impossible at this price range !
1:50 Saying 4K is more important because it is advertised for the consoles and TVs is inaccurate imo. Current gen consoles and TVs are also advertised as being able to utilize 60-120 fps so shouldn’t all games do that too by that logic?
Also for me the difference between 30 and 60 fps is huge compared to say 2K and 4K.
Still waiting for a 60fps mode for RDR2
Yep I think that way too. If we dont get graphical leaps, we should atleast get good performance. I still believe GTA6 and the next Naughty Dog game could bring us a graphical leap AND 60 fps. Still havent felt like "oh now thats next gen!" (maybe Phantom Liberty in some areas) but these developers for me personally are the best if we talk about technical stuff.
Someone gave me the analogy that playing a game is like driving a car through a safari. Would you rather be in a shitty jeep with no suspension and sheet metal for a chair, or a seat with a seatbelt that has decent suspension to dampen the shock. Either way you're getting an experience of a lifetime, but one makes the experience more enjoyable!
I've seen 60fps enough that it *can* bother me to play at 30...but if the proper visual tricks are used to slightly smooth the 30 then I'm fine. Like I never notice 30fps on Skyrim, older COD games, and even Far Cry 5...but yet Far Cry New Dawn, Fallout 4, Spider-Man Remastered, etc. Seem really stuttery at 30fps
when playing game s like overwatch i can even tell the difference between 300 and 240 fps on a 240 fps monitor maybe i couldn't tell 120 from 144, but 144 to 240 is a massive difference.
I literally switched to PC 8 years ago because I saw my cousin playing a game at 60 fps on his computer. And I noticed it immediately! I actually commented on it before he said anything about frames.
30 FPS: It's fine if that's what you're used to.
60 FPS: Holy crap, 30 is slow as hell.
The big reason why I love 60 FPS over 30 is due to smooth camera movement. 30 FPS camera movement when per object motion blur (FF7 Remake) is not utilized, feels really jittery even though it isn't. Panning the camera around at 60 FPS is just lovely.
For me all games should have two modes perf and graphics one at 1080p or 1440p 60 one at 1440p or 4k 30. None of this 720p upscaled or 900p upscaled. And looking at dragons dogma 2 it has no modes and runs at 20ish fps in combat based on previews so that game is not gonna be playable.
Dragons dogma is 30fps on consoles, I’m surprised there isn’t a bigger uproar.
I think a big part of the problem is that we’re still seeing new games being released on both current gen AND last gen consoles to maximise sales so games are still being made to have minimum viable performance on 13 year old hardware. We’re nearly 4 years into this generation of console, it’s well past time to say goodbye to ps4 and Xbox1.
That crack about not telling the difference between 30 and 60 is absolute BS. to me, anything over 120 gets harder as you go higher.
I don't get motion sick but I do feel like I'm walkin on quicksand
Its funny how the video he posted was slow moving and slow panning of the camera 😂
Anyone who says they can't tell the difference between 30 and 60fps is either lying or needs an eye exam.
The whole thing of 30 fps vs higher is that it was a bad faith marketing bullet point from ~two console generations ago when devs were hard locking 30 fps in games like AC and where marketing the games on graphics. 30 fps is fine, but it's more that a publishers doesn't want to spend for optimation or the dev tied the physics to frame rate vs "oh no this game doesn't need 60+ fps or proformance/ graphics modes"
Better graphics go out the window when its stuttering and blurry. 60 fps should now be the standard.
I genuinely can't play at 30 fps anymore. 45 fps is about the lowest i can take. Some games hide lower fps well with motion blur, but most of the time 30 fps feels jarring. I just can't do it anymore.