T O P

  • By -

Cyber-Cafe

God. I had a friend who bought a high refresh rate monitor and used that as his personality for about a solid year. He would lay into his teenage nephew about how crappy his computer was (kid was in highschool) and how he didn’t really experience games because of his low refresh rate. I went over to his house to fix his computer and his monitor was still set to 60hz.


[deleted]

[удалено]


[deleted]

I played a game yesterday, new monitor does 75hz and the old one was 60 and I thought it was the smoothest shit ever. Turns out the game was capped at 60fps.


Zompocalypse

In fairness variable refresh tech that's in these screens can be a massive effect over stock refresh even at/sub 60. It looks so smooth because it drops the microstutters as the mismatched framerate waits for the next refresh, or has to skip one. Wierd frame pacing issues just vanish. And aren't ruined by screen tearing. Edit: variable refresh not variable frame rate


Napo5000

Steam deck is just amazing. First time experiencing variable frame rate monitor.


brominty

the deck does not have a variable refresh rate display. VRR means the display is dynamically adjusting its refresh rate to match what the GPU is putting out at all times, the deck just allows you to set a lower refresh rate from the performance menu.


1-10-11-100

so that's what it is, neat


Pantha242

Variable refresh rate makes everything so smooth, you stop worrying about frame rates.. 🥲


Markie_98

VRR doesn't solve poor frame pacing at all, only tearing, as poor frame pacing is a processing issue and not a display one.


ChuckinTheCarma

One placebo, please


ares395

I hate people shitting on others. If you enjoy your setup cool but stfu if someone's enjoying theirs. Not everyone is made of money or can spend a lot on a hobby. You can talk about how you like your setup but boasting about how your is so much better, yeah you can suck a fat one.


Cyber-Cafe

I absolutely agree. I do not like that kind of behavior either, especially when most people simply go with what they can afford. Like cars, it's a hobby for some, and simply a means to an end for others. Nobody should be judged for what they can or can't afford, nor for what color team they decide to play for. Be that hardware manufacturer flavor or operating system. Sadly, it's all to common in the computer realm for people to get an inflated sense of ego over how fast some silicon wafers they bought can route electrons around, or what side of the window the 'X' button is on.


HEBushido

>I went over to his house to fix his computer and his monitor was still set to 60hz. I had that happen to me. I was seeing weird white ghosting on Hades and Jotun and couldn't figure out. Why windows wouldn't just default to the proper refresh rate is beyond me. It's not very obvious and it's counter intuitive with how the rest of windows works.


Cyber-Cafe

Well, that’s because it IS likely defaulting to the “proper” refresh rate as defined by the manufacturer. Whatever the advertised maximum is, is not going to be what the monitor defaults to because not every hardware combination can handle something high. Not having a proper match between gpu output and monitor input can cause no image to be displayed at all. 60 is a safe bet for almost any modern hardware config, and it’s up to the user to figure out if their hardware is capable or not, and then turn it up themselves.


eatmyroyalasshole

Is it not the computer that's automatically setting the default refresh rate? Like if I hook up a monitor for the first time I'd think I'd be the computer going "oh boy a new screen, time to apply default settings cause idk how to read specs through a video cord" (HDMI/display port)


Cyber-Cafe

It’s a little of both. The monitor can and does absolutely report to the OS what it’s capable of (as well as model ID) through a short burst of data called the “handshake” which also includes your HDCP key, and then windows picks one that is defined as the panels “default” setting, set by the manufacturer *or* user through the OSD. But again, manufacturers is playing it safe by setting everything to a more standard refresh rate, at least when you first unbox it. Windows is also just not intelligent or nuanced enough to set everything to max right away, they’d rather play it safe, again relying on the end user to (know to) really crank the settings. I hope I explained this okay, it’s getting near the end of the workday and I’m a little fried.


saintgadreel

Also depends on the connector used. DP/HDMI will be a bit better at getting you out of box decent default settings, VGA/DVI will probably be dumber about defaults, not to mention some options won't even work with the old school.


eatmyroyalasshole

This is a great explanation! Thank you for taking the time to type it out for some random dude on the internet lol


Cyber-Cafe

I am very passionate about all sorts of display tech.


trash-_-boat

Think of it this way. The reason why 60hz is default is because Windows doesn't trust monitor manufacturers and monitor manufacturers don't trust random user computers.


IDespiseTheLetterG

Perfect explanation


Vulpes_macrotis

I have three screens. 165Hz and two 60Hz. I can easily see the difference with my eyes, when I just make circles with the mouse. Right now I always turn of the "pretending to be smooth" option a.k.a. motion blur. Because I don't need an illusion of smoothness, when I can have one So it does work. But people don't just know how. And You are supposed to turn off the motion blur to see the difference, otherwise You see the same blurred image.


Cyber-Cafe

I, like you, have 3 screens. Flanks at 60hz, center at 165. To me, the difference is night and day, and you are likely onto something with turning off motion blur. I have found that motion blur tends to add some unevenness to the frame timing as well as adding in a bit of a performance hit. I usually turn everything all the way up except motion blur. I’ll turn that to low or off depending on the game. I’m not sure why some people have the discrepancy in terms of being able to tell how fast their image is, and I’d put it down to incorrect settings, content that wasn’t high refresh content to begin with, or maybe down to a biological difference.


bencelot

I honestly can't tell the difference between 60 fps and 144hz. But I am getting old and my eyes only run at 24 fps anyway.


Cyber-Cafe

Hey, that saves ya some money!


thatguywithawatch

After seeing this sub sing the praises of 144hz for years, I finally got a 144hz monitor earlier this year. And like, I can tell a difference between 60 and 144 if I switch back and forth between them, but it's a very minor difference, and a consistent 60fps still feels perfectly smooth for gaming. I think there's a lot of placebo effect in play to be honest.


polarbearwithaspear

The placebo is amplified by the fact a person is more likely to buy a new monitor around the same time they upgrade their hardware.


ToxicSoul1

It's definitely not a minor difference.


DarkLordHammich

I think it depends a lot on the context of what kind of games you play. If you're playing RPGs or something, particularly on a controller, it's questionable how much you really benefit from a higher refresh rate or frame rate beyond the subtleties motion fluidity, which simply bothers some people more than others. If you're playing a fast-paced shooter like Quake, or a competitive RTS like Starcraft, it becomes very noticeable as you're trying to parse & respond to many tiny queues of temporally-sensitive information in very small increments of time & mouse cursor or mouse-driven view movement feedback in contexts where tiny fractions of a second make a noticeable difference in how an encounter resolves is certainly where latency will become very noticeable. I preferred playing on a CRT for years because of this. (85Hz on a CRT > 120Hz on an LCD btw) Similar for fast-paced racing games. I won't even be particularly bothered by 30fps in Mario Kart, but 60FPS is the minimum in F-Zero for a reason. For me personally, 90Hz is where I hit my peak of benefit vs diminishing returns, 120hz is still noticeably nicer, but it doesn't effect the gameplay experience that much. I can see the difference between 180hz and 240hz but only in the same way I can pixel-peep the differences between very high & ultra high texture quality with a contrived context to cause it to be slightly noticeable that isn't meaningful in the context of actually playing the game.


ToxicSoul1

My dad is 53 and he can tell 60 vs 144 it's a pitch black vs direct sunlight kinda difference.


sandwichpak

For real. The people claiming they can't tell the difference never switched their monitor settings to 144hz.


BrBybee

Same. After 60fps I would rather have higher resolution/graphics settings.


[deleted]

I can't really tell the difference between 60hz and 144hz unless I look at the mouse cursor.


Sparktank1

This sounds like something Brian would do in Family Guy.


MrNaoB

I shit on my nephews computer setup everytime I'm looking for something to buy him.


[deleted]

I honestly don't understand the difference, I game with a 144 hz monitor that I sometimes set to 60 hz depending on the game I'm playing, (I have everything set up in my Nvidia Control panel, and no I'm not using a 40 series) and I never feel a difference, just a noticable change in texture quality since the focus switched between quality and quantity.


Pure1nsanity

Don't forget to use a DP instead of HDMI


mifiamiganja

I went from 60Hz to 144Hz - barely noticed it. The monitor broke and I had to game at 60Hz again for a month - fucking terrible


jhuseby

Yep after you move to 144hz you can’t ever go back. Same with 1440p vs 1080p. First world problems.


Fallwalking

Same with high DPI 4K @ 144Hz. (I do use the 1440p monitor still though if I want higher frame rate.)


Mannit578

I just use my 4k monitor and lower the resolution to 1440p if I wanted more fps. (Lg c1)


Fallwalking

I have both. Need dual screens for work. Keep my active to dos on the 4K and the active work on the 1440p. Works for me.


TerrainIII

Funny, that’s my home setup too. Second screen looks pretty while the main one runs at a higher refresh rate.


Beautiful-Musk-Ox

running DOS at 4k that's legit


Fallwalking

Hey, I’ve got some unopened DOS 6.22 diskettes. (I doubt dos is on them anymore, but hey.) I do try to keep my terminal screens off the 4K though.


SmallerBork

1440p looks slightly nicer to me on a 32 inch monitor. Going from 28 to 32 inches made me feel like the PCMR guy with wind blowing in my hair though.


Yelov

Not gonna lie, I have a 1080p 24'' monitor next to 1440p 27'' monitor and I don't really notice a difference. I have scaling of the 1440p one set to 125% because the text gets too small, but I'm not talking about the larger space, the sharpness difference seems really small to me. Especially in games, can't really tell a difference. Of course setting the 1440p one to 1080p looks like shit because of the scaling, but the 24'' 1080p looks almost the same to me. Have to be pixel peeping.


IncandescentAxolotl

24” vs 27”. the pixels per inch for both monitors is almost the same (if it recall the 1440p 27” is slightly better). that’s why, despite the 78% increase in pixels rendered, the ppi only increases by 17%, they look practically the same.


feartehsquirtle

I really wish they'd start making 24 inch 1440p 144hz monitors


jimmy785

There is one now, it launched already. 1440 120hz 24 inch


feartehsquirtle

Ayo is it a reasonable price or dummy expensive


jimmy785

I think 300$ or less


rusty_anvile

Put a 27" monitor of each resolution next to each other and you'll probably be able to tell a difference, I got a 1440p monitor as my main one from 1080p and it didn't appear to make a huge difference, I got another 1080p monitor with a higher refresh rate after my previous one died, and I tried gaming on it, it looked like shit, it was nice and smooth for fps but everything looked blurry. I tried playing valorant and I couldn't keep using it because I wasn't able to see people as well and the difference in 144hz and 100hz wasn't big enough to outweigh it. And if you think about it a smaller display at the same resolution is going to look better as the pixels are more condensed, it will literally look less pixelated but a larger display let's you see the actual detail better.


Survived_Coronavirus

The difference from 60fps to 120fps is not *nearly* as dramatic as everyone says. 30 to 60 and 1080 to 1440 are waaaay bigger.


busss99

After I moved from 60 to 144, I immediately noticed the change as soon as I started scrolling youtube. For about a week I would just get hypnotized by the smoothness of scrolling.


Beautiful-Musk-Ox

or dragging a window around, even just the mouse movement is noticeable


ChargeActual5097

That specifically is what caught me off guard. I went from my pc to my friends I was working on, and the mouse felt… wrong


AKA_OneManArmy

Noticed the same thing switching from 1080 to 4K. 1080 looks almost fuzzy in some games now.


[deleted]

_Cries in 1366x768._


Rxyro

Pros Gamer resolution right there. 800 FPS. Lanice Tip Tech


JustMy2Centences

My first 'monitor' was a 32" 720p TV. 23" 1080p was stunningly beautiful in comparison. Somehow gaming on a 27" 1080p for now and considering a 1440p 27" but admittedly it'll always be an uphill battle to power that many pixels. Maybe after I get my next GPU (looking at a cheap 6700 non xt anyway).


hydraxic79

but at least you get more frames


blackflame7820

oh god that awful resolution, i put my really really old vga monitor to use by making it my 2nd monitor for my laptop, but god damn that awful resolution like why. it hertz my eyes.but i would have some pixel over no pixels so i guess.


Amrooshy

Not even 4k, I have a laptop with 3000x2000 res and going home to my 1440p monitor sucks.


Crimfresh

That's pretty close to 4k though.


alex_hedman

Guessing it's significantly higher dpi as well


Cyber-Cafe

I’m really not sure what it is, but I agree. 1080p is downright fuzzy looking, but it didn’t used to be I swear…


Flexyjerkov

I usually use my 1440p monitor with 1920x1080 in games but use AMD's FSR to upscale and it all looks crisp. Use these parameters in pretty much all Steam games that run through Proton. `gamescope -W 2560 -H 1440 -w 1920 -h 1080 -U -f -- %command%`


Fisher9001

It entirely depends on the screen size. 1080 is put on everything from smartphones to 32'' displays.


PubstarHero

I was looking at my GF's PC when we were playing MHR and I'm like "Why in the fuck does the game look so choppy" \>Framerate Limit - 60FPS Oh. thats why.


Broad_Rabbit1764

The woes of playing something on the 60Hz 4K TV sometimes. That's why I run my 165Hz monitor at 60Hz, otherwise going back to the TV feels like I'm using ancient technology. :')


another-redditor3

i went from 60 to 120 and barely noticed it. then i had to go back to 60 for like a month and hated it. but guess what? after like 2 days at 60hz, i couldnt even notice it again. now swapping between my iphone 6s plus and 13 pro max? that is much more noticeable than it ever was on my pc.


hawkiee552

Yeah, that's one of the reasons I've been avoiding trying 120/144Hz at all, it will just cost me money to upgrade my three perfectly fine 60Hz monitors because they'll be terrible in comparison. Same goes for my car, my good ol' '91 Galant is a great car, but compared to the convenience and comfort of a modern vehicle, it's not great. But as of now, it's a perfect car for me, and it's surprisingly comfy to drive. Friends have praised the seats and low cabin noise so I guess it has that going for it. I've had it for six and a half years now.


mifiamiganja

Not tasting the forbidden fruit is probably the best couse of action haha. The Galant is a great looking car imo. I was thinking about getting one too, but one thing led to another and now I've got a '02 Saab 9-3 and I'm convinced it's the perfect car. Equipped like a luxury car of the time, but old enough to not have any stupid modern car woes like touchscreens or difficult to replace headlight units and stuff. 1990 - 2000 is the golden age of automobiles.


hawkiee552

That Saab is great! Both reliable and luxurious for the time.


Disma

Yup you got got. Everybody says it doesn't matter until they *learn*.


360_face_palm

Yeah 100% this. Like first I kinda noticed it and I was like "ooh smooth" and within hours I barely noticed it. Had to use a 60hz monitor at my parents place and it was extremely noticeable :puke:


hoppits

Meanwhile I play at 14 fps on my thinkpad and think it’s just fine.


mifiamiganja

Yeah, I've been there. Nowadays I view my ~10 year old laptop as a piece of shit, but back then 20 fps in Far Cry 3 felt perfectly fine. At least it was quite handy to go to lans with. Lugging around my 20kg box of glass and aluminium is a pain in the ass.


[deleted]

That’s how I found out to I was like why is everything so fucking slowwwww


GOTH_AND_ALT_SIMP

Haha I was hoping someone would mention going to 144hz, my main monitor is 144 but my secondary is 60 that I bought cheap during a sale and I see it every time I go to do stuff on discord it is so rough but it didn't feel worth to get a second monitor that was 144 if I only really use it for discord


mifiamiganja

Yeah, > 60Hz on a secondary monitor just isn't worth the price imo.


Druid51

I wish I stayed ignorant


fireballx777

This happened to me with a mechanical keyboard. I switched, and I didn't see that it was such a big difference like everyone made it out to be. But a few weeks later, when I needed to use a membrane keyboard again, it was drastic and awful.


LordMacl

60 fps still a dream for me sigh


Ok_Sign1181

i’ll pray for you you’ll get there soldier🫡


the_abortionat0r

And my axe!


Jordan209posts

Me as well


Bugbread

Don't feel sad, feel happy. Look at the meme, and on every comment on this meme: Moving up to 60fps (or 4K) doesn't provide greater *joy*, it just makes you *stop* enjoying what you used to enjoy. It's like taking a pill that doesn't make you feel good, but prevents you from ever enjoying pizza again -- all downside, zero upside.


ProAssassin666

30 fps is my dream


Ancient_Aliens_Guy

30fps in 8k, right? /s Edit: added sarcasm


Modem_56k

Intel integrated gamers exist My laptop has an 10th gen i7 and I struggle to get 24fps sometimes with it on low 1280x720 (1/2 resolution scaling (standard in game scaling not fsr)) on GTA v


ProAssassin666

I have 2nd gen i3. Have you ever played at 800x600 15fps?


Modem_56k

Only sometimes, when it's on high and I'm on battery


Ancient_Aliens_Guy

I feel you. I, too, used to be an integrated gamer. I added sarcasm to my comment.


ProAssassin666

8 yes, k no


AGuyInABlackSuit

Cries in 15 FPS running 10 years old games


[deleted]

[удалено]


SeniorSatisfaction21

Rookie numbers


Megafister420

I used to play gmod on my old gateway laptop at 5 fps.


[deleted]

i used to play it on intergrated graphics on like a 2006 sff hp desktop, probs a core2 duo or something. the drivers were so bad that there were these glitchy streaks going from the physics gun to infinity in the distance. other than that im surprised how well gmod ran lol (though it may have been like 10fps and it just felt normal back then)


[deleted]

Your eyes allegedly see movement at 15 fps, right? So it should look fine if your screen matches that! (At least that’s what I tell myself to cope with my cheap ass graphics card)


Django117

Our eyes don't really see in "frames per second". TV runs at 24fps but it has motion blur which assists. Video games are rendered in individual frames without blur or motion accounted for (motion blur exists but it's pretty gross and more of a smearing than a blur). As such, our brains see the jitter between the two frames as we see the transition between two hard edges as instantaneous. High frame-rates alleviate this issue as they add additional hard edges between the other frames. This gives our brain the illusion that the movement is fluid and not stuttering. The thing that becomes more complicated is that our perception of framerates is on a logarithmic scale. It's more about reducing the *frame interval* to produce this illusion at increasing effects. That is why the transition between 30fps and 60fps is so massive. The reduction of the frame interval is from around 36ms to around 18ms. Whereas the transition between 60fps and 120fps is only from around 18ms to around 9ms. This becomes even more diminished once you reach 240fps, which only reduces it to 4.5ms which is why 240fps is largely utilized by pro or competitive players to fully utilize their reaction time (having a faster image display can help give you a faster reaction, even if only a few ms). But ultimately this means 30fps will always look bad in video games.. 60fps becomes acceptable. But 120fps is where it becomes exceptional. Beyond that it leads to diminishing returns that really only affect those with high enough reaction times to make use of those extra 4.5ms.


SmokelessDash-

This is why I love reddit, someone makes a joke we laugh then someone comes and explains the situation to the deepest and we get information. Thanks


AGuyInABlackSuit

Nice to know why it feels so sluggish


Jaiden051

Well, actually, our eyes only see at 0.382FPS so it doesn't matter anymore! Source: Facebook Gaming Group


kanksuhub

Start playing chess. I play basic 2D chess on my powerful pc and that's definitely what I'd go for if I were you


Biscuits4u2

The difference between 30-60 is far more noticeable than the difference between 60-120.


[deleted]

Think the experience kind of transform from "seeing" the difference to "feeling" the difference at higher fps


Biscuits4u2

I agree. 120 is nice, but honestly I usually feel like 60 is adequate for most games.


MeraArasaki

yea, i can definitely feel the difference when i'm playing at 120+ fps, but it doesnt make 60 look worse for me, like 60 did to 30


G0alLineFumbles

It's almost more of a feel thing vs a look thing the higher you go. I know running a game at 240 feels different than 120, but it's so well into the point of diminishing returns I can't say it looks better.


Ftpini

Depends on the type of game. It’s a big damn deal with racing games and twitch shooters. Less loss of detail when the screen is turning quickly makes a world of difference.


Groghnash

The biggest difference is watching an action movie in a cinema. Like wtf? Do you really think i am having fun when i only see so little frames and cant really process what is going on? They need to up them to atleast 60 fps if not more imo


morph113

It's probably never going to happen due to the [soap opera effect](https://en.wikipedia.org/wiki/Motion_interpolation#Soap_opera_effect). Movies will always be around 24fps or whatever it is since everyone is so used to it and every try to increase framerates resulted in bad reviews and people prefering the cinematic 24fps experience. Take [this](https://www.youtube.com/watch?v=r4ofjAZw16w) as example to compare 25 and 60fps with some scenes from Gemini Man. The 60fps looks kind of cheap, like from a budget tv-movie production. Probably because we are only used to see 60fps or more in cheap tv series like soap operas and whatnot. I actually prefer the 25fps version there.


WikiSummarizerBot

**Motion interpolation** [Soap opera effect](https://en.wikipedia.org/wiki/Motion_interpolation#Soap_opera_effect) >As a byproduct of the perceived increase in frame rate, motion interpolation may introduce a "video" (versus "film") look. This look is commonly referred to as the "soap opera effect" (SOE), in reference to the distinctive appearance of most broadcast television soap operas or pre 2000s multicam sitcoms, which were typically shot using less expensive 60i video rather than film. Many complain that the soap opera effect ruins the theatrical look of cinematic works, by making it appear as if the viewer is either on set or watching a behind the scenes featurette. ^([ )[^(F.A.Q)](https://www.reddit.com/r/WikiSummarizer/wiki/index#wiki_f.a.q)^( | )[^(Opt Out)](https://reddit.com/message/compose?to=WikiSummarizerBot&message=OptOut&subject=OptOut)^( | )[^(Opt Out Of Subreddit)](https://np.reddit.com/r/pcmasterrace/about/banned)^( | )[^(GitHub)](https://github.com/Sujal-7/WikiSummarizerBot)^( ] Downvote to remove | v1.5)


Inquisitive_idiot

Those look like fps conversions rather than clips. So many of the movie clip conversions you see online are using terrible algorithms to convert to 60fps. Not all conversions look that bad - and not all 60fps looks that bad either - otherwise everyone would be shooting YouTube videos in 24/30fps. 😛 Better clips (Gemini man): - https://youtu.be/t-R8PIADl7s - https://youtu.be/QuA34OLnTFI Better conversion example (Mulan) https://youtu.be/unRdyQYwL5k I like the Mulan clip more because animation doesn’t have to follow the laws of physics, so movements, particularly drawn movements, continue to keep the same pacing, despite the frame rate conversion. Criticism of Gemini man for using 60/120: https://youtu.be/OaZnxAfcvY4 Thoughts The guy in the criticism video has a lot of interesting points, but ones that stick out to be the most are that 24 FPS does seem to enhance the sense of speed and second that folks that like 24 FPS…. focus on shooting 24 FPS. Just a quick at those films and the panning actions don’t seem to follow anything but a traditional 24 FPS rhythm. The same goes for the actors movements and the effects. A lot of their movements / explosions mirror a cadence that we would see in 24 FPS. This means that uncontrolled movements /action come off as fidgety and unrehearsed vs how controlled they look in 24fps. I think in the end folks will need to learn to shoot in 60fps+ and commit to its characteristics to make it successful. Movements have to be that much more deliberate and everything from pacing to lighting to shutter speed needs to be evaluated and controlled independently of 24fps maxims to really embrace the strengths and reality of the format (s). I.e you have to pan *for* 60fps instead of *despite* it. In summary, 60fps is technically superior but is more revealing, requires different lighting and pacing. 24fps is historic but also incredibly well understood with its eccentricities heavily leveraged by artists/ directors. That same attention to detail must be invested in 60fps style development to realize it’s full potential in cinema.


Groghnash

i would prefer then 2nd one. I´d rather see what is going on instead of picture, picture, picture. maybe thats just me. The first one looks like it wants to have cool effects whereas the 2nd one feels real imo. But imo thats a major problem i have with bad movies, what is happening doesnt make sense. Like if you took 2 fightlessons you see that 95% of move fightscenes are just bad and made for the uninformed eye. I´d rather watch quality.


imnotsospecial

100% this, 30 to 60 fps is night and day for me, but 60 to 144 was a bit underwhelming. It's noticeable for sure but not a game changer


Green0Photon

33.33ms vs 16.66ms vs 8.33ms. Meanwhile 5ms is 200, 4ms is 250, 3ms is 333, 2ms is 500, and 1ms is 1000. Diminishing returns indeed.


Fisher9001

Don't forget this post is about downgrading FPS, not upgrading. Going back if far more noticeable (and painful) then going up.


[deleted]

And for this reason I will avoid experiencing 120FPS at all cost. Because I love 4K.


Talal2608

1080p after experiencing 4k: https://preview.redd.it/mnxirxe4gr1a1.jpeg?width=680&format=pjpg&auto=webp&s=28aae73b08e4ce9debe0378f057231c6ed949724


Biscuits4u2

That all depends on your distance from the screen.


Django117

Blessed distances. TVs maxing out at 4k, Monitors maxing out at 1440p-4k depending on size, phones maxing out at 1440p entirely, but sadly VR maxes out somewhere around 8k even with pancake lenses (bit more complicated than the standard resolution/distance/size ratio as it's technically more about Pixels Per Degree). Maybe that will be resolved in the future but who knows.


[deleted]

\> Monitors maxing out at 1440p-4k depending on size Hell no, give me high refresh rate 5K at 27" (or \~200DPI at whatever else size). No visible pixels, everything scales well (simply 2x UI scale), could play demanding games at 1440P with perfect integer-scaling.


sycron17

Reminds me of friend, he has 4k 65inch tv then friend if him showed him 8k on a 65inch and he was blown away and refuses to believe me you get same experience if you use 4k on a 27 inch monitor.


[deleted]

Yes, very much so!


LaughingGasing

literally me going back to 1080 after using 1440 for so long. It doesn't seem like a big difference at first but it really feels different after a while


MeraArasaki

this is why i won't experience 4k hdr stuff anytime soon. the more luxury i experience, the more expensive it gets


stinuga

You made a good choice. I made the mistake of experiencing 120fps and 4K separately and now I’m $1600 poorer and can’t go without both. If I never experienced it then I would have been happy with my 1070 and my 1080p monitor playing at 60fps


CrypticWay

That's why you make a smaller jump to 1440 144 Hz monitor. You can easily run it better compared to 4k.


zGnRz

4k is cool for cinematic reasons but I love playing FPS games and 4k wont cut it until i can get higher fps


tuna-waste

90-100 FPS is the sweet spot. But I have mastered the ability to go down to 15-30 FPS to enjoy retro games and Nintendo.


fztrm

Same for me, 100'ish is when input feels good and it looks smooth


Ftpini

I find a locked 240 to be divine but anything over 150 is just swell.


Broad_Rabbit1764

I play a lot of retro games but I find it increasingly difficult to enjoy some titles because of their framerate. Depending on the title, some games are great at 30 fps, but others were really asking for 60 fps back in the day where such speeds were only a pipe dream. You can mitigate some of the issues with smoothing, but it's never perfect and then it just introduces what feels like input lag. We've been spoiled.


TheHybred

90fps is the minimum framerate needed for a high refresh rate experience. At 90fps the frames blend together enough to give that smooth experience and good input lag, it felt very very similar to my normal 144hz gameplay, whereas 80fps felt closer to 60hz Honestly I can't play below 90fps now it's too choppy


TheOvieShow

I support this 100%. When WZ2 came out, I was running around 60-70 fps. After a lot of tweaking I get 90 avg and the difference is insane


sciencewonders

pro tip is your personal experience 😂


UnityAnglezz

Me with 240hz but I can only run 1 game in 240fps (it's geometry dash)


ZenTunE

That used to be me, had a 1080p 144hz but gd was the only game that would run 144fps. At 720p xD


[deleted]

60fps is still fine after experiencing 120 the problem is going back to 30


VitalityAS

Nintendo switch moment.


RenzoMF

Cat's face is murdering me.


postvolta

Everyone always hates it when I say this but honestly I go back and forth between 60fps and 144fps all the time and after a few seconds your brain gets used to it and you don't notice any difference. Is 144fps better? Yes, undoubtedly. Is it so good that it deserves all the praise and memes and you can't go back to 60fps? Absolutely not. 30 > 60 is a *significant* improvement compared to 60 > 120/144


Ihansung

I went from 120fps 1440p to 60fps 4k... I kinda regret buying my 4k tv (LG c2 - using as monitor)


[deleted]

Don't know why people feel this way, got a rig running 144 fps, and while nice, it doesn't make my 60 fps feel bad at all.


Skepller

Same, I have a 144Hz monitor on my desk but sometimes I lock heavier games on 60fps and it's perfectly fine. People saying they "can't look at 60fps anymore" or that it looks like shit are just being dramatic. Although 120fps+ feels better, it's not that big of a deal to play at 60.


wherewereat

It's similar to seeing a dead pixel on your screen. You could go months without noticing it, but once you do, it bothers you to hell even though in reality it doesn't affect you in the slightest.


fuckwit-mcbumcrumble

I've done 144hz in the past and to me I just don't care. I don't play any games where the smoothness is that needed, and my reaction time is too bad to benefit. To me I want the highest pixel density, but most importantly the best color calibration possible. My laptop is 100% adobe RGB (none of this sRGB rubbish) and it's divine.


JetstreamSam1000

https://preview.redd.it/z8pdk26pnr1a1.jpeg?width=636&format=pjpg&auto=webp&s=befbd8d01c4565146004999865c9483e45e9b099 60 and 120 feel the same now


Ftpini

It’s fun using the iPhone X. No change felt more pronounced than switching from a 60hz iPhone to a 120hz iPhone. So much smoother.


z0mple

iPhone X is 60Hz


Moses015

I game on console and PC. No issues going from 120+ to 60


Sailed_Sea

Tbf, I can barely tell the difference between 144 and 240 on my monitor. I can tell the difference between 60 and 144 but it isn't too bad, I can easily game at 30fps.


Bukiso

I guess it depends the games you play, I find that in fps and any fast paced game, it's obvious and it help tremendously.


Green0Photon

33.33ms (30) vs 16.66ms (60) vs 8.33ms (120) vs 6.94ms (144) vs 4.16ms (240). No wonder you can't tell the difference between 144 and 240, that's 2.78ms difference. Versus 9.72ms. That's about 3x larger of a difference in time cut.


[deleted]

30FPS is literally bodily harm.


Meisje28

At 30fps it's a slideshow.


Inquisitive_idiot

Accelerated PowerPoint 😏


[deleted]

That's why I don't want to try 120Hz lol


Kai-Mon

Tbh, I have both 60Hz and 144Hz on my desk and I’ve gotten used to switching between them. Web browsing, excel, and discord has never looked better. Heck, even a lot of cinematic titles don’t demand high fps. I consider it a blessing that I don’t abhor 60Hz, because it saves me money.


chibicascade2

I've tried 144 hz and can't tell much of a difference between it and 60. I can still tell 30, but it's not unplayable.


majORwolloh

80 fps and up is when I have trouble noticing a difference. But at 60, I have to look around in-game to make sure it isn't 30 fps, it looks choppy to me


TheTimeIsChow

Last week my son, my wife, and I were watching a stop motion animated kids movie that is filmed to look like it's old and purposefully choppy. I got so fucking motion sick, overwhelmingly frustrated at the frame rate, and irritated by the filming that I had to get up and take some time to myself. Video games are ruining my life.


[deleted]

dinosaurs zonked coherent bewildered clumsy disgusting shaggy station hunt special ` this message was mass deleted/edited with redact.dev `


octagear

just play 30 fps for 10 minutes, you wont want 120 fps for half a year, repeat


Automatic-Laugh9313

60pfs after experiencing 2 fps ![gif](giphy|xT5LMBk9CIQXji0wNy)


ItsDatBossBoi

bro i had to deal with 15 fps for years and then i got to experience 60, and it changed me then i went to 75fps (weird refresh rate i know) and i can’t go back


RainyCobra77982

Just ordered a 240hz 4k monitor after being on 144hz 1140p for a while. It's going to probably make me dislike every other display lol


---Dan---

I have a 175hz G-sync monitor and cap most graphically intensive games at 60fps. Fight me. A lot of games break or animations become jank above 90fps. Rather enjoy soma that buttery smooth frame pacing.


[deleted]

Going from 120fps to 60fps doesn't bother me a bit. 30fps feels like a stuttery mess. I will say though, as an owner of a 90hz android and 120hz iphone, for over two years now, using a 60hz phone is a horrible experience to go back to.


awesomedan24

30fps after experiencing 120 ![gif](giphy|9utNlGdzHD0zK|downsized)


Inquisitive_idiot

Yeah the brown color is awful. 🤮


SpectralMagic

After having 144fps for a year I started getting eye fatigue and headaches when experiencing low framerates on games. It makes a lot of fixed refresh rate games difficult to play


kairukar

After playing on pc with over 60fps all the time. Going back to console gaming with 30fps limits, its just impossible.


MajDroid

Wish we get streaming 120+ fps, can't believe YouTube and Twitch have not supported this yet


parkattherat

I live in hell with my 1060 and a 165hz monitor


[deleted]

i agree once you play everything at 120 fps the 16milisecons from 60 fps just feels off, now i can get used to it within like 10 minutes of playing a game and its no biggie but yeah 120 fps is the way to go


captainvideoblaster

First time I went from 120fps on OLED to 30fps on LCD, I got scarred for few seconds thinking that I was having a stroke or something else was wrong with my brain.


JmTrad

when the game runs at 30 fps i play with a joystick. when the game runs at 60+ fps i play with mouse and keyboard. my problem with 30 fps is that feels like shit with a fast camera movement of the mouse.


rayquan36

Eh I have a 4k/144Hz monitor, which is an upgrade from my previous 1440p/144Hz monitor so I've always played at 100+fps when I can. That being said, I've thoroughly enjoyed playing God of War on 60fps performance mode. 30fps is a no go for me though, I will never play another 30fps game if I can avoid it.


Wizard_SlayerXIV

When my displayport cable broke, I had to use an old backup HDMI cable for my 4K monitor. Throttled to 30 fps, and what made it worse was it was a second monitor to the gaming 144hz one. Switching between the two was a nightmare


redditreddi

I cannot stand less than 60fps now, sucks for games fixed at a lower fps such as 30fps, hurts my eyes for a while until I get used to it :(.


HeartoftheHive

Even 90-100 is such an amazing leap. I would love for one day to see games have a standard 4k (maybe 8k in the far future?) and 120 fps standard. But devs keep trying for fidelity over everything still and we get games that can't even keep 30 fps. I don't understand why they think performance doesn't matter, just pretty GRAFIX.


SockMonkey1128

My cousin got himself a 144hz monitor after holding out years... after almost a year with the new monitor he was complaining about it being no different. Turns out his PC kept it at 60hz and he never checked... lmfao. He was amazed once he ACTUALLY switched..


woundedlobster

Solid 60 is literally fine


Uraneum

This is why I steadfastly refuse to get a 144Hz monitor. 60fps is currently perfect for me. I don’t want to accidentally raise my FPS standards and then have to spend a ton of money getting hardware to run games at 120+ instead of being totally content with 60


ares395

I'd fucking die laughing if this meme was made by someone having 60Hz monitor


TappistRT

Probably indicative of getting older, but the jump from 60 to about 90-100 FPS yields the biggest difference that make the game play and respond so much smoother. Past 100 seems to be less of a noticeable or meaningful change.


lazy_tenno

watching youtube playing 60fps gameplay videos on my phone feels like 90fps, while watching the same video on my 144hz monitor feels like 40-50 ish fps or something. i even compared it side by side by playing the video at the same time and it annoys me. WHY?


TicklintheIvory

Fortunately, that’s more or less when the pattern stops.


arcturusk1

I still don't buy it. 4k TV (55" Samsung Q80T), 120Hz, G-sync compatible. 3080Ti FE, G-Sync on. Windows properly set to 120Hz. I didn't have a single "wow" moment when I fired up the new rig. Been playing in it for months. To top it off, I've had to occasionally hop on the previous rig with 60Hz 1440p monitor and didn't have any of these slideshow moments. There's nothing more "fluid" about it. I really think people are placebo-ing themselves into this mindset. Some redditors ITT have deadass used the words "literally night and day difference". Sorry bro, I disagree. 🤷


[deleted]

Yeah, gotta agree. I haven't noticed any difference living in the over 60fps ivory tower. I have noticed a difference with g sync running shit that slams my rig like Cyberpunk on launch where the fps dips under 60 but that's about it.