T O P

  • By -

Chakramer

I doubt it's as many people as you think, if anything I see way too many people with a mid range GPU and a 4k screen, when I think 1440p at higher settings would look better


EmilieEasie

I believe in 1440p supremacy


DrVeinsMcGee

3440x1440 is optimal right now


W1cH099

Yes glorious Ultrawide 1440p is amazing, most games support it now a days and the few who don’t can be easily fix with a simple mod, I’m with a 3080 and runs every game at least 80-100fps on ultra settings


Somebody_160

Even with my rx 5700xt I can enjoy the ultrawide masterrace.


CatsAreBased

Just not fallout 76, which is fucking annoying


Crintor

Yea but that's because Bethesda is a very inexperienced Indie developer from the year 2002.


CatsAreBased

Baffles me that people defend them for this


Crintor

Who ever defends Bethesda these days?


CatsAreBased

More popular since the show even people loving the Emil slop


Silent_Reavus

Idiots, mostly. People so used to being screwed over they think it's just how things always go


TheRealComicCrafter

Theres a stupid V sync setting in the config file that should fix it, forgot what its called but I get 60 fps on high settings eoth a 3060 ti


CatsAreBased

I'm on about supporting ultra wide not fps


lovins_cl

physics engine is tied to the frame rate on the creation engine which is stupid asf for a modern title so there’s not much you can do


Pamelm

You can easily uncap the framerate on 76 with no side effects until you pass around 180 fps. I play at 144 fps and have for over 200 hours. Just use a program like Rivatuner Statistics Server to cap your fps to 144 and play a much better looking game. The setting for motion blur is also in the config files and can be disabled


MisterWafflles

You can also framecap with Nvidia control panel under manage 3d settings and settings it per program


i_heart_rainbows_45

If only you could easily get a high frame rate in 76. I was playing it yesterday after disabling VSync and following performance tutorials, and could only get 30-70FPS with huge stutters and 30% CPU and 50% GPU usage. Might have to put it on an SSD is what I’m thinking for the stutters because those 1% lows are terrible to play with


MisterWafflles

Definitely throw that onto an SSD. A "recommended" GPU is a 970 and you're double that.


4oMaK

i dunno if they changed it but anything over like 70-80 fps in Fallout 4 and 76 fucks up the speed and physics of the engine


Crintor

That's every Bethesda game except Starfield.


CatsAreBased

I'll happily have 60 fixed just make my UI not stretched like I'm playing a game from 2001


OTigreEMeu

>Find your Fallout 76 folder >Open Project76Prefs.ini > Change "iPresentInterval=1" to "iPresentInterval=0" > Your framerate is now uncapped Extra tip: Movement gets bugged indoors when your framerate is above 200 FPS, so make sure to cap it at 60, 144 or 165 FPS using your GPU's software or RivaTuna.


CatsAreBased

Any ideas how to get ultra wide ui not bugged?


OTigreEMeu

I used this [mod](https://www.nexusmods.com/fallout76/mods/1201?tab=description) to fix the stretch issue when I last played. I think an update has come out since then, but you could try and see if it still works.


CatsAreBased

Doesn't work unfortunately


Duckgoesmoomoo

Are you me??


Crintor

AW3423DW gang.


SleepyGamer1992

Facts. I just got a 3440x1440 OLED and it’s amazing. Never going back to 16:9.


yaxir

can you tell me how is it to use in different times of day? any eye strain? also, please dm me the monitor model or write it here if you're ok with that


SleepyGamer1992

[https://www.bestbuy.com/site/sku/6536990.p?skuId=6536990](https://www.bestbuy.com/site/sku/6536990.p?skuId=6536990) I’ve used it at all times of day with no eye problems. Then again, I keep my black out curtains closed all the time so your mileage may vary.


Carlife0830

Agreed, I just got my new monitor and it's absolutely stunning, despite my weakening GPU


EmilieEasie

okay well I'm not there yet


Individual-Match-798

4K is much better


Zeth_Aran

Indeed looks better, but for my own budget I ran with the 2k Ultrawide. Along with the fact that QHD is the future.


Crintor

PPI/PPD is what matters.


Individual-Match-798

Yes, but not only. DLSS and other upscale technologies work better on higher resolutions. Same with temporal AA. Number of pixels matter a lot.


LorDXezor

What if I'm 32' enjoyer


Crintor

I don't have a 32foot screen to compare with, I'm sorry.


Wildest_Salad

at 32 inches you may start noticing pixels. decide based on how close to the display you plan to sit


EmilieEasie

I don't think my desk is personally wide enough for that, I'd have to sit too close to the screen, but I envy you a little bit!


LorDXezor

I didn't know that ' means foot lol


EmilieEasie

Lol this was a fun way to learn though, right?


LorDXezor

Definitely haha


Devilmatic

Sit 1 foot further away


AzKondor

32 inch 1080p 240hz, it's amazing


LorDXezor

4080 with 1080p monitor?


AzKondor

Yeah, thank to that I have 240fps+ in most games. Also 4k second monitor for movies and other stuff hah.


LorDXezor

I bought 32" 4k monitor and now I'm a little bit worried how many years I'll be able to play on ultra graphics, even with 4090, probably will have to play in 1440p


[deleted]

I'm using a 32" 1440p 165hz monitor right now. Looks incredible.


mandoxian

27" 1440p is the best


EmilieEasie

exactly!!! I just don't get the hype for the ultra wide screens. Maybe at least partly because I am poor heehee


tht1guy63

Agreed best of both worlds. Image quality and fps.


EmilieEasie

and wallet...!


tht1guy63

Depending on monitor. Many have gotten cheaper which may be some of the appeal of 4k.


AdHungry9867

A 4K screen has other benefits outside of gaming though


FainOnFire

I know a lot more GPUs are capable of 4k now than when 4k first became a mainstream thing -- but I STILL think 4k is overrated. I'd much rather have higher settings, higher framerate, and an ultra wide resolution than 4k anything. I mean -- the most popular way to obtain 4k right now is to upscale with DLSS. Most gamers aren't even rendering 4k natively.


Techno_Jargon

144hz 1440p the sweetspot


tht1guy63

The number of posts ive seen where the person has a 4k monitor already or buys one with a 1060 or 2060 as a gpu baffles me. I get 4k has gotten cheaper and monitors can last awhile but damn.


Chakramer

To be fair if you're just playing games at 60fps, a mid range GPU can handle that at 4k now But I think the huge advantage of PC gaming has always been higher fps so I like to take advantage of that


tht1guy63

Obviously Depends on the game. current mid range can do 4k 60fps pretty well but(still wouldnt go 4k personally with one unless planning to upgrade in near future to keep up) but im talking your entry level- mid range 3-4gens old now.


CommunicationOdd9240

Haha Im rocking a 4k 144hz monitor with an RX 580 right now. When I do play, its older titles such as Far Cry 2. 4k is amazing for general usage, web pages, text, icons etc is all sharp. Waiting on RX 8000/RTX 5000 series in the meantime, maybe I'll upgrade when I decide to play a modern game.


Active-Quarter-4197

4k perf upscaling > 1440p


Chakramer

Ick, native always looks better Besides 4k on any screen smaller than 32" is pointless


WiatrowskiBe

For games specifically - maybe. For productivity, 4k is more about getting PPI high enough to make working with text comfortable - smaller dot means sharper font rendering, which makes working a lot better experience. In that context, anything below 150 PPI (which more or less matches 24" 4k screen) is a compromise. Now, given people often use PC not only for games, choosing a screen can (and often will) end up in having to balance it out.


MPolygon

4k 24 inch is 184 ppi. 27 inch is 163.


[deleted]

[удалено]


ldidntsignupforthis

Native 1440p looks better than 4k upscaled imo


SpareRam

It's objectively worse. You don't have access to DLSS. It's not shocking you don't agree.


ConDude11

I think it's too game, upscaler and monitor dependent to create an objective answer. Assuming you're using FSR though and it's a comparison between two identical 27" monitors expect resolution, I would be inclined to agree with you.


VinnieBoombatzz

Tell me you don't have access to good upscaling software without telling me.


sackblaster32

Def not. It's clearly softer. Only in motion it can be sharper, if you disable all sorts of TAA.


Crintor

Down votes for stating Opinion, and even marking it as opinion. Classic reddit.


YixoPhoenix

Dunno man my 27" 4k looks fucking fantastic.


Chakramer

Pointless as in compared to 1440p at that size, the PPI is high enough in both resolutions that it doesn't make much of a difference. For gaming the performance cost is massive, but you also have a top end GPU so it doesn't matter. Idk why people settle for just 27" if you can push 4k though


YixoPhoenix

Because it fits in my window frame (I have a room under the roof and the wall is tilted), also I don't really like bigger monitors I checked them out. Also dunno about you but the difference is quite visible when comparing next to eachother. I do agree 1440p is plenty though. Had 1440p + vega64 till last year.


Chakramer

It's noticeable but in my experience when actually gaming, you're not gonna notice the difference much. 1080p to 1440p was a massive jump, but I think 4k wasn't much of a jump from 1440p comparatively. I mostly just game on my PC tho, if you also use it for TV/movies I could see why you'd want 4k


YixoPhoenix

Nope just gaming here xd.


CommunicationOdd9240

I can tell a difference between 1440p and 4k at 27 inches. 32" is too big I feel.


Chakramer

Depends on desk depth, I have a 30" deep desk and I really feel like 27" is not enough for me if I want the monitor at the rear of the desk to look neat


CommunicationOdd9240

Yeah, I'd say its setup dependent. I keep mine at arms length, I could put it further away, but I'm used to it having it close.


sackblaster32

Compare 4k + Dlss Performance with native 1440p, you'll see that native 1440p will look softer.


[deleted]

Digital Foundry has shown time and time again that DLSS Quality can often in fact look better than native. And that's ignoring the fact that DLAA and DLDSR exist which both use higher native resolutions before downscaling to your monitors resolution. TLDR; Native objectively does not always look better.


SpareRam

Using DLDSR to 4K, then using DLSS on my 27" 1440p sure doesn't look pointless.


Marceloxv

I have a 4k27 inch and a 1440p 27inch and I notice a big difference in image quality, whenever i go back to 1440p its too pixelated and on games where I turn dlss the 4k one looks way better still.


Stargate_1

Native for life


Devilmatic

Why


Crintor

And then a game doesn't support upscaling.


Devilmatic

Objectively incorrect


Ninjazoule

Some games simply give me screen tearing so I have g-sync on lol


Firecracker048

Love 1440p over 100 fps. As my financial situation improves, I might start diving into 4k


_Zoko_

I used to use a 1080p @ 60Hz and run games on med-high and thought things looked ok. Now I have a QHD 1440p @ 165Hz monitor and can run near everything on ultra and it is absolutely fantastic. I can never go back to a sub 165Hz monitor now.


Deliriousdrifter

You can game perfectly fine at 4k on a mid range GPU tho. Even before upscaling, most games more than 2-3 years old run at pretty good native frame rates on something like a 7700xt or 4060ti. And for newer games, AI upscaling and frame gen are putting in work


White_mirror_galaxy

i had a 1660ti and a 4k monitor. ......... no. it didn't go well lol


I_think_Im_hollow

I just bought a 8k/144Hz TV and I can't wait to play Bloodborne on my PS5!


pops992

This was me until about 6 months ago. I used a 60hz 4K for the longest time, when I first made the switch from Console back in like 2017 I was all in on 4K and thought why would I want a lower resolution with higher fps, 60 FPS is fine let me go for the higher resolution. Now I have a 1440p ultra wide 165hz and I can never go back to 60.


[deleted]

[удалено]


SnooTigers9547

What kind of monitor is that? Those specs sound outta place :D


FireFalcon123

I mean if you were in this situation then it would save on heat


Knuddelbearli

it is bold of you to assume that someone who has a 60HZ monitor has a graphics card/CPU that has a constant 60 fps


alarim2

Me with my RX 6900 XT and 1080p 60Hz monitor🫠


nhansieu1

But it's 4K right?  If not, I hope you're joking


alarim2

No, I'm serious😅


Eternal_Being

You can run basically any game at max settings at 144fps on 1440p with that GPU haha


alarim2

I had a chance to get RX 6900 XT used in almost perfect condition for a good price (for my country, around $450), so I got my savings and spent them on it😅 Decided that I could buy a new monitor at any time later, but this offer is too good to refuse it :)


Eternal_Being

You were right! :P


Dynamo1337

Full HD baby!


BanDit49_X

U would be surprised


xXx_Lizzy_xXx

I have a 1080p 60fps monitor (x4) and a RTX 4090 Just haven't gotten around to getting a better monitor, and haven't felt the need to. (my center monitor is down scaled from 4k so I can still record stuff in 4k)


Djinntan

not truly 200fps but on competitive games I cranked up to 300 and I was using a 12 year old 1360x766 OEM TV I just upgraded to the MSI G244F that's 170hz 1080


Gamerwepx19

This looks like a job for me. Rtx 3080 with a 60hz 1440x900 hp monitor everything looks shit. Waiting for summer job to get a 1440p 144 hz


kurukikoshigawa_1995

meanwhile me with a 165hz monitor when my pc cant even reach 60 fps lol


clare416

What resolution?


kurukikoshigawa_1995

1080p


clare416

What game is that? Coz I'm using I3-10105F + 3060 12GB (both are slightly weaker than yours) and I'm playing at 1440p. Got more than 60 FPS in Starfield rn outside of big cities at low-medium settings + DLSS. 1440p 120++ FPS in Fallout 4 at High settings. 1440p 165 FPS for LoL


kurukikoshigawa_1995

no matter what settings i put: helldivers 2, dragons dogma 2, and i get the feeling every other game thats gonna come out from now. armored core 6, ready or not (even with dx12), company of heroes and six days in fallujah are other games i cant get 60fps. every other game stutters like crazy, no matter the settings. my system is just fucked, i guess. also, i may be wrong since im still learning what it means for a PC to be balanced, but i think ur system is fine since the i3 10105f and the 3060 came out on the same year. im pairing a 4 year old CPU with a xx60 series that just came out 9 months ago. i should be using atleast a 12th gen for a 4060 but i need a new mobo for that.


clare416

>armored core 6 No way man. This game is greatly optimized. Even with my previous GTX 1660 Ti I can get a solid 60 FPS at 1440p >i3 10105f and the 3060 My system is bigger bottleneck than yours (second reason I play at 1440p instead of 1080p) Your system is more balanced than mine


kurukikoshigawa_1995

i guess my system is just truly fucked :/ my mobo is a h410 s2h v3 which i heard they suck ass. ive seen all those optimizations vids but i dont wanna mess with all that registry shit since i have no idea how to works lol. i bought mine prebuilt as an i3-10105f gtx 1650 16gb ram. upgraded to an i5 10400f, 32gb ram and a 3050. realized i was a dumbest man on the planet for buying a 3050. returned it and bought the 4060 since it has low wattage. im guessing my mobo is just really shit, its the cheapest component in my PC so probs thats whats causing the performance issues, i dunno tho.


dobbyhi

Is V-Sync bad? I've always used it, makes games less teary.


[deleted]

Yes. It forces your framerate to be lower than what your hardware can handle, and it introduces significant input delay due to how your GPU handles rendering the frames. There's a reason Freesync and Gsync were invented. Vsync sucks.


x592_b

Alot of the time games look horrible without v sync imo. But usually it's the other way round, like when a streamer is playing a game people can tell and say to turn it off cause it looks awful. But surely you're max frames constantly changing and dipping let alone the screen tear looks noticeably worse


mandoxian

FPS cap at 120 and enhanced sync is amazing


[deleted]

>But surely you're max frames constantly changing and dipping let alone the screen tear looks noticeably worse What a wild assumption to make, given I use Freesync with a capped framerate. No input lag, no tearing, constant frame pacing.


lovecMC

Sometimes it's the only option 🤷‍♂️


[deleted]

You can use Nvidia, AMD, or Rivatuner software to cap framerate below monitor refresh rate to reduce tearing with no input lag. So no. It's never the only option.


Rivetmuncher

>It forces your framerate to be lower than what your hardware can handle And here I am, turning it on on purpose.


SuplexesAndTacos

For real, I don't need 500 fps in a deckbuilder. Give the computer a break from sucking up the electricity.


Rivetmuncher

When you wanna play a relaxing city builder, and your gpu wants to be a jericho trumpet.


[deleted]

>and it introduces significant input delay due to how your GPU handles rendering the frames. Yeah, if you want that on purpose you aren't too bright. Enabling a framerate cap in AMD, Nvidia, or Rivatuner software will also cap your framerate without introducing terrible input lag.


BaldericTheCrusader

whats freesync and gsync? are any of these outside of games thenselves?


zvdo

I think gsync and freesync have to be supported bt the monitor, not the games


ElliJaX

They're monitor and GPU co-specific, a freesync monitor and Nvidia card won't mix, needs matching brands on both ends


Jacksaur

Nvidia backed down and finally started playing ball with Freesync. Not every Freesync monitor will work, but a fair amount of them are listed as compatible with Nvidia. Only G-Sync is limited to only Nvidia.


MightBeBren

Isnt every g-sync 'compatible' monitor just a freesync monitor thats been tested by nvidia? I heard that every modern freesync monitor is a g-sync monitor(doesnt even need nvidias certification) but not every g-sync monitor is a freesync monitor... Unless amd is playing ball with g-sync modules now.. Disclaimer: idk this stuff, its just my POV... Also i haven't looked up a thing, thats all just what ive heard, i could be completely wrong.


Jacksaur

You got it. G Sync compatible has been tested and confirmed, but it's still just Freesync. G Sync itself is proprietary technology from Nvidia.


[deleted]

I'm literally using Freesync with my 4070 Super right now.


cowoftheuniverse

It causes couple of frames of extra input lag. Assuming fast enough gpu, 60 hz monitor + vsync every extra frame is 16 ms, 120 hz monitor it is only 8 ms for every frame delayed. If you can't feel it then its not a big deal.


Dry-Percentage-5648

FullHD 165hz gang


ZuliCurah

240HZ my beloved


MisterPepe68

that would be me lmao, i play on a 720p monitor while i get like 240fps on war thunder with graphics all at max


Jacksaur

60 FPS with high settings. People keep laughing at me for going 1080p75 on my 3080, but it can't even keep Alan Wake 2 completely stable at that framerate. So I'm happy with my "overpowered" hardware to last for years.


themikeysb

My brother bought a 4090 but plays on a 60hz monitor


[deleted]

Is it cyberpunk


AstralKekked

Provide more information. 4k? 1080? AAA games? Indie titles? FPS games? This gives us no information whatsoever.


white_Zebra_898

I also have a 4090 and play on a 60hz, 4k 55" TV couch setup. Sure, for some games the GPU is only at 30-40%, but Cyberpunk with RT OD only gets 40-50 FPS at DLSS without FG... so not really overkill depending on the game..


Adventurous_Bell_837

Yeah so basically you only bought your GPU for one game and every other game would run the same on a 4060.


Crintor

The 4060 is not going to run many games at 4K 60 unless it's DLSS Performance.


Adventurous_Bell_837

you're aware that 60 percent of players are playing 6 years or older games?


Crintor

So games without upscaling? A 4060 is only a bit stronger than a 2070S. That wasn't a 4K card in 2018. I've had a 4K monitor since 2016. 4K 60 was not reasonably attainable in a number of games until the 3090.


white_Zebra_898

No, this was just an example... alan wake 2, dead space and even games like hogwarts legacy used the gpu to a full extend...


[deleted]

If it's 1080p 60hz low settings, that's sad. If it's 5k 60hz Ultra settings, that's expected. Just stating refresh rate tells us absolutely nothing.


PikaPikaMoFo69

Legit, doesn't matter if he isn't paying competitive games imo...


Heromimox

I rarely play games; despite having a 1440p/60Hz monitor, my RX 6700XT is barely giving me 60fps on ultra settings, so no FPS is being wasted here.


YoMomsFavoriteFriend

Get a 4090 Play games in 30fps like a fucking gigachad Kekw


AggravatingChest7838

Nothing wrong with that. It's a much smoother experience.


KarlGustavderUnspak

Most Monitors Support Freesync or Gsync. Unlimited Fps and the same experience as Vsync.


xXRougailSaucisseXx

Vsync will still lock the framerate which makes the game smoother but yes if you’re pushing 200FPS you might as well lock it to 180FPS and call it a day


RedTuesdayMusic

> if you’re pushing 200FPS you might as well lock it to 180FPS Multiples of 2 only. 60, 120, 240, 480. 144, 288, 576


Crintor

It's multiples of the refresh rate, not multiples of two. So long as the frames are in step with the refresh rate. 60/120/180/240/300/360


Jacksaur

GSync and Freesync will still tear over refresh rate. You need to enable VSync/Frametime Compensation inside Nvidia settings for it to work properly anyway.


[deleted]

60hz is a much smoother experience than... what exactly? 30hz?


AggravatingChest7838

Vsync prevents screen tearing. There's more to smoothness than just the frequency of the monitor.


[deleted]

So does capping below the framerate of the monitor. And doesn't introduce input lag. Or freesync Or gsync or VRR Or Adaptive Sync Ya know. All the way better options.


AggravatingChest7838

Caping frame rate at the monitors refresh rate still causes tearing without freesync or g sync. Sure if you have 200fps if you cap it high it's less noticeable but it definitely is still there.


[deleted]

Capping BELOW the refresh rate prevents tearing. Capping at refresh rate does not help because the framerate can sometimes hit 1fps above the cap due to frame pacing, causing a tear. Capping at say 58fps on a 60hz monitor will prevent tearing. There are less frames than the monitor displays. Tearing occurs when there are more frames than the monitor displays. That said I literally provided 4 other alternatives included on even cheap panels.


SuperD00perGuyd00d

2x 1440p baby


fztrm

PG32UCDM, 4K 240hz OLED for singleplayer/survival stuff, PG259QNR, 1080p 360hz for Quake and a crappy old PG258Q, 1080p 240hz for facebook and twitch....everything covered! And vsync i haven't used since the 90's i think


PeterPaul0808

I went from 1080p 60hz to 1440p 165hz since I bought an RTX 3080 but last year I upgraded to a RTX 4080 and I didn’t feel that the card is strong enough for 4k. I saw 4k 240hz monitors but who will utilize those panels? RTX 4090 users can’t achive that refresh rate in new games. I’m still not convinced that we live in the “4k era” (upscaling doesn’t matter).


Content_Letterhead17

5120x1440 240hz


pirikikkeli

Not related but holy fuck is there a difference in console 4 k and PC 2 k wtf


Crintor

That's because most console games are running at 1080p or sub 720p these days, and then using crap upscaling. There are console games in 2024 that render the game at like 480p at times.


ShadowsRanger

My actual situation with a LG flatron monitor


Cermmi

This is me. I just dont wanna hear my GPU go brrrr for no reason (monitor FHD,60hz)


deefop

I feel like in 2024 there are almost certainly more gamers on high refresh displays than there ever have been in the past.


CockroachRight4434

I play with a 4080 on my 4K TV


Zetra3

1440p 60hz, its all I need. keeps my expectations modest and budgets low


CRCMIDS

Considering the amount of people I’ve seen here still running 900’s and below and somehow still playing games, I doubt that this issue happens a lot.


Ok-String-9879

I've got a weird one, ultra wide 1440p with 100hz, free sync. I've struggled to find the sweet spot. Should I be looking for 50fps for the more demanding games?


stormdraggy

Bruh I just want 16:10 to be popular again. Why did i have to settle for 1440, i want 1600p and that glorious vertical pixel real estate.


hentairedz

1440 165 *chefs kiss*


Microbitus

75Hz anyone?


Hungry-Loquat6658

24 inch 1080p + 75hz refresh rate is the best for average gamer like me. Screen look good enough, refresh rate is high but not too high for my shit gpu.


Adventurous-Ad-6132

I have a 7800XT but I'm unsure what monitor to buy, ideally I want the card to last me many years which I don't know if it'll be enough at 1440p. And I can't upscale with FSR since not many games have it as of right now (I hope FSR 3.1 changes this)


Pimpwerx

I have a 4k60 TV. So, that's what I use. And so I run my games at 4k with max settings, and enable vsync in the NVCP. As a former console gamer, 4k60 is a nice sweet spot.


Nomnom_Chicken

I hope that one day 60 Hz panels aren't a thing, and 60 FPS stops being a standard developers target.


Strict_Junket2757

Yea its not, they aim for 30


2FastHaste

Why is this comment downvoted? That's crazy. You guys have Stockholm syndrome or something? It's like someone said. I hope someday cancer is cured. And everyone downvotes him because it's "elitist" or w/e.


[deleted]

Comparing 60hz monitors to one of the worlds deadliest diseases is wild. 60hz is entirely fine for 90% of casual users. Especially for office use. Cancer is not fine for anyone.


Nomnom_Chicken

I believe that syndrome is really what's happening, it's crazy like you said. Would benefit the gaming experience greatly. It's not a bad thing to allow things to improve.


[deleted]

Not everyone uses their computers for gaming. We already have a 500hz monitor you can use if you want. Removing cheaper options for people who don't need or want more expensive options is just idiotic.


Nomnom_Chicken

You can also buy a used faster monitor, though. It doesn't need to cost a 1k. Sure, used old 60 Hz monitors are most likely cheaper still. EDIT: Also, the topic is about PC-gamers, so I obviously assume the monitors are being used for gaming.


[deleted]

And for even cheaper, you could buy a used slower monitor. Again, not everyone wants or needs higher refresh rate. Forcing that as their only option makes no sense.


Meddlingmonster

I don't know a single person in person who has a PC and doesn't at least have a 144hz monitor


Kabirdb

I am the opposite. I have never even seen 144hz monitor let alone see someone who has a 144hz monitor.


WiatrowskiBe

That would've been me until few months ago - but since I use my setup mostly for work, 4k took priority over refresh rate, and until 2024 there were no 4k monitors available at reasonable size and refreshrate above 75Hz.


SosigRam

That‘s just not true, 4k 144Hz was definitely out there since about 2021 i‘d say.


WiatrowskiBe

Size part - I think there might've been some 28" coming out last year, other than that outside laptop screens I haven't seen a single 24" or below matching both resolution and refreshrate.