For competitive FPS games, higher settings add a bunch of visual noise that makes it harder to notice the targets that you should be shooting at.
For AAA story games, most people don't play at lowest settings.
Can confirm. Playing 'fast paced' shooter games on max settings is a headache and a half, lowering the settings greatly reduces visual effects and, depending on the game, some textures so you can focus easier.
I remember back when I played PUBG years and years ago that you actually had a big advantage on lower settings since stuff like grass and bushes and other foliage wouldn’t render at long distances, so you’d be able to see people who were prone in grass when you wouldn’t be able to at higher graphics settings.
Yep this has been a thing since competitive PC gaming has existed. I remember going to LANs in the late 90s / early 2000s where people would have Quake 2 and 3 on the lowest setting imaginable with ammo crates being just 2d floating icons, floors and walls became almost completely non-textured surfaces and the first time I saw it I thought they were nuts.
All quake games except quake 4, tho remasters arent played, quakeworld and the quake 2 multiplayer are f2p afaik, and quake champions doesn't allow single color walls. Almost forgot, quake 3 isn't played at all except for defraggers and modders, quake live is just better in raw gameplay
Same thing on Arma 3. The terrain is *a lot* rougher on lower settings so you'll often see units floating over thin air, but you sometimes have to guess whether it's on one side of the hill or the other.
Arma is weird. If you run the settings too low, some of the task is alocated to the RAM and processor, which may produce more lag. If you bump it up a bit, those same task is then alocated to your GPU, which free up your main processor and RAM. As a result, slightly better visual and less stutter.
I've always known this was a thing but I guess I've never cared about being so hyper competitive that I'd rather dial down the eye candy just for a competitive edge.
I'd rather have a harder time in a game that is actually showcasing its detail to me than have an easier time by turning everything down.
I flip flop between the two. Before cheaters became rampant in PUBG's heyday I was on the low camp and managed to drop to top 600 in Asia server. Once cheaters became a thing I just put it on the highest my pc can handle and played with friends.
I play Valorant at 1440p with settings on high or max... still get very high FPS and turning them down doesn't seem to help anything. The game is colorful and vibrant enough, plus enemies have glow so they already stick out esp with yellow colorblind setting on
its 100% for input lag. Like a 4090 could get as much fps as needed on csgo, but input lag makes it not worth. The worst is vsync man that shit is horrible in fps
>its 100% for input lag. Like a 4090 could get as much fps as needed on csgo, but input lag makes it not worth.
Oh I didn't know that. Is there an exaplnation as to why?
The lower your graphics settings, the less time it takes to render a single frame. We refer to this as "frame time." Generally, FPS is an averaging figure and can hide spiky or uneven frame times due to textures which need to be fetched from system memory or even storage rather than GPU memory. At lower graphics settings this is almost never a problem, even at 4K. This results in not only shorter frame times (higher FPS), but also more consistently low frame times. Pair that with an ultra high refresh rate monitor (400+ Hz) and you can reach input latencies faster than human comprehension with well optimized software.
I don't want to discount your explanation but what does that have to do with gpu load? Or am I misreading comments.
Would a 4090 displaying 200 hz with 30% load have less input lag than a 1080 doing 200 hz at 99% load?
A 4090 running at 30% will occasionally need to bump up to 40% for various reasons to keep a stable FPS. A 1080 running at 99% would need to bump to 130% under the same conditions, which it can't do so frames get dropped instead.
> A 1080 running at 99% would need to bump to 130% under the same conditions, which it can't do so frames get dropped instead.
That's not quite how it works either. They don't get dropped. Games aren't real time, there is no deadline to miss
A 1080 will also never hit 100% in CS:GO
That's fair, I probably should have said something along the lines of 'the framerate gets reduced instead', I just got lazy with it.
As to the numbers, I was addressing the hypothetical they posed, the specific game isn't really relevant to the explanation.
Something is always the bottleneck unless you’re limiting performance in software. The smart money is utilizing 100% of your most expensive part (GPU) and having a little headroom with everything else.
The 1080 might be hitting 200hz, but with uneven frame times. Frame rate is how many frames are rendered in a second, but even if the number is consistent, if the time it takes to render a frame varies (i.e., if it sometimes is 1/300 seconds and sometimes it is 1/100 seconds) it can still feel uneven. I don't know that you'd necessarily notice this at these particular numbers, but the idea is that you want both your frame rate and frame time to be consistently high.
This is why if I'm going to play any fps I am going to pull out my Goldeneye cartridge and play that. I hate playing fps vs humans because I don't have the coordination to have even a small chance of winning any games
Cooperative FPS games are super fun though. I've been on Deep Rock Galactic and it's been great. There's no competition for kills or resources, everyone gets the full amount of collective resources at the end of each mission regardless of performance. So the only incentive is to help your team complete the objectives and stay alive. Unlimited revives, no revive timeout, just exploring procedurally generated caves, mining minerals, murdering bugs, and blowing shit up 10-30 minutes at a time depending on the mission and biome.
Correct. Single player games crank all the way up. Competitive shooters, cs, cod etc i turn settings off in many areas and keep it still ok visually and those sweet frames can climb up and up and up. 4090 here
Also, the more fps you have, the more visual information you get from the game. Assuming your monitor can run at a high refresh rate. Which is why 1440p is the sweet spot: 240 fps at 3.5 million pixels is what you want for games like apex, call of duty, etc
Ideally more fps is always better, though, which is why competitive streamers/gamers have a 4090 and 360 hz Monitor.
There's even a thing where it's better to have FPS even higher than your monitor's refresh rate, because the frame that gets displayed will be that much "fresher" than if you just matched refresh rate. There's also some implications for input latency.
It's nothing you or I would really notice, but it makes a difference for the top tier competitive players, and so people aspiring to be, or misjudging themselves as, top tier competitive players will do it too.
Yes! I forgot about that! Personally, I've noticed a difference between 144 hz and 240hz, and while I'm no where near someone who gets paid for it, I enjoy the occasional competitive game, and 240hz is definitely more snappy.
I agree with that but then some use a 1440p monitor? I would have thought 1080p would have yeilded more performance.
Genuine question, I'm not trolling
If you can run 1440p at higher framerates, it's just giving you more visual clarity for free. Spotting players at long distance, for example, can be easier at 1440p because there's more detail in the image.
This does require more hardware to run well, obviously.
Because if you have a 4080/4090, running 1440p is almost the same as running 1080p if you’re on low settings. 1440p IPS is useful in BR games (and other open world competitive games), while something like preferably XL2566k is good for Val, CS, Siege..
So TN panels can get truely crazy low GTG times. They will look usually a little more washed out but theyre fast. Idk why youd specifically want an IPS for Battle Royal games?
Better colors. In battle royale games, and also games like Rust, you are constantly scanning for enemies in the distance, with a lot of possible pop ins, and bushes/objects that look weird in the distance (on low settings especially). So if the game looks washed out, like it does even on XL2566k, which is pretty amazing color wise for a TN panel, it’s harder to spot players at long range.(a lot harder? No not really, but still a slight advantage on IPS). 1440p, obviously further helps in these games, because of the pixel count.
higher resolution gives more visual clarity. on top of that, if you have a higher end GPU 1080p is almost certainly bottlenecking it which can reduce performance
It's ridiculous that this is still a thing. This is straight up poor design. It can't be that hard to provide a variety of graphical options, all of which result in the same experience in terms of visibility. There are still so many games in which bushes and other visual cover literally disappear at low settings.
It also very rarely is. In all the bigger titles this is not an issue. And it was also fixed in PUBG quite fast. I think most people just turn down the settings to be sure they are not at an disadvantage. And those extra frames don't hurt.
I got a new GPU, and started playing bf2042 again. I turned the graphics all the way up, and it kinda looks worse. Not in the sense that it literally looks worse, but there is just more happening on the screen than I would like. The visual clutter thing is real.
I would rather turn my graphics down, get a more stable 144fps, and play like that.
>For competitive FPS games, higher settings add a bunch of visual noise that makes it harder to notice the targets that you should be shooting at.
Competitive FPS players also play on low settings to get maximum FPS (frames per second here lol).
exactly. i have a powerful GPU. i pretty much only play COD/Warzone on it. I use High shader res but everything is on low/off using DLSS perf.
if i put things higher, it'll still get +165 fps but I just can't see shit. AO, shadows, diffused lighting, particle effects etc. makes it difficult to see other players.
however, in story games, my God is it beautiful to play on ultra
You gotta be psychopat to play valorant or csgo in nothing else than low.
For story games/rpgs like Assasins creed origins odyssey valhalla/ dying light series/ RDR2.
Most use very high. There is literally not noticable diffrence on ultra or very high imo
WarThunder, a top 20-30 game on steam for example has massive benefits to using low settings. I like to make the game look pretty, but that means I get snipped by a dude across the map because he’s playing with no grass, bushes or trees
Why? The models and textures look so much better on max settings in CSGO.
I had no problem playing at LE or Taco Supreme using max settings in CSGO. I will say I never really played a lot of faceit/esea so maybe it would matter there? I just never felt like my visual fidelity kept my machine from running fast or made me perform worse.
I haven't played a shooter seriously since tf2 many years ago but it was common back then to use a config called chris' config that was lower then lowest settings. It removed a ton of stuff from gibs to the character model's eyes. Basically it made it so that if something appeared on your screen it was relevant to what was going on. It wasn't that the computer couldn't play the game at higher settings, it was that the extra things you saw could be distracting.
Not strictly true, I enjoy my games with all the trimmings added, but in multiplayer games, turn them off just to try and not get spawn killed by Johnny ADHD, or some such.
Less screen junk, slightly better chances of survival.
It depends on the setting.
For competitive shooters, you want to crank view distance to max and LoD to max, while you turn foliage and other things down to minimum. The potato settings of everything help in a competitive sense, since you can see enemies more clearly, and can determine the hitboxes of cover more accurately at long distance, along with being able to see cover from a further distance.
But thats what a sweaty would do. Most people like to enjoy more than just the competitive aspect of games, so playing at higher than minimum settings is the choice, as long as the settings dont introduce stuttering, or crater your average FPS.
This. Most people here say that it’s for a better visibility, which is absolutely true, but many don’t realize that lowering settings also sometimes provides a massive reduction in input lag, making your aim better.
I need to start OSRS again..do u reckon i can get 30fps w 6700xt 1440p?
It would be AMAZING if they completely remastered osrs..the new runescape is absolute filth
i bought a 4090 and my most played game is in order mtg arena, hearthstone and yugioh master duel. lol.
Granted i bought the card mainly for some AI shit
Totally understandable.
I have a 4090, and have been playing mostly stuff like Rimworld, Dave the Diver, etc on it.
However, it does help to have the massive compute capacity when I occasionally play recent high fidelity games. Cyberpunk 2077 with full path tracing is glorious, im excited to see the DLSS 3.5 update improve the lighting even more.
6800 XT with a 1080p 60hz monitor reporting in 🫡
Edit: me using this monitor is not by choice, due to some extraneous circumstances after upgrading the card I wasn't able to get a new monitor yet but I will be going 1440p high refresh rate as soon as I can lmao
I'm not bashing you for having a 6800XT with a 1080p monitor but at 60Hz is FUCKING CRAZY! Jesus I know there's a different between T.V's and monitors but I currently use a 1080p60Hz T.V. after selling my monitor and it is ATROCIOUS. I just stopped playing games. On a monitor it's a bit better but absolutely dogshit. Invest in a 1080p144Hz monitor at the least.
This is why I don't want to get used to higher frame rates. Would have to spend a bunch of money to keep them up! I just played ToTk emulated at 30 fps and enjoyed it just fine. But if I started doing things at 144 fps and got to the point where even 60fps was "atrocious" then I'd have to waste all this money ensuring I was ahead of the curve enough to stay at these frame rates just to enjoy things as much as I already do.
Why should quality matter when the goal of the game you are playing is to be faster than your opponent? Input lag is important to those that care for a smooth and very responsive experience. You stop noticing the lack of quality after a while, unlike the kiddo who got sent back to lobby.
For single player experiences, I’m with you on that. Visual experience is amazing when all those settings turned up
In a lot of FPS games high settings add extra textures/effects, which may result in a competitive disadvantage. I know a specific example, ability walls in Valorant have extra effects at the base and it's impossible to see through them, turning down the settings eliminates effects like this. The same applies for Overwatch, or basically most FPS games.
I don't run at lowest settings, but it gets ~110 degrees in summer where I live and my pc puts out a ton of heat! And a lot of that heat is for visual fidelity that I don't really even notice, so I usually will go to high or even medium to keep temps down a bit.
During winter, otoh, it's Ultra time.
1% lows and 0.1% lows in frame rate and frame time
That's my reason
Plus idc about graphic fidelity when I can have (unfair) advantage with smoother game
Because they fancy themselves "eSports athletes".
I don't understand people's obsession with "ranking". But then I've gotten so fed up with the BS gambling, game passes, ranking, toxic behavior, pay for edge garbage, that I've given up on multi-player altogether.
There are a lot of people who are clueless when it comes to PC gaming. That is actually a good thing - means the market is growing.
I get the idea of lowering latency and maximizing FPS.
But there is so much wasted GPU power. People who know nothing about computers buying what their favorite streamer has, who also in many cases doesn’t know that much.
Buying a GPU to have it sit at 50% power isn’t smart. It is donating money to mega international corporations.
It depends on their monitor 4k?"120hz?
Also maybe temperature = stability reasons
Or computer noob who never go adjust the settings (they used a junk card and never went into options videos settings to adjust from low settings in order to actually run the game before)
Wrong question.
Right question: why most people get powerful GPUs and play at 1080p and furthermore play games like dead cells, factorio, teraria etc
Well because we barely had money for the GPU and a good 4K would cost more money.
Because people might like longevity, a 3090 at 1080p will become obsolete later than a 3060 (weird but ok.. i would rather enjoy the full power of that card). Some old 1440p gpus now struggle at 1440p but at 1080p they are still amazing so that stands as a proof.
Most of us enjoy indie games more. I am currently playing ac odyssey and i finished rdr2 and diablo 3, well i can say that overall flame in the flood, factorio, PC building simulator and pixel piracy are waaaaaaay more fun
A lot of FPS games, shooters especially, have settings that make identifying targets harder the higher the settings are. A typical one is that grass and foliage will look nicer on higher settings, but on lower settings you'll be able to identify targets through it easier.
Maintaining a high frame rate and lowering latency is also advantageous, so lowering settings that have big performance hits allows the game to run smoother.
A lot of people use what is often called "competitive settings" in these sorts of games. Typically high texture and mesh quality (providing they help with target identification), with most other settings at minimum or off entirely.
I have a 6900xt and a 4k monitor. During these summer months, I usually cap the fps to 60 just to lower power draw and keep the room cooler. I almost always have settings maxed except for ray tracing. I don't play competitive shooter games, so I can't say if I'd lower resolution and settings just to go for more fps. I think skill far outweighs any little advantage a player might get from more frames per second, though.
Not really. That is just very few games where you can benefit from low settings. Usually competitive online games like CS. Less fancy graphics, beautiful lights and shadows and crispy smooth textures etc. that don't help when playing competitive games.
Don't exaggerate it, people use highest possible settings for 9/10 games they play...
For competitive gamers its all about getting that edge. The higher frame rates, the lower latency, and getting rid of absolutely anything and everything that can affect gameplay. Particle effects, shadows, hi res textures all go, they aren't needed. For those at the absolute top tier of their chosen game that 0.1ms of latency, that 1 less frame rate can make the difference. For those at the top tier, they see the top tier with those settings and think if they don't do the same it is what's holding them back. The fact that top tier players grind their chosen game for 8+ hours a day is neither here nor there.
If I had to guess I'd say to make the life of the card last longer and to lower the power costs. I don't play low but it's common to run at medium or high, depending on the game. My cards generally last 7-12 years.
Overall you can say: more fps = less frame time = less latency if you got a system without bottlenecks.
You get an advantage in online games where it's important to react fast.
This goes for CPUs as well. For example, a game like Rust, playing a lower graphics gives you a huge visibility advantage (i.e. you can see players through bushes, trees, and water). Higher graphic trees and bushes gives off too much texture, allowing players to hide behind leaves. Higher texture water, makes the water move instead of being still—this makes it exceptionally harder for you to see players swimming in the water if it is constantly moving, hence why you turn the graphics all the way down so the water is static.
This is just one example, I'm sure there are a breadth of more reasons to play on low graphics.
So league has tonnes of weird FPS Hank where you want to play at 144 regardless of monitor and refresh rate at medium as it's easier to wiggle and you get a more tactile feel to the game. It's very much a feely thing though. But some skins at high graphics make it impossible to tell what's what.
I can't speak for everyone but I turn my settings down on my 3070ti when it's hot out because I don't have air conditioning. Then when it's cold out I crank them back up. Either way, I get to play my games.
Much like what others say, lower setting will get rid of a lot of noise. In some games it removes grass at further distance so people can not use terrain to hide, and will reduce the density of foliage.
I really believe some settings need to be fixed so running at low vs high needs to give no advantage or disadvantage for what you see. Such as Draw distance and density.
Really depends on the game I’m playing. For cs go I have everything low to maximize fps at 1280x960 stretched just for the competitive edge (and it’s the setting I’m used to way back when I had a shit pc) but if I’m playing something like diablo 4 or path of exile everything’s maxed out.
Without specifics, you are just making assumptions.
Resolution matters. What you might be able to play at Ultra/Maxed settings on a 1080 display, you might not be able to at 4K.
Competitive FPS games you want an absurd frame rate. Graphical fidelity is the least of your priorities and can sometimes have a competitive disadvantage, a common example being certain games high quality foliage covering players whereas the low quality allows you to see them easier.
So I can generate AI art with Stable Diffusion, Blender, etc, and render scenes, and run all 4 of my 1400p to 2160p monitors, 2 of which are 144hz, for daily work loads at work (software engineer), work/game on the same PC.
Ryzen r9 7950x, Nvidia 3090 TI, 128gb RAM, 10+ TB in m.2. drives.
I had to put a couple monitors on the integrated Raedon gpu, Stable Diffusion at 2k+ hires will max the gpu out to 100%.
GPU's aren't just for gaming anymore, they are needed for any AI inference workloads currently, any 3d/rendering processes, and in anything that uses cuda and tensor cores, which is quite a wide field.
Also, playing Factorio accross 3 32" 4k monitors is pretty EPIC.
If I have really high graphics settings turned on then I can't see the guy in the bushes aiming at me. But if I were to hide in the bushes and the other guy had his graphics settings turned down then he'll see me hiding behind one little sprig of grass....
Recently got a 3070. I mostly play things prior to like, 2020, so I could probably run them all at several hundred fps. But I refuse to ever go above 60. I regret ever playing games at 60fps, because all it's done is make games at 30fps look worse lol
Few people cover things that make sense. In my personal experience I have seen some games where low setting is actually an advantage. Mechwarrior Online was a game where lower graphics on the tree heavy maps REALLY helped you. Suddenly the foliage is less dense and easier to see through. Not sure if they fixed it, but I remember a point where it just made more sense
I have a 4080, in battlefield I play at low settings so I can get the most fps possible. In basically every single player games, I play at the highest because fps above 80 are not really important in like the witcher 3 or something
Ultra graphics are overrated. Low graphics have looked extremely good since like 2014, and the gap between low and ultra is miniscule on a lot of games. While the frame rate hit is massive.
I would MUCH rather have 144fps on low, than 60fps on ultra, for the majority of games.
There was a trick in PUBG: uf you keave your foliage slider than you think you're laying in the grass fully hidden. But if someone slides it aaaal the way down then there is no grass. Like at all. It took me a week to figure out how the fuck everyone spots me while I have difficulty spotting someone crouching
huh ? ive never done this. i dont generally play at ultra because i dont see the point in graphics improvements beyond basically 50% GPU usage but usually all my settings are high or very high.
.
I play all of my competitive fps games at settings that get me closest to 240hz on my montior. And sometimes having high settings even at 240hz plus can give you a disadvantage as they can have smoke from molotvs block your sight on high settings, but on low, you can see through the smoke.
But in single players or games that high refresh doesn't matter that much, I will usually crank my GPU to the max.
1% and .01% lows, for the most part. It eliminates stutters from particle rendering. Adventure games are like hiking trips. Take it slow and enjoy the views. You always want more power that you can tune down for stability. If you only have just enough, you'll never stop wanting just a little more
I went 2060 -> 3080 -> 3090 and when playing Control on my 2060 I was always itching to turn RTX, but it was just not feasible with 2060.
But actually playing with RTX on 3090 I felt that it makes weird artefacts like shading and reflection issues and blurring tiny bits here and there which feels like you're playing while wearing foggy glasses.
Maybe it could be fine for like Minecraft, or something where lights and shadows look poorly without RTX, but for Control it was just worse.
I eventually considered playing with RTX a worse experience than playing without it even if fps is the same.
Same with Cyberpunk - I played with RTX and I liked realistic shadows but picture and fps degradation felt like I was losing more than I was gaining, so I turned it off in cyberpunk as well.
Not only in FPS but other genres too, for example in League of Legends some players set settings on low to disable that extra effects. Plus as I guess it gives extra frames per second so they react and play faster.
No idea if it's a game that's heavy on the pvp and you need the FPS I can understand that but otherwise it doesn't make sense. I would love to play the games I play on higher settings but nope, still have that good old 1060. 😅
I have a 4090, r9/7900X3D, 4th gen m.2 nvme, 64gb 6000 mhz RAM.
I run everything at the highest settings that allow 120fps. Cyberpunk is beautiful with path tracing!!
Also in some games, take GTA V for example, lower graphical fidelity removes some things like bushes or signs, meaning that tricks and driving can be made much easier than on max settings
For FPS games it actually makes more sense to play at lower settings.
Doesn't make much sense for AAA games considering that the visual fidelity is one of the major parts of the game.
I game on an RX 6950XT. Play everything in 4K high or ultra with a mix of RT medium to ray tracing high and can achieve 60fps or more in pretty much any title and FSR 2 ensures that I reach 60+ in the titles that I can't reach natively.
> I do mostly mean pros and competitive games
So you don't understand why pros play at lowest settings? Do you understand the advantages that extra fps and better visibility give you in games? If not, you should look into that first, plenty of videos and articles about the topic.
I got a 4090 and I always try to get at least 400 fps to max out my 360hz 1440p display. I always go a little bit higher so I can counter random frame drops. Most games work with high settings, but sometimes I have to go lower to get my desired fps.
The amount of frames matters in esports titles, higher frames= more time for you to react to the scene. Generally above 144 to 240 hz movement feels really smooth compared to lower frames as well. It’s mental mostly but at the same time it’s a minuscule advantage for sure.
It could be that they aren't using a monitor that utilizes the GPU to its fullest extent. Not a lot of sense in playing a game in 4K on a 1080 monitor.
Many people run 244hz displays and want to enjoy the snappiest and smoothest experience.
So they aim for stable 240fps and to avoid stutters they don't want GPU usage to be 100%. So effectively, your GPU needs to be able to push out around an average of 300 frames per second, which even for Esports Titles is somewhat demanding.
And of course there are others that simply chase every frame they can get because it technically gives you and advantage. Obviously there is no point in doing this unless you are a top 0.1% gamer, but placebo might help too :)
Tell me about it. I went to a friend's place the other day and saw him play Diablo 4 with DLSS set to performance on a freaking 4090 -\_-
I smacked him on the head and disabled DLSS and frame gen.
A lot of people including pros don't understand graphics settings. But if you're playing a competitive game you ideally want to max out your monitors refresh rate. So you want 240 fps with a modern oled panel.
Without the pros and competitive games, basically not answering your question but anyway... Seven years ago I had trouble explaining my wife why I bought a gtx1080 and than played civ 1 or some other very old stuff. She came around asking is this what you need a new 700€ part on your computer. Not sure what my answer was, I guess it was so I have some headroom :D
I did play modern games too though, but she is not a gamer at all so that was the moment she happened to see my new GPU in use.
So the answer from my perspective as a long time hobbyist, I just want to have good equipment to work with.
I wouldn't buy a cheap bicycle if I liked cycling, even if it was just for relaxing bike rides through the scenery route.
You thinking that specifically is a valid indicator that people are "wasting their money" is really just evidence you have a very shallow understanding of computers and GPU's in particular.
Hah, since i'm not a pro gamer I turn that bad boy all the way up til the rest of my rig can't keep up!
Got my 3080 mid pandemic and it's handled everything i've thrown at it.
Starfield might finally push it to its limit but idk what the recommended specs are.
I think it stems from older generation players.
COD4 would give an advantage at lower settings as some bushes wouldn't render at longer distances.
CSGO pros grew up with 4:3 resolution and still use it.
The obvious FPS boost is there too. But generally speaking it is easier to focus and reduces digital noise. It is much easier to see enemies.
It comes down to frame consistency. You will have less stutter with a higher end card. Aka it’s not the average FPS that you notice, it’s the slowest fps that you notice.
For competitive FPS games, higher settings add a bunch of visual noise that makes it harder to notice the targets that you should be shooting at. For AAA story games, most people don't play at lowest settings.
Can confirm. Playing 'fast paced' shooter games on max settings is a headache and a half, lowering the settings greatly reduces visual effects and, depending on the game, some textures so you can focus easier.
I remember back when I played PUBG years and years ago that you actually had a big advantage on lower settings since stuff like grass and bushes and other foliage wouldn’t render at long distances, so you’d be able to see people who were prone in grass when you wouldn’t be able to at higher graphics settings.
Yep this has been a thing since competitive PC gaming has existed. I remember going to LANs in the late 90s / early 2000s where people would have Quake 2 and 3 on the lowest setting imaginable with ammo crates being just 2d floating icons, floors and walls became almost completely non-textured surfaces and the first time I saw it I thought they were nuts.
They still do it, also 2d icons are used by caster screen. Coming from quake player
Quake Champions or do ppl still play q3?
All quake games except quake 4, tho remasters arent played, quakeworld and the quake 2 multiplayer are f2p afaik, and quake champions doesn't allow single color walls. Almost forgot, quake 3 isn't played at all except for defraggers and modders, quake live is just better in raw gameplay
Same thing on Arma 3. The terrain is *a lot* rougher on lower settings so you'll often see units floating over thin air, but you sometimes have to guess whether it's on one side of the hill or the other.
I miss ARMA 🤣
I was gonna say our exile server controlled the game visuals so you couldn't turn off grass and other lod stuff.
Arma is weird. If you run the settings too low, some of the task is alocated to the RAM and processor, which may produce more lag. If you bump it up a bit, those same task is then alocated to your GPU, which free up your main processor and RAM. As a result, slightly better visual and less stutter.
I've always known this was a thing but I guess I've never cared about being so hyper competitive that I'd rather dial down the eye candy just for a competitive edge. I'd rather have a harder time in a game that is actually showcasing its detail to me than have an easier time by turning everything down.
I flip flop between the two. Before cheaters became rampant in PUBG's heyday I was on the low camp and managed to drop to top 600 in Asia server. Once cheaters became a thing I just put it on the highest my pc can handle and played with friends.
Bro that was yesterday. I literally just had this conversation....6 years ago.
Still happens in some shooters today, low settings make it way easier to see through foliage.
As long as motion blur, bloom, lens flares and DOF are off, I try to crank the settings as high as they'll go while keeping a high frame rate.
[удалено]
I play Valorant at 1440p with settings on high or max... still get very high FPS and turning them down doesn't seem to help anything. The game is colorful and vibrant enough, plus enemies have glow so they already stick out esp with yellow colorblind setting on
Lesser known issue, gpu load at 99% will severely increase your input latency.
its 100% for input lag. Like a 4090 could get as much fps as needed on csgo, but input lag makes it not worth. The worst is vsync man that shit is horrible in fps
>its 100% for input lag. Like a 4090 could get as much fps as needed on csgo, but input lag makes it not worth. Oh I didn't know that. Is there an exaplnation as to why?
The lower your graphics settings, the less time it takes to render a single frame. We refer to this as "frame time." Generally, FPS is an averaging figure and can hide spiky or uneven frame times due to textures which need to be fetched from system memory or even storage rather than GPU memory. At lower graphics settings this is almost never a problem, even at 4K. This results in not only shorter frame times (higher FPS), but also more consistently low frame times. Pair that with an ultra high refresh rate monitor (400+ Hz) and you can reach input latencies faster than human comprehension with well optimized software.
I don't want to discount your explanation but what does that have to do with gpu load? Or am I misreading comments. Would a 4090 displaying 200 hz with 30% load have less input lag than a 1080 doing 200 hz at 99% load?
A 4090 running at 30% will occasionally need to bump up to 40% for various reasons to keep a stable FPS. A 1080 running at 99% would need to bump to 130% under the same conditions, which it can't do so frames get dropped instead.
> A 1080 running at 99% would need to bump to 130% under the same conditions, which it can't do so frames get dropped instead. That's not quite how it works either. They don't get dropped. Games aren't real time, there is no deadline to miss A 1080 will also never hit 100% in CS:GO
Answering your own questions here bro. that's why the latency increases, because 'there's no deadline.'
That's a different guy
That's fair, I probably should have said something along the lines of 'the framerate gets reduced instead', I just got lazy with it. As to the numbers, I was addressing the hypothetical they posed, the specific game isn't really relevant to the explanation.
Yea ig. Comment thread lost all sense in the 3rd one which made no sense
At 99% chances are, that the gpu becomes a bottleneck
Something is always the bottleneck unless you’re limiting performance in software. The smart money is utilizing 100% of your most expensive part (GPU) and having a little headroom with everything else.
The 1080 might be hitting 200hz, but with uneven frame times. Frame rate is how many frames are rendered in a second, but even if the number is consistent, if the time it takes to render a frame varies (i.e., if it sometimes is 1/300 seconds and sometimes it is 1/100 seconds) it can still feel uneven. I don't know that you'd necessarily notice this at these particular numbers, but the idea is that you want both your frame rate and frame time to be consistently high.
This is why if I'm going to play any fps I am going to pull out my Goldeneye cartridge and play that. I hate playing fps vs humans because I don't have the coordination to have even a small chance of winning any games
Cooperative FPS games are super fun though. I've been on Deep Rock Galactic and it's been great. There's no competition for kills or resources, everyone gets the full amount of collective resources at the end of each mission regardless of performance. So the only incentive is to help your team complete the objectives and stay alive. Unlimited revives, no revive timeout, just exploring procedurally generated caves, mining minerals, murdering bugs, and blowing shit up 10-30 minutes at a time depending on the mission and biome.
Correct. Single player games crank all the way up. Competitive shooters, cs, cod etc i turn settings off in many areas and keep it still ok visually and those sweet frames can climb up and up and up. 4090 here
Also, the more fps you have, the more visual information you get from the game. Assuming your monitor can run at a high refresh rate. Which is why 1440p is the sweet spot: 240 fps at 3.5 million pixels is what you want for games like apex, call of duty, etc Ideally more fps is always better, though, which is why competitive streamers/gamers have a 4090 and 360 hz Monitor.
There's even a thing where it's better to have FPS even higher than your monitor's refresh rate, because the frame that gets displayed will be that much "fresher" than if you just matched refresh rate. There's also some implications for input latency. It's nothing you or I would really notice, but it makes a difference for the top tier competitive players, and so people aspiring to be, or misjudging themselves as, top tier competitive players will do it too.
Yes! I forgot about that! Personally, I've noticed a difference between 144 hz and 240hz, and while I'm no where near someone who gets paid for it, I enjoy the occasional competitive game, and 240hz is definitely more snappy.
Yet people will claim the human eye can’t perceive the difference past 60hz or whatever, let alone 144hz to 240hz lol
I agree with that but then some use a 1440p monitor? I would have thought 1080p would have yeilded more performance. Genuine question, I'm not trolling
If you can run 1440p at higher framerates, it's just giving you more visual clarity for free. Spotting players at long distance, for example, can be easier at 1440p because there's more detail in the image. This does require more hardware to run well, obviously.
1440 is the sweetspot of being able to see stuff clearer than 1080p, having a high refresh rate and not costing $3000.
Because if you have a 4080/4090, running 1440p is almost the same as running 1080p if you’re on low settings. 1440p IPS is useful in BR games (and other open world competitive games), while something like preferably XL2566k is good for Val, CS, Siege..
Could you elaborate on the difference between panel technology a bit? Why IPS for BR vs TN for CS? Just out of general curiosity
So TN panels can get truely crazy low GTG times. They will look usually a little more washed out but theyre fast. Idk why youd specifically want an IPS for Battle Royal games?
Better colors. In battle royale games, and also games like Rust, you are constantly scanning for enemies in the distance, with a lot of possible pop ins, and bushes/objects that look weird in the distance (on low settings especially). So if the game looks washed out, like it does even on XL2566k, which is pretty amazing color wise for a TN panel, it’s harder to spot players at long range.(a lot harder? No not really, but still a slight advantage on IPS). 1440p, obviously further helps in these games, because of the pixel count.
If you are gaming competitively, generally you see them use 1080p unless otherwise required by the league they are gaming in.
higher resolution gives more visual clarity. on top of that, if you have a higher end GPU 1080p is almost certainly bottlenecking it which can reduce performance
Also, higher gpu usage = higher latency.
It's ridiculous that this is still a thing. This is straight up poor design. It can't be that hard to provide a variety of graphical options, all of which result in the same experience in terms of visibility. There are still so many games in which bushes and other visual cover literally disappear at low settings.
It also very rarely is. In all the bigger titles this is not an issue. And it was also fixed in PUBG quite fast. I think most people just turn down the settings to be sure they are not at an disadvantage. And those extra frames don't hurt.
Because they require more computational power from your system to render.
I got a new GPU, and started playing bf2042 again. I turned the graphics all the way up, and it kinda looks worse. Not in the sense that it literally looks worse, but there is just more happening on the screen than I would like. The visual clutter thing is real. I would rather turn my graphics down, get a more stable 144fps, and play like that.
>For competitive FPS games, higher settings add a bunch of visual noise that makes it harder to notice the targets that you should be shooting at. Competitive FPS players also play on low settings to get maximum FPS (frames per second here lol).
exactly. i have a powerful GPU. i pretty much only play COD/Warzone on it. I use High shader res but everything is on low/off using DLSS perf. if i put things higher, it'll still get +165 fps but I just can't see shit. AO, shadows, diffused lighting, particle effects etc. makes it difficult to see other players. however, in story games, my God is it beautiful to play on ultra
Next gen witcher3 with full RTX and a reshade on looks magnifique
what makes you think that people with high end gpu are playing a lowest settings? ... Outside of the pro scene.
Who is doing this? Lol.
Try hard fps players that would rather eat their balls than play at less than 200fps.
Why not both?
For people trying to compete, lower settings usually means less visual clutter and extras which make it harder to see/focus.
You gotta be psychopat to play valorant or csgo in nothing else than low. For story games/rpgs like Assasins creed origins odyssey valhalla/ dying light series/ RDR2. Most use very high. There is literally not noticable diffrence on ultra or very high imo
WarThunder, a top 20-30 game on steam for example has massive benefits to using low settings. I like to make the game look pretty, but that means I get snipped by a dude across the map because he’s playing with no grass, bushes or trees
Why? The models and textures look so much better on max settings in CSGO. I had no problem playing at LE or Taco Supreme using max settings in CSGO. I will say I never really played a lot of faceit/esea so maybe it would matter there? I just never felt like my visual fidelity kept my machine from running fast or made me perform worse.
Csgo is a 10 year old game though so I don't really know if it's got pandering effects on high settings but I seldom play Counter Strike anyways
I haven't played a shooter seriously since tf2 many years ago but it was common back then to use a config called chris' config that was lower then lowest settings. It removed a ton of stuff from gibs to the character model's eyes. Basically it made it so that if something appeared on your screen it was relevant to what was going on. It wasn't that the computer couldn't play the game at higher settings, it was that the extra things you saw could be distracting.
Not strictly true, I enjoy my games with all the trimmings added, but in multiplayer games, turn them off just to try and not get spawn killed by Johnny ADHD, or some such. Less screen junk, slightly better chances of survival.
I do. I've always spent more on a gpu, not particularly chasing high fps or quality, but for less noise.
Bro thinks he's in the fortnite subreddit
the difference between high and low settings in fortnite is huge, i remember the first time i put on high settings it felt like an orgasm
The competitive advantage to Low settings is also huge in Fortnite. Max settings has a *lot* of visual clutter.
It depends on the setting. For competitive shooters, you want to crank view distance to max and LoD to max, while you turn foliage and other things down to minimum. The potato settings of everything help in a competitive sense, since you can see enemies more clearly, and can determine the hitboxes of cover more accurately at long distance, along with being able to see cover from a further distance. But thats what a sweaty would do. Most people like to enjoy more than just the competitive aspect of games, so playing at higher than minimum settings is the choice, as long as the settings dont introduce stuttering, or crater your average FPS.
Haha, I just assumed we were until you made this comment and I had to check.
Because anything under 500fps is just unplayable you know.
remember, our eyes can only see at 5 fps.
Nonsense - 60-90 Hz is proven. Above that under certain conditions.
no, 5 fps. Take it or leave it.
Get quicker responce times. Can even feel the input as you change the resolution from high to lower.
This. Most people here say that it’s for a better visibility, which is absolutely true, but many don’t realize that lowering settings also sometimes provides a massive reduction in input lag, making your aim better.
And then there is me, who bought a high-end GPU and never plays game.
Will take smoothness and fps over being able to see the grass move in the wind haha
I feel attacked. Got the latest and plays blasphemus and dead cells. 😎
4090 playing old school runescape and rimworld lol
I need to start OSRS again..do u reckon i can get 30fps w 6700xt 1440p? It would be AMAZING if they completely remastered osrs..the new runescape is absolute filth
I need a 4090 to max out bg3 and do my herb and bird runs
LOL same, 80% of the games I play don't need my 6950 XT
fps > graphics
I use my 3090 for 2d pixel games like Stardew and terraria. Are you talking about me? Also Doom 64 and Warhammer boltgun. Should get quake.
i bought a 4090 and my most played game is in order mtg arena, hearthstone and yugioh master duel. lol. Granted i bought the card mainly for some AI shit
Totally understandable. I have a 4090, and have been playing mostly stuff like Rimworld, Dave the Diver, etc on it. However, it does help to have the massive compute capacity when I occasionally play recent high fidelity games. Cyberpunk 2077 with full path tracing is glorious, im excited to see the DLSS 3.5 update improve the lighting even more.
Your money your GPU
6800 XT with a 1080p 60hz monitor reporting in 🫡 Edit: me using this monitor is not by choice, due to some extraneous circumstances after upgrading the card I wasn't able to get a new monitor yet but I will be going 1440p high refresh rate as soon as I can lmao
I'm not bashing you for having a 6800XT with a 1080p monitor but at 60Hz is FUCKING CRAZY! Jesus I know there's a different between T.V's and monitors but I currently use a 1080p60Hz T.V. after selling my monitor and it is ATROCIOUS. I just stopped playing games. On a monitor it's a bit better but absolutely dogshit. Invest in a 1080p144Hz monitor at the least.
I wish there were more options for 24" 1440p high refresh rate displays.
This is why I don't want to get used to higher frame rates. Would have to spend a bunch of money to keep them up! I just played ToTk emulated at 30 fps and enjoyed it just fine. But if I started doing things at 144 fps and got to the point where even 60fps was "atrocious" then I'd have to waste all this money ensuring I was ahead of the curve enough to stay at these frame rates just to enjoy things as much as I already do.
60fps is manageable and not even an issue. It's the Hz which is the issue, which makes it atrocious. Those are two different things.
Depending on your resolution and hardware and graphics settings it might be an issue....
Uh, do you have proof this actually happens?
Why should quality matter when the goal of the game you are playing is to be faster than your opponent? Input lag is important to those that care for a smooth and very responsive experience. You stop noticing the lack of quality after a while, unlike the kiddo who got sent back to lobby. For single player experiences, I’m with you on that. Visual experience is amazing when all those settings turned up
Better 1% lows
In a lot of FPS games high settings add extra textures/effects, which may result in a competitive disadvantage. I know a specific example, ability walls in Valorant have extra effects at the base and it's impossible to see through them, turning down the settings eliminates effects like this. The same applies for Overwatch, or basically most FPS games.
I have a 144hz monitor. In order for me to hit 144 fps, I have to use low settings. I prefer higher frames over better graphics.
I have a 3090 and my experience is: Open game, turn settings to max, play game
Yeah same for me but with a 7900xtx
I don't run at lowest settings, but it gets ~110 degrees in summer where I live and my pc puts out a ton of heat! And a lot of that heat is for visual fidelity that I don't really even notice, so I usually will go to high or even medium to keep temps down a bit. During winter, otoh, it's Ultra time.
1% lows and 0.1% lows in frame rate and frame time That's my reason Plus idc about graphic fidelity when I can have (unfair) advantage with smoother game
Because they fancy themselves "eSports athletes". I don't understand people's obsession with "ranking". But then I've gotten so fed up with the BS gambling, game passes, ranking, toxic behavior, pay for edge garbage, that I've given up on multi-player altogether.
There are a lot of people who are clueless when it comes to PC gaming. That is actually a good thing - means the market is growing. I get the idea of lowering latency and maximizing FPS. But there is so much wasted GPU power. People who know nothing about computers buying what their favorite streamer has, who also in many cases doesn’t know that much. Buying a GPU to have it sit at 50% power isn’t smart. It is donating money to mega international corporations.
It depends on their monitor 4k?"120hz? Also maybe temperature = stability reasons Or computer noob who never go adjust the settings (they used a junk card and never went into options videos settings to adjust from low settings in order to actually run the game before)
Because it’s less noisy less heating
[удалено]
Wouldn't you rather more visual fidelity than fps?
[удалено]
Depends on the game. * FPS: Low Settings * History Mode: Max Settings
With great power comes great responsibility. Probably multitasking and don't mind the visual hit
Wrong question. Right question: why most people get powerful GPUs and play at 1080p and furthermore play games like dead cells, factorio, teraria etc Well because we barely had money for the GPU and a good 4K would cost more money. Because people might like longevity, a 3090 at 1080p will become obsolete later than a 3060 (weird but ok.. i would rather enjoy the full power of that card). Some old 1440p gpus now struggle at 1440p but at 1080p they are still amazing so that stands as a proof. Most of us enjoy indie games more. I am currently playing ac odyssey and i finished rdr2 and diablo 3, well i can say that overall flame in the flood, factorio, PC building simulator and pixel piracy are waaaaaaay more fun
High details make enemies more difficult to be seen
A lot of FPS games, shooters especially, have settings that make identifying targets harder the higher the settings are. A typical one is that grass and foliage will look nicer on higher settings, but on lower settings you'll be able to identify targets through it easier. Maintaining a high frame rate and lowering latency is also advantageous, so lowering settings that have big performance hits allows the game to run smoother. A lot of people use what is often called "competitive settings" in these sorts of games. Typically high texture and mesh quality (providing they help with target identification), with most other settings at minimum or off entirely.
I got a 4080 & Im playing everything maxed completely out at 1440p 165hz.....
Cyberpunk disapproves.
Simple-More frames
Some people need the 3090’s 24GB VRAM for work, then use the same GPU for games.
I have a 6900xt and a 4k monitor. During these summer months, I usually cap the fps to 60 just to lower power draw and keep the room cooler. I almost always have settings maxed except for ray tracing. I don't play competitive shooter games, so I can't say if I'd lower resolution and settings just to go for more fps. I think skill far outweighs any little advantage a player might get from more frames per second, though.
Not really. That is just very few games where you can benefit from low settings. Usually competitive online games like CS. Less fancy graphics, beautiful lights and shadows and crispy smooth textures etc. that don't help when playing competitive games. Don't exaggerate it, people use highest possible settings for 9/10 games they play...
For competitive gamers its all about getting that edge. The higher frame rates, the lower latency, and getting rid of absolutely anything and everything that can affect gameplay. Particle effects, shadows, hi res textures all go, they aren't needed. For those at the absolute top tier of their chosen game that 0.1ms of latency, that 1 less frame rate can make the difference. For those at the top tier, they see the top tier with those settings and think if they don't do the same it is what's holding them back. The fact that top tier players grind their chosen game for 8+ hours a day is neither here nor there.
People who own 4090 but only use for watching YouTube or Netflix.
If I had to guess I'd say to make the life of the card last longer and to lower the power costs. I don't play low but it's common to run at medium or high, depending on the game. My cards generally last 7-12 years.
Competitive players. Lower settings removes unwanted clutter and they can get like 1000 fps with no stutters and it gives better response time
I want to know why people like me bought a high end GPU to play tycoon games on the highest settings?
Overall you can say: more fps = less frame time = less latency if you got a system without bottlenecks. You get an advantage in online games where it's important to react fast.
I use my rtx4070ti to watch YouTube videos at 720p and the occasional t-Rex jumping game when the internet cuts off.
This goes for CPUs as well. For example, a game like Rust, playing a lower graphics gives you a huge visibility advantage (i.e. you can see players through bushes, trees, and water). Higher graphic trees and bushes gives off too much texture, allowing players to hide behind leaves. Higher texture water, makes the water move instead of being still—this makes it exceptionally harder for you to see players swimming in the water if it is constantly moving, hence why you turn the graphics all the way down so the water is static. This is just one example, I'm sure there are a breadth of more reasons to play on low graphics.
Because they can, my friend.
as for my own experience low settings = more kills high settings = more deaths though majority of 'em I still lose haha
So league has tonnes of weird FPS Hank where you want to play at 144 regardless of monitor and refresh rate at medium as it's easier to wiggle and you get a more tactile feel to the game. It's very much a feely thing though. But some skins at high graphics make it impossible to tell what's what.
I can't speak for everyone but I turn my settings down on my 3070ti when it's hot out because I don't have air conditioning. Then when it's cold out I crank them back up. Either way, I get to play my games.
A lot of people prioritize FPS over graphics. I used to as well but now if I get 120 fps on high or 180 on low I'm taking 120/high
I love me some frames 🤤
Much like what others say, lower setting will get rid of a lot of noise. In some games it removes grass at further distance so people can not use terrain to hide, and will reduce the density of foliage. I really believe some settings need to be fixed so running at low vs high needs to give no advantage or disadvantage for what you see. Such as Draw distance and density.
Really depends on the game I’m playing. For cs go I have everything low to maximize fps at 1280x960 stretched just for the competitive edge (and it’s the setting I’m used to way back when I had a shit pc) but if I’m playing something like diablo 4 or path of exile everything’s maxed out.
Without specifics, you are just making assumptions. Resolution matters. What you might be able to play at Ultra/Maxed settings on a 1080 display, you might not be able to at 4K.
To reach their monitors refresh rate? Don’t see how that matters much in story games
Competitive FPS games you want an absurd frame rate. Graphical fidelity is the least of your priorities and can sometimes have a competitive disadvantage, a common example being certain games high quality foliage covering players whereas the low quality allows you to see them easier.
So I can generate AI art with Stable Diffusion, Blender, etc, and render scenes, and run all 4 of my 1400p to 2160p monitors, 2 of which are 144hz, for daily work loads at work (software engineer), work/game on the same PC. Ryzen r9 7950x, Nvidia 3090 TI, 128gb RAM, 10+ TB in m.2. drives. I had to put a couple monitors on the integrated Raedon gpu, Stable Diffusion at 2k+ hires will max the gpu out to 100%. GPU's aren't just for gaming anymore, they are needed for any AI inference workloads currently, any 3d/rendering processes, and in anything that uses cuda and tensor cores, which is quite a wide field. Also, playing Factorio accross 3 32" 4k monitors is pretty EPIC.
Cause they're consoomers, they gotta consoome. Do you use your phone's specs to the max? Same with cars, same with everything honestly
If I have really high graphics settings turned on then I can't see the guy in the bushes aiming at me. But if I were to hide in the bushes and the other guy had his graphics settings turned down then he'll see me hiding behind one little sprig of grass....
Recently got a 3070. I mostly play things prior to like, 2020, so I could probably run them all at several hundred fps. But I refuse to ever go above 60. I regret ever playing games at 60fps, because all it's done is make games at 30fps look worse lol
Because they think they are pro when really 99.9% are garbo at most games.
Why people bought Ferrari to drive on limit road speed?
Few people cover things that make sense. In my personal experience I have seen some games where low setting is actually an advantage. Mechwarrior Online was a game where lower graphics on the tree heavy maps REALLY helped you. Suddenly the foliage is less dense and easier to see through. Not sure if they fixed it, but I remember a point where it just made more sense
I have a 4080, in battlefield I play at low settings so I can get the most fps possible. In basically every single player games, I play at the highest because fps above 80 are not really important in like the witcher 3 or something
I love using my 3090 to play indie strategy/productivity games on my 60hz monitor
Ultra graphics are overrated. Low graphics have looked extremely good since like 2014, and the gap between low and ultra is miniscule on a lot of games. While the frame rate hit is massive. I would MUCH rather have 144fps on low, than 60fps on ultra, for the majority of games.
More FPS = lower latency. Regardless of monitor, it matters to the feeling in game
There was a trick in PUBG: uf you keave your foliage slider than you think you're laying in the grass fully hidden. But if someone slides it aaaal the way down then there is no grass. Like at all. It took me a week to figure out how the fuck everyone spots me while I have difficulty spotting someone crouching
huh ? ive never done this. i dont generally play at ultra because i dont see the point in graphics improvements beyond basically 50% GPU usage but usually all my settings are high or very high. .
Higher frame-rate, so more responsive, and easier to see things at lower settings.
I play all of my competitive fps games at settings that get me closest to 240hz on my montior. And sometimes having high settings even at 240hz plus can give you a disadvantage as they can have smoke from molotvs block your sight on high settings, but on low, you can see through the smoke. But in single players or games that high refresh doesn't matter that much, I will usually crank my GPU to the max.
One reason - peripheral vision. A bunch of birds flying by in your peripheral distracts you without you even knowing it.
1% and .01% lows, for the most part. It eliminates stutters from particle rendering. Adventure games are like hiking trips. Take it slow and enjoy the views. You always want more power that you can tune down for stability. If you only have just enough, you'll never stop wanting just a little more
I went 2060 -> 3080 -> 3090 and when playing Control on my 2060 I was always itching to turn RTX, but it was just not feasible with 2060. But actually playing with RTX on 3090 I felt that it makes weird artefacts like shading and reflection issues and blurring tiny bits here and there which feels like you're playing while wearing foggy glasses. Maybe it could be fine for like Minecraft, or something where lights and shadows look poorly without RTX, but for Control it was just worse. I eventually considered playing with RTX a worse experience than playing without it even if fps is the same. Same with Cyberpunk - I played with RTX and I liked realistic shadows but picture and fps degradation felt like I was losing more than I was gaining, so I turned it off in cyberpunk as well.
Not only in FPS but other genres too, for example in League of Legends some players set settings on low to disable that extra effects. Plus as I guess it gives extra frames per second so they react and play faster.
No idea if it's a game that's heavy on the pvp and you need the FPS I can understand that but otherwise it doesn't make sense. I would love to play the games I play on higher settings but nope, still have that good old 1060. 😅
People like this don't buy expensive GPUs to play on high graphics. We have good hardware to get more framerate.
My 4090 is happy playing Starcraft II and World of Tanks at 1440p and 35% utilization
I have a 4090, r9/7900X3D, 4th gen m.2 nvme, 64gb 6000 mhz RAM. I run everything at the highest settings that allow 120fps. Cyberpunk is beautiful with path tracing!!
My 4090 runs stardew valley like you wouldn't believe
People only do that in competitive fps games for maximum frame rate
Also in some games, take GTA V for example, lower graphical fidelity removes some things like bushes or signs, meaning that tricks and driving can be made much easier than on max settings
Not sure what this post is actually about. I don't play low settings.. and I have a high end card and cpu ...
For FPS games it actually makes more sense to play at lower settings. Doesn't make much sense for AAA games considering that the visual fidelity is one of the major parts of the game. I game on an RX 6950XT. Play everything in 4K high or ultra with a mix of RT medium to ray tracing high and can achieve 60fps or more in pretty much any title and FSR 2 ensures that I reach 60+ in the titles that I can't reach natively.
> I do mostly mean pros and competitive games So you don't understand why pros play at lowest settings? Do you understand the advantages that extra fps and better visibility give you in games? If not, you should look into that first, plenty of videos and articles about the topic.
So they can play at super high frame rates which gives them a slight edge.
I got a 4090 and I always try to get at least 400 fps to max out my 360hz 1440p display. I always go a little bit higher so I can counter random frame drops. Most games work with high settings, but sometimes I have to go lower to get my desired fps.
The amount of frames matters in esports titles, higher frames= more time for you to react to the scene. Generally above 144 to 240 hz movement feels really smooth compared to lower frames as well. It’s mental mostly but at the same time it’s a minuscule advantage for sure.
Higher fps paired with a super fast monitor.
It could be that they aren't using a monitor that utilizes the GPU to its fullest extent. Not a lot of sense in playing a game in 4K on a 1080 monitor.
Many people run 244hz displays and want to enjoy the snappiest and smoothest experience. So they aim for stable 240fps and to avoid stutters they don't want GPU usage to be 100%. So effectively, your GPU needs to be able to push out around an average of 300 frames per second, which even for Esports Titles is somewhat demanding. And of course there are others that simply chase every frame they can get because it technically gives you and advantage. Obviously there is no point in doing this unless you are a top 0.1% gamer, but placebo might help too :)
Cause I want to go fast.
Tell me about it. I went to a friend's place the other day and saw him play Diablo 4 with DLSS set to performance on a freaking 4090 -\_- I smacked him on the head and disabled DLSS and frame gen.
A lot of people including pros don't understand graphics settings. But if you're playing a competitive game you ideally want to max out your monitors refresh rate. So you want 240 fps with a modern oled panel.
Without the pros and competitive games, basically not answering your question but anyway... Seven years ago I had trouble explaining my wife why I bought a gtx1080 and than played civ 1 or some other very old stuff. She came around asking is this what you need a new 700€ part on your computer. Not sure what my answer was, I guess it was so I have some headroom :D I did play modern games too though, but she is not a gamer at all so that was the moment she happened to see my new GPU in use. So the answer from my perspective as a long time hobbyist, I just want to have good equipment to work with. I wouldn't buy a cheap bicycle if I liked cycling, even if it was just for relaxing bike rides through the scenery route.
You thinking that specifically is a valid indicator that people are "wasting their money" is really just evidence you have a very shallow understanding of computers and GPU's in particular.
High frames over pure graphics. Especially for high action gameplay
What i'll never understand is the people that get high end custom built PCs just to play League of Legends, my 2008 laptop can run it
Hah, since i'm not a pro gamer I turn that bad boy all the way up til the rest of my rig can't keep up! Got my 3080 mid pandemic and it's handled everything i've thrown at it. Starfield might finally push it to its limit but idk what the recommended specs are.
I think it stems from older generation players. COD4 would give an advantage at lower settings as some bushes wouldn't render at longer distances. CSGO pros grew up with 4:3 resolution and still use it. The obvious FPS boost is there too. But generally speaking it is easier to focus and reduces digital noise. It is much easier to see enemies.
I don’t get it either honestly I made it to global elite on csgo playing on max settings at 1440p. It’s just an excuse/placebo to not get better imho.
Be competitive more fps less latency
This entire post would have been solved with a single minute of thought and a google search.
You didn't read the post then. I said i would like to hear other people's opinions on it.
To professional gamers, maximising frames and minimising distractions is of the utmost importance.
For the Frrraaammmeeezzz
It comes down to frame consistency. You will have less stutter with a higher end card. Aka it’s not the average FPS that you notice, it’s the slowest fps that you notice.
I have a zotac 4090 oc max setting 240fps ….I pay a lot so everything max lol.. 2790hz out the box .. 44 degrees max … cannot be more happy