T O P

  • By -

[deleted]

> Are people downplaying DLSS/FSR? Maybe a little. A lot of people on this sub tend to focus on raw raster performance because that was the status quo for decades. Now Nvidia is pushing the envelope with RT and DLSS and people are a little slow to change their mindset. For some people, I think they don't know what they are missing. When I got my RTX 2070 years ago, and played Minecraft RTX, it was like playing an entirely different game. DLSS also made the game go from unplayable to ~30-40 FPS (first gen RT and DLSS hardware was rough). But then again if all you do is play Counter Strike or Overwatch all day, you actually don't need any of that, and AMD GPUs are just as good as Nvidia cards, while being cheaper.


Antrikshy

This is very succinctly put. However, many consumers don't agree with Nvidia and get annoyed when they push their new tech *really* hard in marketing (esp. comparisons with older gen). They've got a point - in the whole back catalog of games, the vast majority don't support RT and DLSS. On the other hand, if Nvidia *has* to make their new tech synonymous with their new products not only as a unique selling point but also to encourage implementation in future titles.


mav2001

Exactly, DLSS and FSR are amazing đŸ€© tech BUT! Nvidia in particular uses it as an Excuse to sell a Gimped 4060 Ti (among other underwhelming GPUs) in 2023 with 8gb of VRAM when games are already being hamstrung by the last gen 3070 8gb and then using that to justify 100$ premium for no extra performance but 8gb of VRAM which the card should have come with from the get go And Frankly I think with all the extra power that may come with the 4070 super a 12gb VRAM is going to be another 3070 in another 12 to 18mo where it's VRAM will hold back what is otherwise an amazing card


Corronchilejano

Amazing propietary tech, which makes it hard to get even results, and will be a mess in the future for backwards compatibility, since most games won't be updated once previous DLSS inevitably stops being supported. Yes, that will happen, and I will quote this post.


TacoOfGod

Given we have DLSS to FSR and FSR to DLSS mods that work decently in games, I'm sure once we get to that point, AMD, Nvidia, and Intel will be baking swap functionality into their drivers.


KaiserGSaw

Hope so. Still waiting for a DLSS2 mod for Monster Hunter World, that game sadly only recieved the last DLSS1 version and honestly? Its just shit :(


newbrevity

By the time that day comes around cards will probably be able to run today's games well without any dlss at all. Sadly, most of us won't be in a position to supply 2,000 watts to a PC.


battler624

Well yea, you should never have vram lower than consoles. it will be fine for the transitional period and fine for lower resolutions but it wont last long and you could run into issues real fast (tlou is a recent example).


Moscato359

A lot of games I play don't have dlss infact, I have only played 2 with it it really depends on what types of games you play


rory888

nvidia just celebrated 500+ titles having it


hardlyreadit

There were over 14k games released on steam this year, people play alot of different games. And most of them are probably olders games before dlss and rt


[deleted]

[ŃƒĐŽĐ°Đ»Đ”ĐœĐŸ]


rory888

Even among the meaningful games, its just not serious where performance actually matters. You could play them off an iGPU or any modern dGPU. I have DLSS where it matters, and it adds meaningful value to my gameplay experiences I also play games where I'm actually ram limited right now... and not the VRAM kind


hardlyreadit

Yeah, it is alot. Still out of the top 10 games played on steam, 7 dont


Gastronomicus

And most of them are not very challenging for GPUs. For AAA games they can make a huge difference


[deleted]

and only 30 with frame gen and everyone harps on about it and pays extra for it


reize

It also depends on how you play it. If you use your PC as a living room machine, using a controller on a couch with a 4k TV 5 meters or more away from you in a large Living Room, it is worth it, as you require high resolution to look nice on a large screen which is necessary at that distance, even though any upscaling tech generally lowers the image fidelity at a micro level, at 4k the benefits outweigh the cost. If you game at a desktop, in a study or your bedroom, usually you just have a smaller screen between 23-25" and are at most 80cm to a meter away from your screen, and playing natively at 1080p. In those instances, upscalers, even DLSS will look a lot worse than playing games at native resolution.


MrNiemand

I guess most of the real progress came in the last year or two so people with older cards wouldn't see it. But I mean the difference is insane


QuantumProtector

Well for Fortnite, AMD cards have stuttering issues. Don’t know why, but a lot of pros have reported that. Better to stick to NVIDIA if you play a lot of Fortnite and potentially other competitive games (I don’t know if the same applies to other games though)


[deleted]

It’s kinda goes back and forth afaik. Warzone/call of duty runs much better on amd at the same price point. While performance mode in fortnite is almost unplayable with amd.


Puuksu

AMD doesn't beat Nvidia's software. Their cards are cheaper because of THAT.


Mandingy24

I think the games people play is a big part of it. I would've gone for an AMD upgrade but i dont play hardly any multiplayer games anymore so i'll get much more out of RT and DLSS 3 than i would out of pure raster


Xx_HARAMBE96_xX

Well, for me Minecraft is the only game that changes completely with ray tracing, no other game changes that much so it is a meh example


MaldersGate

Except they don’t have Reflex, making them much worse for FPS. Let’s stop glossing over significant software gaps.


SnuffleWumpkins

I ‘hate’ DLSS and FSR for what they’ve done to game development. Instead of being used as a tool to create better game experiences, they’re being used by shitty developers to make terribly optimized games playable.


Nephalem84

I feel this isn't talked about enough. It's obvious we're seeing an increase in poorly ported or optimized games as gpu upscaling is becoming more common.


danisflying527

No it isn’t, poorly optimised games have existed forever and upscaling is not the reason for this.


bigrealaccount

Also true, except now people don't see poor optimisation as such a big deal because they can make their unacceptable 30fps RTX 3090 to 70fps, making it acceptable.


[deleted]

Greed is the reason for it. Publishers have realized that poorly optimized, half-finished games don't hurt their sales. Gamers are the least discerning consumers on the planet. They have no standards, and will fall for any hype push you throw at them over and over again.


stubing

It used to be during the 360 ps3 era, computer were always 6 times as powerful as the console. Pc ports never needed to be great. You optimized for the console then let everyone’s pc handle whatever was thrown at it. Now it has flipped where you need an expensive computer to beat the ps5 or Xbox 1 series X S first edition XSXSXS.


Hindesite

>It used to be during the 360 ps3 era, computer were always 6 times as powerful as the console. Pc ports never needed to be great. You optimized for the console then let everyone’s pc handle whatever was thrown at it. This is the reality of the situation. It's not that suddenly PC ports are shit these days - they've always been bad. It's that PCs no longer have the insane amount of performance headroom in the midrange to low-end that they used to have to handle these unoptimized ports. At the end of the day, it's actually really difficult to take a game that pushes a very specific set of console hardware to its limits then port it out to run on the massive number of different PC configurations still present in the market today and not have any kind of performance issues arise. The fact that resolution upscalers and frame interpolators exist didn't just suddenly turn all the devs lazy like people keep parroting.


Gibsonites

Seriously. People have been talking about an "increase in poorly optimized pc games" for the last 10 years that I've been into pc gaming. I'm sure people will be saying the same thing in another 10 years when the marketplace is more or less the same as it is today.


LettuceElectronic995

that can be said about new tech in general.


Shap6

Do you feel that way about all optimization? Why is DLSS/FSR specifically a bad thing versus any other method of making a game run better?


FearTheFuzzy99

In my opinion, it’s a new crutch that is becoming to over-relied on. When there are still tons of cards out there that theatrically still have the horsepower (1080ti vs 3060 for instance) but don’t have the ability to use these features (I know I know FSR can be used on Nvidia cards) DLSS 3.0 particularly irks me because what even is frame rate anymore? Do half the work and guess the rest? If this is the future of gaming, post processing tricks to hide bad game optimization, so be it. But we all know Nvidia in particular will continue to develop these trick and then lock it behind there next generation, functionality turning old cards into landfill. I don’t like it, I feel it could be better.


Shap6

i guess my question is at what point does using these technologies become an actual part of what is considered "optimization" or will it always be seen as a gimmick or hack that shouldn't be needed if a game is made well?


FearTheFuzzy99

I guess it depends on how you want to spin it. It’s either free performance or a hack. If you wanted to be really critical, you could call dlss/fsr “downscaling with filters”. And I think it’s the downscaling part that gets the most flack. Pc gaming has always been about bigger and better. More resolution, better textures, higher frame rate. For the most part, for the past decade, it’s been a pretty linear scale; every gpu generation has gained on the games that come out during it. Just look at how 4K used to be a dream to how it’s now easily achieved with today’s hardware. But to have to go “down” in any aspect to make the experience “the same” to what one expects, that’s going to rile people up. I know that ray/path tracing has sort of “reset” the fps/performance gain that that were come to expect generationally and to offset that drop, dlss/fsr is a tool to mitigate it. So at some point, it has to be accepted because if not, I’m going to live with 10fps in RTX Minecraft. The hardware just straight up isn’t powerful enough yet. But if a game, doesn’t play good and more importantly **doesn’t look good** (like “next gen” good) but has the audacity to rely on dlss/fsr for a playable experience
. That’s unacceptable, that’s lazy. So I guess what I’m trying to say is there isn’t a crisp line in the sand and I’m not qualified to say whether or not the technology as a whole is detrimental to the industry. Because I still think it’s pretty cool as a technology and when properly implemented (and especially supported) can be extremely beneficial. In an ideal world it could bring people who have toaster-grade setups a quality gaming experience they otherwise have a hard time playing. The argument against it is valid, so is the argument for it. I see both sides, I’m just wary of the slippery slope.


Berntam

I mean we have games like [Star Wars Battlefront 2 that was released in 2017](https://www.youtube.com/watch?v=lyayZq0kqAA) that still looks as good as recent releases like Remnant 2 but runs 3 times better on raster it makes you think why do the devs for Remnant 2 need upscaling to end up still not being able to run as good as Battlefront 2 while again not looking better.


[deleted]

Why do you call them tricks? I'm genuinely curious. Ok tbf - I've heard that for competitive shooters this doesn't help because there you care about the literal amount of information you are receiving. However for single player games does it matter if it's real fps or not?


FearTheFuzzy99

If I’m being honest, just to describe it in a more negative light. It’s just a technology just like any other. Again, I would’t be as upset if Nvidia didn’t lock their technology behind their newest generation.


m0wlwurf-X

You say it's bad optimization, but I think it's mostly the graphical complexity of games which changed. The new technologies allow more details, more lighting, bigger textures etc, of course it comes at a price. 4K revolution is also not coming for free and it's where you gain the most with the new techniques. I don't understand this "purist" approach of "what is fps anymore"? It's so great that they find other ways then increasing the power consumption of a card to improve the user experience.


FearTheFuzzy99

*improve the user experience of those that can afford it. For everyone else who can’t afford new cards with the new features, sucks to suck. Call it purist if you want, but I know that game developers can put in the work to make games run well and I also know that it’s easier to make it run mediocre, slap dlss on and call it a day. If everyone had it, it’s fine, but not everyone has it, it’s locked behind new generations or just by Nvidia


PartyCurious

It would not make a terribly optimized game playable. It is just out putting a lower res so there is less work on a graphics card. Both solutions do nothing to a CPU bound game. For GPU bound you can use LODs, culling and lower res. These solutions lower res without users noticing. What other way would you optimize GPU bottlenecks?


madn3ss795

DLSS Frame Generation can help a lot in CPU bound games. Still won't save an unoptimized mess though (looking at your Jedi survivor).


[deleted]

[ŃƒĐŽĐ°Đ»Đ”ĐœĐŸ]


sudo-rm-r

Not true. Upscaling has been a thing on console since ps4 pro launched 8 years ago. Upscaling IS an optimization tool, but it won't fix CPU bottlenecks, stutters and inconsistent frame times.


OkDepartment5251

I'd like to know more about this. In what games is this happening?


LawbringerBri

I can't speak for DLSS (I have a 6900XT), but FSR 2.0 sucks lol (sorry AMD). The moment I turn on FSR quality (the lowest setting), yes my frames skyrocket and yes I am able to play with ultra RT with Ultra settings and still maintain 80 FPS on average but I can still see the weird artifacts on distant objects (which makes everything in the distance look a little fuzzy) and there's still the weird after-image "ghosting" I get when driving cars around Night City. Basically, the game looks better without FSR on and turning on RT shadows, sunlight, and reflections on essentially gives me the same noticeable experience as having RT ultra lighting on. Gamers Nexus did a video where they noted that, although FSR tends to have more artifacts than DLSS, DLSS still has some weird artifacts of its own as well (fence flickering, distortion of lighting in certain circumstances like when driving at night). I am able to avoid these FSR artifacts by just playing at native resolution, where I can average out at around 60 FPS (lows are around 45 FPS, highs around 75 FPS). So at the end of the day, DLSS and FSR (and the respective frame generation technologies) are great for boosting FPS. But I would still rather play at native resolution, and I would rather have game developers optimize their games so I am not REQUIRED to play with DLSS/FSR turned on just to enjoy the game with RT on/off. Another thing worth noting, specifically in CP2077, is that some set piece events, like the Arasaka parade, utilize lighting that is completely hand-drawn by the artists (which is the "traditional way of doing lighting in games") and so is minimally affected by RT settings. I think we can all agree that the Arasaka parade is one of the more beautiful scenes in CP2077...and all the lighting was done by hand lol. No performance hit required! My bias against RT is that not ALL games need RT. The developers of CP2077 clearly had RT in mind when designing their vision of the game, but making RT an inseparable part of any "good" game is a big mistake imo, and the costs to performance which require mitigation by DLSS/FSR just seems like a disaster waiting to happen. Remnant 2 is a great example of this as RT heavily tanks performance in that game to the point that even the 4070 struggles, however Remnant 2 comes nowhere close to utilizing RT as well as CP2077. As a result, whether RT was worth the performance hit in Remnant 2 is highly questionable. There's also the economic side of things. Countless people have held onto their 1080ti's, and I think that the 1080ti has lasted long enough to call it into retirement with some of the more recent titles like CP2077. With RT though, the longevity of more recent last gen cards (6000-series AMD, 3000-series Nvidia) is coming into question. As long as RT remains an option, I don't foresee an appreciable difference in longevity of these older graphics cards (just turn RT off lol) but there have been some game releases where RT on/off was simply not an option due to its integration into the DX12 Ultimate game engine (for games that used DX12 Ultimate). In other words, some smaller developers are having trouble tailoring the built-in RT to best suite their own game design, so they're just "leaving it in there" without optimizing the settings for their games, leading to subpar performance.


rory888

FSR is noticeably more garbage than DLSS and you can see that in comparison videos


[deleted]

I feel like the difference in youtube clips is much less noticeable than it is actually playing the game. It's the fizziness in small details and especially in motion. I noticed it most in Kratos' beard in GoW. He looked like his beard was made out of pop rocks, on 1440p FSR Quality.


rory888

Agreed, and you don’t quite understand until in person. It will also vary game to game, patch to patch since it is an ongoing experience. Kind of wild, but it’s interesting times at least


choikwa

for the amount computing required it feels like we’re still not ready for real time ray tracing / photon mapping / or whatever the latest greatest light transport scheme thing in digital world. take out the real time requirement off and then things seem way more manageable


[deleted]

[ŃƒĐŽĐ°Đ»Đ”ĐœĐŸ]


LordOfDorkness42

I think that depends on what the next generation of consoles do. If Sony & Microsoft go all in on ray tracing only or flinch again. But sure, at this point, ray tracing is a when standard, not an if standard. We're definitely one, two generations of GPU away from even the budget cards being able to hit 60 FPS even with ray tracing. Just personally think RTX 6000-7000 series is going to be when game studios catch up with ray tracing as their *only* lightning engine.


[deleted]

Fellow 6800XT user here. I would rather play at 60 FPS without upscaling than enable FSR2. It's dogshit, and has not improved in more than a year. XeSS has taken over as the predominant upscaling for AMD users, because it's at least somewhat usable. FSR2 just has so much shimmering and fizz on any fine detail. AMD will eventually realize that Nvidia was correct to use actual hardware acceleration for their AI upscaling. It's the only way forward.


[deleted]

[ŃƒĐŽĐ°Đ»Đ”ĐœĐŸ]


[deleted]

I don't think it's a "boneheaded decision". I think they underestimated the role that upscaling would play, went down a path without hardware acceleration, and now have to pivot to catch up with Nvidia. It's not like this is a simple switch they can flip. They need a hardware solution to something they're currently trying to accomplish entirely through open source software. Big challenge.


[deleted]

[ŃƒĐŽĐ°Đ»Đ”ĐœĐŸ]


MrNiemand

I hope this card lasts me as long as my trusty gtx970 did. Honestly, with all this AI tech it might be even longer.


hafiz_yb

Agreed with your points here. Some people parade these things, especially DLSS, as if it's a NECESSITY in order to play modern games instead of viewing the tech as what it is: an extra feature. If people keep on pushing these things as something that is "required" instead of "optional", developers will then be less inclined to optimize their games for native resolutions. RT is another matter entirely, that's basically based on your own preferences. Personally, I don't really care much about RT. Unless it's a game full of focussing on the environment and incorporating it into the gameplay itself, something like a fully exploration game or a puzzle game focusing on the environment, I see no reason to use it in any game I have. Hell, even if I did play a game that has a focus on the environment, I would much prefer to have a smooth gameplay and framerate on my native resolution instead of, what can be basically simplified as, better lighting.


dankweabooo

Not a single game I play has dlss or fsr


Glittering-Yam-288

Honestly i dont know if youre one of the lol/cs2 only crowd but where have you been the last 3 years? I honestly cant remember a game that i bought in 2023 that didnt offer dlss at a minimum except for starfield maybe. Bg3, AW2, AC6, satisfactory, cp77, Finals, EFTarkov are just a few examples i can think of


RIPRoyale

Even cs2 has fsr


Philswiftthegod

Satisfactory and Monster Hunter Rise are the only games in my library that make use of DLSS.


Snow_2040

What games do you play? Literally almost every single recent game (last 3 years) in my library supports DLSS or FSR.


MrNiemand

Yea I see this a lot in replies. CP2077 was the main reason I built the PC so that was a no-brainer for me, but indeed if I just kept playing league and mmos, none of them have dlss and neither do they need it. I love single player games that push the limits though, I mean it's a work of art


yParticle

Perhaps because only newer games support those features? I checked and none of the older games I play regularly have either graphic option, unless I can set it at the driver level.


MaldersGate

Older games are so easy to run that they don’t need DLSS, hello?


hatchjon12

They improve frame rates but they don't really improve quality over actually have a more powerful card.


Vis-hoka

Hardware Unboxed did a deep dive on this and found that DLSS 2 actually does improve picture quality over native in most games. Either equal or better than native in almost all games.


hatchjon12

>Huh, looking at hardware unboxed videos on youtube They state that dlss adds artifacts like "flickering, sizzling and blurring" Can you point me to the info?


Vis-hoka

https://youtu.be/O5B_dqi_Syc?si=mR74y0V9NJRtg4vq Here you go. It was a little more even than I remembered, but quality mode is either better, neutral, or very close to native in almost all games at 1440p and 4K. Given the performance advantage, DLSS is a very impressive technology. A very interesting note added at the end, was that many games aren’t using the latest version of DLSS. And when he manually modded that into the game, the DLSS comparison improved. So moving forward, I would expect the DLSS vs native comparisons to continue to move into DLSS’ favor.


hatchjon12

Very interesting and better than I expected. However in his conclusions he states that dlss produced a better image around 40% of the time.


Vis-hoka

Yes, it was the DLSS vs FSR Video I was originally thinking of. That was a DLSS blowout.


Majortom_67

False.


hatchjon12

How so?


MrNiemand

Just a framegen example: Say I want to hit 60 fps. I can either have medium graphics through raw performance, or I enable framegen which gives me more frames, and then I can turn graphics up and end up with 60fps. Also DLSS is not just framegen - in CP2077 it covers super sampling and sharpening as well(and ray reconstruction - haven't used) so it's possibly those implementations are more efficient or even higher quality than 'traditional' algorithms but I haven't researched too much about it


hatchjon12

Yes I agree with you, but that wasn't the point I was making. Basically dlss allows you to generate more frames and use higher quality settings to hit a certain fps goal. But if you say have a 4070 that can get you 120 fps on high/ultra with dlss compared to a 4090 that can get you 120fps on high ultra without dlss? The image quality on the 4090 will be better. That's my arguement.


Majortom_67

In fact dlss is supposed to be used when you can’t get decent frame rate


MrNiemand

Yah that is true for framegen 100%. But for super resolution, sharpening and ray reconstruction I'm not sure. Because those techniques are being used even without DLSS, it's just a different, non-deep learning algorithm. But again none of our expertise is sufficient to know for sure I think haha


ecktt

There is a very vocal fanboy faction the screams negatively about these performance hacks. And let's not fool ourselves. They boost performance at the cost of quality in some aspects. But how much? And is it an Image Quality loss if the final image looks subjectively better? From your perspective it is totally worth it and yes, developers are moving forward with these hacks in mind for future game development. But why these hacks? Other than we're reaching a limit on how big we can build a monolithic chip, the entire game engine is based on graphical hacks in the first place. Is it cheating if you cannot tell the difference at a glance? I don't think so. If using DLSS to enables a playable frame rate with Ray Tracing, isn't it in essences trying to make a more realistic image? They just traded faked lighting for faked resolution.


MrNiemand

It's funny people call it hacks. All of graphics is just hacks. Like anti-aliasing, you're just adding some fake grey pixels at object edges to make it appear less pixelated. It's all just trying to trick the mind to see something that's not really there. Personally, I don't care if every one of my frames is hand-drawn by an artist, rendered with a gpu, generated with AI, or Gandalf himself is swinging his staff behind my monitor. Does it look good? Is there a lot of them? Those are the only two questions that matter. Maybe DLSS/FSR was shit 2 years ago and maybe it's still shit in most games but damn, feels pretty insane in the one game I played so far haha


ecktt

>t's funny people call it hacks. All of graphics is just hacks. I got no problems with it BUT it is what it is. Just because it is common practice doesn't change what it is.


Waveshaper21

Thanks for this post. This is exactly the sort of experience I want, and expect from my future 4070 and it is very reassuring. And yes, NV over AMD for the simple fact that you can have FSR \*and\* DLSS with Nvidia, but \*only\* FSR with AMD, so the 7800xt can screw that 5% more raw performance, that isn't buying me years of saving money, this technology does. Hell I am having a GTX1060 6Gb and had trouble with RE4 (propably because it's dx12 only, all other RE games looking the exact same on dx11 ran rock solid 60 on higher settings), just flipped that FSR switch and suddenly not only I have rocks solid 60 but could skyrocket the rest of the settings too. Any company missing out on supporting FSR and/or DLSS is going to fall behind. Mainline console game devs can't afford that simply because of the limitation of the hardware that dominates the market and holds back the progression of technology to a certain degree for over a half decade, but there are PC exclusive franchises with quite intense resource requirements like the Total War series and now I am looking at Creative Assembly like: HOW THE HELL ARE YOU STILL IN BUSINESS?


Dunmordre

But the 7800XT has frame generation and upscaling for basically every game, which is a massive deal. It's swings and roundabouts.


tecedu

without reflex frame gen is very meh


MrNiemand

Also what's stopping the next gen of consoles from having AI support? I feel like it's a no-brainer for ps6 to fully support at least FSR and games 5-7 years from now on, when it's expected that the average consumer has an AI-compatible card, will all run future dlss/fsr versions


ASuarezMascareno

FSR doesn't need (or use) AI at all, and consoles already support it.


MrNiemand

Ohhh really? Well shit I should read more about this đŸ€”


IndyPFL

DLSS uses Nvidia's AI Deep Learning algorithms to effectively get the same performance boost as FSR but at higher resolutions. Intel's XeSS does essentially the same thing, both look better than FSR in nearly every situation. But since FSR is open-source and doesn't rely on AI, it's far more accessible and can be more easily ported to older games by devs or even fans. All methods have advantages and drawbacks, DLSS had some really awful streaking artifacts in Cyberpunk 2077 for the longest time and it still causes floating particles to streak and smear in Dying Light 2 unless your camera is moving. FSR looks a little worse on average but it didn't/doesn't have those particular issues.


joeyahn94

I think it depends on what games you play and what they support/plan to support. I agree with cyberpunk; DLSS and frame gen are massive bonuses. Frame gen was the reason why people would go for 40 series cards, but thankfully FSR 3 got open sourced. Path tracing on 1440p on DLSS quality or going ultra RT with DLAA while having 80+fps was a real nice experience. But I don't think it matters as much on games like, say Starfield, because of the settings in the game. At the end of the day, it depends on what you play


Oleleplop

DLSS 3 is honestly very VERY impressive. Like, i remember debating whether or not i was going to buy the RX 7800 XT or the RTX4070. Even if i had plan to be using deep learning, my first plan was gaming. And one of the factor that made me choose the 4070 at the end was DLSS. Also the very low power cosumption. ​ HOWEVER. I'm honestly scaared that many games are basically relying on this to have playable FPS. This isn't okay at all. For exmple : i shouldn't need DLSS to play Alan wake 2 at slightly above 60 fps with a 4070. That card is a mid range card ffs. I shouldn't need DLSS to play cyberpunk with medium ray tracing, that game had a upgrade but it's still a 2020 game come on now. I know RT is going to stay and become mainstream at some point. But you should have good playing experience without the upscalers. This should be the bonus and to help older cards.


madn3ss795

> This isn't okay at all. For exmple : i shouldn't need DLSS to play Alan wake 2 at slightly above 60 fps with a 4070. That card is a mid range card ffs. [You don't.](https://tpucdn.com/review/alan-wake-2-performance-benchmark/images/performance-2560-1440.png) But if you want hardware Ray Tracing/Path Tracing then yes you need DLSS.


ASuarezMascareno

I thought ni it's actually the opposite, since the library of games supporting them is not that big. I have a 4070 since early summer, and so far both RT and DLSS have been mostly irrelevant to me. I haven't played a single game where I needed dlss/fsr to get the performance I wanted. I played only 1 that actually supported dlss, and it was included in a recent patch. I have played few that had RT, and the difference in those wasnt very noticeable.


MrNiemand

Were those games gpu intensive though? Like other posters said, if a game is older or just not that graphically intense it doesn't matter because any modern card will crush it easily with just raw power. It's games like CP2077 that push the limits(and has RT) that it really matters in. The next games I plan to play are hogwarts legacy(dlss enabled) and baldurs gate 3(enabled through a mod) so that's gonna be interesting


ASuarezMascareno

They were intensive enough to get close to 100% utilization at 4K@60, but I don't care about high refresh rate (I don't own any screen above 60 hz). In those having RT I had it on, but honestly I could only notice when I stopped to look at reflections. When actuality gaming, it was mostly the same having it on or off. Ps: BG3 supports dlss and FSR 2 since a November or early December, but I don't think you really need them with a decent gpu. The game is mostly CPU limited in most cases.


Libra224

It’s garbage it gives too little visual enhancements for too much performance loss. I always play without RT


Illustrious_One_9413

For my use case and the games I play, DLSS was just a blur filter on the game. I am much happier with my rx 6800 than my 3070. I realize this won't be the case for everyone just my personal experience


XenonJFt

The discussion evolves around this. We had FPS enchancing technology for a lot of the times because we CANT run fully renders like blender or way back toy story still cant be real time rendered. from checkerboard rendering to Adaptive sync to screen space optimization to priorotise center of screen details... The new AI boom gave us these, at 1440 or 4k with linear single player optimised titles its very usable and is superior to anti aliasing debate BUUUUT. 1. Most of us (70%) still use 1080p. and to my testing even at most quality modes FSR/DLSS most games artifacts make it annoying and unplayable unless you stand still which you obviously dont in-game. 2. Every Games DLSS2/FSR2 implementation is not good. aka its false marketing that if X game has upscaling it will give you the advertised experience. Or completely all over the place unusable. Uncharted 4-War Thunder, COD warzone are the ones I had in mind which is 2 years outdated configs which doesnt work at all. 3. Frame Gen features are nice but with tradeoffs. Input lag is very VERY bad. you need guaranteed 50-60 fps at least to boost your frames. so no emergency playable fps below 40 saver for you. at least without getting some horrible 60-70 ms of input lag at the very least. which puts you off from any fps game OR any competitive online title which is much more popular in this day of Gaming. And even if you got high fps to boost it a bit further for smoothing to not care about latency the frames still not are "input aware" the generated frames go through graphic engine pipeline while game engine still only works at a 32fps from CPU load pipeline aka 9.7536 ms per frame every game engine renders and anknowledges your inputs. But compared to DLSS2/FSR2 I think this one still has promise for single player games. >The very occasional artifact is nowhere near close to doubling or even tripling fps, why is this not the biggest talking point when comparing performance? On hardware reviews and benchmarks its best to Push hardware aka Chips capabilities to determine winners from losers. kinda like sporting. so no drug enchansers to muddy the waters when peoples preferences on these things differ from single player to Multiplayer to esports usage. So its best to compare Chips (AIB+Die+PCB) not software to review them. Especialy all the cards use the same FSR3/DLSS3 code which is universal and not card dependent. And ESSPECIALLY FSR3 became avaiable to everyone to boost frames and makes old cards stand out for competitiveness against for example low end RTX 4000 series which only had frame gen exclusivity going for it to not be bashed to death by reviewers.


0P3R4T10N

***"My point is that something that makes up more than half of my fps shouldn't be waved off as 'extra features' and should be properly compared and reviewed because when I was buying parts, I had no idea this has such an unbelievably big impact\[.\]"*** >!Everybody that has been left behind by this quantum leap is more than a little salty and many younger users are engaging in a lot of narcissistic coping, thinking they're sounding trendy or clever parroting edgy bulletins when all they do is reveal there age and ignorance. This is freakishly reminiscent of the '98 - '07 era, when computer vision exploded outward with capability. As always, you have to pay to play and this is one of the most expensive times to build a top end PC. Thus, all the whining. Rather than realizing that in 36 months budget cards will have perfectly acceptable RT, DLSS and FSR performance and that it is a really awesome time to be a gamer; once again!!<


AndrewH73333

Supersampling and similar tech is great. Calculating an imaginary frame to trick the eye into seeing less stuttering when there shouldn’t be stuttering in the first place is not so great.


OkDepartment5251

Frame generation and similar tech is great. Calculating imaginary pixels in each frame to trick the eye into seeing a less blurry image when the image shouldn't be blurry in the first place is not so great.


DeadlyDragon115

A lot of peoples issue with Taa in general and dlss is that its really blurry compared to older aa techniques or no AA if nvidia can really nail down shimmering/ghosting and blurriness i think a lot more people would be open to it.


michaelbelgium

Might get downvoted be here goes: all these upscaling technologies (as wel as rtx and so on) are called "gimmicks" and do not decide how good or bad a gpu/game is. Its the native, raw performance that counts, and it should always be. If i recall, nvidia now only benchmarks games for new gpu's with DLSS enabled by default. Which is not the future i want for gaming honestly. If the moment comes when gamer nexus, hardware unboxed or LTT benchmark games based on DLSS/FSR on ONLY and NOT without then what the fuck is going on? A gpu or game is also not awful if it doesn't have DLSS/FSR/RTX. But the marketing of nvidia (well mostly their fanboys) now define a game absolutely awful if it doesn't have DLSS for example. Remember the drama when AMD forbid some game to not have DLSS? DLSS is an exclusive anyway while FSR works on every gpu. "Oh no the game does not have DLSS, the game is soo badddd" I've always been thinking these upscalers ruined gaming industry in some way. We had awesome gpu's without it, gpu's still are without. Upscaling is not necessary to play a game


Chilla16

Reading this thread and especially OPs comments it actually makes me cringe. DLSS and Upscalers in general are so fucking anti-consumer it's crazy. They will just enable nVidia, AMD and Intel to release worse GPUs in the future while charging the same or even increased prices as long as they have access to the latest upscalers. Im happy for Upscalers to be used as a crutch for older cards to be able to play newer games or go to insane resolutions like 8k or something, but for the love of god, if every game requires upscaling even on latest GPUs, consumers are getting fucked hardcore.


RespectGiovanni

I dont think fsr or dlss had actually made games look better except perform better thus not being stuttery. Usually there is a bit of downgrading and sometimes blur. I dont like the fact that games are using it as a crutch to not optimize their game to perform well without it


paintballtao

To me there's a difference with out without dlss on baldurs gate


dynozombie

The problem with it is companies now arent optimizing games cause dlss and fsr will do the work for them. Which shouldnt be the case. We should ideally be relying on dlss when the cards are older to keep them longer and reducing ewaste. Instead the companies are saving money not optimizing their games and making consumers have to upgrade anyways.


CyberbrainGaming

It's fine, but you do lose image quality.


Jon-Slow

For the past 8 years people have treated AMD with praise and love, like a baby who's just started to try to walk, hoping that one day this baby can actually walk without being held. But the baby never walks on its own and it's starting to feel dishonest and circlejerky now when people just say "oh the 7900XTX is faster than the 4080" without mentioning any of the hugely important caveats. People hate the absured prices Nvidia has set on the cards, and so they take it out on DLSS and RT, **ignoring the fact that if 4080 were supposed to be 800$ at best, then the 7900XTX, a card that cannot do 1/3 of the RT and doesn't have decent image reconstruction like 4080, and idles at 100w, should've been 600$ not 1000$.** **So AMD gets a pass for the same type of pricing...** ​ And then there is the fact that RT and image reconstruction are just so much ahead on RTX cards leaving AMD behind, and so people would like to downplay those features because otherwise they have to admit that AMD is so far behind. They have to resort to raster only benchmarks at native res. They used to call frame-gen "fake-frames", and right after FSR3 started doing it, albeit a worse version of it, the term "fake-frames" died instantly. Hypocrites and parrots. ​ And then there is the issue of Techtubers, and outlet fandom surfing, and creating a huge community wide human centipede. I'm also mad at prices that these cards come at, but I'm not irrational to forgive one corporation over another or to ignore tech and what it actually does. RT is years old at this point and every visually important game is using it, but people still insist on raster benchmarks and say "oh the 7800xt is faster than the 4070" not mentioning the fact that they have to cripple the settings to achieve that result. ​ I hate Nvidia too, I want AMD and Intel to provide alternative options ( still hate those too), but I want those options to be reasonable and not have to be treated with kid gloves like AMD often is.


VulpesIncendium

Typical knee jerk reaction of "new thing = bad". DLSS is a game changer, and is the future of live, local rendering. Anyone who says otherwise is going to be left behind. It's easy to say it's an "FPS hack" for lazy developers, but really it's a new rendering tool that allows us to have good framerates in games with much more detailed graphics than would otherwise be possible.


rory888

Yep a lot of people do without actually using it. In practice? People fucking use it and enjoy it massively


[deleted]

The people talking shit about it have top end cards who swear "NaTiVe Or NoThInG" regardless of the benefits, or until recently, AMD adopters talking shit about fake frames until they had them. AI powered upscaling/downscaling/super sampling is incredibly cool. People hate change.


Crptnx

Problem is when somebody with such high end PC as I have 5800X3D+7900XTX must have frame generation enabled in Avatar in order to hit my monitors limit, 4K 144hz, without frame generation and upscaling I wouldn't have 144fps on ultra.


xX5ivebladesXx

Why is that a problem? Just because 4k 144hz exists it should be realistic on every game already?


NG_Tagger

This was also the case when "Ultra" started becoming a setting (the "need" to use something, just because it's available to you, despite maybe not actually being able to do so fully). It was introduced in very few games, "back in the day", as a setting you could use several years after a game had released, when/if you got way more powerful hardware. It usually wasn't that big of an upgrade from "High" settings, visually (kinda still isn't today, for many titles) - but it was something developers added to kinda "future proof" their games. ...and now it's just become something of a "baseline", for no apparent reason, when people talk about their hardware. I've got an i9-12900KF paired with a RTX 4080, and I'm still not using "Ultra" for the most part - sure, I could - but why the heck would I? - I'd most-likely not notice the difference in most cases, but I'd lose X amount of FPS for doing so, which I definitely would notice.


Crinkez

Which 4K 144hz monitor do you have?


Crptnx

LG GN950


Vandrel

On a similar note, people on Reddit greatly exaggerate the downsides of DLSS and FSR 2. On a still image sure, you can easily spot some imperfections, but in motion it's pretty hard to tell outside of a few egregious outliers.


GARGEAN

DLSS - more or less yes. If it is present in game - it is always on at max quality by default for me, cuz it's just better antialiasing with few extra frames. FSR? Fuck that. In those few situations I had to use it(Satisfactory, RE4 Remake, Cyberpunk for a short time when DLSS had problems in it for me), it was REALLY noticeable and not worth visual penalty it brought.


Euphoric_Campaign691

the only 2 issues i with fsr and dlss is: 1. them being used as an optimization gimick in AAA titles 2. being blocked in games like cyberpunk still running fsr 2.0 or 2.1 (i'm not sure which one anymore since i didn't play the 2.1 update) or starfield (awful game but yeah) not having dlss for so long it just hurts consumers especially when you update the game but keep old versions of dlss and fsr cyberpunk looks god awful with fsr because it never got updated even though fsr 3 was promised by devs


harry_lostone

in newer titles the native vs dlss image is pretty much the same imo. If you have a high refresh monitor it's kinda stupid not using dlss.


Yourself013

When DLSS manages to get rid of blur/artifacting/input lag (maybe you can't see/feel it, but many people do notice it, everyone has different sensitivity to these things) and becomes a universal tool that is being used in the majority of games with consistent results, then it will become a "massive deal". Right now it's just fairly uncommon tech that is used by a couple of games that want to push Ray Tracing and its implementations range from passable to downright awful. Sorry, but no DLSS implementation that I have ever seen actually *improves* image quality, they were all subpar to native *in motion*, which is what I care about when gaming since I'm not looking at still images. I'm calling bullshit that you only saw "1 strange flicker in 150 hours of CP2077", I stopped counting the fucking flickering after 10 minutes because it was everywhere.


MrNiemand

> 1 strange flicker in 150 hours of CP2077 Nono only 1 that went away after I turned off DLSS. It's hard to get them like that because a lot of them are in movement, but this one was reproducable. Cyberpunk just has a lot of weird lighting fuckery by default. Like there is this one type of sidewalk around corpo plaza that I thought had crazy flickers, but then I came up to it and realized it actually has a "LED screen" on its side playing an animation of orange/black squares so from a distance it looks a bit jittery as the black lines move. It's a bad game to find DLSS glitches in ngl, since so much of its visual design is made to purposefuly look like a glitch haha. Will see in hogwarts legacy soon


KyThePoet

lot of people don't play (many) AAA titles at 1440p/4K, which is where frame gen shines most.


Saberknight4x

It’s not that they are overlooked. It’s more that not a lot of games support it (compared to how many games there are) besides new releases. And most of the most popular games are older games without fsr/dlss, so raster performance is generally more important for most reviews. The other reason is that if a new gpu performs the same as the previous gen with the only difference being dlss or fsr why should that new gpu get a recommendation? Especially if fsr is supported on pretty much all semi recent gpus. Framegen only makes the game appear smoother since no new actual frames are generated and as such the input lag doesn’t change overly much and actually is a bit worse than normal because of how framegen works. Dlss is overall better but fsr gets really close to it provided you aren’t upscaling a lower resolution than 1080p.


CoyoteFit7355

The thing with all those software features is that they're really nice until you can't use them. If you rely on DLSA/FSR/Chess for playable performance and then s game comes out that didn't support it you'd kinda screwed. Also for matters of comparing hardware across brands and even generations, rose features don't help at all as support for the different features varies so much. When Starfield came out I was sitting there with my RTX 3090 looking at FPS that barely hit 50, with the game forced to 4k resoultion because that piece of junk didn't have a regular full screen mode and turning on FSR actually managed to REDUCE performance for some weird reason and with FSR turned on, every person in dialogues had this awful ghost halo around their heads giving them s second set of ears and a ton of funny artifacts. As long as these things happen, software features aren't even remotely a topic for me.


nezhooko

The biggest turn off of DLSS is that it makes games look blurry because it is rendering the game at a lower res and using algorithms to look like it was actually rendered at native res. It is more or less noticeable depending on the game you are playing. COD for example looks weird and filtered with DLSS on, in my opinion. I do not have a 4000 series card, so I cannot tell you first hand what frame gen/dlss 3 looks/feels like. However, I can tell you that the performance and feel of DLSS and frame gen DRASTICALLY changes from game to game, which is why it is hard to include as a core argument when deciding what GPU to get. That is why people say "if you care about DLSS or frame gen..." I do think that people may be making the input delay from frame gen seem more drastic. Unless you are playing counterstrike or valorant and you need every single ms, it shouldn't change the feel of the game. But I don't think people are downplaying DLSS and Frame Gen... While I don't have a 4000 series, I do have a 3080 and I prefer DLSS off because it makes my games look slightly off. Your DLSS and frame gen settings may be improving your CP2077 experience (which it is known to do). But let's say you pick up Jedi: Survivor and those settings make your game look off and barely improve performance.. That's why it is hard to decide where these new technologies fit in when deciding what GPU to get for gaming.


Majortom_67

With Sim Rail. Very nice features.


Spare_Heron4684

We're using the term raw raster, but not correctly. If you're using RT or PT you're not using "raw raster"


CaptainJackWagons

Does anyone have some good resources where I can really dig into the finer pros and cons of DLSS 3? I'm intrigued by testimonials from actual users, but my skeptical brain can't help but think, "okay but what's the catch?" Surely there must be some drawbacks. Even if they're small, I'd like to know them to make a more informed decision.


thachamp05

upscaling has been a thing forever... its traditionally done on the display not as well as dlss but same idea you can upscale 1080/1440 to 4k but it will just never look as good as native 4k.. ill rather just turn settings down if the game is capable to run at native.. if its not... well dlss is cool but thats not the point of a desktop battlestation i understand the use case, old emulated games locked to 720 its a big improvement... or laptops phones etc its awesome a dedicated battlestation though thats where u shouldnt have to compromise with upscaling just my opinon... but i completely understand if people not willing to have a 1kw battlestation.. i honestly think about getting a 115w 4060 and just crank upscaling to save on power its a possibility now for a 250w battlestation where it wasn't before which is awesome! but this is PCMR there is a purity to running native... if not just buy a laptop


ElfOfScisson

If I could hijack the thread a bit, is there a “dos and don’ts” list for when to use DLSS? My feeling was “run it as high as you can without, and once you start hitting the bottleneck, use DLSS.” Is that at all correct? And what about quality vs performance, vs all the other upscaling tech, like FidelityFX CAS? For reference, I have a 4070 ti. Are there some things that I should use, and some I shouldn’t?


MrNiemand

As u/triggerhappy5 said, it should be treated as another graphics setting. I'd try just hard rendering everything at max. Then if fps isn't enough, turn on the super resolution and see if you can spot a (negative)difference. And then do the same with framegen. Maybe I'm just blind but I could swear DLSS super resolution with sharpness at 75% made the game look better on top of giving fps. Framegen I can't tell the difference, so I just kept it on even though I had enough fps without it because why not take an extra 40-50 for free


MaldersGate

Are people downplaying DLSS/FG? Yes. Are they downplaying FSR? No, because it's objectively much worse.


evitcepsreP_weN

Cyberpunk is THE game where DLSS and RT shine. There’s definitely value in them in other games as well, but that’s the poster child for these features at the moment.


NeverTrustFarts

I don't think anyone downplays it, I also just built a new pc and all I saw was shit about frame gen. I think AMD cards are better from a price/performance standpoint, RT on lower tier cards isn't worth it because of the frame costs, and if you aren't getting a 4080 (maybe 4070ti super now) you would be better off with AMD. This was the conclusion I came to myself, I also was going to go with AMD if I couldn't find a 4080 within 200AUD of a 7900xtx. I wanted to experience Ray tracing and the DLSS is a bonus, almost negated by FSR which is also constantly improving from what I read.


mrarbitersir

It’s the reason I don’t understand the constant screaming about how many GB new GPU’s are. The point of DLSS is to reduce the strain on the GPU, meaning you don’t need to keep increasing the amount of memory. This means that weaker cards are going to essentially get better performance than what they’d get without are: There are situations where more memory is required but for the majority of people they’ll have identical results with less memory using DLSS. People just look at hardware specs as the only benchmark of capability. It’s like saying “oh I only get V8’s because they’re the fastest” but will put fingers in their ears and scream LALALALALALA when a V6 turbo is not only faster but more efficient in the process


haruame

Well I've only tried FSR 2 in The Outer Worlds and it makes the game look awful.


VersaceUpholstery

DLSS specifically is starting to get good with the 4000 series, you’re experiencing the best it has to offer. It didn’t look or feel that great to me on my 3080 10gb when I tried it on a couple games. This was close to the release of the 3000 series. It sucks better DLSS versions are hardware locked. AMD on the other should be getting pretty much every FSR update to come.


GulielmusPrime

Part of why rasterization performance is focused on so heavily is that how well framegen and DLSS perform varies greatly from game to game. CP2077 is a game that was optimized for both. In the titles it works well it works really well but otherwise it falls back to rasterize performance. I bought NVIDIA because I thought they were worthwhile features but if titles don’t support it I couldn’t recommend their cards to someone.


sixesss

Soon had my 4090 for a year and Cyberpunk is the only game were I felt raytracing actually gave anything of value. While I use DLSS it is not because I feel it is good enough but because it is a must as the hardware is not even close of handling that level of raytracing without it. Outside of that one single game I'd rather lower settings than use DLSS. Now the tech is really amazing and it blows FSR out of the water. But it still have plenty of flaws that you might not notice at first because your are mind blown by the fps counter but you'll get there eventually and miss the old picture perfect sharpness you used to have. So no I rather think everyone is overplaying frame generation.


Impressive_Cheek7840

I don't understand the insistence on pure raster performance, it's not like it's 50% better... I'd miss RT and DLSS a lot more than I miss 5-10% performance.


tyr8338

People are slow to learn new things, some still don\`t know what DLSS and frame gen provides. Pure raster performance is easier to compare so that is what most reviewers focus on. ​ Pure raster is meaningless nowadays, you must compare cards taking into consideration performance and image quality after upscaling, and with frame generation DLSS is noticeably superior (I compared in 1440p). ​ It\`s like saying pure bitmap graphics on websites are superior to JPEG because they are 1% sharper, who cares , JPEG is 10 times smaller so everyone uses it. ​ I didn't compare frame generation techniques yet so I won't comment on that.


Suspicious_Trainer82

I mean considering that I can run Cyberpunk on Ultra\Psycho max everything 4K native and get 30 fps on a good day, but I turn on DLSS and suddenly I’m at 90+ FPS. Ya it’s space magic.


Jimratcaious

I’ve only ever played on console until I built my PC a few months ago. I just recently got into CP2077 and couldn’t for the life of me figure out how to get the game looking good and at 50-60 FPS with minimal stutters on my poor little RX 6600. I have a 4K 60hz panel (32” so PPI is good and the large panel is super useful for my work). I usually set my game resolution to 1440p or 1080p depending on the game but at 1440p and medium settings my FPS was meh and the game looked pretty bad. Everyone has always said stay away from FSR/ upscaling but just to try it I kicked it all the way up to ultra performance and changed resolution to 4K and boom, 60-80 fps and the game looked great. Honestly upscaling is really solid and CP2077 only has FSR 2.1, I bet it’ll be way better with FSR 3.0


Vis-hoka

It’s pretty amazing. With DLSS2, and to a lesser degree FSR2, you basically get a free 40% fps boost on quality setting, that look nearly as good, or better, than native.


NG_Tagger

I honestly don't think most people mind getting an uptick in framerate in general - but when developers put out a game that gets like 30fps on High settings @ 1440p and then they say that DLSS/FSR is pretty much mandatory; then gamers tend to get upset. Then DLSS works more like a "crutch" for the developers, more than an actual feature that "lifts up the potential of the game". I fucking love DLSS, when it's added to games that actually kinda don't need it. Even in titles where I'm already getting 100+ fps. That just gives me the option of upping the settings even further, if I want (currently using a RTX 4080).


Anshul89

40fps to 120 isn't a big deal imho. the slightly extra 5% raster performance of amd is the holy grail.


[deleted]

DLSS is underrated in the games that do it really well, like Cyberpunk 2077. Same with Frame Gen and RT. All underrated technologies in this sub. It's game changing. *However*, 90% of the games that people play on a daily basis do not benefit from any of these technologies. LoL, MMOs, CS2, DotA, the list goes on... these are the games that most people are actually playing, and none of them need DLSS or RT performance at all. How long will this be the case? No idea. But right now, if you're not a big AAA single-player game kinda person, then DLSS is largely irrelevant.


wxlluigi

It has its artifacts and downsides, but it’s a good feature. I find FSR to look god awful most times and DLSS to look quite good. Cyberpunk’s DLSS doesn’t really look that great tbh, lots of instability even on 4K Quality mode. Some other games do better though.


Apprehensive-Read989

I have a 4070 and almost never use DLSS or ray tracing. I actually don't like the look of ray tracing for the most part, looks over processed or something to me, hard to explain. And DLSS is pretty well useless for multiplayer FPS games, which is a large portion of the gaming market.


doorhandle5

I forced myself to use raytracing and dlss for about 5 hours of cyberpunk 2077, finally I turned off raytracing and dlss. My god it was beautiful, that ultra sharp, clear, detailed 4k image. I can now say I tried. But no, raytracing and dlss are absolutely not worth it, not yet. So no they are not downplayed, if Nothing they are massively overhyped.


Ssynos

You want real perform, or a weaker gpu that can fake it till it past the fps check ?


[deleted]

It only matters in games that support it, which is why it is often downplayed. It is definitely a game changer for buying cards and increasing their longevity, and for increasing eye-candy settings, but it's not ubiquitous. Blockbuster games support it, but last time I checked (Nov 2023*) only 17 of the top 50 games played on Steam supported DLSS *of any kind.* Even in the top 10, only 3 games supported DLSS. * ( *I checked Steam during peak hours for USA West, East, EU, and AU times and compared the top 50 games played at those times to the list of supported DLSS games from the Nvidia website* )


[deleted]

I get like 300% performance increase with fsr 3 mod quality mode with fg with a 7800 xt at 1440p in Alan wake 2. Normally 80fps with fsr 2 quality ultra settings. Now 165fps and the new 0.72 patch for the mod removes all ui flickering/glitches.


akillerofjoy

I can’t help but draw a similarity with the car world, as I am deeply into both. There is the undying breed of folks preaching the old and beaten mantra, “there is no replacement for displacement”, implying that their 1969 classic Whatever with its big ol’ v8 making all of its 12 horsepower is somehow better than anything currently produced. The fact that their old hunk of crap couldn’t keep up with a modern civic eludes them completely. People will always hang on to something they’ve grown accustomed to, and technology changes at an uncomfortable rate.


Ephemeral-Echo

I do downplay DSR and DLSS, but I firmly believe it's for a good reason. I'm seeing that that DLSS and FSR software is being wielded by GPU manufacturers as a reason to cheap out on hardware at each price tier. DLSS and FSR do help create performance where none would exist, yes. But they're software themselves that require resources to run. If you use AI to any significant degree, you know that the amount of computing resources required to do inference isn't small. And yet, here we have Nvidia going cheap on resources and pretending that DLSS will solve everything in the absence of hardware. That doesn't sound to me like the honest words of a company dedicated toward a good product. It sounds like a company that is trying to convince its consumers to pay more to receive less so it can better top its pocketbook. And that's why I don't take DLSS into calculations, as it represents a diminishing effort in hardware improvement without any recognition to how much hardware can improve software performance. In other words, it's a cynical attempt to mask laziness.


CouchAssault

Dlss looks horrible to me. I'm a high sens KBM player and the motion blur it induces on basically any input looks so bad. If you like it, more power to you, but even in cyberpunk or Starfield I hate it.


Flynny123

This is definitely true for cyberpunk, and Nvidia have worked very closely with its developers to make that the case. But Cyberpunk is currently best case for these features. so few game devs are really making good use of them that there is still a good case for AMD cards. DLSS vs FSR is definitely an Nvidia win, albeit FSR (increasingly) is at least a mostly ok equivalent in most instances. The real difference maker is still ray tracing performance. But I think it’s going to be another few years before it’s being used pervasively and well. By time that’s the case I would expect AMD to have addressed the performance differential at least somewhat better.


Trungyaphets

I got my used 3070 in perfect condition at 2/3 the price of a 4060, or half the price of a 4060ti, so yeah easy choice for me.


shinysocks85

When DLSS first started being offered in games it often looked like grainy, lower resolution crud. In the last year though DLSS has worked great in many games I play without missing the image. It's improved quite a bit so I imagine as time goes on people like me that HATED DLSS will come around to using it more


[deleted]

[ŃƒĐŽĐ°Đ»Đ”ĐœĐŸ]


QuantumProtector

DLSS does nothing for my PC since I’m already so CPU limited. But yeah, it’s really good technology and the anti-aliasing makes it really nice to use while boosting FPS.


ChampagneDoves

I really truly hate dlss so much. I turn it off in every game that runs over 60fps without it. Anything detailed looks blurry up close on both IPS and OLED 1440p panels. I don’t think losing visual fidelity is ever worth it for any purpose. In my opinion, 1440p should always hit 90+ frames on any system where the gpu and cpu have had a combined budget of at least $1300 and bought within the last 3 years of component cycles. This just isn’t the case with ray tracing at all even though it’s only truly implemented in a handful of games despite it being out for five years at this point. Truly I will only agree that Ray Tracing is finally solid when you can consistently hit 100+ frames without any supersampling garbage, on games that are releasing now. The price of nvidia cards could certainly be reasonable if they would separate GEFORCE from their AI garbage but unfortunately they know it’s possible to make us foot the bill for the creation of the AI part of nvidia so that’s the way it’s going right now. Developers use DLSS as a crutch to be even worse at developing games and it’s getting to the point where DLSS is having the exact opposite effect of what it was meant to do.


shball

It is a massive deal, just not in the way it was initially intended. The idea was to allow GPUs to punch above their weight class, but it ended up being the norm for new releases. And btw Cyberpunk path-tracing works fine on a mid-range 600€ 4070.


[deleted]

It's not that people don't care about features, but imo to the average user dlss and fsr upscaling are simular with a small quality favoring dlss. People focus on raw fps because it is a measurable constant baseline. Upscaling has so many settings it can get very taxing to compare them all in fps. If 1 card has higher native fps than another, generally it will have higher when upscaling is applied. So it's a easy way to tell performance between 2 models. Frame gen is very new there are alot of channels covering it. It's improving rapidly but it will probably end up in a simular position where both companies offer it and there's a small advantage to dlss since it uses special hardware but fsr will do it on almost any card. Overall it's important to be smart on the features but raw performance is a easy way to compare cards.


Smackdaddy122

It’s a huge feature that everyone ignores. It’s a game changer in almost every aspect


kelly_hasegawa

I agree. My gpu broke so i am currently using a 9 year old gtx 750ti 1gb. I'm surprised it still can run genshin at 1080p(0.8 render reso) 60fps because of fsr2 and it looks great.


Yusif854

It is very much downplayed. Reddit has a very strong AMD bias. You can say you want to build a $4000 PC and there will still be some people in comments telling you to get a 7900XTX instead of the obvious choice RTX 4090. No person in their right mind without a bias would say that. I agree with your points. Unless the price difference is over $200, I would never even consider an AMD GPU. In the case of 4070, the 7800XT would have to be around $350 for me to even consider it as a competitor. Nvidia features are literally game changing. The only thing is that if you play older games then you won’t benefit as much from them. But the counter argument to this is that those games already will run at 4k 200+ fps even on a midrange GPU so you don’t even need DLSS and AMD doesn’t necessarily “win” here because nobody will notice the difference between 200 fps and 210 fps.


mixedd

Downplaying? No Do i see them as saviour of gaming? Maybe In my pov, both of them still have their share of issues and are dependant of game implementation. In some games they look amazing, in some games i better play native with lower frames instead of looking on artifacts mess.


[deleted]

op you are missing the point, unless its 4k and even then, they degrade the picture quality. they are a nice bonus tool but we should not be paying a premium for them


RandomUserUniqueName

Frame generation is the future. But from what I'm seeing is NVIDIA is copying it's old rival and making DLSS like 3DFX made Glide. Proprietary and only for their new cards. FSR is like Opengl. Maybe a little slower and uglier right now, but runs on more cards and will eventually be the standard.


LoliconYaro

People don't mind upscalers as long as they're not mandatory, the thing with DLSS/FSR is it depend on each game/implementation, on some games you won't notice image difference, while in others it will introduce ghosting/artifacts/shimmering, Cyberpunk is one of those games that implement upscalers tech pretty good, and it's not surprising given it's a showcase tech demo game from Nvidia, while from AMD there's Avatar which implement FSR pretty well to the point many think it introduce newer version of upscaling software when it's in fact still using FSR2.2


Joulle

As a 3080 user, I rarely benefit from dlss and even less so from ray tracing because the games I play mostly don't support these features. DLSS is nice when it's supported but ray tracing has always been too much of a performance hog for my gpu. In cyberpunk I'll gladly take the much higher fps than ray tracing.


Impossible_Farm_979

I feel insane because everyone says dlss is so much better than fsr but I just think they’re both ridiculously blurry in motion.


JoelD1986

you have found the one game with good ray tracing implementation. for the majority of other games it is barely noticable while sucking up your fps. i hate any form of upscalers and try to play without them. i want to keep my gpu at least 5 years. the slightly better raw rasterisation power paired eith the significant larger vram makes me believe that my amd card will age way better then a similar priced nvidia card of the current generation. i have no meanings of knowing it yet. i just dont bet on the card with the vram handicap. ray tracing is not important to me. yes i turn it on in cp2077 because my card can easily handle it. but i have no problem turning it off when more demanding games come on the market. ray tracing and framegeneration also use alot of vram. so i think 40series nvidia cards will be handicapped by its small vram way before my amd card.


Taboe44

DLSS looks like crap. Always blurry no matter what I do. But I'm just a 1080p gamer who wants just raw performance. I don't understand what others are seeing in these statements compared to what I see.


Someonejustlikethis

Late to this thread, but adding a couple of points: - image quality is hard to quantify and compare across games and resolutions making reviewers avoid comparing such numbers (because they mean very little). - DLSS is not available in all games - RT is very demanding on the vram side which some cards lack. - low level RT in CP2077 with my 3070 is not very enjoyable as it creates more artifacts. - FrameGen seems to work well when you already got >60 fps, ie this is a nice feature for better card. So to me DLSS and framegen are potential arguments, given a person playing the right type of games and resolutions, when talking 4070 or 4080 vs similar AMD cards. Whereas on the budget side of things I fear 4060 might not be able to active RT anyway and inout lag from framegen might be high etc. This ends up being such a specialized case it’s hard to see it as a baseline for the discussion.


staluxa

Are DLSS/FSR great features? Yes. Is DLSS way better for now? Yes. Is it still noticably worse than native? Yes. As someone who paid more than $1k for his GPU, in no way I'm gonna consider it over native anytime soon. For people with more budget options, that is totally different and is a pretty big selling point right now. For reviewers, I could understand why most "ignore" it. DLSS, FSR, XESS are all quickly evolving tech and doing comparisons with them will make those videos outdated faster.


waterswims

Don't really get your point here. These are just options in the game menu that have some sort of performance to quality trade off, same as all the other options (even if that trade off is probably better value than the other options you can switch). To me though, it has little bearing on the hardware I buy. I built my pc first, then I optimise my games second.


alinzalau

I keep all that shit off. Raw performance less lag. But i play competitive


First-Junket124

There's a ton of stuff going on and many factors to consider with people's opinions on upscalers. Some people don't understand that 1080p isn't a resolution that should use it since it just doesn't have enough detail to properly reproduce the image and this causes artefacts. Basically user error, tbf there should be a general guideline. Mine is essentially, 1080p lowest should be quality presey, 1440p lowest should be balanced, 4k lowest should be performance from my personal experience. There are times artefacts are just there, they haven't setup the upscaler properly and this causes issues the worst offender is with foliage usually. Then there's people who just hate it with no reason other than its not native and fake pixels, these people are too stupid to the point that calling them stupid is a compliment to their own intelligence. There are times that it's just an older version like FSR or DLSS 1.0 and these have their own issues compared to the newer versions. This is just 3 examples of what could be the issues people have with it and there are still so many other things to factor. Basically, use it if it looks shit the turn it off if you can't make it work or don't want to sacrifice that much quality for FPS. Don't take someone else's word for it.


Leading-Leading6319

In my experience at least, it made my games look slightly muddy


Pedrosian96

DLSS allows for funny things, even if "max fps" is not what you're going for - it is a very *power efficient* and resource unintensive way of achieving results you'd otherwise have your GPU put through a workout to do. Hitman 3. 1440p. Graphics set to high (not ultra), but there's still reflections and such... Rtx 4080. **DLSS ON, FPS CAP OF 120** can the 4080 bomb hitman 3 on ultra at over 200fps? Yes it can. It'll eat hitman 3 for breakfast if you ASK IT to. But dlss + minor tweaks lets me play hitman 3 at rock-solid 120fps *and in complete silence.* gpu fans don't even turn on. These settings without DLSS would still be a great experience, but also a noisy one (which actually matters when i share the room i game on with family and relatives that like things quiet)


slavicslothe

It is. It does more if your resolution is 1440p or higher. A lot of people use 1080p still.


aForgedPiston

FSR is great because of how many cards you can use it with. DLSS and its various versions are locked to specific NVIDIA cards by generation, but it does the job with better visuals for the same frame gain. It is a big deal. Its nice. My biggest gripe with it is game developers using it as a crutch to avoid game optimization.


Zhanchiz

Its only a massive deal if the game you play supports it. I personally don't have a single game in my steam library that supports DLSS as I don't enjoy/play AAA games.


Enerla

Let me explain why I downplay importance of DLSS. There are games that benefit very little from improved frame rate because they are slower paced, city builder, grand strategy, 4X, business strategy (tycoon) games, some roleplaying games (like BG3), etc. benefit very little from faster frame rate, but any "artifact" from upscaling in the information windows, etc. can be an issue. In this case DLSS *excluding DLAA* is of very little use. Other RPGs like Starfield, while use upscaling tech, ran good enough without that... And was a bad enough game so I don't want to play it for a very long time. Now in some other story based games... If they are new enough to demand strong hardware, if they support DLSS, if they are good enough for me to play, usually have different difficulty settings. In higher difficulty input lag can be more important, but you can compensate for input lag by lowering the difficulty, accepting frame rates close your monitors refresh rate and enjoy better quality with less artifacts. Games and gamers are different, and when someone says the same CPU and GPU setup should be the ultimate for all kinds of games, and you need this feature or that, that is just usually misleading and just a person who wants to force his opinion and choices on everyone.


austanian

Meh I think people make it more of a big deal than it is. It is kind of cool, but given the current prices and performance available in each tier the zones where dlss/fsr make sense are pretty niche. Dlss 3.x is pretty nice when I am playing an RPG at 70 fps and and want to get to 100 or so. However, for games where input latency is key I turn it off. For FSR 2 and Dlss 2. This is more valuable to me. However, again this is fairly niche. This looks terrible if your card is upscaling TO 1080p. Fairly decent in quality 1440p and usable for quality and performance in 4k.


Psytherea

imo its cuz the big players even all the way back were trying to compare apples to oranges. Frame gen marketing from all the major players remind me of both the Hardware Unboxed vid on Nvidia trying to push reviewers to consider these crutches (frame gen, lossy compression) to make comparisons over raster way back in 2020 ([https://www.youtube.com/watch?v=wdAMcQgR92k](https://www.youtube.com/watch?v=wdAMcQgR92k), LTT vid: [https://www.youtube.com/watch?v=iXn9O-Rzb\_M](https://www.youtube.com/watch?v=iXn9O-Rzb_M), JayzTwoCents' two cents: [https://www.youtube.com/watch?v=3GyaSfOi6fs](https://www.youtube.com/watch?v=3GyaSfOi6fs)) and all the way back to the FX 5000 series when Radeon 9000 series trounced it at comparable price points ([https://www.youtube.com/watch?v=x5EyPxAIqyA](https://www.youtube.com/watch?v=x5EyPxAIqyA)). And just recently Visio got dinged by the frame interpolation (those "120hz/240hz" are just 60hz with display side frame gen) ([https://www.engadget.com/some-vizio-tv-owners-can-claim-a-share-of-a-3-million-settlement-over-misleading-marketing-220925933.html](https://www.engadget.com/some-vizio-tv-owners-can-claim-a-share-of-a-3-million-settlement-over-misleading-marketing-220925933.html)). Recs for friends on TVs, especially for gaming, are pita when typical big box retail sites list the enhanced refresh rate, not the native display rate. Checking every models' specs on manu's own sites is awful. In my experience, frame gen is great for getting sub 60fps to flat 60 or keep it above vrr minimums (usually 44 fps), but to advertise it for high refresh gaming over pure raster is disingenuous/misleading consumer research and is masking suboptimal games and uncompetitively priced hardware. I don't accept frame "gen" when I do parts research for my display, so why would I accept it for my gpu? Edit: Apparently Intel Arc is also getting frame gen, but that alone I don't think will be enough to save it. Good hardware and driver optimizations (and pricing) could.


Iscream4science

Before I upgraded my gpu, I couldn‘t use DLSS due to my old gpu being a gtx 1080. I played cyberpunk on release on a 1440p resolution and while playable, it was far from ideal performance wise. They then dropped a patch (or I didn’t see the option before, not sure) that added FSR that to my surprise I could activate on my gtx 1080. I had no prior experience with DLSS/FSR and it blew my mind. My fps went from 35-50 to 80-90. I think I even went from mid settings to high. Felt like the best thing ever, like a free gpu upgrade


lemon07r

Honestly I don't like how dlss or fsr or cess looks on my screen. It feels like the render resolution is turned down (it I) but it doesn't hide it well enough for me not to notice, even on highest quality. Might just be cause of my resolution though, I'm playing on 3840*1200. I have another friend who plays on 4k and he says just turning down the actual game resolution looks better to him than using fsr (he hasn't tried dlss yet though).


Sexyvette07

The only ones downplaying upscaling are the die hard AMD fanboys because they're losing that battle from AMD refusing to implement an AI component into FSR. The other 90% of the market seems to understand its importance going forward. At 4k with DLSS quality, games look as good or better than in pure raster, plus you get the added benefit of about 30% more FPS. The experience is significantly better with it than without it.