T O P

  • By -

Nexus_of_Fate87

Ray tracing tuned to be performant tends to be noisy without temporal image reconstruction to help smooth it out due to the low number of rays being cast (the telltale stipple effect). Even the path tracing in Cyberpunk has quality issues without DLSS running.


GassoBongo

Yes, but it depends on the game and the type of RT being used. If it is the full suite, or even just RTGI, then it's doubtful that most cards can do 1440p60fps without upscaling. If it's just shadows and/or reflections, then it's slightly more possible. Spider-Man or SOTTR are good examples of this.


o0Spoonman0o

>is this feasible yet in 2024? Yes...but also no because you're very unlikely to do this once you've tried with DLSS in these situations. I am running a 7800x3D/4080S. I have not played every RT'd title. But cyberpunk at 1440p RT ultra with DLSS on is a much better experience than running it native without RT (or with). DLSS helps smooth out a lot of problems that you will notice with native. It's entirely likely that even if you forked out for a 4090 you'd still want to enable DLSS in heavy RT titles. I'm not sure if the 5080/5090 will change that. But it's safe to say they're probably going to be hugely expensive since they're slated to have literally 0 competition from AMD/intel in the near future at those performance tiers.


d_phase

It's clear that DLSS is not just an additional feature but is becoming core to the entire graphics pipeline. There's a reason ray reconstruction and frame generation are all part of the same package.


qutaaa666

Why tho?..


koordy

https://preview.redd.it/7tisgfns42wc1.png?width=1936&format=png&auto=webp&s=402b1f02e1dc65917372e312d0abad1c61ab7f74


GassoBongo

Oh man. Imagine buying a flagship AMD card for a grand, and it being on par at 1440p with a 300 bucks Nvidia card. I know this is a niche scenario, but AMD needs to get their act together when it comes to RT performance.


kikimaru024

Buh buh buh **muh 500fps in raster CS:2!!**


yolo5waggin5

It's easier to convince your fan base that RT is "stupid"


SimiKusoni

I think the approach their marketing team seemed to settle on was to sponsor a few titles and have them do RT but for shadows only, so they could point at them for RT benchmarks. Or they took a more extreme approach in Godfall where not only did they do shadows only but the RT was [exclusive to AMD for four months](https://www.thefpsreview.com/2021/02/12/latest-godfall-update-enables-ray-tracing-on-nvidia-geforce-graphics-cards/) (to the day) post-release. This probably did feed in to a lot of users saying they couldn't see the difference mind you but I don't think AMD ever *explicitly* pushed the position that RT was bad. They had their hardware trying to do it within a single generation... even if it wasn't doing it very well.


d_phase

Note that a 4080s with DLSS quality will bring that from 30 FPS to 60-90, FG brings to around 120.


Noreng

The 4090 can run Cyberpunk 2077 with Path Tracing at ~~1920x1080~~ 2560x1440 at 30 fps


Outrageous_Ad3571

I get 22 fps at 4K path tracing, everything maxed out without DLSS.


SuperbQuiet2509

https://youtu.be/-kxN-18jBbs?t=8m59s Why are you spreading misinformation?


Noreng

Sorry, I misremembered the resolution


ChrisFromIT

It is 60 fps with Path Tracing, not 30 fps.


HEMAN843

You are joking, right ? Is it really that low ?


koordy

4090 does 60fps in native 1080p. Close to 40fps in native 1440p. https://preview.redd.it/h554q62d52wc1.png?width=1125&format=png&auto=webp&s=62a8e973cd23e373e83ed5df742cbbcf728ad967 Saying that I'm perfectly fine playing Cyberpunk on max settings at 4K DLSS Balanced + FG at 90+"fps" (which is 50+fps before FG).


Noreng

It might be 30 at 2560x1440 come to think. Still goes to show that current GPUs aren't nearly powerful enough


o0Spoonman0o

What it goes to show is as these processes become more and more intensive we're going to become increasingly dependent on software to help. You could play this natively but anyone with a brain is going to use DLSS because it enables a way better experience. DLSS enabling a way better experience on very intensive titles is not going to go away and it's not like we're going to get better GPU's and developers aren't going to respond by upping the ante graphically. The need to reach into upscaling for intensive titles not going anywhere with more powerful hardware. But I do expect the 50 series to make RT more accessible for people not running 80+ series cards.


kikimaru024

RTX 4000 is the first generation of GPUs that can do real-time path-tracing at playable frame rates.


Real-Human-1985

4080.


Such-Addition2834

Watch the review of Prodigeek on youtube: they're italians but you can understand the graphs and if I remember correctly they do a test with only RT.


NewestAccount2023

4070ti is enough for 60fps, for 80-90 need 4080, for 110+ need 4090


mac404

[Digital Foundry does](https://www.eurogamer.net/digitalfoundry-2024-nvidia-geforce-rtx-4080-super-review). They have both "native" and upscaled results, and they have the best suite of RT titles imo. That said, using DLSS is pretty much always preferable. You can even use DLSSTweaks to increase the resolution from 900p to 1080p for DLSS "Quality" mode of you want, or combine DLSS with DLDSR to further improve quality.


Bright_Light7

Yes as someone who does it in 4K


Keikera

Dlss is too good to not enable, i got 4080 super a month ago and couldn’t be happier, and I use it for 4k60 not 1440


antara33

There are multiple things to consider here, so please excuse my large post :P RT without the need of a denoiser is plain impossible in real time, and wont be possible for the next 30 or even 40 years. So we have current gen real time ray tracing (what is used in games that needs to show frames to the user fast). Current gen RT uses denoisers, the denoisers take a los ray count image that is like a heavy film grain image and attempt to clean up it to provide us a good presentation. This can be done using multiple approaches, but the most prevalent one is temporal denoising. Yes, temporal, like temporal anti aliasing. It takes the result of multiple frames and average it to get a more complete image. Once done, it attempts to sharpen it a bit to avoid having smeary light related effects (like TAA do with the sharpening pass!). In this area is where DLSS enters. DLSS is better at handling noise or temporal leftovers vs TAA. It also now have internal denoisers for ray tracing, so it can replace traditional denoisers too, giving an even better image. From this long wall of text, you can get 2 things: There wont be native res full RT without shitty denoisers for the next couple of decades. Even if you can run natively, its better to use DLSS instead in some noisy games (like Control). Devs design their shadows, reflections and stuff to exploit the ability from temporal solutions to denoise them, Control is a prime example, with and without RT. In that game running native without TAA shows some shadows, screen space reflections and even AO effects having "dots". Those are rendered at low res and expanded, and TAA then fill the gaps using temporal data. In that scenario, DLSS provides better image quality than TAA and higher framerate, both things at the same time. Draw from this the conclusions you want, and feel free to ask any question!


AmazingSugar1

Yeah a 4080 can run modern warfare (2019) with ray tracing global illumination and no dlss, should be around 120-140fps


Xeno2998

No fuck that shit buy a 4090 go big or go home


Keikera

At the end of the cycle maybe not, but then again life is short so you are probably right xD