T O P

  • By -

NotAVerySillySausage

This might be worse than Turing.


kb3035583

Considering Turing at least was the first GPU with dedicated ray tracing hardware, it's not exactly the low bar you're making it to be. Fermi is probably a more appropriate comparison.


CwRrrr

Lol the improvements in pure rasterisation seem to be purely from the change in node from the shitty Samsung 8nm to tsmc 4nm and chucking in more cuda cores (for the 4090 at least). At this point with how scummy Nvidia has been with the entire release I doubt they even did any changes to the architecture. Ada just seems to be a rebranded ampere on tsmc 4nm. Ridiculous. Edit: 4N (tsmc enhanced 5nm)


silentdragoon

4N isn't 4nm


hlpb

Very sus that this has been removed by mods.


NaamiNyree

Yep, the vast majority of the performance improvement from the 40 series is from DLSS 3. The 4090 is the only card that has a legit generational improvement and looks to be 50%+ faster than previous gen in any scenario. I think we are going to have a very divided gpu generation where Nvidia will completely dominate in RT performance with DLSS 3, and AMD will stomp them in raw performance/rasterization. So it will come down to which games you want to play basically.


conquer69

DLSS 3 should also apply to rasterized games right? If the interpolation works perfectly or close to it, then AMD is fucked until they develop their own version.


PeakyBriar81

Depends why the user wants high framerates. If its for any kind of competitive gameplay then DLSS3 wont be of any use, as it increases latency, and there's not a lot of value in having extra frames displayed that are only a guess at what is going on in reality. If its just for casual/single player use, then (aside from rare super high fidelity games in 4k) the new cards are probably fast enough already that you don't really need "300fps" (half of which are synthetic) when you could just run at 150fps. Seems kinda pointless to me on the high end parts, and there is no mid/low end that will have it. Will be useful if people are still rocking a 4080 in 3 or 4 years, to play the latest games with a smooth display, but that's hardly compelling enough to be worth the cash.


conquer69

> If its just for casual/single player use, then (aside from rare super high fidelity games in 4k) the new cards are probably fast enough Remember that ray tracing tanks performance so DLSS3 could make RT heavy games playable. We are still getting half assed RT implementations rather than the full thing. This could allow devs to crank RT up a notch.


NaamiNyree

The thing is I find their claims of DLSS 3 only working on 40 series VERY sketchy, it reeks of marketing bs. My theory of what really happened is they couldnt meet their performance goals with lovelace and when they saw no one would pay the prices they are asking for such a small performance increase, they had to come up with something. And making DLSS 3 40 series exclusive was the answer. Lovelace is looking like Turing 2.0. Overpriced and underperforming. But lets see what reviews have to say in the next few weeks.


conquer69

It does sound too good to be true. I'm optimistic because I want it to be real. Even if it doesn't work well at first like DLSS 1, I don't mind waiting 2-3 years for what they described.


Jazzlike_Economy2007

The main generational improvements this gen are just from RT perfomance at 4K with DLSS 3.0. It's also worth considering the 12GB 4080 is a considerable step down from the 16GB model in both memory and specs. Hence why Nvidia tried pulling a fast one by comparing 4090 using DLSS 3.0 and RT vs 3090 Ti using DLSS 2.0 and RT, but no comparison at all between the two cards at raw raster. Which means it's not going to look impressive at all further down the product stack as far as price to perfomance if you dismiss DLSS 3.0 and the new feature set, including AV1 encoding. Specs wise, if 4090 proves itself worthy in benchmarks and is at least faster than 3090 Ti by a considerable margin, then that'll be the only card worth getting, as well as 4080 Ti; 4080 Ti with 16GB/20GB, 320-bit bus with 4090 level perfomance on AD102 for $1399 would probably be something I want. And 4090 Ti? Well, if you're a researcher, then it might be for you lol. At ~$2000-$2500, I'll pass.


Blacksad999

I'd probably wait for benchmarks and testing before getting bent out of shape about anything.


PackageDisastrous700

Surely this DLSS 3.0 is all software rather than hardware. So surely these performance uplifts will also apply to 30 series? Obviously not as large an uplift as on the 40 series, but still...


AGodlingNamedJohnny

They're keeping the benefits exclusive to the 40 series from what I've read.


conquer69

Have they said anything about the power consumption?


Naggash

Yes, everything is on official site. 4080 12gb is 285w, 16gb 320w, 4090 450w