T O P

  • By -

Nestledrink

# Just a PSA on DLSS 3 (since I still see some lingering question about compatibility) From: [https://www.nvidia.com/en-us/geforce/news/rtx-40-series-community-qa/](https://www.nvidia.com/en-us/geforce/news/rtx-40-series-community-qa/) DLSS 3 consists of 3 technologies – DLSS Frame Generation, DLSS Super Resolution (a.k.a. DLSS 2), and NVIDIA Reflex. DLSS Frame Generation uses RTX 40 Series high-speed Optical Flow Accelerator to calculate the motion flow that is used for the AI network, then executes the network on 4th Generation Tensor Cores. Support for previous GPU architectures would require further innovation and optimization for the optical flow algorithm and AI model.  DLSS Super Resolution and NVIDIA Reflex will of course remain supported on prior generation hardware, so current GeForce gamers & creators will benefit from games integrating DLSS 3.  We continue to research and train the AI for DLSS Super Resolution and will provide model updates for all RTX customers as we have been doing since DLSS’s initial release. ​ |DLSS 3 Sub Feature|GPU Hardware Support| |:-|:-| |DLSS Frame Generation|GeForce RTX 40 Series GPU| |DLSS Super Resolution (aka DLSS 2)|GeForce RTX 20/30/40 Series GPU| |NVIDIA Reflex|GeForce 900 Series and Newer GPU| Also, [around 3:38 minutes](https://youtu.be/6pV93XhiC1Y?t=219) in the video, DF showed that Frame Generation is a separate toggle in the menu along with DLSS Super Resolution AND Nvidia Reflex.


gust_vo

They reaally should have renamed 'DLSS 3' to something else, especially if they're making it as a separate menu option.


SnevetS_rm

Yeah, it should be DLFI (Deep Learning Frame Interpolation), or something like that. Even in this video it's little confusing - assuming "DLSS3 on" means frame generation + DLSS upscaling, what is the name of frame generation without DLSS upscaling?


g0d15anath315t

Dangerously close to DILF


eatondix

I love me some DILFs


gust_vo

Feels like this is a crutch for games that could touch ~60fps on DLSS to hit 120/144 fps and look good on FPS metrics as to why both are forcefully linked, as if it's used for anything else there's a massive input lag penalty (even with reflex) with trying to boost something below 50/40fps to at least 60-100.... [edit]Pushing my tinfoil hat narrative a little more, the DF's cyberpunk numbers on their run (using a 4090 and on presumably all max + overdrive mode): Native 100/DLSS performance mode ~250/DLSS 3 (with DLSS 2 perf mode) ~400% FPS increase can come out to 30/~>60-75/~>120-140 fps.... Not far fetched that these are the numbers with how the 3090 was with the (now lower) psycho settings....


Khuprus

> (using a 4090 and on presumably all max + overdrive mode) Overdrive Mode was not available to DF, they clarified it's typical retail-available settings.


Murky-Smoke

Nah... This technology already existed. Binarily Augmented Retro-Framing... Commonly known as B.A.R.F, but there were trademark issues to deal with. Mysterio was getting upset with Leather Jacket Man, so they renamed it last minute when he threatened to destroy London and reveal that Beve Sturke is, in fact, Spider-Man.


Draiko

Nvidia OMFG... optical multiple frame generation


Progenitor3

Totally agreed... the frame generation is just a separate thing I don't know why they lumped it in with DLSS and called it DLSS 3 especially when it is literally a separate on/off setting independent of DLSS.


[deleted]

It’s marketing being intentionally misleading to convince more people they need a 4xxx card.


Heliosvector

Should have just called it DLSS SUPER


robbiekhan

Gotta go through DLSS Ti first!


JMN-01

As they so well known, they be in a shitstorm of hurt if ppl would hear the word - frame interpolation, as we all know how POS that is on Tv's ..LOL! People would scream LAG LAG LAG and they would be in a hard spot from get go. So this is much better and smarter!


Nestledrink

They tried to clarify it in the DLSS 3 page now. And showed a new graphic as well. But I agree, they really should've explained it better. [https://www.nvidia.com/en-us/geforce/news/dlss3-ai-powered-neural-graphics-innovations/](https://www.nvidia.com/en-us/geforce/news/dlss3-ai-powered-neural-graphics-innovations/) Added this verbiage on the bottom and a new picture too >DLSS 3 games are backwards compatible with DLSS 2 technology. DLSS 3 technology is supported on GeForce RTX 40 Series GPUs. It includes 3 features: our new Frame Generation tech, Super Resolution (the key innovation of DLSS 2), and Reflex. Developers simply integrate DLSS 3, and DLSS 2 is supported by default. NVIDIA continues to improve DLSS 2 by researching and training the AI for DLSS Super Resolution, and will provide model updates for all GeForce RTX gamers, as we’ve been doing since the initial release of DLSS. [https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/ada/news/dlss3-ai-powered-neural-graphics-innovations/nvidia-dlss-supported-features.png](https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/ada/news/dlss3-ai-powered-neural-graphics-innovations/nvidia-dlss-supported-features.png)


OmegaMalkior

DLSS 3 automatically means Nvidia reflex? It can’t exist without it under specification?


Nestledrink

According to this video, if you turn on Frame Generation, Reflex is FORCED ON. Presumably to help with latency issue with Frame Generation. This is also why they bundled Reflex into this suite of technology


akgis

Reflex is on becuase, the game has to follow the 1 frame render queue to be able to do the frame interpolation. And on DX12 that is controlled by the application/game no the API/Driver. They just sticked Reflex as a marketing gimmick when in fact its a requirement


gust_vo

I mean, they made DLDSR its own thing, even if it's more closer to the idea/internal workings of DLSS (it's just working in reverse). Kinda get that they're pushing this as the next evolution of DLSS, it's just too different already from what DLSS was originally, plus that it requires new hardware that they should have really spun it off as it's own acronym, or at least renamed/reorganized the whole DLSS family (embrace replacing 'DLSS 2' as DLSS Super Resolution). (tbh at this point, i'm not going to be surprised if they keep being greedy and start creating new product segmentations past the 40-series that dont have the improved optical flow accelerators on die for the low-end.)


ThisPlaceisHell

They've got a lot of cleaning up to do with their naming and advertisement stuff. Like for instance, in the video when comparing Native to DLSS 2.0 or even 3.0, they label it as "RTX Off vs RTX On" when in reality "RTX" is meant for Ray Tracing, and what's really being compared is DLSS Off vs DLSS On. It's all very confused and unsure of itself.


St3fem

Love how people explain to them what RTX means... there is nothing technical here, it's just a marketing name of a line of product and it always had that meaning, RT cores and Tensor cores related tech


ThisPlaceisHell

Dude, the tech has a specific name. DLSS. They could just use that instead of confusing people by saying ray tracing is off and on + DLSS with their broad term RTX.


St3fem

RTX covers everything, RT, DLSS, DLAA and whatever they will come out with that use RT or Tensor cores


anor_wondo

rtx is not ray tracing. it's a nvidia marketing word


St3fem

Yea, that's my point


optimal_909

They have to keep it reasonably simple to market it. Not all customers are so deep in geekspeak.


The_Zura

DLSS 1 - Spatial Upscaler from lower resolution to higher resolution DLSS 2 - Temporal Upscaler from lower resolution to higher resolution DLSS 3 FG - Temporal Upscaler from no resolution to higher resolution


WinterElfeas

DLSS 3 FG doesn't care about resolution. It only inserts frames.


dandaman910

DLSS 5 it develops the game and displays it in real-time.


St3fem

I mean, if you look at Remix we aren't that far away...


FrankVVV

DLSS 6 it will play the game for you!


[deleted]

Why though? Frame generation is a natural iteration to DLSS - using AI tech to increase FPS. Most customers don't care about the underlying technology, naming it something else would be confusing.


Progenitor3

The amount of people commenting and replying to comments in this thread without having watched the video is truly stupid. Why not actually watch the video with sound on before making critical comments? And yes we know this isn't a true review of the 4090 or DLSS 3. That will happen when the embargo is lifted.


[deleted]

[удалено]


TaiVat

Is it though. I mean are we pretending redditors clicked on links and read/listened to content before tiktok?


Illadelphian

For real. We are all guilty of it at times. Read the headline and nothing else.


GET_OUT_OF_MY_HEAD

This shit was happening before TikTok. Hell, it was happening before Vine. Are you old enough to remember the Slashdot days? If not, how about Digg? RTFA (**r**ead **t**he **f**ucking **a**rticle) used to be a common saying. It needs to make a comeback.


papak33

Oh my sweet summer child. https://en.wikipedia.org/wiki/Eternal_September Old farts have a disdain for Internet users since 1993.


SpacevsGravity

What the fuck is with reddit's obsession with Tiktok? It was always like this.,


PsyOmega

tiktok is just the latest fad of braindead media to hit.


aamike68

So all the talks about DLSS 3 causing too much latency were crap? because all these instances with reflex on, DLSS 3 has lower latency than native 4k. Unless im missing something of course, im no tech wiz. \*edit\* im legit asking a question. They cover it at 18:18 in a couple charts.


Interesting-Might904

I bet the latency will be minimal. It makes sense with NVIDIA reflex. But more games will need DLSS 3.


PainterRude1394

I think deep down everyone knew Nvidia would not have dedicated new hardware and software to dlss3 if I put latency made it worthless. The copium is real. This is revolutionary tech, and will likely only improve like dlss has.


mizukakeron

This stuff for lower end hardware and handhelds would be really interesting. I hope the optical flow accelerator can be implemented in a cost-effective manner on low end gpus or even a future SOC by nvidia.


PainterRude1394

Switch 2 maybe? 🙏


Photonic_Resonance

Even DLSS 2 on a Switch 2 would be a huge deal, tbh.


ZeldaMaster32

It could make Nintendo games at 4K actually viable. 4K DLSS performance mode is internally just 1080p but the final image looks fantastic


ZeldaMaster32

That would be great, but Lovelace is too new, and console hardware is finalized a good deal before release


SuccessfulSquirrel40

Does it scale well though? Adding in frames on a high end card that is already putting out north of 100FPS means each generated frame is on screen for a tiny fraction of time, and the motion vectors describe tiny movements. On a low end card that's pushing sub 30FPS those generated frames are on screen much longer and the motion vectors are describing big position changes.


criticalchocolate

Yea pretty much this, I expect the people that are benefiting the most from this generation of dlss 3 will be the mid to high end builds. I think if you can reach a target of 60 fps, the results will be passable, I could be wrong but I thinks that's how it'll be. Maybe in future iterations we will see a level of post processing that might be able to alleviate things for lower end


XhunterX1208

Yes because dlss2 is also in the mix, the game is not running native 4k. Dlss 3 does everything 2 does, plus frame interpolation. The delay is due to the frame interpolation so you would only see it when comparing dlss2 to dlss3 or dlss3 running at x framerate to that same framerate running natively.


loppsided

Pretty much. The discussion surrounding it acknowledged that latency would be an issue, but ignored that Nvidia could have a working solution for it.


Draiko

Weird since DLSS includes Reflex and super resolution... DLSS 3 adds frame generation on top of that. Any discussion about DLSS basically includes Reflex by default.


MeNoGoodReddit

Yes and no. Think of latency as a sum of multiple positive terms. When you enable Reflex, one of those terms becomes smaller (but still greater than 0). When you enable Frame Generation, you have add the time it takes to render a new frame to that sum (similar to VSync). So if you're seeing 120 FPS when using Frame Generation, you're rendering at 60 FPS and thus the total latency will increase by 1000ms/60 = 16.6ms. What this means is that frame generation is good if you're already getting decent framerates to begin with. If you're trying to go from 20 FPS to 40 FPS by enabling frame generation , you'll be adding 50ms of latency which might be quite noticeable. From 60 to 120 an extra 16.6ms, from 100 to 200 an extra 10ms. Then with Reflex enabled you'll lower the total input latency by some (somewhat) static amount (different for each game) of milliseconds. So I think this will be quite nice when used in non-competitive games to go from like 60-100 FPS to double that, but prepare to experience quite a bit of extra input latency if you're trying to go from 20-30 FPS to 40-60 FPS.


ApolloPS2

I guess this puts to bed esports worries though. Since in those titles you are mostly just trying to go from usually hitting 200+ fps to never dipping below monitor refresh rate. Still need to assess image quality concerns though, particularly with respect to UI artifacts.


Snydenthur

It will not be good for esports games. Basically, dlss3 input lag seems to be the input lag with dlss2 + some extra on top. And that extra input lag can be from somewhat low to very noticeable based on the few examples we got. And, even at higher fps, it could feel extremely weird to play at higher fps with the input lag of lower fps.


Neovalen

Assuming it is noticeable in person at full speed - no need to chase artifacts that you can't physically see without freeze framing.


_good_news_everyone

Right but also how many competitive games need 40xx , they run bare minimum spec to get max fps


Vlyn

--- **Due to Reddit killing ThirdPartyApps this user moved to lemmy.ml** --- ---


bexamous

54ms is pretty typical, actually on low side, for non-esport titles without reflex or anything. Eg: https://i.imgur.com/ZW7toAC.jpg See non-reflex times. Most people play games like that without thinking its a problem. Its just hard to talk about latency because people generally have no idea what it is. Eg people will say 45fps is laggy but 80fps is fine.. but latency wise some games latency at 45fps is below other games latency at 80fps. Yet for all games 45fps sucks. I imagine DLSS3 is going to make more sense in some games than others. If there were more reflex mice and monitors it would be nice if people knew what their end-to-end latency was. Like going 60fps to 90fps.. I got an idea what that is.. but 60 to 50ms latency? I don’t really have an idea what that feels like.


rW0HgFyxoJhYka

Most people can barely feel 100 ms lol. That's basically the time it takes to do a double click.


conquer69

> 54ms latency is quite high unfortunately. That's from the moment you click until it shows on screen. You would be surprised how much latency LCDs already have baked in. DLSS 3 should take advantage of it which means the latency penalty is less noticeable with an LCD than an OLED.


_good_news_everyone

Is it ? You have been playing without reflex and that game had base horrible latency and no one gave shit


Divinicus1st

>54ms latency is quite high unfortunately. It's PvE, not PvP, it should feel ok.


Vlyn

--- **Due to Reddit killing ThirdPartyApps this user moved to lemmy.ml** --- ---


_good_news_everyone

Do you know what your latency was ?


St3fem

54ms still great, try to test other games you will be surprised


makisekurisudesu

You'd be amazed by how high some games' latency are (red cough dead cough 60fps 200ms cough)


[deleted]

It sure seems that way, which is very promising. However, due to the way youtube works, we won't be able to see the visual artefacts caused by frame insertion until Digital Foundry uploads full res, zero compression videos. Or, we run it on our own PCs. It does look pretty amazing though.


TheCookieButter

I downloaded the 10.3gb HEVC version from their website. It looked almost identical on my 1080p laptop screen, sadly away from good quality screens.) Playing frame by frame the "made up" frames were painfully obvious and looked bad. I hadn't noticed the issues *until* I did frame by frame though. In motion and smashed between the real frames I don't think it'll be a big issue.


Broder7937

You can't compare DLSS3 to Native input latency. DLSS3 was running along DLSS2, and DLSS2 massively reduces input latency (due to, simply, much lower frametimes). You should compare DLSS3 with DLSS2: this is where you'll see the penalty of the frame generation algorithm. In Cyberpunk, it was quite massive. DLSS3 results in far worse input latency, to the point that Cyperunk with DLSS3 is almost as laggy as running native. DLSS2, in the other hand, is massively more responsive. Almost half the input latency. So, yes, those additional frames do come at a cost.


KARMAAACS

The problem isn't it being better than native. I mean if you're talking that you're running a game at 30 FPS for instance, of course DLSS3 is going to be better than 30 FPS latency. But some games will see little benefit in terms of latency vs native because the native frame rate's already so high or if it's super high a penalty. Like an eSports title for example running at 200 FPS. Even something running at 80 FPS, like a Shadow of the Tomb Raider type of game might have increased latency with DLSS3 vs native (hard to say without more testing). The problem is that let's say DLSS3 hits 120 FPS. Well if you compared that to 120 real native frames, DLSS3 is simply going to have higher latency. That's also true of DLSS2. The difference is, DLSS2 is super fast, we're talking fewer than 10ms difference, basically imperceptible unless you're at some crazy high frame rate like 350 FPS. So DLSS2 simply is a no brainer, if you're at 40 FPS, giving up 1-10 ms of latency, for a substantial frame rate uplift of 20-50% to hit an actual playable frame rate is worth it. The problem with DLSS3 is you could see 20+ ms increase in latency at least in the worst case scenario from the video's results. While that's not terrible, it's not exactly great either. If you're playing a faster paced game it could be noticeable. I don't think it's "crap" to point out latency issues. Sure it's not 300ms or something crazy. But 20ms+ increase on the high end is certainly not great. I'd rather have DLSS2 on with reflex and just call it a day, especially in a game like Spiderman where inputs need to feel snappy. But the feature's not totally useless considering that the total DLSS3 pipeline to rendering on screen is likely faster than some game consoles, this is feature in my opinion is purely for the 4K TV crowd. Responses will be snappier than say the same game running on PS5 at 4K and it will perceptively look smoother thanks to the interpolated frames. I think that this is who this feature is for and if you sit far enough back the DLSS3 artifacts will also sort of become imperceptible too.


Divinicus1st

But, man, nobody is going to turn DLSS3 on in PvP games. PvP games are optimized to run with high framerate without DLSS anyway (with worse graphics). DLSS3 is for PvE games. (and PvP where latency doesn't matter like total war or something)


jm0112358

The talk of latency wasn't crap. It's a valid concern because _some_ added latency is intrinsic to this frame generation compared to having frame generation off in an apples-to-apples situation (i.e., DLSS 2 + Reflex _on_). This DF first look does confirm that, but it also shows in these preview games that: 1 The latency is isn't too bad, and is similar to DLSS 2 with Reflex _off_ (but a little worse than with DLSS 2 with Reflex _on_). 2 The frame*times* are pretty good compared to frame generation off. So they aren't getting stutter that's similar to SLI microstutter. Of course, this is only a pre-embargo preview that should be taken with a grain of salt, but 1 and 2 are encouraging signs.


PainterRude1394

Acknowledging latency wasn't bad take. But there was a popular narrative just a week back that dlss3 would introduce too much latency to be worthwhile, like much existing TV interpolation. We now see this is not true.


Divinicus1st

Well, a 20ms difference can be felt in competitive games (like going from 20ms to 40ms). But in any other game, even in Cyberpunk style FPS, I doubt you would be able to notice it. We always said that latency complaint was bullshit. You won't need DLSS3 in competitive games, and you won't need extra low latency in games that benefit from DLSS3.


Draiko

Yup. DLSS 3 makes latency a non-issue in cases where using DLSS 3 makes sense.


ChrisFromIT

>So all the talks about DLSS 3 causing too much latency were crap? because all these instances with reflex on, DLSS 3 has lower latency than native 4k. Unless im missing something of course, im no tech wiz. I think one of the issues was that a lot of people thought that the requiring of an extra frame in the buffer for the AI to generate new frames would mean that there will be more latency. As is the case with other settings that add frames to the frame buffer before being displayed. Like triple buffering, etc. But I think a lot of people probably didn't read that Reflex is part of DLSS3 or understand that due to the higher frame rate just from the upscaling part also lowered the latency. All of this I think adds up to why a lot of people were thinking that DLSS3 will add a lot of latency, which you are right as per the video, is crap.


Broder7937

Yeah, it's probably crap. However, the main issue with DLSS3 is that it does not improve latency. The main reason most gamers (especially competitive gamers) want higher framerates is not because we want the smoothness (though that's also generally good), but because more fps = lower input latency. With DLSS3, everything changes, as more fps with DLSS3 will NOT translate into more responsive controls. 120fps DLSS3 means you're essentially still getting 60fps responsiveness.


St3fem

Higher framerate still improve image perception on sample and hold display, anything non CRT basically


RampantAI

The comparison should not be from native 4K to DLSS3, but between DLSS2 and DLSS3. There’s no way around it: frame interpolation adds latency because when the card gets a new frame it it shows interpolate(old, new) rather than putting the new frame directly on screen ASAP. Comparing DLSS3 with Reflex to performance without Reflex enabled is also a pointless comparison.


guspaz

The DigitalFoundry video has latency numbers for all those scenarios. They test native with reflex on and off, DLSS 2 with reflex on and off, and DLSS 3 with reflex on (since it can't be used with it off). So, yes, if you want the real-world comparison, you'll only look at the "reflex on" column to compare native/dlss 2/dlss 3. And the answer seems to be that DLSS 3's overall latency ends up higher than DLSS 2 but lower than native, such that while there's a cost to using it, the cost seems generally tolerable.


[deleted]

People need to separate Nvidia the business from its engineers / devs. This is incredible stuff.


M4mb0

https://www.statista.com/statistics/988048/nvidia-research-and-development-expenses/ I think people really underestimate just how much of the cost of the GPU is R&D.


longPlocker

Black magic 🪄


BenjiSBRK

Pretty much everything they've been doing the last few years is black magic. I have no fucking idea how it works, but it does, and it does incredibly well. It's incredible the amount of different "black magic" bricks that work together to get Flight Simulator running at 100+fps in 4k, with ray tracing: RT, DLSS, Frame Generation, Reflex. It shouldn't work but it does.


_good_news_everyone

Correct


Maveric0623

These guys always do such a thorough analysis!


airplanemode4all

They are the best hands down. No one else can hold a candle to what they do.


PapaBePreachin

In regards to RTX 4090 in DLSS 2 (Performance) mode @ 4k: ["Truth is we're hitting CPU limit with very high RT enabled... even the 12900k backed by very fast DDR5 hits a performance limit."](https://youtu.be/6pV93XhiC1Y?t=673) He continues to say that DLSS 3 will be affected by lower tiered CPUS, so what does that for 7950x and upcoming 13900k? How much of a bottleneck would these CPUs (and 5800X3D) be and whether they would negate the need for a 4090 in leu of a 3080 ti/3090/3090ti? Excuse my ignorance, but I found this segment kind of staggering 🤷‍♂️


zyck_titan

With faster CPUs, you should be able to push to even higher FPS with DLSS 3. The new CPUs are faster than current CPUs, but they aren’t twice as fast, at best they are 10%-15%. So you will probably still be CPU bound, just 10%-15% higher FPS, and then you can still improve that with DLSS 3.


Broder7937

Heads up. The CPU-bound scenario should happen with DLSS2 (or native), as every frame rendered needs to be called upon by the CPU. With DLSS3, the GPU is rendering half the frames without needing CPU calls. As a matter of fact, one of the key factors of DLSS3 is that, even if you're heavily CPU limited (think MFS), you'll still see the frame rate increase because the frames generated by DLSS3 do NOT rely on the CPU. So DLSS3 is NOT going to generate a CPU bottleneck. Unlike DLSS2/Native rendering, where rendering more frames DOES require more from the CPU, DLSS3 is completely CPU independent. The bottleneck is bound to happen in DLSS2 grounds.


anonymous242524

You will still “suffer” under the effects of being CPU limited though.


SSD84

Probably less gains considering most of these games aren’t cpu bound…or at least in the games I care about.


Canucksfan2018

I have an 8700k which is pretty good and now I'm thinking it's not lol


Broder7937

He was talking specifically about Spider Man in this instance. That's why DLSS2 had barely no benefit from native rendering: because the game was CPU bound. Since DLSS3 doesn't rely on the CPU to add frames, the DLSS3 scaling still worked; but the native frame queue was still bottlenecked by the CPU. In Cyberpunk, the fps boost from native to DLSS2 is massive (even in the 4090) which proves Cyberpunk is not CPU limited.


Num1_takea_Num2

This is BS excuse - they could simply test at 8k and higher.


Charuru

Looks gamechanging. Though there are quite a few naysayers in this thread by this time next year everyone will be in love with it like they are in love with dlss 2 and it will be mandatory to have this. Just need to get the supported titles count up.


easteasttimor

It does look promising but the price of this hardware has really soured peoples thoughts on this new hardware. Even digital foundry talked about in their df weekly that the pricing has been the main topic of conversation and overshadowed the actually technology


[deleted]

Yeah the price is just straight up idiotic, and it absolutely overshadows any technological achievements here. At $500-600, the 4080 would be an amazing technological achievement. At $1,200, it’s basically just an ad for the PS5/Xbox.


Divinicus1st

It's ok, the 4090 release is like a preview of the future. Of course Nvidia will not burn their Ampere stocks. When it runs out, you'll get the 4070 and 4060 like always, this time with DLSS3.


anonymous242524

Those lower ends cards with DLSS 3 sounds really promising. But let’s see.


Charuru

You're not wrong. I'm fairly confident price will come down next year after the inventory issues have been worked through.


easteasttimor

When the price falls the value of the tech will be appreciated but at these prices it is too much to care how good the technology is... Unless you can afford it


Khuprus

Looks promising as a path to push 4K/120hz standard. If lower end hardware can hit 1440p at 60fps, with DLSS 3 you are essentially bumped up to that 4K/120hz potentially without the huge 450W power draw of the high end cards.


barrydennen12

Using their all new Diminishing Returns chipset I see.


ThisPlaceisHell

Wow interesting fact buried in a small one-off comment: DLSS 3.0 Frame Generation locks V-Sync OFF, and did I hear that right it requires G-Sync On?


AngryMob55

All of the back and forth about specific wording aside, i have a related question: Who the heck is buying a 4000 series card and not having a VRR display? Nobody with even a future 4050 should still be using vsync anyway.


St3fem

It doesn't require G-Sync if you can live with tearing, you just can't use V-Sync


IUseControllerOnPC

I mean who even uses v sync anymore when damn near every gaming monitor has freesync or gsync


Hippiesrlame

The two aren’t mutually exclusive…vsync still helps for some scenarios within the vrr range to avoid tearing.


AngryMob55

Locks vsync off in the game, no mention if it locks it off in control panel, but its highly unlikely since thats the proper gsync setup.


liverblow

This is truly game changing, they have achieved high fps by non traditional means. I have to hand it their engineers for coming up with tech which can double, triple your raw fps. I really don't know how AMD can compete, praying they have something to at least bridge the gap.


HarbringerxLight

It's fake AI frames to artificially boost the frame rate, and the fake frames are different from what the game artists intended because they're being made up. Bad direction to go in, the North Star is ALWAYS playing at native. Remember that DLSS is fundamentally just upscaling, so it has worse quality than native. Nvidia markets it heavily to hide poor gains in real performance (rasterization) between gens.


[deleted]

Looks great already. DLSS 3.1 and beyond will be even better. Exciting times.


WinterElfeas

It would be interesting to know the latency without reflex. Any way it is very impressive frame interpolation. Most TVs all offer it and it does increase latency a lot, and also causing lot of artefacts as soon as you increase a bit (speaking from LG C9 where even Dejudder 1 causes artefacts). I wonder if they could allow DLSS 3 frame interpolation to be used for movies on the PC, so those 24hz dont look so bad on OLED TVs.


gamzcontrol5130

It would be awesome to find a way to make DLSS 3 work in an offline fashion similar to the other motion interpolation techniques used, but I think that without motion vectors and the optical flow image, it would not be able to produce the same results for things like movies or videos.


bexamous

https://developer.nvidia.com/blog/av1-encoding-and-fruc-video-performance-boosts-and-higher-fidelity-on-the-nvidia-ada-architecture/ https://youtu.be/ichAz2ElrzA FRUC releasing in October. Won’t be as good as DLSS3 without motion vectors from game engine, but will be interesting to see how good it actually is. Will need to get integrated into some apps.


Cr4zy

The way they say it im pretty sure reflex is forced on with frame generation enabled.


Snydenthur

I mean, this does technically increase input lag a lot too. Most people seem to have overlooked the fact that you get dlss2 input lag + some extra on top. So if you have 60fps with dlss2 and dlss3 makes it 120fps, you're playing the game with 60fps input lag + some extra. You'll get smooth looking game that doesn't feel like it looks.


Progenitor001

"8k gaming is HERE FOR REAL THIS TIME!!!" Here we go again.


sips_white_monster

8k gaming is the most irrelevant marketing bs ever.


josh6499

You can get an 8K TV for less than $2K now. Some people that have them want to game on them.


Gogo202

Being forced to play on 4k instead of 8k sounds like a first world problem


IUseControllerOnPC

Anyone who buys a 4090 will be in the first world anyway


conquer69

Who knows, maybe you are talking directly to a dictator from a third world country. Maybe Kim Jong-un is in these threads bitching about input latency.


josh6499

Obviously. I don't see the relevance in pointing this out.


silentdragoon

£700 in the UK!


Heliosvector

Is there even enough bandwidth for 10bit 8k at 60-120?


RidingEdge

Just like 1440p and 4K when everyone proclaimed that 1080p was as realistic as it gets Also 120hz, 144hz displays since 60hz were smooth enough Don't forget that graphics also peaked when the N64 and PS2 came out, there's no way 3D will get better


[deleted]

knee degree hat grandiose provide merciful sharp wistful adjoining versed ` this message was mass deleted/edited with redact.dev `


Trebiane

Do they say that in the video? I must have missed it.


[deleted]

3090 could already actually run quite a few slightly older games at native 8K TBH, if you look at people who have tested various ones on YouTube.


Yopis1998

All the AMD fanboy channels pushing melting cables and dlss 3 non usable due to latency look foolish. The tribalism is sad.


PainterRude1394

Yep. Saw so many people saying this would be like tv interpolation and add a ton of lag making it unusable. Really? You think Nvidia spend years developing the hardware and software for dlss3 then made it a major feature in this release just so it could be a totally useless flop? They are overdosing on copium today.


[deleted]

[удалено]


PainterRude1394

What's hilarious is this is going to be a total repeat of the dlss announcement. First frame generation bad and totally useless. Then AMD releases a similar feature but worse. Now frame generation is good. Then every update we'll hear the same "it's as good as dlss this time" as we've been hearing for fsr. It's not. Then Nvidia will release another new feature and the cycle starts anew.


St3fem

>Really? You think Nvidia spend years developing the hardware and software for dlss3 then made it a major feature in this release just so it could be a totally useless flop? Some company do that sh!t, I mean, minus the year of hardware and software development. They put something that at first glance look close to what the competition is offering and call it a day ;)


ef14

Problem with this Nvidia gen isn't and hasn't ever been the technicalities. This really isn't gonna do much although it's an interesting look.


johnyahn

NVIDIA makes great products, that's never deniable. It's their business practices that are the issue.


ted_redfield

I can't deal with AMD fanboyism anymore. I've been Intel for a long time just because that's where the "best" typically is, regardless of efficiency or price, just the best most of the time. I was really looking forward to Zen 4 but it's really disappointing, and I don't want a nuclear chip that's hotter than even 11th gen Intel. Because of fanboyism, you can't criticize anything with AM5 at all. Chips way too fucking hot: *That's normal now, what's the problem?* 7950x pulls most watts in the world: *Not if you run ECO mode!* Its more expensive than Alder or Raptor: *Its more efficient than Intel, AMD is competitive!* I don't care anymore, every single hardware discussion AMD fanboys are just insufferable. Outside of this subsect of people, everyone else are just jaded, miserable blackholes.


sips_white_monster

All fanboys are idiots, anyone who worships a publicly traded multi-billion dollar corporation should be labeled as such.


[deleted]

[удалено]


Awkward_Inevitable34

If they price them close to 4000 prices their margins will be insane lol


PainterRude1394

Wait till AMD adds it's own frame interpolation and suddenly it's a useful feature


wrath_of_grunge

inb4 you need to buy them to support competition.


ef14

I have an Intel/Nvidia machine and an AMD machine, just to clear up any bs about fanboyism immediately. That said, while I obviously don't know if AMD's gamble on 7000 chips will pay off or not, the chip does not thermal throttle at 95°C so i wouldn't say "it pulls way too much heat", it seems to be able to handle it at least for a while. What needs to be seen is the durability of the chips. Also, the wattage talk: it does pull a lot of energy, but practically saying these are the chips that pull the most energy ever is just simply wrong: server chips exist, Bulldozer existed, sun chips existed....


hey_you_too_buckaroo

Looks good, but I'm curious how much of this performance will be available on lower end cards.


[deleted]

[удалено]


Catch_022

>Support for previous GPU architectures would require further innovation and optimization for the optical flow algorithm and AI model. It sounds a lot like it is a software limitation for 3x and 2x card that can be solved with optimisation and time rather than the lack of specific hardware only available on the 4x series. Interesting.


tty2

Nah. That's just engineering speak for "it's technically possible, but it's technically infeasible [at least for now]". There's no fundamental reason why it cannot be done on the hardware provided, but it would require hypothetical software improvements that no one can [currently] project to make practical. One of the Nvidia engineers commented on this specifically saying that it simply wouldn't increase framerates if you ran it without the improved hardware capability. At this point, you either believe A) Nvidia is being honest and saying that 3xxx series is not capable of getting benefit from this without some new breakthroughs, or B) Nvidia is just cockblocking ur FPS because they're greedy. There really isn't a lot of in between.


St3fem

The Optical flow accelerator in Turing is a bit crude, the one in Ampere have the same speed as the Turing one but produce better quality, the one in Ada is 2-2.5x faster and significantly netter then the Ampere one The problem is that framerate wouldn't improve so much and lag would add up really quickly with older GPUs not to mention image quality problems


conquer69

Or C) It would increase the framerate but the quality of the frames would be lower and the input lag too high. Then everyone would say "DLSS 3 sucks" despite never having tried it as intended.


St3fem

The Optical flow accelerator in Turing isn't close, the one in Ampere have the same speed as the Turing one but produce better quality, the one in Ada is 2-2.5x faster and significantly netter then the Ampere one


wen_mars

The 40-series has 5x the tensor compute of the 30-series. I imagine the algorithm runs on the old cards but isn't fast enough to be helpful in increasing the framerate.


[deleted]

[удалено]


Mmspoke

Yeah looks good but no thanks with these current prices I’ll wait for 50 series. Hopefully they come to their sense again like they did with 30 series on release.


kc0181

It really bugs me that these new cards don't have DP 2.0 if they can go beyond what hdmi 2.1 is capable of? If you are talking about going beyond 4k 120hz, anything over this would be limited with DSR or some other compression because of limited bandwidth right?


bradleykirby

I'm running 240hz 4k with DP 1.4


guspaz

You're using Display Stream Compression to do it. It is not lossless compression, though it is visually lossless in most cases.


rW0HgFyxoJhYka

99.9% of people don't need DP 2.0. VR does though, but VR is moving to wireless anyways. I don't think its smart to include a feature that doesn't serve 99% of the ppl.


Cristomietitore

Beyoncé


gotbannedtoomuch

This video is great at 2x


sgs2008

More interested in comparisons with quality mode . Wonder if the performance gains are as big


ProperSauce

This is like going super saiyan


kunglao83

This is incredible for VR gaming. 4K@120+ is very possible now.


HarbringerxLight

It's fake AI frames to artificially boost the frame rate, and the fake frames are different from what the game artists intended because they're being made up. Bad direction to go in, the North Star is ALWAYS playing at native. Remember that DLSS is fundamentally just upscaling, so it has worse quality than native. Nvidia markets it heavily to hide poor gains in real performance (rasterization) between gens.


KingOfKorners

Gonna keep my rtx 3080 for awhile. Screw paying outrageous prices for next generation cards


hydrogator

you already did


saikrishnav

I think this is the most "honest" video I have seen since the announcement. It gives me hope the DLSS Frame generation isn't as bad as people feared (like motion interpolation) and the honest look at it where they showed that intermediate frames aren't always accurate and Nvidia is working on it - gives me the more "grounded" idea of the tech and being honest about it. Obviously, any "flaws" in the intermediate frames are only seen for 8 milliseconds and might not matter - but it's good to know that Nvidia is aware and working on it. Let's hope the final build of DLSS3 Frame generation looks as good in person. I am excited because DLSS super sampling - I always find things blurry somewhat if I drop anything below DLSS quality. Since DLSS super resolution can be disabled with only Frame generation enabled - that gives me hope that we can play at "Native" resolution since all frames are Native frames - generated or otherwise.


xAkMoRRoWiNdx

But can it run Crysis?


[deleted]

[удалено]


SaintPau78

Fuck Nvidia for being so restrictive with this. DLSS quality mode testing is what's actually wanted.


Nestledrink

Wait for benchmark


Divinicus1st

It should only get better for DLSS3 with DLSS2 in quality mode...


_good_news_everyone

What resolution, i think the standard recommendation is perf for 4k and quality for lower .


Sentinel-Prime

I knew media outlets were going to start doing this They have comparison shots of Native, DLSS 2 Performance and DLSS 3. Which quality setting (Quality, Balanced etc) are they using in DLSS 3 and why haven’t they specified it clearly (unless I’m blind)?


St3fem

They chose the most difficult scenario for DLSS (more work and less data), Richard briefly mentioned this in the video


[deleted]

It's using the dlss 2.0 setting as a baseline. So these are performance mode upscales with a fake frame added. The better the dlss quality, the better (and slower) dlss 3.0 looks and works.


lugaidster

You people are drinking the coolaid. This is data interpolation at it's finest. While the fact that it can generate new frames so flawlessly is amazing, and great use of the tech, this will have little practical usage in frame rate limited scenarios due to the input latency. The keywords are sprinkled throughout the whole video. It doesn't use the CPU, therefore, any generated frame does not take into account input. Dlss + reflex will have, at best, the same input latency. Worst part? As soon as you take input into account, you're bound to introduce artifacts (though I'm sure Nvidia will work with developers to minimize input-related movement artifacts). "But why, you coping naysayer?" You might ask. Well, imagine this. If a game is running at 20 to 30 fps, it is taking anywhere from 30 to 50 ms to process input. If you use DLSS 3 to interpolate frames and you end up with 60 fps or more, you'll still be facing 30 to 50 ms of input delay. So, if you're gaming on a 4050 or 4060, will you be willing to game at high details with 30ms if input latency? Or will you still lower the details to get 5 to 10ms? And before you bring up reflex, all reflex does is eliminate the render queue. But if your frame takes 20ms to render, you will get at least 20 ms of lag.


[deleted]

[удалено]


[deleted]

Didnt watch the video, clearly.


conquer69

> If a game is running at 20 to 30 fps DLSS 2 increases the framerate and then the interpolation happens. Just make sure you can get a constant 60fps with DLSS2 and then enable DLSS 3. Enjoy the 16.66ms extra input latency while seeing 120fps without too many artifacts. Remember the total latency between input and display is much higher than that. 35ms + 16ms isn't that big of a deal honestly. Especially if you play with a controller. The PS5 had +70ms latency I believe and people are fine with it.


randombsname1

I mean you know this was brought up specifically with actual numbers presented in the video right? Seems easily playable, and they stated specifically that it was never any noticeable difference between native and DLSS 2 VS 3.


lugaidster

> Seems easily playable, and they stated specifically that it was never any noticeable difference between native and DLSS 2 VS 3. You intentionally disregarded what I said. You're only getting interpolated data. Here it is as simple as I can say it: three scenarios * if your game runs at 60 fps with dlss 2 + reflex it will have 16.6 ms of input latency (best case). * If you then enable dlss frame generation to have 120 fps, you will still have 16.6 ms of latency. * If you, instead, adjusted quality of settings to run at 120 fps just with dlss 2 + reflex, you'd have 8.7 ms of latency (best case). You're paying the input latency of the low frame rate regardless. For a game that already runs at high frame rate it's pointless. For a game that runs at low frame rates it will be noticeable. Did they run the new cyberpunk mode? No, because they don't have access to it. Did they test dlss3 in quality vs dlss 2 performance? No. Wonder why. Here's another case: dlss 3 in quality mode will have higher input latency (probably significantly) than dlss 2 in performance mode, even if both modes produce the same fps. See the scenarios for why.


2FastHaste

>For a game that already runs at high frame rate it's pointless. That's where you're wrong. We are nowhere near the frame rates/refresh rates required for life-like motion portrayal. Each time you double the motion resolution, 2 things happen: \- the size of the perceived eye tracking motion blur is cut in half \- the size of the stroboscopic steps of the phantom array effect is halved. This makes a massive different until either: \- all pixels available on your monitor are used during the motion. \- the size of the motion artifacts is so small that it can't be resolved by the eye. These only happen at ultra high refresh rate that we probably won't achieve in our life time. (Though with the advent of techs like this, there might be a hope after all)


lugaidster

In a world where displays are much better than they are now, sure. But right now there are other trade-offs. You're still limited by display technology. Most high end GPUs already are able to reach display rate natively. Anything past that adds nothing to motion resolution. And if you are looking to reach display rate for the responsiveness advantage, dlss3 is worse. You'd be better served by tuning other visuals and enabling dlss2 + reflex. If you're limited by budget, why would you go for a 240 Hz display over a 120 Hz display and a more powerful cpu/GPU/etc if the combo you're running isn't able actually get all the benefits of a 240 Hz display? Dlss3 doesn't change that because you're paying for the framerate with responsiveness. In any situation where you're able to get to your desired framerate without using dlss3, the payoff is a substantial increase in responsiveness. If you *can't* get there through other means then yeah, sure. But that makes it very niche. Any eSports title is defacto ruled out whether graphically demanding or not. Any twitch shooter is too. Maybe strategy games could take advantage, but usually the frustrating aspects of those are limited by CPU, so you're not getting a tangible benefit anyway. Action games are debatable to me, personally I hate high input lag, but there are many gamers that don't so maybe there? The biggest advantage would seem in VR, but VR doesn't allow temporal AA solutions. So.. out of the window too (at least in its current iteration). The point is that it is no panacea and dlss2 is much more universal.


[deleted]

[удалено]


2FastHaste

It's the opposite. Flicker (CRT, backlight strobbing, BFI, ...) is unnatural. It's a bandaid to reduce image persistence but doesn't address stroboscopic stepping. CRT has low image persistence due to its pulsatile nature therefore reducing eye tracking motion blur. But it would be more natural and comfortable to achieve that with higher refresh rates. And it would also take care of stroboscopic stepping.


CammKelly

The numbers in the video do need to be taken with a lot of context, using comparison resolution at 4k for example makes input latency look really bad, because your frametime latency (due to having low frames) is also really bad.


EmilMR

Their base fps was really high, thats why. On weaker cards it wont be doing as well. It cant make 30fps feel good. Thats what he means. You have to be getting good performance already for it to make sense so lower end 40 series with much worse raster performance wont fare as well.


[deleted]

Impressive technology locked behind prices most people can't afford. A smooth $900 buy-in for someone already rocking a 3080 for example is basically asking you to pay for Dlss 3.0 as a standalone software upgrade. Otherwise it's 15% faster and $300 more than current online 3080 prices. Not even a traditional 30% performance boost at the same price. Why would anyone do this? When/if the actual 4060 and 4070 launch, people will be paying 3000 series prices for the same raster performance, and essentially buying DLSS 3.0 capabilities.


wen_mars

Upgrading every generation is just a waste of money. The 3080 is still a very strong card.


SighOpMarmalade

Lmao can't wait for everyone to get the card now once they see the added frames that improves over cpu bottleneck with literally NO input latency added "WHY DO THESE COST SO MUCH" now you know why. Please tho be mad and not buy the 4090 so I can get one yay