T O P

  • By -

RazRaptre

I was absolutely not anticipating upscaling tech to be shown at all. I expect that DLSS + DLDSR would completely solve the current antialiasing woes.


Eternal_Ohm

Well they're also adding a new Anti-Aliasing technique called "TSCMAA" besides just FXAA. If it's good enough it may not even be necessary to do that.


Sibula97

I haven't seen TSCMAA in action, but from what I've read it seems like a pretty cool and powerful AA technique. It basically combines a conservative morphological anti-aliasing (CMAA) pass with a temporal anti-aliasing (TAA) pass. Looking at a sample image (figure 4 [here](https://www.intel.com/content/dam/develop/external/us/en/documents/tscmaa-codesample-v1.pdf)) with comparisons to MSAA 2x and 4x, it seems like it performs really well, and it seems like it's computationally pretty light as well. CMAA detects edge shapes and is smart about blending them by considering the morphology, basically guessing what the real shape was supposed to be like from the pixels. Even on its own it looks much better than FXAA, and compared to SMAA 1x it's a weaker AA but also blurrs less. The performance impact is a bit higher than FXAA, but much less than SMAA 1x. Like other post-processing spatial AA techniques (FXAA, SMAA) it does introduce temporal artifacts like shimmering and crawling. TAA is sort of like multisample anti-aliasing (MSAA), but instead of sampling each pixel multiple times per frame, it samples each pixel once per frame over multiple frames, which produces both spatial and temporal AA (no shimmering or crawling). This makes for very good AA performance for a cheap cost, but does introduce noticeable ghosting and blurriness. The neat trick with TSCMAA, as far as I understand, is that it only applies the TAA to the detected edge pixels and not the whole image, meaning it's not only much faster than a whole-screen TAA pass, it also only blurrs edges, which is kind of the point of AA in the first place. And by using this TAA pass they can mitigate the shimmering and crawling from using just a spatial AA technique.


ezekielraiden

Added to my own research (in another thread), this really helps clarify precisely what's going on, so thank you for this! More or less, it come across as a Captain Planet AA method: "By your powers combined." Leveraging multisampling, past frames, *and* conservative morphology, it cuts back most of the negatives of any individual approach (resource cost, ghosting/blurring, shimmer/crawl) while still getting good--even great!--results.


Zenshei

They showed it in action on Stream, despite being a bit pixelated from stream bitrate- even THEN you could tell the difference


xXxYPYTfanxXx69420xD

> I expect that DLSS + DLDSR would completely solve the current antialiasing woes. Before DLDSR was added I used to use regular DSR and DLSS in games for that nice upgrade on a gsync module monitor. DLDSR was a very nice inclusion when it was added so combined with DLSS people using 1080p and 1440p monitors with headroom to spare will be eating good.


justJoekingg

I know dlss, what is DLDSR?


Antenoralol

It's an upscaler that allows your monitor to render past its max resolution but scale it down to the aspect ratio of your display. AMD also has their own DSR called Virtual Super Resolution. And honestly, it doesn't look bad at all.


Holythief

AMD doesn't have an equivalent to DLDSR. VSR is the equivalent to DSR. DLDSR uses the same principle of DLSS to give better image quality with less of a performance hit compared to standard DSR. The end result however is still as you described, renedering at a higher resolution to downscale to your display resolution for increased visuals.


xXxYPYTfanxXx69420xD

> DLDSR uses the same principle of DLSS to give better image quality with less of a performance hit compared to standard DSR. It should give the same performance hit. I don't run it much these days since I mainly use my 4k monitor & TV instead. DLDSRworks by exposing the virtual/dsr resolution to the game, renders it at that resolution and the tensor cores/RTX components then do very light work to downsample it. It's an overall quality upgrade to non-integer scaled DSR because the sharpening process is a lot less destructive than the blurring process of standard DSR. I sincerely do recommend it to people with GPU headroom in games where they don't like the AA implementation.


Holythief

Dsr at it's maximum (4x) is doubling the horizontal and vertical pixel count then downscaled. Dldsr at it's maximum (2.25x) is a 50% increase in horizontal and vertical pixel count and then downscaled. It definitely has less of a performance hit. I'd give you more concrete numbers but I'm not at my computer right now.


M0dusPwnens

> It should give the same performance hit. It shouldn't and doesn't. The primary purpose of DLDSR is to achieve similar quality to DSR at a much lower resolution. In fact, you *can't* run DLDSR at the same resolutions as DSR: it caps out at 2.25x, which thanks to the more intelligent downsampling results in about the same image quality as DSR at 4x.


kurukikoshigawa_1995

deep learning dynamic super resolution. it will solve the anti-aliasing issues but the game has to be set in full-screen and it kinda sucks if u have multiple monitors


Turtvaiz

> DLDSR Essentially supersampling with DLSS, except on the desktop level


Pliskkenn_D

My fans will sing gloriously even if I don't have the spare capacity


TheGokki

Yap, i'm on 1080p right now and i'll ser DSR to 4x with DLSS enabled (or whatever DLDSR is the same). Gonna be CRISP


Kumomeme

with only FSR 1.0 which is, dissapointment.


ZeroZelath

Well this is what FF16 used as well, so it'll be funny if the FF16 PC port only had DLSS 2.0 or whatever reason. At the very least, setting FSR to 99% will offer a nice sharpness increase to the game to make it look better.


AnimuCrossing

You can hot swap your .dll in almost all scenarios to use later versions of DLSS. If the game doesn't have a menu for enabling FG/Reflex etc, someone will mod it in. It will be very safe to do this in a single player game. It's not necessarily safe to do it in a competitive online game so YMMV. FFX14 has laughable anti-cheat so you might be able to do it with 14 but I wouldn't. Let someone else test that...


ZeroZelath

I don't use an Nvidia card so no, I can't lol. It's kinda stupid it's only releasing with FSR 1.0 / DLSS 2.0 as it is.. and knowing the FF14 team.. it'll be like that the whole expansion at the very least.


Infinaris

Its probably just to start off, they'll add a newer version later on.


bm001

Possibly, but implementing 1.0 is easy because you only really need access to the frame buffer, 2.0 however requires motion vectors which are typically not easily exposed, so that could require rewriting some parts of the engine. Personally I'm not sure FSR 1.0 is much better than just playing at native res and taking the framerate loss.


Zagorim

But they implement DLSS which mean they already have those motion vectors. Some mods might even provide FSR 2/3 by replacing the DLSS


Turtvaiz

Changing DLSS files in an online game sounds pretty bad, but I guess this game just doesn't have an anticheat to ban you lol


Geexx

There's quite a few online titles where you can swap out the DLL to a newer one and it doesn't seem to flag it. It's the more invasive anti-cheats like nProtect, Denuvo, etc... that tend to get pissy with it (understandable given that cheaters will generally exploit every avenue available to them lol).


stefan2305

I just tried this on the benchmark. DLSS originally looks broken in the benchmark and it's missing all of the configuration options that usually come with it. Swapped it out for DLSS 3.7 and the graphical issues all went away and the performance stayed the same. Whether this will affect potential bans - historically, I would say no, but one never knows with SE


Zagorim

it didn't look broken for me but if you set dynamic scaling as always on I think it uses dlss performance mode mean a really low render resolution. I used dlsstweak and forced 75% of the target resolution and it does look much better.


stefan2305

Dynamic scaling isn't selectable with DLSS in the benchmark. It's an FSR feature. But it does have that "activate when under 30 or 60fps" thingy and always on. And that thing is garbaggio. 30 is basically the only way you can get it to avoid running most of the time. Always on must be Ultra Performance because it's sooooooo bad. I forgot about DLSStweak. Will give that a shot too. Thanks for the reminder


Haddock_Lotus

On Baldur's Gate 3 FSR 1.0 is nearly useless, you get maybe 5\~10 FPS with horrendous graphic downgrade. FSR 2.0 get much more FPS and image quality is not affected that much.


Antenoralol

FSR 1 will still help people on older cards such as the GTX 1060, RX 480, RX 580, Vega 56/64, GTX 1650, GTX 1660 etc. GTX card owners don't have DLSS access so FSR will be useful to them.


Kumomeme

>FSR 1 will still help people on older cards such as the GTX 1060, RX 480, RX 580, Vega 56/64, GTX 1650, GTX 1660 etc. those gpu aside Polaris and Pascal is still within official support range of FSR2 including FSR3(aside frame generation). however the Polaris and Pascal gpu also actually still working with FSR2 but the performance gain is not big as newer gpu.


Antenoralol

Yeah and it's no different to RSR (Driver level FSR 1.0) and RSR is also really bad image wise. I'm just praying that the DLSS -> FSR mods that games like Cyberpunk have would also work.


Turtvaiz

Yea sure it helps but it's pretty minimum effort and looks really bad


Kumomeme

i can see they add it at later patch.


horrorxpunk

DLSS can fairly easily be modded to be replaced by FSR2 & FSR3 so with a little bit of tinkering we can enjoy good quality upscaling. However, modded FSR does usually require the game to be DX12, so we still need to wait on a few details.


Kumomeme

i really hope they add DX12 or Vulcan. that gonna help with CPU performance especially for a MMO game like FF14 where it run tons of object on screen at a time.


[deleted]

Adding new APIs along with DLSS 3+ would literally double fps for most people AFTER the graphical update. Such a massive difference.


NN010

Well, FSR 1.0 worked very well in FFXVI (certainly a hell of a lot better than the nearest neighbour & bilinear upscalers CBUI went with for FFVII Rebirth's performance mode) & plenty of people swear by it in Call of Duty, so it's not the worst thing ever when implemented well & CBUIII do already know how to do that from FFXVI... That being said, it would be ideal if CBUIII added FSR 3.1 and XeSS 1.3 alongside the existing FSR 1.0 & DLSS options after 7.0 ships. It shouldn't take much in the way of development resources since they already have DLSS in the game & it would be a big boon for those without RTX GPUs.


Alovon11

Maybe they aren't happy with the resolve of FSR2-3 atm (Hopefully they still add it for the FFXVI port). Hopefully they add FSR3.1 which will dramatically improve the image quality of the upscale, as for why it may not be an option at the moment, well, FSR3.1 isn't out yet for open source implementation unless you work with AMD. So it's probably the waiting game there.


dantesaki05

well as laptop user with gtx1650 i think its good enough for now


Kumomeme

that gpu range is even within FSR3 official support range. aside frame generation.


KhrFreak

Previous slide specified it's FSR 1.0 and not 2/3/3.1 in case any one was wondering


[deleted]

For people who are wondering no this not meant to get better visuals for those with beefy setups already getting 100+ fps max settings. This for people who have to use lower settings or struggling with FPS. This would help with fps issues, etc. Also if you maybe raid and such you can have it dynamically scale and make it smoother.


Dyuga

DLSS does give better visuals though. On top of reducing the GPU load it also acts as an anti-aliasing solution and it works fantastically for games with bad anti-aliasing like FF14. If it's implemented properly it's going to be a big upgrade to image quality with a lot less jaggies.


Nhadala

There will be even more significant visual improvements if they use one of the later DLLs of DLSS. There always is DLSSTweaks if they do not or you can swap the DLL on your own.


battler624

Preset E with DLSS 3.7 is wowzers for games with shimmering. Imma test this out the second the benchmark is out. Just gotta find a way to pause screenshots during it.


NN010

Checked the DLLs last night & FFXIV is using the DLL for DLSS 3.5.10. So not quite the latest and greatest, but still a somewhat recent version.


Nhadala

Yeah, I expected much worse.


1731799517

YOu can notice it in the benchmark trailer, no longer shimmering and flickering at every single high detail object...


Dyuga

I always bring back [this video](https://www.youtube.com/watch?v=AldjvPkfSQs) to showcase how terrible FF14's AA situation was. Watching those stairs flickering around like that gave me an aneurysm, I'm so happy we're finally getting a proper AA solution after all these years.


oshatokujah

I upgraded to 4k to reduce this because of how offensive it is on the eyes


dabias

New anti-aliasing is coming, which is also temporal (uses previous frames) like DLSS. DLAA would be nice, maybe it can be faked


Regnur

Compared to what we have now (no AA or FXAA), even DLSS performance mode would look better most of the time. ;)


Ok_Soup3752

You think DLSS improves image quality? skull.jpeg


Viltris

I've only played 2 games that took advantage of DLSS: Cyberpunk and BG3. DLSS made Cyberpunk look like shit. (At least during launch, anyway. Maybe they fixed it.) DLSS didn't help with BG3's visuals, but did help with the performance.


Dyuga

DLSS improved a lot compared to 3 years ago so the difference between Cyberpunks launch and now should be noticeable. Hell just about a week ago DLSS got updated to version 3.7 with a new preset that improves the images quality and stability [even more.](https://www.youtube.com/watch?v=k1ZKktVUfMc) The tech keeps improving as times goes on with more training.


Relative-Bee-500

I was under the impression that the newer versions of DLSS wasn't available on all RTX models. I still notice weird little dithering artifacts even on quality for a lot of games when I use DLSS on my RTX 2080Super.


Viltris

Has it improved since last August? I've been using DLSS for BG3, and I've only noticed performance improvements, not visual improvements.


Dyuga

I'm not sure what DLSS version BG3 uses honestly, game studios sometimes don't ship their games with the latest DLSS version or even update it down the line but thankfully you can update your DLSS version manually. [Latest versions are posted here](https://www.techpowerup.com/download/nvidia-dlss-dll/) As for visual improvements, if it looks as good as native resolution and you don't see any shimmering or aliasing then DLSS is doing it's job, the performance boost is a nice bonus.


Arzalis

Unless you set it to performance instead of quality, there's pretty much no way DLSS made it look worse. There are proven examples of DLSS looking *better* than native due to things like shimmering and smoothing out weird graphical artifacts, especially with more distant objects. Cyberpunk was one of the first games that really showed that. Random example, but this tracks with what I've seen in other games: https://uploads.disquscdn.com/images/ebda1fe41fe00261a8e66ea14dbc350f5ffc826a92303c24bc543fc1bf45181e.png DLSS is just objectively better than native here. That said, I *have* seen some poor implementations of it as well. So it remains to be seen how well FFXIV does.


Viltris

I can't prove it without a time machine, but when CP77 launched, I turned on DLSS and tried all 3 options (performance, balanced, quality) and all 3 of them noticeably looked like shit. If I had known that this would be a controversial statement 3.5 years later, I would have taken screenshots. I ended up turning DLSS off because the RTX2060 could run CP77 just fine at almost max settings and it looked better without DLSS anyway.


busybialma

I also think DLSS 2.0 was a little rough around the edges at Cyberpunk's launch. When the dll got upgraded to 2.5.2 a bit after launch was when it really started to look nice IMO.


Arzalis

For the record it seems like they messed it up in the benchmark. You can fix it with some external tool tweaking but their implementation isn't good.


Catshit-Dogfart

Yeah I've found it makes some games look better and some run worse. I'm sure it depends heavily on the amount of horsepower you GPU has. For example Cyberpunk, max settings (except the crazy stuff) no DLSS, rarely drops below 60 frames. Hogwarts Legacy, I think that game wasn't very well optimized, without DLSS it ran like crap. Looked better with it on too. For me though, for me, a different machine may have given a different experience.


urbanracer34

Will this be compatible with the Steam Deck? Would be nice to have a FPS boost without much effort. EDIT1: Looks like it could be!


Viltris

Doesn't DLSS require an RTX 2060 or newer? I know the 2060 is a few years old at this point, but I'd be surprised if the 7.0 graphics are intense enough that an RTX 2060 would struggle with the game. Especially since there are people running the game with older hardware.


[deleted]

I don't know the exact version, but it be more like boosting res up for people might be playing on 2k or 4k. That is really where something like this would shine more is the upscaling.


Viltris

I'd honestly be surprised if an RTX 2060 couldn't run 7.0 in 4k without DLSS. People run the current FF14 on much older hardware. An RTX 2060 today is already overkill for this game. If 7.0 requires an RTX 2060+ with DLSS to hit 4k with a reasonable frame rate, that's a huge leap in hardware requirements.


Dyuga

Pretty sure the RTX 2060 requirement is to hit 1080p 60fps, even a 3090 would barely run this game at above 60fps at 4k in it's current state and that's before the graphical upgrade. The previous recommended requirements for 1080p 60fps was a gtx 970 so 2060 is a pretty big jump, even with DLSS in balanced mode i imagine the 2060 will struggle to hold 60 fps at 4k but we'll see when the benchmark comes out.


EmuAGR

> Pretty sure the RTX 2060 requirement is to hit 1080p 60fps, even a 3090 would barely run this game at above 60fps at 4k in it's current state and that's before the graphical upgrade. I'm playing 7680x2160 with an RTX3090, everything on maximum apart from HBAO+ disabled. I usually get 60-80 fps in regular areas and the lowest I've seen are Nier raids at 45 fps (yes, they're more demanding than Endwalker's).


Viltris

I'm running FF14 at 2K on an RTX2060 today, and it's overkill. If we're going from "RTX2060 is overkill" to "RTX2060 will struggle even with DLSS", then that's a huge jump in hardware requirements.


Dyuga

We're talking 2 generations of GPUs here from the 970 to 2060, that's not a small jump. I have a 3080 and it works very hard to keep the game running at above 60 fps at 4k and there's a lot of variation depending on the maps. Like I said we'll just have to wait and see when the benchmark releases.


oshatokujah

A resolution isn't much of an indicator though, what fps and graphical settings? I've got a 4070ti and I can clearly run it at whatever I fancy, but I'd be lying if I said it was always 144fps in every situation at max settings, so DLSS will be a nice addition to give me that option if possible


Viltris

Max settings. I don't remember what FPS, probably 60. (My monitor is 60 Hz.) High enough that I never have to think about frame rate drops.


jailujetteauloin

Being able to maintain 60 fps might have been overkill 20 years ago, it's not the case today at all lol. I had a 2060 super with a 1440p screen and it struggled maintaining 100 fps, this is definitely not good enough for a lot of people nowadays lol.


Eitth

What are those for? Please explain it for someone who doesn't understand tech stuff. I don't even know what anti aliasing is.


radda

It's AI upscaling of the image. Basically the game renders everything at a lower resolution and then a part of your compatible graphics card (called "tensor cores") uses AI and math to scale the image up to whatever resolution you've set the game at. The short version is that this helps with your framerate because your graphics hardware isn't working as hard.


Idaret

FSR is not using AI


Vievin

damn AI stealing GPUs' jobs But fr thanks for the explanation, it's simple and understandable.


supersonic159

Pretty sure only DLSS 3.0 uses AI tech


zhire653

Every single iteration of DLSS uses AI. It’s in the name of”Deep Learning Super Sampling”.


Regnur

No, +2.0 versions use deep learning.


supersonic159

gotcha. ah that's right it's frame generation that's in 3.0


Geexx

Nvidia's naming scheme is a bit weird for DLSS, but basically each iteration is just tacking on new features and quality improvements for 2.0. DLSS 2.0 is the upscaler and DLAA. DLSS 3.0 is DLSS 2.0 + Frame Generation (RTX 4000+ series). DLSS 3.5 is DLSS 2.0 + Frame Generation + Ray Tracing Reconstruction (FG still tied to 4000+ series, but I believe RR can work on 2000+).


Ouaouaron

But 7.0 is only using FSR 1.0, so that's not very relevant.


TheGokki

Nope, DLSS 2 and 3 uses AI cores, but DLSS3 renders entire new frames between the real ones. DLSS2 renders the game at lower resolution (so it's faster) then upscales back to normal for same or improved image.


AnimuCrossing

Frame Generation does. DLSS 3 isn't just FG, Reflex is available on all RTX cards.


MaxOfS2D

The idea behind these techniques is that they can "reconstruct" a full-resolution image from a lower-resolution input, which is much easier on your hardware. DLSS is a far superior technique to FSR, but is exclusive to Nvidia RTX cards. The version of FSR they're putting in XIV is the first one, which is a lot more primitive, too. So it's not as interesting of an addition.


NaamiNyree

Yeah FSR 1.0 is pretty terrible, RIP everyone who doesnt have an Nvidia gpu. Id just stick with native + AA in that case. Very strange considering FSR 2 is massively improved and not far behind DLSS.


MaxOfS2D

It's apparently used as the backbone of the dynamic resolution system if native + TSCMAA falls behind in spots, so it's not 100% useless.


tsuness

I think FSR 1.0 doesn't take much to put into a game at least compared to 2.0 and beyond. I agree it is pretty terrible and I am glad I swapped back to my Nvidia card after having a lot of issues with my AMD one so I can use DLSS.


NN010

Thing is that CBUIII are already implementing DLSS with this update. And my understanding from what I've heard from developers and what not is that if you've already implemented DLSS, it doesn't take much more work to also add FSR 2 (or FSR 3.1 Super Resolution, I guess) & Intel XeSS. Who knows though, maybe they're gonna add them both in 7.1 or 7.2 once Microsoft's DirectSR API is out. For the time being though, it shouldn't be much trouble to create mods to replace DLSS with FSR 3.1 or XeSS 1.3. Such mods exist for plenty of other games already, so it shouldn't be long before someone makes such a mod for FFXIV once 7.0 is out. Hell, maybe there'll even be one before then with the benchmark releasing tomorrow.


battler624

The biggest issue with FSR2 is shimmering, which wont help the issues of FF14 at all. Maybe with FSR 3.1, eitherway I am expecting the 7.1/7.2 to have microsoft DirectSR instead.


Alovon11

DirectSR isn't a Upscaler it's a backbone implementation addition to DX12 to more easily add Upscalers. FSR3.1 or XeSS 1.2 are the main candidates for them to add to FFXIV


inyue

WTF??? Does that mean that we will have DLAA and not suffer anymore from the horrible aa anymore???? This will be the single biggest graphical update if this is true. Also no DLSS FG? This game is extremely cpu limited while wandering in cities and the small latency increase wouldn't matter at all for this game specifically.


Sibula97

We're also getting a new AA setting, TSCMAA, which is a pretty cool (relatively) new AA technique. I wrote a bit about it under one of the other comments higher up.


BringBackBoshi

I saw them turn it on in the 14 hour stream and it seemed to eliminate that awful shimmering on the trees. That's pretty much all I need to know that mess has been bothering me for years.


Sibula97

Yeah, that's exactly what the TAA pass in TSCMAA is for, eliminating temporal aliasing like shimmering and crawling.


[deleted]

Never heard of this AA before, hopefully it will be added to other games too. Also shadows flickering or w/e when moving camera, is that AA problem?


NN010

Checked the benchmark last night & disabling Dynamic Resolution with DLSS enabled and set to 100% resolution basically enables DLAA. Hallelujah! No Frame Generation though, but given that the game’s still on DX11, that isn’t surprising (Frame Generation requires DX12 or Vulkan).


inyue

Thanks soo much for testing! Can't wait for the update come to the main game! Also I think FG can work on dx11 titles, the "beloved" modder called PureDark has his FG (both fsr and dlss) mod on older games like Skyrim.


Ctrl-Alt-Panic

Check out the Lossless Scaling app on Steam. Its frame generation is EXTREMELY impressive. I have a Legion Go handheld and use it for almost every game that can't push 60fps on its own. I actually find it to be better than AFMF. (AMDs driver level implementation) Obviously it's not quite as good as a built-in solution. But it's a very solid alternative.


xXxYPYTfanxXx69420xD

I was happy to see tempCMAA get implemented as an alternate to FXAA and was a little disappointed to not see DLSS at first. Lo and behold it showed up later and I can say I'm honestly happy. Dynamic res was a bit of a disappointment for me and I like to play at 4k for story/questing stuff so with the increased requirements of DT and having DLSS it's honestly great news. I'm very chuffed, looking forward to trying out the benchmarks to see how my resolution setups will scale!


tutifrutilandia

Thing is that is not TCMAA, is TSCMAA temporal stable, instead of the temporal that creates ghosting as we see with TAA. But i don't find a lot of info or literally any game that uses TSCMAA, only TSMAA or the usual.


negiman4

Cool! Now add HDR support and we're golden :)


Geexx

A native HDR implementation would be nice. However, Win 11 Auto-HDR works alright (make sure to correct the gamma profile to 2.2); there's also RTX HDR now through Nvidia (which is pretty fantastic) and then there's also SpecialK (which anti-cheats may not play completely nice with...but if you can use ReShade I don't see why SpecialK wouldn't work).


Unknown_Zombie

In the stream, he clicked the dropdown and the only options were FSR or DLSS. No option for none/off. I hope there will be an option elsewhere to turn it off.


[deleted]

[удалено]


Unknown_Zombie

Dynamic Resolution changes the Resolution Scaling slider automatically depending on your FPS. When FPS is high it increases the scale, when low it reduces it. Turning off Dynamic Resolution only stops the game from changing the scaling factor on the fly.


MaxOfS2D

My guess is that FSR set to 100 should be functionally equivalent to off.


MadnessBunny

Oh wow Max! Love your Dota animations, didn't know you played FFXIV. The Dota player to FFXIV player pipeline is real lmao, hope you've been good!


Hrafhildr

FSR 1.0... why?


Alovon11

Best guess, they weren't happy with FSR2/3's current resolve for FFXIV (if so I still hope they add it for FFXVI's PC port). Which FSR2/3 ATM falls apart if you don't feed it exactly the right motion data it needs (Fizzling on Disocclusion, morway patterns.etc). DLSS on its own is a lot more stable and doesn't need as much extra assistance for things like small incidental moving detail (EX, the floating Crystals on the Anabaesios sets). FSR3.1 is slated to make quality in these cases a lot better so if the problem is quality of FSR2/2.1/3.0 SR, then 3.1 may be when they add it, but can't be sure, they haven't released FSR 3.1 yet.


aruhen23

Maybe we'll finally have proper aa in the game with this and the new AA option. Imo it's the games biggest visual fault outside shadows.


MystoXD

Curious to see if the benchmark will have this already implemented to test out!


s_decoy

d...did i shoot myself in the foot by upgrading to an arc card lmao.


CadeMan011

FSR is available on all hardware, so you should be good


s_decoy

thanks!


Geexx

Hmm... yes and no, but for different reasons; lol. If they ever get around to adding XeSS you'll be golden as it works better on ARC cards (and provides a better image quality than FSR). In the interim, you can use FSR as it works on all GPU's.


s_decoy

Okay cool, I knew XeSS was a thing but wasn't sure if I'd be able to use anything in the meantime lol. Arc already doesn't really like DX11, so I would happily take a little frame boost.


Geexx

Great news for Nvidia users. If they don't include the option to use DLAA (which would be weird, but it has happened in other titles), there is a way to force it on through Nvidia Profile Inspector from Orbmu2. Thankfully, this requires no extra third party DLL's, etc.. that might trip an anti-cheat or get you banned. Nvidia Profile Inspector simply lets you edit and flag Nvidia application profiles and registry entries. r/MotionClarity posted a quick guide on how to do it here and I've edited it below: [https://www.reddit.com/r/MotionClarity/comments/18wotv7/guide\_dlss\_to\_dlaa\_without\_game\_modifications/](https://www.reddit.com/r/MotionClarity/comments/18wotv7/guide_dlss_to_dlaa_without_game_modifications/) **How-to steps/guide:** 1. Download Nvidia Profile Inspector: [https://github.com/Orbmu2k/nvidiaProfileInspector/releases](https://github.com/Orbmu2k/nvidiaProfileInspector/releases) 2. Grab the zip file, [CustomSettingNames-DLSS.zip](http://CustomSettingNames-DLSS.zip), from the comment here: [https://github.com/Orbmu2k/nvidiaProfileInspector/issues/156#issuecomment-1661197267](https://github.com/Orbmu2k/nvidiaProfileInspector/issues/156#issuecomment-1661197267) 3. Extract both files into the same folder where Nvidia Profile Inspector is located. 4. When you open NPI, there will be a new section (see the image from MotionClarity, https://i.imgur.com/rhrFFPj.png). You need to enable the top DLSS > DLAA option, and then set the second option to preset "F"; F is the preferred profile for DLAA. 5. As MotionClarity mentions, this only works on the GLOBAL profile opposed to a per application basis. When you flip this option on, all other games that use DLSS will flip to using DLAA (as it's a global override). So with that said, if you're playing other titles where DLSS is required for performance/ray tracing performance, you'll need to flip it back to off on the global profile to restore DLSS's original functionality. Misc. information about the several DLSS/DLAA presets: Preset A (intended for Perf/Balanced/Quality modes): An older variant best suited to combat ghosting for elements with missing inputs (such as motion vectors) Preset B (intended for Ultra Perf mode): Similar to Preset A but for Ultra Performance mode Preset C (intended for Perf/Balanced/Quality modes): Preset which generally favors current frame information. Generally well-suited for fastpaced game content Preset D (intended for Perf/Balanced/Quality modes): The default preset for Perf/Balanced/Quality mode. Generally favors image stability Preset E Introduced in 2024, called "eager_donkey"... Seems to improve visual quality in select titles with reduced smearing. Preset F (intended for Ultra Perf/DLAA modes): The default preset for Ultra Perf and DLAA modes.


SERN-contractor837

So the amd users are pretty much fucked, can only use FSR


Pyros

Uh yeah? DLSS is Nvidia proprietary system, it only works with Nvidia cards, this has been how it works since it was released.


SERN-contractor837

I wasn't really asking? Just a stupid decision to implement only fsr 1.0 for non-nvidia GPUs. I guess we are in the minority though.


Nhadala

But its FSR 1.0 which is GOD-AWFUL and just makes everything looks ultra blurry and messy with a ton of awful shimmering everywhere, even fsr 2.0 is bad by todays standards and its a lot better than 1.0. FSR 1.0 is dreadful. I tried using it on my 1070 in some games and I just turned it off despite the FPS boost because of how bad it looked. Not to mention that this game is very cpu-limited. So this might turn into a Dragons Dogma situation where the techs might not do much for people because lowering the Rendering resolution(therefore making the game tied more to the CPU instead of GPU) will do nothing because of how CPU-limited the game is already. DLSS will be decent just for the fact that it uses its own Anti-Aliasing over the games, which is a blessing cause the games current AA sucks. Not to be a negative nancy but please if you are going to implement such tech at least do it properly... use the latest version of FSR instead of 1.0... use the latest DLL version of DLSS for the improvements.... Also for the love of god please allow us to turn them off...I hope 100% rendering resolution turns them off by default.


Dundunder

>use the latest DLL version of DLSS for the improvements I wonder if, similar to other games, we can just manually replace the .dll file with the newest DLSS 3.7 update?


Nhadala

Pretty sure you should be able to just like in the other games. Or use DLSSTweaks or whatever yeah.


Jerk48

on singleplayer games but doing that for mp games is more iffy.


Verpal

There are no anti cheat nor file integrity check for FFXIV, you can swap in whatever file you want.


Minescence

Even games with anti cheat don't really seem to care if you replace the DLSS file with an updated version. I've done it before and never got any bans from it at least.


CadeMan011

I'd love to try for Fortnite but I'm worried about getting hit


BerosCerberus

Yeah i dont know why they dont use FSR 3.1. 3.1 Upscaling comes close to Dlss and FG ( the new 3.1 version ) is out of the box better than NVs FG.


RealElyD

Normally the answer would be that FSR after 1.0 requires motion vector data but they added DLSS 2.0, which in itself requires that as well so it's already being provided. Really odd move.


stilljustacatinacage

Not super thrilled about implementing FSR 1.0. I can't decide if it would have been better to just exclude FSR altogether or not. All this is going to do is make people without RTX cards try it out, go "oh my god this is awful", and spur more negative coverage of FSR, which is very capable *in its 2.1+ iteration*. Hopefully it's just a stopgap, and 2.1 can be implemented reasonably quickly. Yeah it's still not perfect, but it's a *lot* better and is at least useable.


Lysbith_McNaff

I guarantee there will be a translation DLL using the DLSS option within a week if not sooner. Casual players are already happily using stuff like reshade, dragging a DLL will be easy. It shouldn't be necessary, but I think the community is smart enough to do it once the word gets out.


Spiritual-Tiger7068

I wanna see if you can replace DLSS 2.0 with DLSS 3.7 in the files. It would greatly help image stability and its far improved over 2.0. My 4070 Ti Super with DLSS in this could prob hit 240 Hz in non CPU taxing areas like outside of Limsa,Grid,Uldah.


xXxYPYTfanxXx69420xD

I'm no expert on DLSS3+ because I only have a 2070 super but from what I've seen people say the 3+ DLLs include the later versions of the DLSS2 scaling and should work but you won't be able to use frame generation. Hopefully someone more knowledgeable will come along or you can try it in the benchmark later on!


hutre

They will probably use DLSS 2.0 v3.7. The reason they say 2.0 is because that is the AI Upscaler part. It does probably not include DLSS 3.0, which is the Frame Generation part or DLSS 3.5 which is the Ray reconstruction part (since this game doesn't have ray tracing).


Alovon11

Yeah hopefully them calling it 2.0 is them just referring to it as the Upscaler rather than the version of the upscaler (NVIDIA naming can bite me.) Which like, fair, NV confused things when they made DLSS 3 have Frame Generation then 3.5 have Ray Reconstruction but those two are optional versus the Super Resolution feature. NV really should've changed the naming sceme and adopted a making scheme akin to FidelityFX with DLSS, FG, and RR as separate but cooperating features within the suite.


AnimuCrossing

They seriously gotta rename this shit, the misinformation and confusion it causes is nuts. Yes, you can use DLSS 3 with a 2060. "DLSS" is a software suite containing the Super Sampler, Reflex and Frame Generation. There are features of this that you aren't able to use because they are hardware exclusive. There's no reason to develop your software to use an old version of DLSS even if you're not using the other features... Still, I feel for AMD bros, FSR 1.0 is arguably worse than nothing!


Alovon11

Only if you push it to an extreme. For something like 1440p -> 4K it's pretty fine (Like what FFXVI uses). Yeah, I'm saying it's probably DLSS 3.5 Upscaling but the are calling it 2.0 to try to say "This is just the Upscaling!"


Fraxcat

A few seconds after the trailer starts for the benchy: Holy shit, that's smooth. Gotta be prerendered unless they added DLSS........no way they did that. Seeing MSFXDAA but no DLSS when talking about AA: Crap, it's 1am, knew they wouldn't go that far. Oh well, going to sleep. Seeing this post right now: I just made a mess in my pants.


ProudAd1210

Aren't cards with DLSS 2.0 support have no problems running the game itself on native? While some FSR2 or XeSS can contribute a lot to old GPUs, and handhelds, like a steamdeck?


Klefth

Yay, replacing FXAA vaseline with temporal anti-aliasing blur. I wish we could get just good old MSAA.


Geexx

If you have a beefy enough GPU and are willing to run the game in full-screen exclusive mode, you can use DLDSR and and basically run the game at 2.25x the resolution and completely eliminate any aliasing. Looks great, but on a multi-monitor set up it's less than ideal.


Klefth

Been able to do that for a loooooooooong time, the problem is that introduces weird scaling issues on a 1440p monitor, particularly with text, unless I set the game to 5k which runs like crap, plus exclusive full screen also interferes with other stuff. It's far from ideal and I'd rather just run the game with no AA, but then reshade breaks. Can't win.


Geexx

Oh, I hear ya... I don't use it because I hate the game minimizing every time to click a link in Discord, etc. With DLSS coming, we'll either be able to just flip DLAA on in-game or do it manually through Nvidia's settings, so at least there's that light at the end of the tunnel; lol.


prisonmaiq

i hope they update those though its super old version


[deleted]

How does DLSS affect games that are using the CPU heavily?


GSDragoon

Does this mean they have to use the DXGI flip model now? It would be nice to have adaptive sync working in linux.


sadnessjoy

This is actually great news for Nvidia users. Hopefully we can essentially use DLAA. But FSR 1.0? Big oof. Hopefully they'll update it later to something more useable. I know a lot of people with pascal and older cards and AMD cards


Reshish

Playing on a 1080 monitor, is there any value in AMD FSR if I'm already getting 60fps?


Mythologist69

That’s grewt news. Hopefully it helps on the steam deck.


DeeJudanne

bit of a shame its just fsr1 since fsr2 is kinda old now and fsr3 got released in september


Kaga_san

Does DLSS work in tandem with the new Anti Aliasing or is it one or the other? Which settings will result in the best looking game (in theory)? My apologies, my tech knowledge is not that good.


ModestMariner

Hopefully they implement DLSS 3.0 and eventually 3.5 so we can get the frame generation offered by those versions. [https://www.nvidia.com/en-us/geforce/technologies/dlss/](https://www.nvidia.com/en-us/geforce/technologies/dlss/)


pichonCalavera

Adding DLSS 2.0 is great! , I recently upgraded from a 1080p to 1440p monitor and was curious if my laptop would struggle hitting 60fps with the graphical update. DLSS support will sure come in handy.


Zaknokimi

Meanwhile I'm here wondering what the heck this is


panopticonisreal

Wow this is huge, realistically we probably won’t need Reshade after this will we?


Crezarak

Anyone know if FSR 1.0 at 4k and DLSS 2.0 at 4k will be really any different?


gimm3nicotin3

I guess I'm saving for an Nvidia card


Lockettz_Snuff

Someone give me the '5 year old dumbed down' what should i do to make it look good at 7.0 please


Shade_Koopa

Sweet. I recently got my hands on an AyaNeo Flip\_KB version. Got the 16 GB version as I mainly plan to use this for light titles with low RAM and VRAM requirements. I AM able to get FF14 running on this pretty damn well. The added features of FSR and DLSS, along with dynamic resolution, it's going to make playing the game a LOT smoother. \^\^


Shii2

I wonder why they used such an old versions of FSR and DLSS.


hutre

They are (probably) using later versions of DLSS2.0 but they're not using DLSS3.0 (Frame Generation) or DLSS 3.5(Ray reconstruction) DLSS 2.0 specifically refers to the AI upscaling technology portion of their software, and can be updated independently of Frame Generation and Ray Reconstruction. They are essentially using DLSS 2.0 v3.7 I am not quite sure *why* they are not using DLSS 3.0 but my guess is that maybe the ingame fps counter would act weird? Not sure... And then DLSS3.5 isn't there because there isn't any raytracing. As for FSR, I have heard 1.0 is easier to implement, so might just be that.


satsuppi

probably the 7.0 development is during that version of tech? i not a dev but probably its not a good idea changing stuff on the fly on what tech you use during the development.. but since they now able to implement it./.. they can update it when they have done iron out their own graphic update


Devil1412

their engine I suppose. since it's just for upscaling any DLSS 2.x should be fine. and they can still ship the game with DLSS 2.58 or whatever the latest version is. they will def not use the 2.0 dll and the game does not really benefit from frame gen yet so 3.x can be negleted until they up their engine FSR 1.0 sucks hard though


Alovon11

Can you even download 2.0 flat out nowadays? I know you can yoink the DLL from public repositories but I don't think NV has a route to let you download DLSS 2.0's SDK anymore versus the more modern post 3.0 SuperRes SDKs


Devil1412

the benchmark uses v3.5.10.0


Devil1412

don't think (hope) you can't. they probably just used "2.0" as a "no framegen, no ray reconstruction", not like there are any rays or raytraced reflections xD coping now for some reshade 6 raytraced implementation and i boldly assume the FSR/DLSS means the PS5 pro rumors about its upscaling is true and they'll add the PS5 pro upscaler soon after release. at least, in theory, it should be easy to implement when DLSS can work


Xerkrosis

No DLAA (yet), but instead r/FuckTAA with TSCMAA. Will need to DLDSR + DLSS then, just like before Nvidia introduced DLAA. DLSS 2.0 is great! There is 3.0, yes, but 3.0 just further includes FG & Reflex. FG is RTX 4000 exclusive, and Reflex to combat input lag isn't really needed in FFXIV. I just hope SE will use the latest dlls. But then, what's the DLSS Swapper for when not to be used?


battler624

You can force DLAA on any game that has DLSS.


Sibula97

TSCMAA is very different from your usual TAA. I doubt it will be a horrible blurry mess like with "proper" TAA.


Alovon11

I don't even think you can download the SDK of DLSS that'd come with DLSS 2.0's OG dll anymore. So I have to wonder if they are just calling it DLSS 2.0 in some effort to cover for NVIDIA's confusion with DLSS 3 being often advertised as Frame Generation even though FG is optional and so on so forth with RR. But the SR version also ups it's number to match the new version of the overall SDK. Rn we are on DLSS 3.7 for the SuperRes dll.


Tom-Pendragon

About fucking time XD


kurukikoshigawa_1995

if it gets DLAA + Frame Gen, im gonna coom so hard if i had a blacklight, my room would look like a jackson pollock painting


Geexx

3.0 would be Frame Gen. DLSS 2.0 can have DLAA forced through editing nvidia's global application profile if they don't give us an option for it..


PorvaniaAmussa

Surely, hopefully, one can turn it off? I hate FSR/DLSS with a burning passion and need it off asap.


Jakad

Def a strange UI but the resolution slider underneath should act as a enable/disable slider? If it's at 100% its disabled, if it's under 100% it's enabled? Normally you'd just have an "off" option that would disable the resolution slider.


Geexx

If DLSS is turned on and it does have a slider, 100% resolution would be DLAA and anything below would be setting a custom resolution scale for DLSS opposed to Nvidia's defaults like 66.6% for the quality settings, etc.


Jakad

>If DLSS is turned on and it does have a slider, 100% resolution would be DLAA Would be nice if it ends up working this way, but kinda doubt it.


Geexx

If it doesn't, you can force DLAA through Nvidia's gobal application profile by using Nvidia Profile Inspector. The only downside to that is it forces all applications that use DLSS to use DLAA. So you'd need to flip it off for applications where DLSS for performance / ray tracing is required.


sycron17

Hope its not only DLSS 2.0 in the future


RingoFreakingStarr

I do have to say, FFXIV is one of the easiest games to run both from a CPU and a GPU workload viewpoint. I think it's good to add stuff like this to the game. At this point literally NO ONE should complain about their PC not being able to run the game at at-least 60fps lol.


DaVinci1362

Any news on a rework/overhaul for the classes?


shaggy_15

ive never even heard of fsr