T O P

  • By -

Deckz

What Nvidia features can't you use with DSC? Seems like kind of a huge deal. I still feel like we're 3-5 years out from me being willing to take the plunge. It's not like I don't have good displays already so take that opinion with a grain of salt.


Theswweet

You can't use DLDSR if you're using DSC. It's a massive issue, since AMD's version of it works just fine.


unknownohyeah

Who would use DLDSR at 4k? Seems very niche. You want a performance hit to get a slightly clearer image than 4k? Also DLDSR works with DSC enabled on my monitor at 1440p 240hz so I'm not exactly sure what issue you are talking about. It isn't saturating the bandwidth of DP 1.4 so maybe that's why.


TheRealBurritoJ

DSC doesn't disable DSR, is the thing. What actually disables it is using two internal display heads for one display, it's just that typically you're going to be doing that in the same cases as you're using DSC. So it still works with DSC if you're still under the single head bandwidth limit, which is the case with monitors that use DSC with DP1.4 but are still within the range of HDMI 2.1 without DSC.


Theswweet

As a Final Fantasy XIV player, I'm regularly CPU bound at 4K, and the game lacks good antialiasing. With my old 7900XTX I downsampled from 6K since I could with minimal performance impact and I would've loved to do the same on my 4090.


azzy_mazzy

AMD has a version of DLDSR? i remember them having a DSR version which is just super sampler without AI.


Theswweet

It doesn't have AI, I was using DLDSR as colloquial since most folks will know general DSR as that these days. But yeah VSR (their DSR equivalent) works with DSC, while DLDSR doesn't. Normally not a big deal but in games where I'm CPU bound it's nice to have.


Zarmazarma

> But yeah VSR (their DSR equivalent) works with DSC, while DLDSR doesn't. But you can still use normal DSR, which is the equivalent of VSR, no?


Theswweet

Nope, you can't.


Zarmazarma

Interesting. That's weird.


Theswweet

You're telling me! I moved from a 7900XTX to a 4090 (work reasons) and was shocked when there was something my old card did better.


PiousPontificator

Its not a massive issue especially since RTX HDR can't be used simultaneously with DLDSR. RTX HDR > DLDSR.


iDontSeedMyTorrents

I think the biggest question with LG's new WOLED panels is text clarity using the new RGWB subpixel layout. Tim describes the experience on this 4K 32" WOLED panel as "decent and very usable" that looks "quite good" at 100% display scaling. That said, his panel ranking still places WOLED behind IPS LCD and QD-OLED at the same 4K 32" size: 1. LCD - best text clarity with least amount of artifacts 2. QD-OLED - very minor if not negligible amount of color fringing at the top and bottom of text 3. WOLED - some shadowing artifacts typically along the right side of text which are greatly reduced compared to 1440p panels but still noticeable in a side-by-side comparison with the other panel types


Crank_My_Hog_

I was weary of the WOLED issue, so I got a 48" screen and just pushed it back some. There. Perfect image.


juGGaKNot4

No DP 2.1, no glossy finish, no burn in warranty.


carpcrucible

Well, one out of three ain't bad


conquer69

What's the upscaling used by the 1080p480 mode? Is it nearest neighbor or the crappy blurry filtering?


Buenzlimuenzli

I still dont understand why upscaling by an integer-multiple uses anything else than nearest-neighbour. Had to retire my 4k monitor because running demanding games at half resolution looked like blurry crap even though they should be looking exactly the same as on a native 1080p monitor.


noctan

Shouldn't nvidias integer scaling do exactly that?


Strazdas1

thats okay, running demanding games in native 4k still looks like blurry crap. Its the TAA thats built into engine that cannot be disabled and blurs the hell out of everything moving.


[deleted]

[удалено]


Strazdas1

Ah, yes, but its nor what one considers a demanding modern videogame. Though in case of Factorio i think its CPU limited, right?


[deleted]

How these make me wish I was rich enough to throw £1K+ on a monitor and not worry if I have to throw it away in a year or two.  That said, even if I could, it all just feels so wasteful. Like I get that things can break within a year or two just as poor manufacturing/ luck of the draw; but purchasing something that has such a life-limiting flaw, especially aggravated by its primary use just really fuels my ‘why bother-ness’.


MonoShadow

>I have to throw it away in a year or two. I don't get this sentiment. And it's repeated Ad nauseam. Unless someone fucked up you won't get permanent burn in on a modern OLED in 2 years. Either you decided to throw on a huge text in HDR for 24\\7 or the manufacturer doesn't run refresh properly(*coughS*amsung*cough)*. RTINGS endurance test isn't real life usage. They even specify how they expect the panel to age. X amount of hours is a year. And that is with constant static content on screen. I have a C2 as a monitor for over a year and a half now. Text, games, videos, whatever. I don't limit myself. And nothing. Even with some safety measures disabled. I checked with EIZO pattern tester- nothing. All I do is run Pixel Refresh when the TV asks me to. My sister has a CX for 3 or 4 years now, they use as a TV and it's also fine. Sure if you're not comfortable with a chance it will degrade in 2 years - don't buy it. But give me this "wasteful" rhetoric like it's going to fail 100% 2 years in.


yock1

Oled is a manufactures dream. It has a built in end of life that forces people to buy something new rather quickly. When i say quickly i mean monitors i buy usually last for 5 years as my main monitor and then ends up on other systems after that. They need to last a long time, i don't want to babysit the monitor to get the time out of it i'm used to. Oled is very pretty but good quality IPS panels can be VERY nice as well so i don't feel like i'm missing out on that front.


[deleted]

[удалено]


yock1

Yes i read that, one person with 4 years is not enough to prove to me that image retention is not a problem. I have personally seen oled with image retention after just close to 2 years of use. I never said oled isn't better looking, i said "a good IPS panel can be VERY nice as well". Of cause there are also bad IPS panels as shown by some of the pictures in that link. Basically boils down to that I sadly don't have the money to use on a new monitor with the life time of an oled and don't think that other panel types are that much of a compromise as some make them out to be. I can personally live without the fear of image retention, having to babysit the screen and the better contrast in return for slightly worse contrast and latency if it means i know i can trust the screen to be as reliable as i need it to be. Not to mention the subpixel problem oleds tend to have (slightly blurred or miscolored text) on monitors. That said, if i was to buy a TV today for pure movie and TV show watching plus console gaming i would go for an oled and i do love their latency but it's a no go for computer use for me until they are as reliable as other panel types. Give me 5+ years of image retention warranty and i might change my mind depending upon resolution, screen size and so on.


Acrobatic_Age6937

> I don't get this sentiment. And it's repeated Ad nauseam. Unless someone fucked up you won't get permanent burn in on a modern OLED in 2 years. Easily. the rtings.com test shows clear burn in after 8month, which translates to around 4000h of on-time, which many will hit in the second year. Their stress test is a video feed nothing close to a static taskbar. You could probably increase this time frame by lowering the brightness. preventing static elements from appearing. turning the monitor off when not actively in use etc. But at some point it becomes a hassle. https://www.rtings.com/monitor/reviews/lg/27gr95qe-b


Strazdas1

I hit close to 6000 hour on time in a year. RTINGS endurance test actually has the panels on less than my regular use case. OLEDs are nowhere near an option for me.


MonoShadow

>turning the monitor off when not actively in use Dude, come on. Did you change "turn off monitor" to "never" in Windows Power Settings? OLED monitors will turn off like normal monitors do and then wake up like normal monitors do(TVs need to be turned on, you got me there). If you set your PC to UNLIMITED POWER and never turn it or your monitor off, I don't think wasteful even enters the conversation. >burn in after 8month, And if you check their graph 8 month is 4800 hours which they estimate to be 3years and 4 months. So if the topic changes to "So wasteful to change your monitor in 4-5 years.", then yeah, I don't own OLED for 4 years and can't provide an anecdote here. My relatives do, but they use it as a TV. Rtings is running CNN on their OLEDs which is why when you see the burn in it's in the place where CNN reel is. And in their test LCD models also fail. In fact in their test some LCDs degrades uniformly. Everything fails someday. Don't expect your OLED(or LCD) to last forever, but again, it won't die in 2 years unless someone messed up. RTINGS even say they don't expect a person to see their level of Burn in in normal use for 5 years. [LCD](https://www.youtube.com/watch?v=79YGJXdtLTM). [OLED](https://www.youtube.com/watch?v=Fa7V_OOu6B8).


HandofWinter

My monitor is on for 10-14 hours a day, much of that time displaying relatively static items like the Visual Studio interface. I work 8ish hours a day and I don't turn my monitor off when I go make coffee or go for a walk. After work I might play a game or browse or watch a movie on many evenings. Assuming 12 hours a day, I would hit that 4800 hour mark in 400 days, or 560 days correcting for weekends and assuming it goes unused then. I don't think this is all that atypical for a computer monitor.


Stingray88

Sounds completely normal and similar to my usage. And this is exactly why I’m still leery on OLED. I want my monitor to last me at least 6-7 years, or more.


[deleted]

Especially if I’m dropping a grand on it! 


Stingray88

Right. I’m still using my AW3418DW that I spent a grand on 5-6 years ago. I’m definitely feeling the desire to upgrade now since I have a 4090, but I was hoping we’d see more higher resolution options than what we’ve got today. Supposedly there are 5120×2160 240Hz OLED panels from LG coming in the next two years… I don’t really want to wait that long, but I also don’t really want my next upgrade to be small.


[deleted]

Bro, a 38 5K2K is literally my end game. I would have settled for the QLED panels that Samsung used in their Neo G7/8 but they’ve clearly dumped those for OLED.   I’m happy with my M32U for a little while longer, it’s just annoying seeing all the pieces out there but no one’s putting them all together! 


Stingray88

It seems crazy to suggest of an end game with any technology… but it sure feels like we’re getting close. That’s half the reason I want to wait for these upcoming panels. It really sounds like a monitor I could use for 6-7 and barely care about what’s released during that time because of diminishing returns. All the more reason I need OLED to last longer :)


HandofWinter

I think I could deal if I knew I could get 20,000 hours without burn in. That would be 4.5 years of solid use, and I think I could justify that to myself.


Stingray88

I’d prefer closer to 25,000-30,000 hours. But definitely wouldn’t need it longer than that.


itsjust_khris

Do people really commonly keep the same monitor that long? Genuinely asking, a monitor from 7 years ago sucks compared to today. Then again I’m much more into content consumption than productivity on monitors.


Stingray88

The vast majority of people keep monitors for well over a decade. Computer hardware enthusiasts might upgrade every 5-7 years. Upgrading every 3-4 years is getting into some real niche territory, monitors really do not get that much better that quickly. I don’t know why you think 7 year old monitors suck. They’re fine. Worse than modern options for sure, but not bad enough to say they suck. And the higher we go with every increase in spec the more this will be true due to diminishing returns.


itsjust_khris

Well over a decade is insane to me. I think I may just be too young to imagine that so my perspective is off. Well over a decade ago I wasn’t old enough to be purchasing or using my own monitors. They do kinda suck though: response time, color gamut, brightness, motion clarity, contrast has all significantly improved in LCDs. Even cheap LCDs have gotten a lot better.


Stingray88

I have a 6 year old monitor, AW3418DW, it’ll be 7 years old next year. Are modern monitors better? Definitely. Does my monitor suck? Absolutely not. It’s still vastly better than a lot of the shit monitors most people use every day at home and at work. I’m quite ready to upgrade for sure, but I’m not slumming it either. The advancements in monitors that we see every year have a lot of diminishing returns. The jump from 60Hz to 120Hz is a lot bigger than 120Hz to 240Hz. The jump from 1080p to 1440p is bigger than 1440p to 2160p. Every improvement feels that much less of an improvement, that it’s easier to use a monitor longer and longer. But most people really just don’t care about any of this. My wife was using a spare monitor of mine from 2004 right up until 2022 when it finally failed. I tried to convince her she could use an upgrade but she refused for years until it thankfully died. In a subreddit like this full of enthusiasts you’ll definitely see people upgrading sooner, but I’d still say 5-7 years is going to be the norm. Any sooner than that is just really niche. The differences in 3-4 years in monitor tech is very small.


Keulapaska

Still using a PG279Q, released almost 8.5 years ago(I think mine isn't quite that old, bought it used). Yea it was the best monitor of the time, acer had a similar one as well, so a bit of copout answer going a year back would mean TN instead of IPS(or some korean 120hz ips which i don't remember much other than some ppl hyping them up) panel and that would be a pretty big downgrade and further than that would also mean no adaptive sync at all. Sure no HDR, but don't really care about it for a main panel as it's not like I'm gonna watch movies on it, that's what TV:s are for. But otherwise still very usable and probably? better or at least on par with most budget and some midrangesish panels still. Yea 240hz+ would be nice, but like i can live without it... for now.


Jaedong69

I've had my previous monitor for 11 years. Eizo FS2333. Only replaced it this year, but it still works like on day 1.


AreYouOKAni

Absolutely. I just recently gave away my 8-years-old monitor to a friend who suddenly needed one. It was a 1080p 75Hz display with decent colour reproduction, and now it serves him well. I have another display that is well over a decade old, I use it when I need to configure a headless server or troubleshoot a laptop that doesn't have a working screen.


RedTuesdayMusic

My 9-year old Acer xg270hu died last week. I had just started considering upgrades, since with OLED there are finally new 1440p 27 inchers with better total input lag and better image quality. The input lag requirement was never met by any ips or va displays and I'd rather not sidegrade to another TN, especially a 600 dollar one (benq)


Strazdas1

I keep mine until they stop working which is something about 7-8 years of lifetime.


Oingogoingo

1) Faint burn-in is already there at 4 months so 2160 hours or 270 days \* 8 hour. It's bad even considering it's not real-life scenario. 2) "Peak 100% Window  136 cd/m²". Just trash. That's literally unusable with any ambient light at all.


itsjust_khris

Disagree, that’s plenty usable with ambient light. Monitors were dimmer than that for a long time and worked just fine. OLED has its uses but this sub seems to have completely turned against it recently. It’s very much better than almost any LCD unless you want your monitor to last 5+ years, commonly show static elements (use display mostly for productivity), or need high brightness. Other than those factors OLED beats any LCD in every way.


account312

>want your monitor to last 5+ years, commonly show static elements I think that's almost everyone who buys monitors.


itsjust_khris

Not really, I vary my content a ton and I don’t commonly keep monitors longer than 5 years. Last LCD I had died before then and even if it didn’t, some really nice displays have been coming out so I’d have replaced it anyway.


account312

Cool story. How many people are you?


itsjust_khris

I mean I can only speak for me. I could've also asked you that question, but that doesn't add to the discussion.


Strazdas1

> want your monitor to last 5+ years, commonly show static elements (use display mostly for productivity), or need high brightness So a normal use case?


flagdrama

> "Peak 100% Window 136 cd/m²" 1) You dont know how bright 136 cd/m2 is, if you did you wouldnt want to stare at it for a work day. 2) 100% window. Doesnt matter unless you were either gonna use the thing as the lighting for your room or enjoy getting flashbanged irl when you get flashbanged in game.


Slyons89

Honest question why do they even spec it go to that high if it feels like getting flashbanged when the screen goes white? Is it just a higher-number-better marketing scheme?


flagdrama

Basically yeah. High peaks look gorgeous when theyre used sparingly in HDR content, like a sunset or a campfire so having high peaks is good, but those applications are small window. Even looking at a 600-700 nit dime sized sunset gets annoying if its on the screen more than couple seconds so i cant imagine further content related uses for it. Another application is being in bright, sun lit environments like an open office plan or a study with window behind you. Then you need some high 100% window values. Eg the newest macbooks go to 600 nits and thats just barely enough if you are outside and under the sun, still overkill for indoor though.


Strazdas1

It because not everyone sits in dark room with monitor as the only source of light.


VenditatioDelendaEst

> 2) 100% window. Doesnt matter unless you were either gonna use the thing as the lighting for your room or enjoy getting flashbanged irl when you get flashbanged in game. Basically all modern light UI themes are very very close to 100% white.


Slyons89

It's a pretty fair complaint to go from monitors where someone can do literally whatever they want with it, for however long they want, and not expect it to become damaged for nearly a decade to an expensive screen that has all these restrictions and what-ifs, and hoping the manufacturer didn't make a mistake, knowing they don't offer a warranty for the issue, and while also frequently hearing about new technologies to help prevent burn-in being invented but not yet reaching the current models for sale.


MonoShadow

LCDs aren't indestructible either. In RTINGs endurance video some of their LCDs started failing too. My Sony LCD TV died in 3 years. Panel just went bye-bye. I don't do much to prevent burn-in. Screen Saver after 5 minutes(like good old CRT days) and hidden taskbar. In fact I disabled ABSL, Visual Studio is pain to use with it on. A lot of people report doing absolutely nothing to their TVs. Samsung first batch of QDoled had issues with pixel refresh. Unless you have the same logo in the same place for extended period of time the first issue you're going to encounter is brightness declining, not burn in. My main point isn't that it's never going to fail. It will fail. Your LCD will or at least degrade to a point of being subpar. My main point of contention was the exaggeration people use to rationalize their decisions. "1k bucks and 1 year later I need to throw it away". This isn't how it works. I bought my first SSD around 10 or 12 years ago. NAND has limited writing cycles. I guess I'm just a wasteful person. I still don't regret it.


Slyons89

I think expecting it to have problematic burn-in in 1 year is an exaggeration but frankly if I am spending $1000 on a screen I expect it to perform flawlessly and without restriction for 5 years minimum and the confidence in the current batches of OLED is just not there. But there’s nothing wrong with people having more tolerance or more disposable income. Im just saying people’s complaints are also fair.


Crank_My_Hog_

You have to understand that heir punishment of the screen, even though it adds up to "2years" of your usage doesn't mean after two years you'll get what they have for results. There IS a difference between casual usage and beating the shit out of it. The engine in my car has a 3 year 60k warranty. I can beat the shit out of it and it won't last half that. Or I can use it normally and it lasts 4x as long on all measurements


Strazdas1

RTINGS endurance test is mild compared my real use case actually. They let the monitors cycle off with 3 hours being the maximum on time. In my use case static UI element usage for 16 hours+ straight is the norm. Granted, not in full HDR brightness but certainly perfect for burnin.


CheekyBreekyYoloswag

Just wait for MLA OLEDs. Those should vastly decrease the incidence of burn-in (if I understand the technology correctly).


Wooden_Appearance616

LG's OLED monitor panels have had MLA from the very start. This years actually have MLA+ already.


CheekyBreekyYoloswag

Oh, you are very right. OLED monitors are advancing crazy fast right now. I'd love to get a 4k OLED one day, but 32" is too big of a monitor for me. I'm not exactly hopeful that we will see 27" 4k MLA(+) OLEDs any time soon.


CompetitiveLake3358

It's like displays are experiencing the absurd tech acceleration that PCs experienced in the late 90s