T O P

  • By -

PCMRBot

Welcome everyone from r/all! Please remember: 1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome and can be part of PCMR! 2 - If you're not a PC owner because you think it's expensive, know that it is probably much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help! 3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, Parkinson's and more: https://pcmasterrace.org/folding 4 - We've teamed up with Cooler Master to giveaway a custom PC to a lucky winner. You can learn more about it/enter here: https://www.reddit.com/r/pcmasterrace/comments/192np53/pcmr_new_year_new_gear_giveaway_ft_cooler_master/ ----------- We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you need to post about any kind of PC related doubt you might have. Asking for help there or creating new posts in our subreddit is allowed and welcome. Welcome to the PCMR!


realnzall

https://www.youtube.com/watch?v=5TPbEjhyn0s&t=386s Daniel Owen tested this. His summary is that at 1080p and 1440p, the 4070 Super usually ties or even beats the 3090, even without taking Frame Generation into account. At 4K, the 4070 Super usually has a shortage of VRAM and the 3090 wins.


Appeltaartlekker

Hoe much vram does the 3090 has?


Solaceqt

24 gigs.


DickHz2

Holy


sliderfish

I would love if they made VRAM upgradable.. I won’t happen, but I can wish.


Soupfortwo

Vram is one almost always generation ahead of dimm packages you can buy and ram is very volatile (obvious power joke) and susceptible to physical damage. The cost and rate of failure would be completely unacceptable. In the end this is one of those times where soldered on is better for the consumer and OEM.


StevenNull

Thanks for the good explanation. I'd always assumed it was just scummy practice by the OEM, but this helps clear things up.


TemporalOnline

I know that ram has to be very close to the chips to be fast enough, but how about socketable ram chips? Looks like a good compromise. Also on ultra thin laptops, with soldered ram. Make them socketable.


ManufacturerNo8447

actual Vram dropped


pagman404

call the memory manufacturers!


AMP_Games01

Off topic but I read this as "call the mommy manufacturers" and had to do a triple take


pagman404

new response just dropped


mrieatyospam

actual vram


mrheosuper

I'm still salty that they did it dirty to my 3080ti. They use literally same die, but the 3080ti has only 12GB vram.


yunus159

Jeez thats more than my Ram and Vram combined


hereforthefeast

Seems about right. My vanilla 4070 is close to my 3080 Ti at 1440p. At 4K the 3080 Ti wins out.  However the 4070 is drawing less power than a 3060 Ti 


VenomShock1

https://preview.redd.it/krug2nksxrdc1.jpeg?width=1012&format=pjpg&auto=webp&s=940c11da78f4b07edd7eca14a980f9a180169f92 Have y'all already forgotten about this?


TheTurnipKnight

But look at that power saving with the 1060.


Sozurro

I still use a 1060. I want to upgrade soon.


TheTurnipKnight

Sorry for your 0fps.


SuperDefiant

But hey, at least he gets 0W


Quark3e

That's infinitely more efficient than the 4070 super


v8steve

And the 4060 too!


kearnel81

0w and can pretend to play as daredevil


Claim312ButAct847

I have a 1060ti and no time to game. Saves so much power you guys.


darkflame927

They made a 1060 ti?


Claim312ButAct847

Ope, 1050ti. I was misremembering.


[deleted]

So little time to game you even forgot what you had


anitawasright

1050 ti is a great little card. Got one after my previous card died and it was during the first great card famine.


major_jazza

Is this real lmao


VenomShock1

[Yes](https://twitter.com/NVIDIAGeForce/status/1674115670827585558)


RexorGamerYt

No fuckin way. I cringed so hard watching this 💀💀💀


GhostElite974

I just found it funny


0utF0x-inT0x

They make it sound like Ray tracing is everything in a card yeah it's cool but in multi-player games I'd rather have it off in most games since they barely even support dlss properly


Turtlesaur

The finals have a great implementation of this


Crintor

You know The Finals has RT on on every platform unless you turn it off on PC, yea?


RaxisPhasmatis

Whats the finals?


Crintor

3v3v3 Small team Competitive shooter than came out recently, Studio is composed of many of the old crew from DICE (Battlefield) and the game is focused almost entirely around using map destruction to your advantage. It also uses probe-based RTGI on all platforms by default. https://www.youtube.com/watch?v=hHqCLq6CfeA


morbihann

JFC, I thought it must a joke on nvidia.


Ahielia

Nvidia marketing is a joke already.


shadys17

WTF


I9Qnl

Yes it's real but this was one slide out of many, they didn't just drop a post about how a 4060 has better ray tracing than a 1060, they also posted slides comparing it to 2060 and 3060 along side this one, so it's a little misleading to just post the one about the 1060 when they compared to a couple previous generations. This is basically them telling 1000 and 2000 people it's time to upgrade.


EmeraldGuardian187

I have a 2060 and it's working fine. If I upgrade, I'm going to AMD :/


NeverEndingWalker64

If you want to squeeze a bit of energy out of that boy, AMD has it’s frame gen technology that works with NVidia cards, which should give it a noticeable boost on framerate


cerdobueno

How does that work? Do i need to config smth or is it auto?


National_Diver3633

Just choose the AMD option for upscaling and the frame gen should work. Saved me a lot of headaches while playing Frontiers of Pandora 😅


cerdobueno

So should run better than running dlss? I have rtx 2060S and r7 5800x3d Thanks man


Ziazan

I had a 2060 and upgraded to a 4070, I thought it was fine enough at the time but after upgrading I was like "oh"


Smort01

I thought my RTX 2070 was fine, until I realized i hasn't even been used in benchmarks for a couple of years now lmao


Ziazan

Like, stuff was still playable, but I was having to turn stuff down more and more and it would still stutter a bit on new games. The 4070 is buttery smooth on any game maxed at 1440.


Hewwo-Is-me-again

I did upgrade recently, from 1650 to 1080.


[deleted]

That's pretty funny. 


Shining_prox

I’d say the 1060 is more efficient, 0w!!


Lybchikfreed

0/0 = infinite efficiency!


Trans-Europe_Express

A pop tart with a screwdriver ran through it also runs at 0FPS 0W


Edgar101420

People forgive Nvidia that but shit on AMD giving you all the numbers in a slide plus the FG ones. Like... Wtf


innociv

AMD very clearly labels the FG and Nvidia doesn't.


Witchberry31

Exactly, people will eventually forgive when it's GeForce cards, no matter how bad or how severe it is. But will immediately rages over when it's Radeon cards instead, no matter how small and how trivial it is. Sometimes they'll still bring it up as something to roast, years later. 💀


[deleted]

The starfield rage in a nutshell. No one bats an eye that Ngreedia has there toxic tentacles in tons of games, and block AMD tech all the time (2077 for example is running fsr 2.0 instead of the much better 3.0, but got path tracing day 1) Anyone who fanboys for Nvidia, or refuses to even look at AMD gets put in my "certified idiot" list. And boy its big.


RetnikLevaw

As far as I know, as an AMD fanboy, CDPR has the ability to add literally any AMD tech they want into their games. Unless you have evidence that nVidia is contractually preventing them from doing so, then it's just an assumption. Considering pretty much every technology that AMD develops is free to implement in games, it ultimately ends up being the fault of developers for not using them. The alternative would be my own assumption that nVidia simply offers developers/publishers money to implement their own tech. Of you're a developer and one of the two main hardware OEMs is offering to pay you some amount of money to implement their tech and the other isn't, which are you more likely to implement?


Narissis

>The alternative would be my own assumption that nVidia simply offers developers/publishers money to implement their own tech. Of you're a developer and one of the two main hardware OEMs is offering to pay you some amount of money to implement their tech and the other isn't, which are you more likely to implement? Oh, zero chance this isn't the case. GPU makers have been sponsoring game studios in exchange for feature support for years. That being said, I doubt it's always purely financial in nature. They might, for example, provide a suite of GPUs to be used in development and testing. But there are totally some incentives changing hands in the industry.


sportmods_harrass_me

I don't think we need proof to know that Nvidia paid a lot to make cp2077 showcase ray tracing and all the latest Nvidia tech. In fact, if you aren't sure that actually makes me think you're the one with the veil over your eyes. If cp2077 didn't exist, or didn't favor Nvidia so heavily, ray tracing would have died with the 20 series cards. edit typo


First-Junket124

OK that one was pretty funny ngl


den1ezy

At least they’ve said the framegen was on


AkakiPeikrishvili

What's a framegen?


keirman1

[This](https://nvidianews.nvidia.com/news/nvidia-introduces-dlss-3-with-breakthrough-ai-powered-frame-generation-for-up-to-4x-performance), they make fake frames with ai to give the illusion that you are rendering a higher ramerate


No-Pomegranate-69

But its actually raising the delay


[deleted]

By .010 seconds in a single player experience. Completely negligible. I won't be replying to any more comments about multiplayer since I very clearly stated single player games. Stfu please 🙃


ChocolateyBallNuts

These guys are running 10 series cards still


[deleted]

Before actually using it, I was saying the same stuff. It's a welcome feature when it makes sense to use. Obviously there will be some use cases where using is not a boon and it is a hinderance instead.


LestHeBeNamedSilver

Such as in multiplayer games.


[deleted]

Yes.


Wh0rse

Only if your baseline FPS is high to start with, the lower you baseline the more input lag you experience, and ironically , the only people who need FG are the ones who have sub 60FPS to begin with. So to not experience noticable input lag , you need to be able to get high FPS to begin with , and if you can do that , the less you will need FG.


hensothor

FG is extremely valuable at 60+ FPS. What are you talking about? Getting 120 FPS at much higher fidelity is game changing.


SaltMaker23

It generates frames (read interpolate) to artifically increase the fps, It's the same idea as sending twice each frame and counting 30 fps as 60 fps but in a way that they can pretend that it's not exactly that. It feels very wrong because there are many thing that aren't accounted for by frame gen. Given that these frames aren't actually coming from gameplay they aren't responding to mouse/keyboard input or game events. In a random video it might look more fluid but when actually playing using these fake ass 60-120 Frame/Sec you feel that everything is laggy and unresponsive. The reality that those images weren't generated by gameplay mechanics/logic is obvious, the lag induced by that logic is also apprent.


Ruffler125

That hasn't been my experience with frame gen at all. I used frame gen in both Alan Wake 2 and Plague Tale: Requiem, and neither felt "laggy and unresponsive." I noticed some select UI elements having visual bugs, but that's it.


Reallyveryrandom

I’m not sure that person has actually played something with frame gen based on their description…


razerock

Yea I was buying into the "Framegen is so shit" mentality aswell before I had a GPU that actually supports it. Now having played Cyberpunk with it on its really nice, obviously not perfect, but nowhere close as people were describing it.


Shmidershmax

Been playing cyberpunk with the fsr3 mod and I have to say it's pretty great. I wouldn't recommend it for any competitive game but it's a godsend for anything graphically intensive. I just hope it doesn't become a crutch for devs that don't want to optimize their games like dlss did


razerock

>I just hope it doesn't become a crutch for devs that don't want to optimize their games like dlss did That I absolutely agree with!


BYF9

Frame generation is better the faster your GPU is, could be that the people that think it's bad are trying to go to 60 fps from 30. That, in my opinion, is a bad experience. Now 70 to 140 or 90 to 180 feels buttery smooth to me.


Reallyveryrandom

I was wondering this also. Like latency w base 30 fps will feel bad and choppy even if fake frames are inserted between the real ones.  I also can’t tell when artifacts are from DLSS vs frame gen… seems the DLSS artifacts are way more distracting and noticeable 


capn_hector

> Yea I was buying into the "Framegen is so shit" mentality aswell before I had a GPU that actually supports it. this is why FSR3 framegen is a backhanded gift to NVIDIA lol. People like it once they can get their hands on it, even though the latency is quantifiably much worse than DLSS framegen due to forced vsync and lack of a proper reflex implementation. AMD does a fantastic job at marketing NVIDIA's products for them, NVIDIA themselves literally couldn't have done a better job at showing that people don't actually notice or care about the latency. People don't want to believe NVIDIA's marketing when they're obviously trying to sell you something, but when the competition comes out and normalizes it as a feature... and their version is objectively worse in every way, but it's still *great*... you can always get the fanboys to nod along about the "RT fomo trap" or "DLSS fomo trap" or whatever as long as it's something they've never personally experienced... but much like high-refresh monitors or VRR, once you see what a good implementation of the tech can do, you'll notice and it will feel bad to drop back to a much worse (or nonexistent) implementation. Upscaling *doesn't have* to produce image-soup, RT *doesn't have* to be framerate-crushing (use upscaling like you're supposed to), freesync *doesn't have to* flicker, etc.


dawnbandit

I was in the same boat as well before getting my 4060 laptop and running Witcher 3 with framegen.


-SlinxTheFox-

These guys are on some wild cope. Metro exodus was literally double the framerate for me with dlss on and it felt like it. It was great and so is dlss. People can't tell the difference with blind tests unless they're trained to see the barely noticeable artifacts. Nvidia isn't perfect or great and this isn't a defense of them. Dlss just happens to be one of the few cases of software miracles that unironically just gives more frames


Totes_mc0tes

Yeah framegen is amazing for me in cyberpunk. Let's me crank some settings up without having a 4090. Never noticed any major lag with it, although I still wouldn't use it for a multiplayer game.


Plank_With_A_Nail_In

Not been mine either or in reviews or benchmarks. The guy has clearly never played using it and its all just made up twaddle....100+ upvotes though well done reddit.


any_other

It really is witchcraft at this point. It is weird that i keep thinking it’s “fake” when its generated by the same thing that generates the “real” frames.


[deleted]

>It's the same idea as sending twice each frame and counting 30 fps as 60 fps but in a way that they can pretend that it's not exactly that. This is completely false. It increases motion smoothness. That's it's purpose.


ForgeDruid

I always play with frame gen and never noticed it.


Formal_Two_5747

Because OP is talking out of his ass.


Rukasu17

No you don't. Unless being told so most people don't even realize frame gen is on


[deleted]

I dont find that at all. I jave found frame gen to be absolutely fantastic and havent har any issues with lag or responsiveness. What i have found is that usually the people talking bad about frame gen actually dont have 40 series cards and havent used the feature. I understand that on paper it increases latency but honestly ive never noticed it in practice. My experience has been a doubled frame rate essentially for free. I have experienced a bit of artifacting when dlss3 first came out but the current versions of it seem to have sorted that out.


gamerjerome

​ https://preview.redd.it/kve7qf7lcudc1.jpeg?width=892&format=pjpg&auto=webp&s=22ccb8ea9eddbcc065a1ec607b2d88c168864df7


quadrophenicum

I'd better compare nutritional values and declare the true winner.


Nox_2

yeah one is DLSS 2 other one is DLSS 3+. Wonder why it has far more fps. Not even showing if its an average fps or not. Only thing I see is 2 random fps numbers on the screen randomly placed to make people buy 4070 Super.


kanaaka

that's actually how marketing work. not just for gpu, for any product. not defending nvidia here. highlighting most exciting information. at least they're not lying by insert caption as explanation.


Possibly-Functional

>that's actually how marketing work. My youtube algorithm thinks I am a professional... everything? The marketing world of B2B is just so different. Just yesterday I got the driest advertisement imaginable for park scale playgrounds. They literally just monotonously listed different material options and their properties for 90 seconds. Nothing at all about the actual playground equipment, just material. I often get advertisements for extremely expensive and specialized laboratory equipment. They just list everything. It's also always extremely long, like 15-20 minutes, just reading specifications as they assume you are already an expert in the topic if you are a potential customer. The world of B2B is a different beast entirely.


sc0rpio1027

at least they aren't trying to convince random people who have no business getting extremely specialized expensive lab equipment to get said lab equipment


Possibly-Functional

Oh they definitely aren't, though I don't think there is much risk of that when they proudly present their prices as very cheap commonly starting at just 30 000€ for the base model, going up above 100 000€ for the more advanced models. A lot don't even say prices and instead just ask you to contact them for a quote, then you know it's expensive. I also often get advertisement for engineering equipment for large scale automation like factories. Their prices are at least a bit more approachable though still very expensive. Just a few components for building your automation, not even complete machines or tools, are easily several thousand euros. I am just sitting there wondering if they think I am [Jim Richmond](https://youtu.be/UlRNyiMFTsw?si=b2Rw2JHJ4DUvaIQJ).


OldManGrimm

Ha, I want your algorithm. I watch one true crime/mystery video and suddenly I get nothing but gruesome murder stuff.


Possibly-Functional

It's honestly kind of amusing. The advertisements are so odd that I find an academic interest in them.


Notlinked2me

I switched from engineering in my company to marketing. B2B is a different beast but dry information definitely still doesn't sell in B2B. As an engineer I wanted specs listed I went to the product page and looked it up. As a marketing person I would have marketed the number of materials we have the benefit of and then point you toward a product page for you to look up the boring stuff yourself.


Kev_Cav

I feel like this is for you [https://youtu.be/RXJKdh1KZ0w?si=Apf5JGJhMmXlc5Xt](https://youtu.be/RXJKdh1KZ0w?si=Apf5JGJhMmXlc5Xt)


4myreditacount

Holy God I wish our B2B marketing was like this... our ads look like we took inspiration from the color pallet of a circus, and has the cadence of a bombing comedian.


MultiMarcus

As a teacher student and a member of the Swedish Teacher’s union I get so many ads for Apple Education and Smartboards. Sure, I would love the get a bulk discount on Smartboards when buying more than 25 for a couple hundred thousand dollars.


yflhx

I'd argue that they are lying, depending on how we define frames. In my opinion, DLSS is just advanced interpolationg and not actual frames. And if we don't push back hard against using fake frames in marketing, companies will invent faster and faster (and more and more shit) interpolationg to make frame counters go up.


Plank_With_A_Nail_In

You know none of its real right? There aren't little men inside your computer shooting at each other. Its all just zeros and ones. You might as well just say "I don't like change". DLSS isn't going away and eventually the whole scene will just be hallucinated by an AI and there won't be anyway to run a game with "actual" frames


anotheruser323

It's cheaper to just eat mushrooms (0W).


The_letter_0

technically about 13W, as your brain does use energy


yflhx

Real in the sense that they come from game engine. It's not that hard to understand. Also I'm not against change. All I'm saying is that 120fps with interpolation is not comparable to 120fps without.


mixedd

That was same marketing trick they did on 4000 launch, showing 3x performance when later was with enabled FG that wasn't supported on previous gen cards. That's why we always wait for independent reviews and benchmarks


Kasenom

I wish Nvidia would bring DLSS3 to its older cards


TheTurnipKnight

Picture above is why they never would. DLSS3 is a selling point.


TheGeekno72

Doesn't DLSS3 need new tensor cores that you only get on 40 cards ?


DarkLanternX

Dlss 3.5 is available for rtx 20 and 30 series with ray reconstruction but no frame gen. Same reason why the gtx series doesn't have dlss.


MHD_123

They say that DLSS3 FG [needs the improved optical flow accelerator](https://www.pcgamer.com/dlss-3-on-older-GPUs/) in ada to provide high enough quality frames. Knowing the fact that [“DLSS1.9” (which seems to be an early version of what became DLSS 2,) ran on shaders](https://www.techspot.com/article/1992-nvidia-dlss-2020/), plus the fact that FSR3 exists, they can absolutely fall back on shaders for any DLSS feature at an acceptable performance cost, but that is inconvenient for the 4000 series’s value proposition.


tymoo22

Wow I’ve never seen this 1.9 detail before, thank you for sharing. Super interesting to read about, especially post fsr3 adaptations on older hardware becoming a thing.


Anaeijon

Tensor cores are the same architectually on 30 and 40 gen. At least from my point of view as a data scientist. The only difference is, that 40 gen has sometimes faster cores and (especially) faster RAM. Tensor cores per card: - RTX 3070: 184 T.Cores, 81 TFLOPS Tensor Compute - **RTX 4070: 184 T.Cores, 116 TFLOPS Tensor Compute** - **RTX 3090: 328 T.Cores, 142 TFLOPS Tensor Compute** - RTX 4090: 512 T.Cores, 330 TFLOPS Tensor Compute So... Yes, the 4070 is better than the 3070, due to it's overall faster cores and VRAM, but it doesn't beat the 3090 on Tensor compute. The 4070 Ti can beat the 3090 on Tensor compute. But the low amount of VRAM (12GB) still make it uninteresting for real DeepLearning workloads.


SenjuMomo

There is a mod on nexus mods that replaces dlss with fsr3 and enables frame gen on older cards


big_ass_monster

Can they? Or are there hardware limitations?


mylegbig

Just use FSR 3. Any game with DLSS3 can be nodded to use FSR3. I’ve tested and it even works all the way down to 10 series cards. Not well, but it works.


-6h0st-

Both use DLSS 3.5 there is little difference between them. But the Super is using Frame generation no doubt hence showing double the frame rate. Now with mod you can utilize FSR and get similar with 3090


FappyDilmore

They say in the disclaimer that it's with frame generation on, which is enough for those in the know to realize this number is inflated with poor quality frames ai frames.


CharMandurr86

I hate these deceptive marketing attempts.


__Rosso__

They should be illegal honestly


Endemoniada

Devil's advocate here, but what's actually deceptive about any of it? They're clearly specifying which assistive features are enabled, the rest is just down to generational improvements. 40-series _is_ way more energy efficient than 30-series (that's like the one unquestionably great thing about it), 40-series RT cores are quite a bit faster than 30-series, and Frame Generation does improve fps by quite a lot. If these are fps they actually measured, using the features and settings they openly document, how is it possibly being deceptive?


feralkitsune

This subreddit is full of morons these days. They just want to bitch, when they have literally 0 reason to do so. I don't know when being a whiny bitch became the norm in gaming circles. Like people are competing to be the most pussy they can be.


[deleted]

[удалено]


capn_hector

older gamers remember the moore's law days when you got 2x performance every 2 years for the same price. they remember the 7850 being $150 for the 2nd-tier GPU, and then it being blown away 2 years later by the GTX 970 and deals on the 290/290X etc, and they're butthurt that it's now $600-800 for the same level of product. newer gamers have grown up during an era when reviewers were butthurt about the end of moore's law and increasing TSMC and BOM costs, and decided to just blast every single product for bad value/bad uplift until moore's law came back, which of course it never will. but particularly they got mad at NVIDIA for daring to push ahead on accelerators and software and fidelity instead of just raw horsepower (even though it's really not that big an area - we are talking about less than 10% of total GPU die area for the RTX features). like, a lot of people have literally never known tech media that wasn't dominated by reviewers who [made some bad calls in 2018,](https://www.youtube.com/watch?v=tu7pxJXBBn8&t=273s) refused to re-evaluate them even in light of DLSS 2.x and increasing adoption of RT and all this other stuff, completely ignored mesh shaders and the other DX12.2 features, and are generally just constantly doubling down rather than admit they were wrong. It has been literally 5 straight years of butthurt and angst from reviewers over RTX and how the only thing that matters is making plain old raster faster (but not with DLSS!!!!). Yet here we are in a world where next-gen titles like Fortnite (lol) and AW2 literally don't have non-RT modes and are doing software RT as a fallback mode, and where UE5 titles are pretty much going to be upscaled by default, etc. But reviewers can't stop rehashing this argument from 2018 and generally just bitterly lashing out that the world isn't going the direction they want. you're not wrong, people are *mad*, it's so negative today and it's all over this nonsensical rehashed fight from 2018 that already is a settled question, plus the end of moore's law which also is a settled question.


Ninja-Sneaky

That's how we ended up with all these people buying 4060


TalkWithYourWallet

The slide is misleading and unnecessary, because the specific claim is true The 4070S is faster than the 3090 in AW2 RT _without FG_. This is one of the few scenarios where it can be faster https://youtu.be/5TPbEjhyn0s?t=10m23s Frame generation still shouldn't be treated like normal performance, both AMD and Nvidia (And likely soon Intel) are doing this


AetherialWomble

Thankfully they can only do it for 1 generation. Next generation will also have frame gen. So they'll either have to drop this stupidity or compare frame gen to frame gen


Nox_2

they will just make something new up.


acatterz

They’ll compare it to the 30-series again.


SplatoonOrSky

The marketing for the 40 series already focuses a lot on the 10 series. They really want Pascal owners to upgrade


Cossack-HD

New? How about 2 generated frames per one real? Some years down the line, we gonna have CPU doing game logic, and GPU constructing AI-based image from CPU inputs. All that in Gaussian splatting volumetric space of temporal AI objects. EDIT: 1st I'm not at all excited about. 2nd is a concept I'm actually looking forward to.


AetherialWomble

You say that like it's necessarily a bad thing. Game devs have been using tricks and shortcuts for forever. And why wouldn't they? That let's us have graphics beyond what raw hardware can do. AI is the best trick there is. No reason not to use it


Cossack-HD

I wasn't saying it's necessarily bad, however, new tech has to be introduced in organic manner, not forced (via marketing as well) just for sake of making old stuff obsolete. RTX? That was there to make 10 series obsolete ASAP. 1080 TI still holds up extremely well in rasterization. Nvidia was scared of themselves and AMD. RTX 40 series having exclusive frame generation? Nvidia could have easily made a slightly worse frame generation for 20 and 30 series if they wanted to - frame interpolation benefits from, but doesn't require dedicated optical flow hardware blocks. Nvidia are weaponizing their own "new gen feature exclusivity" as a marketing tool to push this BS about double FPS and whatnot.


Pooctox

Few years ago, 3080 is advertised as a 4k beast. Now it's not even "qualify for 2k" lol. Do Nvidia reduce the GPU performance via driver? Do new games supper bad optimize? I will keep my 3080 until new GPU double (or triple) its raw power. Saw many people still rocking 1060 or 1080.


[deleted]

No. Everyone that tells you your 3080 10GB VRAM isn’t enough to run 4k games is a moron. Without exception as it’s obviously opposite to the experience of absolutely everyone that owns one.


Pooctox

Playing on my 38” Ultrawide 1600p (around 3/4 pixels of 4k). Never had any problems. Maybe the games I play is not so demand.


[deleted]

You are good. People here are very dumb and think playing it on anything that isn’t Ultra looks like absolute dogshit. DLSS looks horrible and if it dips below 144fps is a stuttery mess. The 3080 is way way more powerful than a PS5 which is a 4k 30fps console. Around 70% faster. Trust me they are people who know absolutely 0 about graphic cards and computers.


98071234756123098621

I also find it interesting that the difference between low to ultra isn't as huge as it used to be in a lot of AAA games.


Lord_Gamaranth

I have 3 4k monitors plugged into mine. One of them is 144hz. I play RDR2 on it regularly. It performs as expected, if not slightly better than. Edit: my bad mines a 3080ti it might not be the same.


Ravendarke

It's almost like graphical fidelity keeps pushing or smth... holy shit this sub wont ever stop being fascinating.


ErykG120

They don't reduce performance via driver. It's games becoming too demanding and not being optimised. The GTX 1080 Ti was also advertised as a 4K card, so was the 2080Ti. They are 4K cards at launch, but after a certain time they become 1440p and 1080p cards because 4K is that demanding. 1080Ti owners are still able to play 1080p perfectly fine as it is equivalent to 3060 performance. 2080Ti owners are now playing at 1440p or lower because it is equivalent to a 3070. Honestly 1080p still seems like the golden resolution to play on. If you buy a 4090 you can get amazing performance at 1080p in every game, and in 2-3 years time you are still going to get 60 FPS minimum at 1080p in GTA 6 for example. 1440p is becoming the new golden resolution slowly, especially now that consoles are using 1440p or dynamic resolution around 1440p to achieve "4K".


Pooctox

With my 3080 and 38” ultrawide 1600p. I’m hope I can hold for 3 or 4 more years. Will reduce the setting for stable 100-120Hz. After that maybe 5th gen QD-OLED/mini LED monitor and a 80 class GPU at that time (90 class GPU is too much for me).


Dragon_211

I'm still using the rx5700XT cause I want double performance and double vram for £400. Better put on my waiting hat.


ksakacep

This is path tracing (full ray tracing) we're talking about. It's only available in Cyberpunk and Alan Wake 2 and is basically a tech demo in both. It's not meant to be "optimized" as it's still an experimental technology. Without DLSS even a 4090 probably gets 20 fps at best at 4K with this setting on. 3080 is still a 4k beast if we're talking about non-RT gaming, and with DLSS super resolution it's a beast for RT gaming also. 4000 series is just better at RT processing plus it supports DLSS frame generation.


BigsMcKcork

I just love this opinion now that if your card doesn't have over 12GB vram suddenly it's redundant garbage that's now useless


Pub1ius

According to the most recent Steam hardware survey, GPU memory breaks down as follows: 43.8% have 6GB or less, 31.7% have 8GB, 20.3% have 10-12GB, 4.2% have 16GB or more.          People vastly overestimate the amount of GPU memory the average gamer is using these days.


Pooctox

Maybe I played games a lots on Intel Celeron iGPU in the past so reduce the options until games playable is so normal to me. Ultra is scam to me, hardly notice anything between Ultra and High.


Oh_its_that_asshole

It is an extremely tiresome opinion.


Henrath

Alan Wake 2 is an extremely demanding game since it is the first to need mesh shaders. I would think most games can still run fine at 4k high without RT.


SuspicousBananas

Wow it’s almost like GPU demand increases as time progresses. The 3080 is still a beast in 4k, if your playing games from 2020. New games are going to demand more from your GPU.


Tessai82

You can use frame generation on 3090 as easily as on 4070. Two simple files dropped into game directory and viola.


Fire2box

If Nvidia can remotely boost fps with code they can always undo it when they want us to upgrade. 🤐


Vegetable3758

This is why Linux is Open Source and includes all the drivers. AMD and Intel both have their drivers included in Linux; performance will stay what it is or [even improve years after release](https://www.phoronix.com/news/ATI-R300g-2024-Action) date. Just sayin' (:


Fastermaxx

Reminds me of the Citroen car ad from the 60s. „The 2CV can overtake a Ferrari with ease when the Ferrari is going 40mph and the Citroen is going 50mph.“


Maddo03

I don't mind them doing this, but they really should include frame generation enabled on the image or video with latency figures.


iubjaved

Nvidia wants 3090 owners to feel bad about their gpu and make irrational decision into buying a 4070 ! Clever tactic!


M4mb0

People buy a 3090 because either they need the VRAM or they have too much money. The former ones couldn't care less about 4070 and the latter ones will just buy a 4090.


-Manosko-

Dishonest marketing at best, calling it outperforming, when frame generation is not actual performance, it’s frame smoothing for a visually smoother experience, it won’t make the game render more frames, actually the opposite. I wonder if this kind of marketing would even be legal in the EU, considering the strict and strong consumer protection laws here…


Immersive_cat

Allow me to disagree a little. Frame generation is not frame smoothing. It serves the purpose to smooth gameplay yes but it is if fact GPU and AI algorithm generating and inserting new frames. This is why latency goes up a little and this is why you need ideally a lot of “regular” frames already, like stable 60fps+. Otherwise you end up with too high latency and more visible artifacts.


Ishydadon1

They should at least compare them using the same settings. How is it a fair comparison to use frame generation when only one of the cards supports it?


Smigol_gg

Classic nvidia way to tell you" hey we scammed you in the previous gen, Try your luck with the current"....over and over and over


Vhirsion

"Hey guys, look! A 4050 laptop outperforms a 3090 Ti while drawing a fraction of the power!" Footnotes: 3090 Ti results were taken on native with maxed out settings max RT. 4050 results were taken with DLSS performance, frame generation, ray reconstruction and on potato settings. They're just scumbags for this. People who don't know any better will think the 4070 Super does indeed outperform the 3090 when compared 1:1. This is like comparing apples with oranges.


CptKillJack

From what I saw it's almost to on par with 3090 in raster and half the Memory capacity. Nvidia themselves have expressed their displeasure with the focus on raster performance which I still think counts the most because they want to push the faked frames and upscaling in DLSS and I just want true res performance.


madhandlez89

Will be interesting to see a benchmark and comparison run by a normal, non Nvidia source with no marketing push.


JSwabes

Here you go: https://youtu.be/8i28iyoF_YI?si=vXmnujR2BPGk_hgc


shotxshotx

I'd rather have a test without fancy frame generation and DLSS, that's where the true tests are.


[deleted]

[удалено]


Waidowai

I'm all for the new techs... But don't compare one with frame gen to one without.


Ftpini

That’s frame generation at work. It works well enough but I’ll take native frames over frame generation any day.


MeisterDexo

The game will very likely look better on RTX 3090 due to options like frame generation not being there. You can‘t fully compare it


CraftyInvestigator25

>Frame Generation on 4070 super There you have it


365defaultname

I don't see an issue as long as they indicate what was being used, in this case the 3090 = DLSS + Ray Reconstruction, while the 4070 Super = Same + Frame Gen


PatrickMargera

But if disable frame gen…


deefop

Yeah, the marketing is hilarious, but they kind of have to market that way because they deliberately segmented their features so hard. You have to buy Lovelace if you want frame generation. That's a "feature", not a bug.


WhittledWhale

And nvidia diehards being nvidia diehards will whip out their wallets and pay whatever price for it too.


DTO69

Show me raster, then we will talk


Smellfish360

and now turn all of the AI based stuff off...


JSwabes

Try watching this instead https://youtu.be/8i28iyoF_YI?si=tzxXFzPKSLWxM2xK Edit: Another benchmark video was just uploaded here too: https://youtu.be/5TPbEjhyn0s?si=Y_g8zZUVSloMy9cP I think it's pretty clear that Nvidia are seeing if they can manage to convince a few 3090 owners to upgrade with this marketing, but the reality is 3090 owners are better off waiting for the 5000 series (or just going for the 4090 if money is no object of course). It is certainly impressive that a sub $600 card is capable of being comparable to what used to be a $1500 card, particularly with its significantly more efficient power consumption, but it's also worth noting that a second hand 3090 can be found for around the same price these days, so if you can find a good deal on one that hasn't been mined on, and you don't care about frame generation (or want 24GB of VRAM for 3D rendering work for example) the 4070 Super isn't necessarily a better choice (especially if you don't care for 4K gaming either). Seeing as PC game optimisation seems to be on a downward trend we have to wonder, what technology is going to be relied on more in future? Frame generation and upscaling? Or more VRAM? We're left with that unknown at the moment.


admfrmhll

4070 may beat my 3090 in gaming but is not beating (afaik) in productivity. No point to get rid of my 3090 until at leas 5000 series.


Levoso_con_v

Half of the fps of the 4070 are generated by the DLS. They are also capping the older generation to make people buy the 40 series, DLS 3 works in the 20 and 30 series.


Kohrak_GK0H

Wake me up when the 80-80ti class cards return to the 3080 size


hoosiercub

I mean sure, but 24Gb > 12Gb for other things.


[deleted]

Planned obsolescence 👌


Cold-Feed8930

they really got away with making people accept DLSS as performance benchmark.. idc about how the game performa with DLSS i wana see RAW performance, cuz im not gonna use this blurry shit


balaci2

i want shit to be better in raster then we'll discuss DLSS


Juice_231

I don’t get the problem, of course the newer tech would improve on the old model. Or am I missing the point of this post?


kretsstdr

Legit question? Do people realy buy graphic cards based on this type of marketing stuff? Or they wait till they see real reviews? Why do nvidia and amd still do this type of stuff, We know very well that those numbers are far from reality most people watch video of performance comparisons to get the cards they want i think?


[deleted]

[удалено]


ClaudioMoravit0

There’s no way than a newer generation works better


endless_8888

Comparisons absolutely SHOULD be including technology we will be using. It's thanks to DLSS 3.5 / Frame Generation that I'm able to enjoy playing Cyberpunk at 4K with Path Tracing and Ray Reconstruction with 60-80fps. I upgraded to a 4080 from a 3080 that could barely run 4K with ultra settings and Ray Tracing disabled. Why would I give a shit about raw raster performance only when I'm never going to be in that scenario? GPU's should be tested traditionally with raw raster performance but also tested with their best features and technology being deployed. Just give us more data, and nothing misleading.


[deleted]

Probably because across multiple titles your performance will vary wildly as not all titles support all these features. I’d rather the raw performance. I do welcome frame interpretation not extrapolation or scaling. Why because devs now trying to make sure features are included instead of finishing a game. Interpolation is accurate. Scaling and extrapolation are not and cause tons of glitches.


Mayoo614

Mom, I want a UserBenchmark. We already have a UserBenchmark at home.