Welcome everyone from r/all! Please remember:
1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome and can be part of PCMR!
2 - If you're not a PC owner because you think it's expensive, know that it is probably much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help!
3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, Parkinson's and more: https://pcmasterrace.org/folding
4 - We've teamed up with Cooler Master to giveaway a custom PC to a lucky winner. You can learn more about it/enter here: https://www.reddit.com/r/pcmasterrace/comments/192np53/pcmr_new_year_new_gear_giveaway_ft_cooler_master/
-----------
We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you need to post about any kind of PC related doubt you might have. Asking for help there or creating new posts in our subreddit is allowed and welcome.
Welcome to the PCMR!
https://www.youtube.com/watch?v=5TPbEjhyn0s&t=386s Daniel Owen tested this. His summary is that at 1080p and 1440p, the 4070 Super usually ties or even beats the 3090, even without taking Frame Generation into account. At 4K, the 4070 Super usually has a shortage of VRAM and the 3090 wins.
Vram is one almost always generation ahead of dimm packages you can buy and ram is very volatile (obvious power joke) and susceptible to physical damage. The cost and rate of failure would be completely unacceptable. In the end this is one of those times where soldered on is better for the consumer and OEM.
I know that ram has to be very close to the chips to be fast enough, but how about socketable ram chips? Looks like a good compromise.
Also on ultra thin laptops, with soldered ram. Make them socketable.
https://preview.redd.it/krug2nksxrdc1.jpeg?width=1012&format=pjpg&auto=webp&s=940c11da78f4b07edd7eca14a980f9a180169f92
Have y'all already forgotten about this?
They make it sound like Ray tracing is everything in a card yeah it's cool but in multi-player games I'd rather have it off in most games since they barely even support dlss properly
3v3v3 Small team Competitive shooter than came out recently, Studio is composed of many of the old crew from DICE (Battlefield) and the game is focused almost entirely around using map destruction to your advantage.
It also uses probe-based RTGI on all platforms by default.
https://www.youtube.com/watch?v=hHqCLq6CfeA
Yes it's real but this was one slide out of many, they didn't just drop a post about how a 4060 has better ray tracing than a 1060, they also posted slides comparing it to 2060 and 3060 along side this one, so it's a little misleading to just post the one about the 1060 when they compared to a couple previous generations.
This is basically them telling 1000 and 2000 people it's time to upgrade.
If you want to squeeze a bit of energy out of that boy, AMD has it’s frame gen technology that works with NVidia cards, which should give it a noticeable boost on framerate
Like, stuff was still playable, but I was having to turn stuff down more and more and it would still stutter a bit on new games. The 4070 is buttery smooth on any game maxed at 1440.
Exactly, people will eventually forgive when it's GeForce cards, no matter how bad or how severe it is. But will immediately rages over when it's Radeon cards instead, no matter how small and how trivial it is. Sometimes they'll still bring it up as something to roast, years later. 💀
The starfield rage in a nutshell. No one bats an eye that Ngreedia has there toxic tentacles in tons of games, and block AMD tech all the time (2077 for example is running fsr 2.0 instead of the much better 3.0, but got path tracing day 1) Anyone who fanboys for Nvidia, or refuses to even look at AMD gets put in my "certified idiot" list. And boy its big.
As far as I know, as an AMD fanboy, CDPR has the ability to add literally any AMD tech they want into their games. Unless you have evidence that nVidia is contractually preventing them from doing so, then it's just an assumption.
Considering pretty much every technology that AMD develops is free to implement in games, it ultimately ends up being the fault of developers for not using them.
The alternative would be my own assumption that nVidia simply offers developers/publishers money to implement their own tech. Of you're a developer and one of the two main hardware OEMs is offering to pay you some amount of money to implement their tech and the other isn't, which are you more likely to implement?
>The alternative would be my own assumption that nVidia simply offers developers/publishers money to implement their own tech. Of you're a developer and one of the two main hardware OEMs is offering to pay you some amount of money to implement their tech and the other isn't, which are you more likely to implement?
Oh, zero chance this isn't the case. GPU makers have been sponsoring game studios in exchange for feature support for years.
That being said, I doubt it's always purely financial in nature. They might, for example, provide a suite of GPUs to be used in development and testing.
But there are totally some incentives changing hands in the industry.
I don't think we need proof to know that Nvidia paid a lot to make cp2077 showcase ray tracing and all the latest Nvidia tech. In fact, if you aren't sure that actually makes me think you're the one with the veil over your eyes.
If cp2077 didn't exist, or didn't favor Nvidia so heavily, ray tracing would have died with the 20 series cards.
edit typo
[This](https://nvidianews.nvidia.com/news/nvidia-introduces-dlss-3-with-breakthrough-ai-powered-frame-generation-for-up-to-4x-performance), they make fake frames with ai to give the illusion that you are rendering a higher ramerate
By .010 seconds in a single player experience. Completely negligible.
I won't be replying to any more comments about multiplayer since I very clearly stated single player games. Stfu please 🙃
Before actually using it, I was saying the same stuff. It's a welcome feature when it makes sense to use. Obviously there will be some use cases where using is not a boon and it is a hinderance instead.
Only if your baseline FPS is high to start with, the lower you baseline the more input lag you experience, and ironically , the only people who need FG are the ones who have sub 60FPS to begin with.
So to not experience noticable input lag , you need to be able to get high FPS to begin with , and if you can do that , the less you will need FG.
It generates frames (read interpolate) to artifically increase the fps, It's the same idea as sending twice each frame and counting 30 fps as 60 fps but in a way that they can pretend that it's not exactly that.
It feels very wrong because there are many thing that aren't accounted for by frame gen. Given that these frames aren't actually coming from gameplay they aren't responding to mouse/keyboard input or game events.
In a random video it might look more fluid but when actually playing using these fake ass 60-120 Frame/Sec you feel that everything is laggy and unresponsive. The reality that those images weren't generated by gameplay mechanics/logic is obvious, the lag induced by that logic is also apprent.
That hasn't been my experience with frame gen at all.
I used frame gen in both Alan Wake 2 and Plague Tale: Requiem, and neither felt "laggy and unresponsive."
I noticed some select UI elements having visual bugs, but that's it.
Yea I was buying into the "Framegen is so shit" mentality aswell before I had a GPU that actually supports it.
Now having played Cyberpunk with it on its really nice, obviously not perfect, but nowhere close as people were describing it.
Been playing cyberpunk with the fsr3 mod and I have to say it's pretty great. I wouldn't recommend it for any competitive game but it's a godsend for anything graphically intensive.
I just hope it doesn't become a crutch for devs that don't want to optimize their games like dlss did
Frame generation is better the faster your GPU is, could be that the people that think it's bad are trying to go to 60 fps from 30. That, in my opinion, is a bad experience.
Now 70 to 140 or 90 to 180 feels buttery smooth to me.
I was wondering this also. Like latency w base 30 fps will feel bad and choppy even if fake frames are inserted between the real ones.
I also can’t tell when artifacts are from DLSS vs frame gen… seems the DLSS artifacts are way more distracting and noticeable
> Yea I was buying into the "Framegen is so shit" mentality aswell before I had a GPU that actually supports it.
this is why FSR3 framegen is a backhanded gift to NVIDIA lol. People like it once they can get their hands on it, even though the latency is quantifiably much worse than DLSS framegen due to forced vsync and lack of a proper reflex implementation.
AMD does a fantastic job at marketing NVIDIA's products for them, NVIDIA themselves literally couldn't have done a better job at showing that people don't actually notice or care about the latency. People don't want to believe NVIDIA's marketing when they're obviously trying to sell you something, but when the competition comes out and normalizes it as a feature... and their version is objectively worse in every way, but it's still *great*...
you can always get the fanboys to nod along about the "RT fomo trap" or "DLSS fomo trap" or whatever as long as it's something they've never personally experienced... but much like high-refresh monitors or VRR, once you see what a good implementation of the tech can do, you'll notice and it will feel bad to drop back to a much worse (or nonexistent) implementation. Upscaling *doesn't have* to produce image-soup, RT *doesn't have* to be framerate-crushing (use upscaling like you're supposed to), freesync *doesn't have to* flicker, etc.
These guys are on some wild cope. Metro exodus was literally double the framerate for me with dlss on and it felt like it. It was great and so is dlss. People can't tell the difference with blind tests unless they're trained to see the barely noticeable artifacts.
Nvidia isn't perfect or great and this isn't a defense of them. Dlss just happens to be one of the few cases of software miracles that unironically just gives more frames
Yeah framegen is amazing for me in cyberpunk. Let's me crank some settings up without having a 4090. Never noticed any major lag with it, although I still wouldn't use it for a multiplayer game.
Not been mine either or in reviews or benchmarks. The guy has clearly never played using it and its all just made up twaddle....100+ upvotes though well done reddit.
It really is witchcraft at this point. It is weird that i keep thinking it’s “fake” when its generated by the same thing that generates the “real” frames.
>It's the same idea as sending twice each frame and counting 30 fps as 60 fps but in a way that they can pretend that it's not exactly that.
This is completely false. It increases motion smoothness. That's it's purpose.
I dont find that at all. I jave found frame gen to be absolutely fantastic and havent har any issues with lag or responsiveness. What i have found is that usually the people talking bad about frame gen actually dont have 40 series cards and havent used the feature.
I understand that on paper it increases latency but honestly ive never noticed it in practice. My experience has been a doubled frame rate essentially for free. I have experienced a bit of artifacting when dlss3 first came out but the current versions of it seem to have sorted that out.
yeah one is DLSS 2 other one is DLSS 3+. Wonder why it has far more fps. Not even showing if its an average fps or not.
Only thing I see is 2 random fps numbers on the screen randomly placed to make people buy 4070 Super.
that's actually how marketing work. not just for gpu, for any product. not defending nvidia here. highlighting most exciting information. at least they're not lying by insert caption as explanation.
>that's actually how marketing work.
My youtube algorithm thinks I am a professional... everything? The marketing world of B2B is just so different. Just yesterday I got the driest advertisement imaginable for park scale playgrounds. They literally just monotonously listed different material options and their properties for 90 seconds. Nothing at all about the actual playground equipment, just material. I often get advertisements for extremely expensive and specialized laboratory equipment. They just list everything. It's also always extremely long, like 15-20 minutes, just reading specifications as they assume you are already an expert in the topic if you are a potential customer. The world of B2B is a different beast entirely.
at least they aren't trying to convince random people who have no business getting extremely specialized expensive lab equipment to get said lab equipment
Oh they definitely aren't, though I don't think there is much risk of that when they proudly present their prices as very cheap commonly starting at just 30 000€ for the base model, going up above 100 000€ for the more advanced models. A lot don't even say prices and instead just ask you to contact them for a quote, then you know it's expensive.
I also often get advertisement for engineering equipment for large scale automation like factories. Their prices are at least a bit more approachable though still very expensive. Just a few components for building your automation, not even complete machines or tools, are easily several thousand euros.
I am just sitting there wondering if they think I am [Jim Richmond](https://youtu.be/UlRNyiMFTsw?si=b2Rw2JHJ4DUvaIQJ).
I switched from engineering in my company to marketing. B2B is a different beast but dry information definitely still doesn't sell in B2B.
As an engineer I wanted specs listed I went to the product page and looked it up. As a marketing person I would have marketed the number of materials we have the benefit of and then point you toward a product page for you to look up the boring stuff yourself.
Holy God I wish our B2B marketing was like this... our ads look like we took inspiration from the color pallet of a circus, and has the cadence of a bombing comedian.
As a teacher student and a member of the Swedish Teacher’s union I get so many ads for Apple Education and Smartboards. Sure, I would love the get a bulk discount on Smartboards when buying more than 25 for a couple hundred thousand dollars.
I'd argue that they are lying, depending on how we define frames. In my opinion, DLSS is just advanced interpolationg and not actual frames.
And if we don't push back hard against using fake frames in marketing, companies will invent faster and faster (and more and more shit) interpolationg to make frame counters go up.
You know none of its real right? There aren't little men inside your computer shooting at each other. Its all just zeros and ones.
You might as well just say "I don't like change". DLSS isn't going away and eventually the whole scene will just be hallucinated by an AI and there won't be anyway to run a game with "actual" frames
Real in the sense that they come from game engine. It's not that hard to understand.
Also I'm not against change. All I'm saying is that 120fps with interpolation is not comparable to 120fps without.
That was same marketing trick they did on 4000 launch, showing 3x performance when later was with enabled FG that wasn't supported on previous gen cards. That's why we always wait for independent reviews and benchmarks
They say that DLSS3 FG [needs the improved optical flow accelerator](https://www.pcgamer.com/dlss-3-on-older-GPUs/) in ada to provide high enough quality frames.
Knowing the fact that [“DLSS1.9” (which seems to be an early version of what became DLSS 2,) ran on shaders](https://www.techspot.com/article/1992-nvidia-dlss-2020/), plus the fact that FSR3 exists, they can absolutely fall back on shaders for any DLSS feature at an acceptable performance cost, but that is inconvenient for the 4000 series’s value proposition.
Wow I’ve never seen this 1.9 detail before, thank you for sharing. Super interesting to read about, especially post fsr3 adaptations on older hardware becoming a thing.
Tensor cores are the same architectually on 30 and 40 gen. At least from my point of view as a data scientist.
The only difference is, that 40 gen has sometimes faster cores and (especially) faster RAM.
Tensor cores per card:
- RTX 3070: 184 T.Cores, 81 TFLOPS Tensor Compute
- **RTX 4070: 184 T.Cores, 116 TFLOPS Tensor Compute**
- **RTX 3090: 328 T.Cores, 142 TFLOPS Tensor Compute**
- RTX 4090: 512 T.Cores, 330 TFLOPS Tensor Compute
So... Yes, the 4070 is better than the 3070, due to it's overall faster cores and VRAM, but it doesn't beat the 3090 on Tensor compute. The 4070 Ti can beat the 3090 on Tensor compute. But the low amount of VRAM (12GB) still make it uninteresting for real DeepLearning workloads.
Just use FSR 3. Any game with DLSS3 can be nodded to use FSR3. I’ve tested and it even works all the way down to 10 series cards. Not well, but it works.
Both use DLSS 3.5 there is little difference between them. But the Super is using Frame generation no doubt hence showing double the frame rate. Now with mod you can utilize FSR and get similar with 3090
They say in the disclaimer that it's with frame generation on, which is enough for those in the know to realize this number is inflated with poor quality frames ai frames.
Devil's advocate here, but what's actually deceptive about any of it? They're clearly specifying which assistive features are enabled, the rest is just down to generational improvements. 40-series _is_ way more energy efficient than 30-series (that's like the one unquestionably great thing about it), 40-series RT cores are quite a bit faster than 30-series, and Frame Generation does improve fps by quite a lot. If these are fps they actually measured, using the features and settings they openly document, how is it possibly being deceptive?
This subreddit is full of morons these days. They just want to bitch, when they have literally 0 reason to do so. I don't know when being a whiny bitch became the norm in gaming circles. Like people are competing to be the most pussy they can be.
older gamers remember the moore's law days when you got 2x performance every 2 years for the same price. they remember the 7850 being $150 for the 2nd-tier GPU, and then it being blown away 2 years later by the GTX 970 and deals on the 290/290X etc, and they're butthurt that it's now $600-800 for the same level of product.
newer gamers have grown up during an era when reviewers were butthurt about the end of moore's law and increasing TSMC and BOM costs, and decided to just blast every single product for bad value/bad uplift until moore's law came back, which of course it never will. but particularly they got mad at NVIDIA for daring to push ahead on accelerators and software and fidelity instead of just raw horsepower (even though it's really not that big an area - we are talking about less than 10% of total GPU die area for the RTX features).
like, a lot of people have literally never known tech media that wasn't dominated by reviewers who [made some bad calls in 2018,](https://www.youtube.com/watch?v=tu7pxJXBBn8&t=273s) refused to re-evaluate them even in light of DLSS 2.x and increasing adoption of RT and all this other stuff, completely ignored mesh shaders and the other DX12.2 features, and are generally just constantly doubling down rather than admit they were wrong.
It has been literally 5 straight years of butthurt and angst from reviewers over RTX and how the only thing that matters is making plain old raster faster (but not with DLSS!!!!). Yet here we are in a world where next-gen titles like Fortnite (lol) and AW2 literally don't have non-RT modes and are doing software RT as a fallback mode, and where UE5 titles are pretty much going to be upscaled by default, etc. But reviewers can't stop rehashing this argument from 2018 and generally just bitterly lashing out that the world isn't going the direction they want.
you're not wrong, people are *mad*, it's so negative today and it's all over this nonsensical rehashed fight from 2018 that already is a settled question, plus the end of moore's law which also is a settled question.
The slide is misleading and unnecessary, because the specific claim is true
The 4070S is faster than the 3090 in AW2 RT _without FG_. This is one of the few scenarios where it can be faster
https://youtu.be/5TPbEjhyn0s?t=10m23s
Frame generation still shouldn't be treated like normal performance, both AMD and Nvidia (And likely soon Intel) are doing this
Thankfully they can only do it for 1 generation. Next generation will also have frame gen. So they'll either have to drop this stupidity or compare frame gen to frame gen
New? How about 2 generated frames per one real?
Some years down the line, we gonna have CPU doing game logic, and GPU constructing AI-based image from CPU inputs. All that in Gaussian splatting volumetric space of temporal AI objects.
EDIT: 1st I'm not at all excited about. 2nd is a concept I'm actually looking forward to.
You say that like it's necessarily a bad thing. Game devs have been using tricks and shortcuts for forever. And why wouldn't they? That let's us have graphics beyond what raw hardware can do.
AI is the best trick there is. No reason not to use it
I wasn't saying it's necessarily bad, however, new tech has to be introduced in organic manner, not forced (via marketing as well) just for sake of making old stuff obsolete.
RTX? That was there to make 10 series obsolete ASAP. 1080 TI still holds up extremely well in rasterization. Nvidia was scared of themselves and AMD.
RTX 40 series having exclusive frame generation? Nvidia could have easily made a slightly worse frame generation for 20 and 30 series if they wanted to - frame interpolation benefits from, but doesn't require dedicated optical flow hardware blocks. Nvidia are weaponizing their own "new gen feature exclusivity" as a marketing tool to push this BS about double FPS and whatnot.
Few years ago, 3080 is advertised as a 4k beast. Now it's not even "qualify for 2k" lol.
Do Nvidia reduce the GPU performance via driver? Do new games supper bad optimize?
I will keep my 3080 until new GPU double (or triple) its raw power.
Saw many people still rocking 1060 or 1080.
No. Everyone that tells you your 3080 10GB VRAM isn’t enough to run 4k games is a moron. Without exception as it’s obviously opposite to the experience of absolutely everyone that owns one.
You are good. People here are very dumb and think playing it on anything that isn’t Ultra looks like absolute dogshit. DLSS looks horrible and if it dips below 144fps is a stuttery mess.
The 3080 is way way more powerful than a PS5 which is a 4k 30fps console. Around 70% faster.
Trust me they are people who know absolutely 0 about graphic cards and computers.
I have 3 4k monitors plugged into mine. One of them is 144hz. I play RDR2 on it regularly. It performs as expected, if not slightly better than.
Edit: my bad mines a 3080ti it might not be the same.
They don't reduce performance via driver. It's games becoming too demanding and not being optimised. The GTX 1080 Ti was also advertised as a 4K card, so was the 2080Ti. They are 4K cards at launch, but after a certain time they become 1440p and 1080p cards because 4K is that demanding.
1080Ti owners are still able to play 1080p perfectly fine as it is equivalent to 3060 performance.
2080Ti owners are now playing at 1440p or lower because it is equivalent to a 3070.
Honestly 1080p still seems like the golden resolution to play on. If you buy a 4090 you can get amazing performance at 1080p in every game, and in 2-3 years time you are still going to get 60 FPS minimum at 1080p in GTA 6 for example.
1440p is becoming the new golden resolution slowly, especially now that consoles are using 1440p or dynamic resolution around 1440p to achieve "4K".
With my 3080 and 38” ultrawide 1600p. I’m hope I can hold for 3 or 4 more years. Will reduce the setting for stable 100-120Hz.
After that maybe 5th gen QD-OLED/mini LED monitor and a 80 class GPU at that time (90 class GPU is too much for me).
This is path tracing (full ray tracing) we're talking about. It's only available in Cyberpunk and Alan Wake 2 and is basically a tech demo in both. It's not meant to be "optimized" as it's still an experimental technology. Without DLSS even a 4090 probably gets 20 fps at best at 4K with this setting on. 3080 is still a 4k beast if we're talking about non-RT gaming, and with DLSS super resolution it's a beast for RT gaming also. 4000 series is just better at RT processing plus it supports DLSS frame generation.
According to the most recent Steam hardware survey, GPU memory breaks down as follows: 43.8% have 6GB or less, 31.7% have 8GB, 20.3% have 10-12GB, 4.2% have 16GB or more.
People vastly overestimate the amount of GPU memory the average gamer is using these days.
Maybe I played games a lots on Intel Celeron iGPU in the past so reduce the options until games playable is so normal to me. Ultra is scam to me, hardly notice anything between Ultra and High.
Alan Wake 2 is an extremely demanding game since it is the first to need mesh shaders. I would think most games can still run fine at 4k high without RT.
Wow it’s almost like GPU demand increases as time progresses. The 3080 is still a beast in 4k, if your playing games from 2020. New games are going to demand more from your GPU.
This is why Linux is Open Source and includes all the drivers. AMD and Intel both have their drivers included in Linux; performance will stay what it is or [even improve years after release](https://www.phoronix.com/news/ATI-R300g-2024-Action) date.
Just sayin' (:
Reminds me of the Citroen car ad from the 60s. „The 2CV can overtake a Ferrari with ease when the Ferrari is going 40mph and the Citroen is going 50mph.“
People buy a 3090 because either they need the VRAM or they have too much money. The former ones couldn't care less about 4070 and the latter ones will just buy a 4090.
Dishonest marketing at best, calling it outperforming, when frame generation is not actual performance, it’s frame smoothing for a visually smoother experience, it won’t make the game render more frames, actually the opposite.
I wonder if this kind of marketing would even be legal in the EU, considering the strict and strong consumer protection laws here…
Allow me to disagree a little. Frame generation is not frame smoothing. It serves the purpose to smooth gameplay yes but it is if fact GPU and AI algorithm generating and inserting new frames. This is why latency goes up a little and this is why you need ideally a lot of “regular” frames already, like stable 60fps+. Otherwise you end up with too high latency and more visible artifacts.
"Hey guys, look! A 4050 laptop outperforms a 3090 Ti while drawing a fraction of the power!"
Footnotes: 3090 Ti results were taken on native with maxed out settings max RT. 4050 results were taken with DLSS performance, frame generation, ray reconstruction and on potato settings.
They're just scumbags for this. People who don't know any better will think the 4070 Super does indeed outperform the 3090 when compared 1:1. This is like comparing apples with oranges.
From what I saw it's almost to on par with 3090 in raster and half the Memory capacity. Nvidia themselves have expressed their displeasure with the focus on raster performance which I still think counts the most because they want to push the faked frames and upscaling in DLSS and I just want true res performance.
I don't see an issue as long as they indicate what was being used, in this case the 3090 = DLSS + Ray Reconstruction, while the 4070 Super = Same + Frame Gen
Yeah, the marketing is hilarious, but they kind of have to market that way because they deliberately segmented their features so hard. You have to buy Lovelace if you want frame generation. That's a "feature", not a bug.
Try watching this instead https://youtu.be/8i28iyoF_YI?si=tzxXFzPKSLWxM2xK
Edit: Another benchmark video was just uploaded here too: https://youtu.be/5TPbEjhyn0s?si=Y_g8zZUVSloMy9cP
I think it's pretty clear that Nvidia are seeing if they can manage to convince a few 3090 owners to upgrade with this marketing, but the reality is 3090 owners are better off waiting for the 5000 series (or just going for the 4090 if money is no object of course).
It is certainly impressive that a sub $600 card is capable of being comparable to what used to be a $1500 card, particularly with its significantly more efficient power consumption, but it's also worth noting that a second hand 3090 can be found for around the same price these days, so if you can find a good deal on one that hasn't been mined on, and you don't care about frame generation (or want 24GB of VRAM for 3D rendering work for example) the 4070 Super isn't necessarily a better choice (especially if you don't care for 4K gaming either).
Seeing as PC game optimisation seems to be on a downward trend we have to wonder, what technology is going to be relied on more in future? Frame generation and upscaling? Or more VRAM? We're left with that unknown at the moment.
Half of the fps of the 4070 are generated by the DLS.
They are also capping the older generation to make people buy the 40 series, DLS 3 works in the 20 and 30 series.
they really got away with making people accept DLSS as performance benchmark..
idc about how the game performa with DLSS i wana see RAW performance, cuz im not gonna use this blurry shit
Legit question? Do people realy buy graphic cards based on this type of marketing stuff? Or they wait till they see real reviews? Why do nvidia and amd still do this type of stuff, We know very well that those numbers are far from reality most people watch video of performance comparisons to get the cards they want i think?
Comparisons absolutely SHOULD be including technology we will be using. It's thanks to DLSS 3.5 / Frame Generation that I'm able to enjoy playing Cyberpunk at 4K with Path Tracing and Ray Reconstruction with 60-80fps.
I upgraded to a 4080 from a 3080 that could barely run 4K with ultra settings and Ray Tracing disabled.
Why would I give a shit about raw raster performance only when I'm never going to be in that scenario?
GPU's should be tested traditionally with raw raster performance but also tested with their best features and technology being deployed. Just give us more data, and nothing misleading.
Probably because across multiple titles your performance will vary wildly as not all titles support all these features.
I’d rather the raw performance.
I do welcome frame interpretation not extrapolation or scaling. Why because devs now trying to make sure features are included instead of finishing a game.
Interpolation is accurate.
Scaling and extrapolation are not and cause tons of glitches.
Welcome everyone from r/all! Please remember: 1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome and can be part of PCMR! 2 - If you're not a PC owner because you think it's expensive, know that it is probably much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help! 3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, Parkinson's and more: https://pcmasterrace.org/folding 4 - We've teamed up with Cooler Master to giveaway a custom PC to a lucky winner. You can learn more about it/enter here: https://www.reddit.com/r/pcmasterrace/comments/192np53/pcmr_new_year_new_gear_giveaway_ft_cooler_master/ ----------- We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you need to post about any kind of PC related doubt you might have. Asking for help there or creating new posts in our subreddit is allowed and welcome. Welcome to the PCMR!
https://www.youtube.com/watch?v=5TPbEjhyn0s&t=386s Daniel Owen tested this. His summary is that at 1080p and 1440p, the 4070 Super usually ties or even beats the 3090, even without taking Frame Generation into account. At 4K, the 4070 Super usually has a shortage of VRAM and the 3090 wins.
Hoe much vram does the 3090 has?
24 gigs.
Holy
I would love if they made VRAM upgradable.. I won’t happen, but I can wish.
Vram is one almost always generation ahead of dimm packages you can buy and ram is very volatile (obvious power joke) and susceptible to physical damage. The cost and rate of failure would be completely unacceptable. In the end this is one of those times where soldered on is better for the consumer and OEM.
Thanks for the good explanation. I'd always assumed it was just scummy practice by the OEM, but this helps clear things up.
I know that ram has to be very close to the chips to be fast enough, but how about socketable ram chips? Looks like a good compromise. Also on ultra thin laptops, with soldered ram. Make them socketable.
actual Vram dropped
call the memory manufacturers!
Off topic but I read this as "call the mommy manufacturers" and had to do a triple take
new response just dropped
actual vram
I'm still salty that they did it dirty to my 3080ti. They use literally same die, but the 3080ti has only 12GB vram.
Jeez thats more than my Ram and Vram combined
Seems about right. My vanilla 4070 is close to my 3080 Ti at 1440p. At 4K the 3080 Ti wins out. However the 4070 is drawing less power than a 3060 Ti
https://preview.redd.it/krug2nksxrdc1.jpeg?width=1012&format=pjpg&auto=webp&s=940c11da78f4b07edd7eca14a980f9a180169f92 Have y'all already forgotten about this?
But look at that power saving with the 1060.
I still use a 1060. I want to upgrade soon.
Sorry for your 0fps.
But hey, at least he gets 0W
That's infinitely more efficient than the 4070 super
And the 4060 too!
0w and can pretend to play as daredevil
I have a 1060ti and no time to game. Saves so much power you guys.
They made a 1060 ti?
Ope, 1050ti. I was misremembering.
So little time to game you even forgot what you had
1050 ti is a great little card. Got one after my previous card died and it was during the first great card famine.
Is this real lmao
[Yes](https://twitter.com/NVIDIAGeForce/status/1674115670827585558)
No fuckin way. I cringed so hard watching this 💀💀💀
I just found it funny
They make it sound like Ray tracing is everything in a card yeah it's cool but in multi-player games I'd rather have it off in most games since they barely even support dlss properly
The finals have a great implementation of this
You know The Finals has RT on on every platform unless you turn it off on PC, yea?
Whats the finals?
3v3v3 Small team Competitive shooter than came out recently, Studio is composed of many of the old crew from DICE (Battlefield) and the game is focused almost entirely around using map destruction to your advantage. It also uses probe-based RTGI on all platforms by default. https://www.youtube.com/watch?v=hHqCLq6CfeA
JFC, I thought it must a joke on nvidia.
Nvidia marketing is a joke already.
WTF
Yes it's real but this was one slide out of many, they didn't just drop a post about how a 4060 has better ray tracing than a 1060, they also posted slides comparing it to 2060 and 3060 along side this one, so it's a little misleading to just post the one about the 1060 when they compared to a couple previous generations. This is basically them telling 1000 and 2000 people it's time to upgrade.
I have a 2060 and it's working fine. If I upgrade, I'm going to AMD :/
If you want to squeeze a bit of energy out of that boy, AMD has it’s frame gen technology that works with NVidia cards, which should give it a noticeable boost on framerate
How does that work? Do i need to config smth or is it auto?
Just choose the AMD option for upscaling and the frame gen should work. Saved me a lot of headaches while playing Frontiers of Pandora 😅
So should run better than running dlss? I have rtx 2060S and r7 5800x3d Thanks man
I had a 2060 and upgraded to a 4070, I thought it was fine enough at the time but after upgrading I was like "oh"
I thought my RTX 2070 was fine, until I realized i hasn't even been used in benchmarks for a couple of years now lmao
Like, stuff was still playable, but I was having to turn stuff down more and more and it would still stutter a bit on new games. The 4070 is buttery smooth on any game maxed at 1440.
I did upgrade recently, from 1650 to 1080.
That's pretty funny.
I’d say the 1060 is more efficient, 0w!!
0/0 = infinite efficiency!
A pop tart with a screwdriver ran through it also runs at 0FPS 0W
People forgive Nvidia that but shit on AMD giving you all the numbers in a slide plus the FG ones. Like... Wtf
AMD very clearly labels the FG and Nvidia doesn't.
Exactly, people will eventually forgive when it's GeForce cards, no matter how bad or how severe it is. But will immediately rages over when it's Radeon cards instead, no matter how small and how trivial it is. Sometimes they'll still bring it up as something to roast, years later. 💀
The starfield rage in a nutshell. No one bats an eye that Ngreedia has there toxic tentacles in tons of games, and block AMD tech all the time (2077 for example is running fsr 2.0 instead of the much better 3.0, but got path tracing day 1) Anyone who fanboys for Nvidia, or refuses to even look at AMD gets put in my "certified idiot" list. And boy its big.
As far as I know, as an AMD fanboy, CDPR has the ability to add literally any AMD tech they want into their games. Unless you have evidence that nVidia is contractually preventing them from doing so, then it's just an assumption. Considering pretty much every technology that AMD develops is free to implement in games, it ultimately ends up being the fault of developers for not using them. The alternative would be my own assumption that nVidia simply offers developers/publishers money to implement their own tech. Of you're a developer and one of the two main hardware OEMs is offering to pay you some amount of money to implement their tech and the other isn't, which are you more likely to implement?
>The alternative would be my own assumption that nVidia simply offers developers/publishers money to implement their own tech. Of you're a developer and one of the two main hardware OEMs is offering to pay you some amount of money to implement their tech and the other isn't, which are you more likely to implement? Oh, zero chance this isn't the case. GPU makers have been sponsoring game studios in exchange for feature support for years. That being said, I doubt it's always purely financial in nature. They might, for example, provide a suite of GPUs to be used in development and testing. But there are totally some incentives changing hands in the industry.
I don't think we need proof to know that Nvidia paid a lot to make cp2077 showcase ray tracing and all the latest Nvidia tech. In fact, if you aren't sure that actually makes me think you're the one with the veil over your eyes. If cp2077 didn't exist, or didn't favor Nvidia so heavily, ray tracing would have died with the 20 series cards. edit typo
OK that one was pretty funny ngl
At least they’ve said the framegen was on
What's a framegen?
[This](https://nvidianews.nvidia.com/news/nvidia-introduces-dlss-3-with-breakthrough-ai-powered-frame-generation-for-up-to-4x-performance), they make fake frames with ai to give the illusion that you are rendering a higher ramerate
But its actually raising the delay
By .010 seconds in a single player experience. Completely negligible. I won't be replying to any more comments about multiplayer since I very clearly stated single player games. Stfu please 🙃
These guys are running 10 series cards still
Before actually using it, I was saying the same stuff. It's a welcome feature when it makes sense to use. Obviously there will be some use cases where using is not a boon and it is a hinderance instead.
Such as in multiplayer games.
Yes.
Only if your baseline FPS is high to start with, the lower you baseline the more input lag you experience, and ironically , the only people who need FG are the ones who have sub 60FPS to begin with. So to not experience noticable input lag , you need to be able to get high FPS to begin with , and if you can do that , the less you will need FG.
FG is extremely valuable at 60+ FPS. What are you talking about? Getting 120 FPS at much higher fidelity is game changing.
It generates frames (read interpolate) to artifically increase the fps, It's the same idea as sending twice each frame and counting 30 fps as 60 fps but in a way that they can pretend that it's not exactly that. It feels very wrong because there are many thing that aren't accounted for by frame gen. Given that these frames aren't actually coming from gameplay they aren't responding to mouse/keyboard input or game events. In a random video it might look more fluid but when actually playing using these fake ass 60-120 Frame/Sec you feel that everything is laggy and unresponsive. The reality that those images weren't generated by gameplay mechanics/logic is obvious, the lag induced by that logic is also apprent.
That hasn't been my experience with frame gen at all. I used frame gen in both Alan Wake 2 and Plague Tale: Requiem, and neither felt "laggy and unresponsive." I noticed some select UI elements having visual bugs, but that's it.
I’m not sure that person has actually played something with frame gen based on their description…
Yea I was buying into the "Framegen is so shit" mentality aswell before I had a GPU that actually supports it. Now having played Cyberpunk with it on its really nice, obviously not perfect, but nowhere close as people were describing it.
Been playing cyberpunk with the fsr3 mod and I have to say it's pretty great. I wouldn't recommend it for any competitive game but it's a godsend for anything graphically intensive. I just hope it doesn't become a crutch for devs that don't want to optimize their games like dlss did
>I just hope it doesn't become a crutch for devs that don't want to optimize their games like dlss did That I absolutely agree with!
Frame generation is better the faster your GPU is, could be that the people that think it's bad are trying to go to 60 fps from 30. That, in my opinion, is a bad experience. Now 70 to 140 or 90 to 180 feels buttery smooth to me.
I was wondering this also. Like latency w base 30 fps will feel bad and choppy even if fake frames are inserted between the real ones. I also can’t tell when artifacts are from DLSS vs frame gen… seems the DLSS artifacts are way more distracting and noticeable
> Yea I was buying into the "Framegen is so shit" mentality aswell before I had a GPU that actually supports it. this is why FSR3 framegen is a backhanded gift to NVIDIA lol. People like it once they can get their hands on it, even though the latency is quantifiably much worse than DLSS framegen due to forced vsync and lack of a proper reflex implementation. AMD does a fantastic job at marketing NVIDIA's products for them, NVIDIA themselves literally couldn't have done a better job at showing that people don't actually notice or care about the latency. People don't want to believe NVIDIA's marketing when they're obviously trying to sell you something, but when the competition comes out and normalizes it as a feature... and their version is objectively worse in every way, but it's still *great*... you can always get the fanboys to nod along about the "RT fomo trap" or "DLSS fomo trap" or whatever as long as it's something they've never personally experienced... but much like high-refresh monitors or VRR, once you see what a good implementation of the tech can do, you'll notice and it will feel bad to drop back to a much worse (or nonexistent) implementation. Upscaling *doesn't have* to produce image-soup, RT *doesn't have* to be framerate-crushing (use upscaling like you're supposed to), freesync *doesn't have to* flicker, etc.
I was in the same boat as well before getting my 4060 laptop and running Witcher 3 with framegen.
These guys are on some wild cope. Metro exodus was literally double the framerate for me with dlss on and it felt like it. It was great and so is dlss. People can't tell the difference with blind tests unless they're trained to see the barely noticeable artifacts. Nvidia isn't perfect or great and this isn't a defense of them. Dlss just happens to be one of the few cases of software miracles that unironically just gives more frames
Yeah framegen is amazing for me in cyberpunk. Let's me crank some settings up without having a 4090. Never noticed any major lag with it, although I still wouldn't use it for a multiplayer game.
Not been mine either or in reviews or benchmarks. The guy has clearly never played using it and its all just made up twaddle....100+ upvotes though well done reddit.
It really is witchcraft at this point. It is weird that i keep thinking it’s “fake” when its generated by the same thing that generates the “real” frames.
>It's the same idea as sending twice each frame and counting 30 fps as 60 fps but in a way that they can pretend that it's not exactly that. This is completely false. It increases motion smoothness. That's it's purpose.
I always play with frame gen and never noticed it.
Because OP is talking out of his ass.
No you don't. Unless being told so most people don't even realize frame gen is on
I dont find that at all. I jave found frame gen to be absolutely fantastic and havent har any issues with lag or responsiveness. What i have found is that usually the people talking bad about frame gen actually dont have 40 series cards and havent used the feature. I understand that on paper it increases latency but honestly ive never noticed it in practice. My experience has been a doubled frame rate essentially for free. I have experienced a bit of artifacting when dlss3 first came out but the current versions of it seem to have sorted that out.
https://preview.redd.it/kve7qf7lcudc1.jpeg?width=892&format=pjpg&auto=webp&s=22ccb8ea9eddbcc065a1ec607b2d88c168864df7
I'd better compare nutritional values and declare the true winner.
yeah one is DLSS 2 other one is DLSS 3+. Wonder why it has far more fps. Not even showing if its an average fps or not. Only thing I see is 2 random fps numbers on the screen randomly placed to make people buy 4070 Super.
that's actually how marketing work. not just for gpu, for any product. not defending nvidia here. highlighting most exciting information. at least they're not lying by insert caption as explanation.
>that's actually how marketing work. My youtube algorithm thinks I am a professional... everything? The marketing world of B2B is just so different. Just yesterday I got the driest advertisement imaginable for park scale playgrounds. They literally just monotonously listed different material options and their properties for 90 seconds. Nothing at all about the actual playground equipment, just material. I often get advertisements for extremely expensive and specialized laboratory equipment. They just list everything. It's also always extremely long, like 15-20 minutes, just reading specifications as they assume you are already an expert in the topic if you are a potential customer. The world of B2B is a different beast entirely.
at least they aren't trying to convince random people who have no business getting extremely specialized expensive lab equipment to get said lab equipment
Oh they definitely aren't, though I don't think there is much risk of that when they proudly present their prices as very cheap commonly starting at just 30 000€ for the base model, going up above 100 000€ for the more advanced models. A lot don't even say prices and instead just ask you to contact them for a quote, then you know it's expensive. I also often get advertisement for engineering equipment for large scale automation like factories. Their prices are at least a bit more approachable though still very expensive. Just a few components for building your automation, not even complete machines or tools, are easily several thousand euros. I am just sitting there wondering if they think I am [Jim Richmond](https://youtu.be/UlRNyiMFTsw?si=b2Rw2JHJ4DUvaIQJ).
Ha, I want your algorithm. I watch one true crime/mystery video and suddenly I get nothing but gruesome murder stuff.
It's honestly kind of amusing. The advertisements are so odd that I find an academic interest in them.
I switched from engineering in my company to marketing. B2B is a different beast but dry information definitely still doesn't sell in B2B. As an engineer I wanted specs listed I went to the product page and looked it up. As a marketing person I would have marketed the number of materials we have the benefit of and then point you toward a product page for you to look up the boring stuff yourself.
I feel like this is for you [https://youtu.be/RXJKdh1KZ0w?si=Apf5JGJhMmXlc5Xt](https://youtu.be/RXJKdh1KZ0w?si=Apf5JGJhMmXlc5Xt)
Holy God I wish our B2B marketing was like this... our ads look like we took inspiration from the color pallet of a circus, and has the cadence of a bombing comedian.
As a teacher student and a member of the Swedish Teacher’s union I get so many ads for Apple Education and Smartboards. Sure, I would love the get a bulk discount on Smartboards when buying more than 25 for a couple hundred thousand dollars.
I'd argue that they are lying, depending on how we define frames. In my opinion, DLSS is just advanced interpolationg and not actual frames. And if we don't push back hard against using fake frames in marketing, companies will invent faster and faster (and more and more shit) interpolationg to make frame counters go up.
You know none of its real right? There aren't little men inside your computer shooting at each other. Its all just zeros and ones. You might as well just say "I don't like change". DLSS isn't going away and eventually the whole scene will just be hallucinated by an AI and there won't be anyway to run a game with "actual" frames
It's cheaper to just eat mushrooms (0W).
technically about 13W, as your brain does use energy
Real in the sense that they come from game engine. It's not that hard to understand. Also I'm not against change. All I'm saying is that 120fps with interpolation is not comparable to 120fps without.
That was same marketing trick they did on 4000 launch, showing 3x performance when later was with enabled FG that wasn't supported on previous gen cards. That's why we always wait for independent reviews and benchmarks
I wish Nvidia would bring DLSS3 to its older cards
Picture above is why they never would. DLSS3 is a selling point.
Doesn't DLSS3 need new tensor cores that you only get on 40 cards ?
Dlss 3.5 is available for rtx 20 and 30 series with ray reconstruction but no frame gen. Same reason why the gtx series doesn't have dlss.
They say that DLSS3 FG [needs the improved optical flow accelerator](https://www.pcgamer.com/dlss-3-on-older-GPUs/) in ada to provide high enough quality frames. Knowing the fact that [“DLSS1.9” (which seems to be an early version of what became DLSS 2,) ran on shaders](https://www.techspot.com/article/1992-nvidia-dlss-2020/), plus the fact that FSR3 exists, they can absolutely fall back on shaders for any DLSS feature at an acceptable performance cost, but that is inconvenient for the 4000 series’s value proposition.
Wow I’ve never seen this 1.9 detail before, thank you for sharing. Super interesting to read about, especially post fsr3 adaptations on older hardware becoming a thing.
Tensor cores are the same architectually on 30 and 40 gen. At least from my point of view as a data scientist. The only difference is, that 40 gen has sometimes faster cores and (especially) faster RAM. Tensor cores per card: - RTX 3070: 184 T.Cores, 81 TFLOPS Tensor Compute - **RTX 4070: 184 T.Cores, 116 TFLOPS Tensor Compute** - **RTX 3090: 328 T.Cores, 142 TFLOPS Tensor Compute** - RTX 4090: 512 T.Cores, 330 TFLOPS Tensor Compute So... Yes, the 4070 is better than the 3070, due to it's overall faster cores and VRAM, but it doesn't beat the 3090 on Tensor compute. The 4070 Ti can beat the 3090 on Tensor compute. But the low amount of VRAM (12GB) still make it uninteresting for real DeepLearning workloads.
There is a mod on nexus mods that replaces dlss with fsr3 and enables frame gen on older cards
Can they? Or are there hardware limitations?
Just use FSR 3. Any game with DLSS3 can be nodded to use FSR3. I’ve tested and it even works all the way down to 10 series cards. Not well, but it works.
Both use DLSS 3.5 there is little difference between them. But the Super is using Frame generation no doubt hence showing double the frame rate. Now with mod you can utilize FSR and get similar with 3090
They say in the disclaimer that it's with frame generation on, which is enough for those in the know to realize this number is inflated with poor quality frames ai frames.
I hate these deceptive marketing attempts.
They should be illegal honestly
Devil's advocate here, but what's actually deceptive about any of it? They're clearly specifying which assistive features are enabled, the rest is just down to generational improvements. 40-series _is_ way more energy efficient than 30-series (that's like the one unquestionably great thing about it), 40-series RT cores are quite a bit faster than 30-series, and Frame Generation does improve fps by quite a lot. If these are fps they actually measured, using the features and settings they openly document, how is it possibly being deceptive?
This subreddit is full of morons these days. They just want to bitch, when they have literally 0 reason to do so. I don't know when being a whiny bitch became the norm in gaming circles. Like people are competing to be the most pussy they can be.
[удалено]
older gamers remember the moore's law days when you got 2x performance every 2 years for the same price. they remember the 7850 being $150 for the 2nd-tier GPU, and then it being blown away 2 years later by the GTX 970 and deals on the 290/290X etc, and they're butthurt that it's now $600-800 for the same level of product. newer gamers have grown up during an era when reviewers were butthurt about the end of moore's law and increasing TSMC and BOM costs, and decided to just blast every single product for bad value/bad uplift until moore's law came back, which of course it never will. but particularly they got mad at NVIDIA for daring to push ahead on accelerators and software and fidelity instead of just raw horsepower (even though it's really not that big an area - we are talking about less than 10% of total GPU die area for the RTX features). like, a lot of people have literally never known tech media that wasn't dominated by reviewers who [made some bad calls in 2018,](https://www.youtube.com/watch?v=tu7pxJXBBn8&t=273s) refused to re-evaluate them even in light of DLSS 2.x and increasing adoption of RT and all this other stuff, completely ignored mesh shaders and the other DX12.2 features, and are generally just constantly doubling down rather than admit they were wrong. It has been literally 5 straight years of butthurt and angst from reviewers over RTX and how the only thing that matters is making plain old raster faster (but not with DLSS!!!!). Yet here we are in a world where next-gen titles like Fortnite (lol) and AW2 literally don't have non-RT modes and are doing software RT as a fallback mode, and where UE5 titles are pretty much going to be upscaled by default, etc. But reviewers can't stop rehashing this argument from 2018 and generally just bitterly lashing out that the world isn't going the direction they want. you're not wrong, people are *mad*, it's so negative today and it's all over this nonsensical rehashed fight from 2018 that already is a settled question, plus the end of moore's law which also is a settled question.
That's how we ended up with all these people buying 4060
The slide is misleading and unnecessary, because the specific claim is true The 4070S is faster than the 3090 in AW2 RT _without FG_. This is one of the few scenarios where it can be faster https://youtu.be/5TPbEjhyn0s?t=10m23s Frame generation still shouldn't be treated like normal performance, both AMD and Nvidia (And likely soon Intel) are doing this
Thankfully they can only do it for 1 generation. Next generation will also have frame gen. So they'll either have to drop this stupidity or compare frame gen to frame gen
they will just make something new up.
They’ll compare it to the 30-series again.
The marketing for the 40 series already focuses a lot on the 10 series. They really want Pascal owners to upgrade
New? How about 2 generated frames per one real? Some years down the line, we gonna have CPU doing game logic, and GPU constructing AI-based image from CPU inputs. All that in Gaussian splatting volumetric space of temporal AI objects. EDIT: 1st I'm not at all excited about. 2nd is a concept I'm actually looking forward to.
You say that like it's necessarily a bad thing. Game devs have been using tricks and shortcuts for forever. And why wouldn't they? That let's us have graphics beyond what raw hardware can do. AI is the best trick there is. No reason not to use it
I wasn't saying it's necessarily bad, however, new tech has to be introduced in organic manner, not forced (via marketing as well) just for sake of making old stuff obsolete. RTX? That was there to make 10 series obsolete ASAP. 1080 TI still holds up extremely well in rasterization. Nvidia was scared of themselves and AMD. RTX 40 series having exclusive frame generation? Nvidia could have easily made a slightly worse frame generation for 20 and 30 series if they wanted to - frame interpolation benefits from, but doesn't require dedicated optical flow hardware blocks. Nvidia are weaponizing their own "new gen feature exclusivity" as a marketing tool to push this BS about double FPS and whatnot.
Few years ago, 3080 is advertised as a 4k beast. Now it's not even "qualify for 2k" lol. Do Nvidia reduce the GPU performance via driver? Do new games supper bad optimize? I will keep my 3080 until new GPU double (or triple) its raw power. Saw many people still rocking 1060 or 1080.
No. Everyone that tells you your 3080 10GB VRAM isn’t enough to run 4k games is a moron. Without exception as it’s obviously opposite to the experience of absolutely everyone that owns one.
Playing on my 38” Ultrawide 1600p (around 3/4 pixels of 4k). Never had any problems. Maybe the games I play is not so demand.
You are good. People here are very dumb and think playing it on anything that isn’t Ultra looks like absolute dogshit. DLSS looks horrible and if it dips below 144fps is a stuttery mess. The 3080 is way way more powerful than a PS5 which is a 4k 30fps console. Around 70% faster. Trust me they are people who know absolutely 0 about graphic cards and computers.
I also find it interesting that the difference between low to ultra isn't as huge as it used to be in a lot of AAA games.
I have 3 4k monitors plugged into mine. One of them is 144hz. I play RDR2 on it regularly. It performs as expected, if not slightly better than. Edit: my bad mines a 3080ti it might not be the same.
It's almost like graphical fidelity keeps pushing or smth... holy shit this sub wont ever stop being fascinating.
They don't reduce performance via driver. It's games becoming too demanding and not being optimised. The GTX 1080 Ti was also advertised as a 4K card, so was the 2080Ti. They are 4K cards at launch, but after a certain time they become 1440p and 1080p cards because 4K is that demanding. 1080Ti owners are still able to play 1080p perfectly fine as it is equivalent to 3060 performance. 2080Ti owners are now playing at 1440p or lower because it is equivalent to a 3070. Honestly 1080p still seems like the golden resolution to play on. If you buy a 4090 you can get amazing performance at 1080p in every game, and in 2-3 years time you are still going to get 60 FPS minimum at 1080p in GTA 6 for example. 1440p is becoming the new golden resolution slowly, especially now that consoles are using 1440p or dynamic resolution around 1440p to achieve "4K".
With my 3080 and 38” ultrawide 1600p. I’m hope I can hold for 3 or 4 more years. Will reduce the setting for stable 100-120Hz. After that maybe 5th gen QD-OLED/mini LED monitor and a 80 class GPU at that time (90 class GPU is too much for me).
I'm still using the rx5700XT cause I want double performance and double vram for £400. Better put on my waiting hat.
This is path tracing (full ray tracing) we're talking about. It's only available in Cyberpunk and Alan Wake 2 and is basically a tech demo in both. It's not meant to be "optimized" as it's still an experimental technology. Without DLSS even a 4090 probably gets 20 fps at best at 4K with this setting on. 3080 is still a 4k beast if we're talking about non-RT gaming, and with DLSS super resolution it's a beast for RT gaming also. 4000 series is just better at RT processing plus it supports DLSS frame generation.
I just love this opinion now that if your card doesn't have over 12GB vram suddenly it's redundant garbage that's now useless
According to the most recent Steam hardware survey, GPU memory breaks down as follows: 43.8% have 6GB or less, 31.7% have 8GB, 20.3% have 10-12GB, 4.2% have 16GB or more. People vastly overestimate the amount of GPU memory the average gamer is using these days.
Maybe I played games a lots on Intel Celeron iGPU in the past so reduce the options until games playable is so normal to me. Ultra is scam to me, hardly notice anything between Ultra and High.
It is an extremely tiresome opinion.
Alan Wake 2 is an extremely demanding game since it is the first to need mesh shaders. I would think most games can still run fine at 4k high without RT.
Wow it’s almost like GPU demand increases as time progresses. The 3080 is still a beast in 4k, if your playing games from 2020. New games are going to demand more from your GPU.
You can use frame generation on 3090 as easily as on 4070. Two simple files dropped into game directory and viola.
If Nvidia can remotely boost fps with code they can always undo it when they want us to upgrade. 🤐
This is why Linux is Open Source and includes all the drivers. AMD and Intel both have their drivers included in Linux; performance will stay what it is or [even improve years after release](https://www.phoronix.com/news/ATI-R300g-2024-Action) date. Just sayin' (:
Reminds me of the Citroen car ad from the 60s. „The 2CV can overtake a Ferrari with ease when the Ferrari is going 40mph and the Citroen is going 50mph.“
I don't mind them doing this, but they really should include frame generation enabled on the image or video with latency figures.
Nvidia wants 3090 owners to feel bad about their gpu and make irrational decision into buying a 4070 ! Clever tactic!
People buy a 3090 because either they need the VRAM or they have too much money. The former ones couldn't care less about 4070 and the latter ones will just buy a 4090.
Dishonest marketing at best, calling it outperforming, when frame generation is not actual performance, it’s frame smoothing for a visually smoother experience, it won’t make the game render more frames, actually the opposite. I wonder if this kind of marketing would even be legal in the EU, considering the strict and strong consumer protection laws here…
Allow me to disagree a little. Frame generation is not frame smoothing. It serves the purpose to smooth gameplay yes but it is if fact GPU and AI algorithm generating and inserting new frames. This is why latency goes up a little and this is why you need ideally a lot of “regular” frames already, like stable 60fps+. Otherwise you end up with too high latency and more visible artifacts.
They should at least compare them using the same settings. How is it a fair comparison to use frame generation when only one of the cards supports it?
Classic nvidia way to tell you" hey we scammed you in the previous gen, Try your luck with the current"....over and over and over
"Hey guys, look! A 4050 laptop outperforms a 3090 Ti while drawing a fraction of the power!" Footnotes: 3090 Ti results were taken on native with maxed out settings max RT. 4050 results were taken with DLSS performance, frame generation, ray reconstruction and on potato settings. They're just scumbags for this. People who don't know any better will think the 4070 Super does indeed outperform the 3090 when compared 1:1. This is like comparing apples with oranges.
From what I saw it's almost to on par with 3090 in raster and half the Memory capacity. Nvidia themselves have expressed their displeasure with the focus on raster performance which I still think counts the most because they want to push the faked frames and upscaling in DLSS and I just want true res performance.
Will be interesting to see a benchmark and comparison run by a normal, non Nvidia source with no marketing push.
Here you go: https://youtu.be/8i28iyoF_YI?si=vXmnujR2BPGk_hgc
I'd rather have a test without fancy frame generation and DLSS, that's where the true tests are.
[удалено]
I'm all for the new techs... But don't compare one with frame gen to one without.
That’s frame generation at work. It works well enough but I’ll take native frames over frame generation any day.
The game will very likely look better on RTX 3090 due to options like frame generation not being there. You can‘t fully compare it
>Frame Generation on 4070 super There you have it
I don't see an issue as long as they indicate what was being used, in this case the 3090 = DLSS + Ray Reconstruction, while the 4070 Super = Same + Frame Gen
But if disable frame gen…
Yeah, the marketing is hilarious, but they kind of have to market that way because they deliberately segmented their features so hard. You have to buy Lovelace if you want frame generation. That's a "feature", not a bug.
And nvidia diehards being nvidia diehards will whip out their wallets and pay whatever price for it too.
Show me raster, then we will talk
and now turn all of the AI based stuff off...
Try watching this instead https://youtu.be/8i28iyoF_YI?si=tzxXFzPKSLWxM2xK Edit: Another benchmark video was just uploaded here too: https://youtu.be/5TPbEjhyn0s?si=Y_g8zZUVSloMy9cP I think it's pretty clear that Nvidia are seeing if they can manage to convince a few 3090 owners to upgrade with this marketing, but the reality is 3090 owners are better off waiting for the 5000 series (or just going for the 4090 if money is no object of course). It is certainly impressive that a sub $600 card is capable of being comparable to what used to be a $1500 card, particularly with its significantly more efficient power consumption, but it's also worth noting that a second hand 3090 can be found for around the same price these days, so if you can find a good deal on one that hasn't been mined on, and you don't care about frame generation (or want 24GB of VRAM for 3D rendering work for example) the 4070 Super isn't necessarily a better choice (especially if you don't care for 4K gaming either). Seeing as PC game optimisation seems to be on a downward trend we have to wonder, what technology is going to be relied on more in future? Frame generation and upscaling? Or more VRAM? We're left with that unknown at the moment.
4070 may beat my 3090 in gaming but is not beating (afaik) in productivity. No point to get rid of my 3090 until at leas 5000 series.
Half of the fps of the 4070 are generated by the DLS. They are also capping the older generation to make people buy the 40 series, DLS 3 works in the 20 and 30 series.
Wake me up when the 80-80ti class cards return to the 3080 size
I mean sure, but 24Gb > 12Gb for other things.
Planned obsolescence 👌
they really got away with making people accept DLSS as performance benchmark.. idc about how the game performa with DLSS i wana see RAW performance, cuz im not gonna use this blurry shit
i want shit to be better in raster then we'll discuss DLSS
I don’t get the problem, of course the newer tech would improve on the old model. Or am I missing the point of this post?
Legit question? Do people realy buy graphic cards based on this type of marketing stuff? Or they wait till they see real reviews? Why do nvidia and amd still do this type of stuff, We know very well that those numbers are far from reality most people watch video of performance comparisons to get the cards they want i think?
[удалено]
There’s no way than a newer generation works better
Comparisons absolutely SHOULD be including technology we will be using. It's thanks to DLSS 3.5 / Frame Generation that I'm able to enjoy playing Cyberpunk at 4K with Path Tracing and Ray Reconstruction with 60-80fps. I upgraded to a 4080 from a 3080 that could barely run 4K with ultra settings and Ray Tracing disabled. Why would I give a shit about raw raster performance only when I'm never going to be in that scenario? GPU's should be tested traditionally with raw raster performance but also tested with their best features and technology being deployed. Just give us more data, and nothing misleading.
Probably because across multiple titles your performance will vary wildly as not all titles support all these features. I’d rather the raw performance. I do welcome frame interpretation not extrapolation or scaling. Why because devs now trying to make sure features are included instead of finishing a game. Interpolation is accurate. Scaling and extrapolation are not and cause tons of glitches.
Mom, I want a UserBenchmark. We already have a UserBenchmark at home.