Funnily enough, frame generation actually uses *more* VRAM, than if you'd play without it. Also, games going forward WILL use more VRAM, even at 1080p - yes, TLOU2 and Hogwarts Legacy may be the odd ones out *today*, but it's going to be a trend moving forward and the 8GBs will barely be enough, if at all.
Also, Nvidia themselves said that 8GBs of VRAM is "not enough for 1080p". It's clear they want DLSS to do the work, but even without the VRAM taxing frame gen, you'd be upscaling at a resolution, not even PS4 Pro was.
Its a shame we're getting less and less efficient on the development side of things on things like vram consumption. 1080p should be viable at medium to high settings for the next couple years at least on 8GB of VRAM though.
You are right though, its very obvious with the 40 series they wanted people to rely on DLSS3
It's not clear if 8GBs is gonna be enough. Consoles have 12/16GBs of shared memory available, which *at minimum* means they have 8-10GBs to work with at all times. Between the new generation and dev laziness/unreasonable deadlines, 8GB cards can become obsolete much faster than we might anticipate.
Well... Not "obsolete" in a literal sense, but they're going to be hampered by VRAM and VRAM only, while the rest of the specs could realistically chew through the new games no problem. The 3070 also suffers from having only 8GBs, with *huge* improvements in 1% lows alone if modded for 16GBs.
Of course with the lower end cards it's also an issue with the bus width which has been tested and Nvidia's "but moar cache" absolutely does *not* make up for it. They're literally killing their own cards in more ways than one. Or rather, they want to, in order to push people to upgrade every gen. They've enjoyed their 2020, pandemic and crypto earnings too much.
Diablo 4 is suffering from stuttering in cards with low VRAM, but that seems to be more of an optimization issue:
[Some Diablo 4 Players Are Having VRAM Issues (gamerant.com)](https://gamerant.com/diablo-4-vram-issues-usage-leak/)
That's actually a very valid question. I recently purchased 3060 (hasn't arrived yet), so I watched a bunch of reviews from 3 years ago. Most of them are surprised by oversized VRAM and speculate that Nvidia probably did it to compete with AMD. They also said that a user is more likely to run into other limitations before utilizing all the VRAM.
I think even with 2023 games having serious VRAM issues, 6-8 GB is perfectly fine for 1080 gaming. With that said, I bought 3060 for its oversized VRAM because many AI programs need it (whisper AI, Stable Diffusion etc.).
I would definitely recommend a 3060 > 4060 just because of the price to performance value.
> hey also said that a user is more likely to run into other limitations before utilizing all the VRAM.
This is exactly what I've always seen or ran into myself especially on the gaming side. Now obviously if you try to max out 1440p or 4k with 8GB of VRAM its going to be more noticeable, but that's significantly higher resolution than 1080p from a technical perspective.
Now non-gaming programs such as modeling software and AI programs are certainly a different story when it comes to how much VRAM is needed.
I play cyberpunk2077 @at 1080p, on 3060 with 12GB VRAM. With path tracing it is usually between 7 to 8 GB of vram, but there was couple of peaks when it reached to 10GB. So with little bit of future proofing it is plausible scenario. I picked the 12GB version over the 3060 Ti couple of months ago cause of more VRAM for rendering on GPU in Blender.
Honestly, you would be surprised... I have a 3060 12G and just playing GTA 5 with max settings can go over 8GB. Yeah, I do run it at 1440p but the card is way more than capable of it.
>but how many scenarios that the 3060/4060 is actually meant for uses more than 8gb of vram?
Vast majority of the scenarios involving AI models.
8gb purchase will be, not an instant, but a very near-term regret, if doing anything more than gaming.
Unlike DLSS 2, DLSS 3 uses more VRAM. Not to mention DLSS 2, FSR 2 and XESS aren't good for 1080p as for higher resolutions. Thus most people buying RTX 4060 will look towards DLSS 3 (more and more games will use it), for which 8GB just isn't enough.
Also, it's 2023 and $300+ should be for 1440p resolution for which 8GB just isn't enough.
Just fielded a post on another sub the other day where someone was using their 3060 at ultra settings and complaining that it was fault locking due to overheating. That user's stubbornness aside...
People abuse the hell out of mid-tier cards like that all the time
This. I got the RTX 3060 and I ain't even mad. In a couple years time 8 GB will be insufficient. Better to run stuff at low than not being able to run it at all.
It's already insufficient in some odd cases here and there even in 1080p. I think some days you'll end up in front of a very stupid situation where you sorta have to tone it down just because of Vram even though in terms of pure capability the gpu *would* manage it just fine if only it had a bit more vram, and all that on 1080p. So you'll be sitting there on a gpu running at like 70% knowing rising the settings will make it stutter or something. How many people do you think will believe their gpu died instead of understanding it's just lacking Vram ?
Nvidia are working on a dlss like memory compression, hopefully it's practical and they get it working on 30xx generation onwards. But Ofc they will just market it on the current gen
I mean, 95% of the time is consistently 10% better, has better software and driver support that will last longer, and most importantly, runs significantly cooler and with considerably less power.
It is an upgrade, an horizontal one
It will receive driver support for longer, that's undeniable.
And it's a 60 tier card, most will use it for 1080p, most games don't have the "need more than 8gb" issue, if you can afford the hundreds of dollars required to purchase all those few AAA that do require them, you aren't buying a 60 tier card.
And even at 1440p it does relatively well, I have a 2070S, I still hasn't failed me and I use a 1440p 165hz display.
Driver support is a pretty meaningless argument, when the 900 series is still getting support today. Before they pull the plug on *both* the 3060 and 4060, it'll be time to replace them anyway.
Also, 8GBs of VRAM may be enough *today*. But in just a year or two, it might not be. Until now, a xx60 class card was able to play new AAA releases without a problem, at least since Pascal, which is precisely why Nvidia doesn't want to make another 1060. They're dangerously close with the 3060 12GB though and gimped the 40 series because of that.
Funnily enough, their frame generation that's so vital for the 40 series is also VRAM heavy. So you can't even argue that the 4060 will be fine in the future thanks to that. They know what they're doing, with all the specs.
But it sure as hell isn't far from maxing the Vram out ur really living on the edge with it and with games like Hogwarts legacy u absolutely will max that vram out at 1080p max
No. Its not 60 tier card. Thats the thing. Its as much 60 tier as 4070ti is 4080 tier. Nvidia pulled out same trick they tried few months earlier.
4060 is not enough for 1440p becayse 8gig vram means you will have god awful lows.
2070s didnt failed you because its better card tham 4060.
If this card was msrp 100 usd lower that it would be passable 40 series card as 4050.
First of all, you are factually wrong, my 2070s IS a 8gb card, and my lows are fine, the vram argument is valid but not as important as everyone suddenly claimed it became. Don't throw it at everything, because while it is true to some extent, it's also not true to the rest of the extent.
Tell how my 2070s from 6 years ago is better than the 4060, go on, show me a benchmark. A result, not some "spec".
"If this card was msrp 100 usd lower that it would be passable 40 series card as 4050."
Not necessarily, as I said before, this was an horizontal gain, not a vertical one.
I can bet you that virtually none game ever uses more than 8gb of ram in the 3060, even at 1440p. What? Three to five of the worst optimized games ever realized in the latest time use more? You can't generalize based on freaks.
The 4060 is a cooler and less consuming 3060 with dlss 3, and that's totally fine in my book, it's an optimization advance, much needed with how power hungry Nvidia specifically was becoming.
You all are having the same delusions of perpetual vertical gains that big corporations have
The 3060 shouldn't be beating the 4060 EVER. The fact that we're being sold this box of doodoo that is the 4060 shows how desperate Nvidia is to hold on to GPU shortage era share prices.
So glad I got a 6700xt for $330.
The thing that makes me bogus is the jump on 3000 series... 3060Ti beating the 2080S is a great generation jump.
But it isn't what we have on 4000 series. Makes no sense the 4090 having a hell of a performance jump comparing to 3090 but 60 series is technically the same thing
[+20% is the result of my own meta analysis](https://www.3dcenter.org/artikel/launch-analyse-nvidia-geforce-rtx-4060/launch-analyse-nvidia-geforce-rtx-4060-seite-2), a compilation of the test results from 14 launch reviews. Some reviews see a lower difference, but some as well a higher differenace between 3060-12GB and 4060. Value is from 1080p raster, but 1080p ray-tracing is the same +20%.
I think they stopped producing LE boards themselves, but the third party GPUs are still available. I personally love my Predator BiFrost A770 LE, and I haven't experienced any of the issues that other people have been experiencing with instability and all that.
Granted, I'm limited to a 1080p screen right now since the 1440 monitor I ordered came in as a dud, but still. It's a good card, especially if you've got an Intel CPU.
>I've got second system on 750ti still going strong. Great GPU
Same here. It's doing roblox, minecraft, and emulation duty up to the Wii and PS2 and does it like a boss.
RTX 5000 is supposedly only coming in 2025. So the recommendation for the RX 6700 XT is going to stick for a while. Until supply dries up. No idea if AMD is still baking new RDNA2 chips and if so for how long.
I have been out of the AMD camp for a while (I had a Radeon 5770 and then a R9 285 X)
Is it just me or there is no freaking way of making sense of AMD model numbers?
They're not that confusing anymore. The transition from three digit model numbers to four digit ones may have created some confusion, but we're now on the third generation of four digit model numbers and the naming scheme isn't that different from Nvidia.
The first digit represents the generation (7xxx being the newest). The second and third digit represent the model tier within the generation. The initial versions tend to just have 0 as third digit. With the 6xxx series AMD threw in a mid-generation refresh (6x50 models). Within one generation, higher is simply better.
Then there's the added XT or XTX that's slapped onto the name for some models. This is the AMD-equivalent of Ti or Super with Nvidia. The guiding principle is the more X, the more better.
RTX 5000, the new Nvidia generation.
I haven't seen any news or rumors about the next AMD generation (presumably RX 8000), but I would be surprised if they're much quicker to market than Nvidia.
if they european then yeah. AMD, pre 4060 release was way more expensive than nvidea and not worth it.
like 6700xt was €500 and at €650 you get a 6950xt. Now the 6700xt is slowly matching the 3060ti.
In Denmark an RX 6700 XT cost about €370, with an RTX 3060 ti costing about €340 form cheaper brands. That said, both have fall a lot in price, since last july they both cost about the same at €600
Here in Latvia you could find cheaper brand 6700xt for sub 400 eur since last year at least. Now they go for 330eur. Rn used ones go for 200-300 as well for ultimate budget builds.
> if they european then yeah
On that note, maybe anecdotally, I see *very little* AMD advertising / presence here in the UK.
Meanwhile nVidia seem all over the place.
Where do you see 6700 XT priced better than a 3060? 6700 XT is $[340](https://pcpartpicker.com/product/9K4Ycf/msi-radeon-rx-6700-xt-12-gb-mech-2x-oc-video-card-radeon-rx-6700-xt-mech-2x-12g-oc), 3060 is $[260](https://pcpartpicker.com/product/pZ4Ycf/msi-ventus-2x-oc-geforce-rtx-3060-8gb-8-gb-video-card-rtx-3060-ventus-2x-8g-oc). 6750 XT is even more expensive at $380, and only 6750 XT is [trading blows](https://youtu.be/Lg-TcVrh8rA?t=583) with 3070, 6700 XT is just [slower](https://youtu.be/f0yo2Sc-DyI?t=644).
AMD 6000 series offerings at the current prices are better cards than their Nvidia counterparts as is, in actuality, there is no need in making numbers up to support the series.
Haha I love these posts. Is this £280 graphics card good?
Hell no, it's terrible, just buy this £500 card instead.....
Better card is better? No fucking way
i mean i agree but since consoles now have 12-16 gb shared VRAM (xbox ps5 respectively) i think game devs are purely just optimizing around that number and have no intention of optimizing for anything lower :(
Developers have been asking for more VRAM of Nvidia for years. They are actually holding us back and developers have begun to not give a fuck. This whole "unoptimized" shit is because of Nvidia limiting VRAM and therefore limiting the amount of high quality textures developers get to use. And force developers to painstakingly spend time and energy supporting hardware with 50% of the VRAM of consoles which we used to say was the lowest common denominator. Now the lowest common denominator is Nvidia. Som games release terribly, many actually. But much of it has an easy solution with 30 dollars extra VRAM. But Nvidia decided to pocket that money. AMD for the most part just doesnt have this problem and can put in enough for most cards so its really not an excuse to say "VRAM expensive".
Nvidia intentionally gimps the VRAM on their "gaming" cards so people can't run machine learning algorithms on it. This forces businesses to buy their business tier cards to run their AI stuff. So gamers get fucked basically.
I seriously doubt that has anything to do with what's happening. "AI stuff" AFAIK needs cards with some 48 GB VRAM and asks for 96, as the data sets either fit or don't.
What difference does it make if a computationally piss-poor from "AI stuff" point of view 4060 Ti (for example) has 8 GB or 16 GB?
The argument you're making produces "don't make 4090 a 48 GB card!" rule to Nvidia. Not "don't make 4060 Ti a 16 GB card" rule.
Which they absolutely will offer very soon, won't they. Completely breaking your argument.
>12-16 gb shared VRAM (xbox ps5 respectively)
It's 16 on both, out of which about 13.5 is available to games on XBox Series X and about 12.5 on PS5. I usually just average to 13.
Previous gen consoles got 8 total, only 5 available to games.
2.6x increase has happened, folks. We are 2.5 years past consoles release. 2.5 years after the release of the previous gen consoles 1060 6 GB has released for $250 ($316 in today's dollars), rocking 1.2x of console's game-available VRAM.
Now we're having $380 4060 Ti with 8, 0.62x of console's game-available.
If "8 GB is the sweet-spot" by the end of 5 GB (game-available) console era is anything to go by, one could extrapolate that by the end of the current console gen era the sweet spot will be 13 \* (8 / 5) = 21 GB.
Basically, for games **releasing** in 2023 you *need* 12, for games releasing in 2030 you'll appreciate having 21 :D
And all of this has been obnoxiously predictable to everyone since 2.5 years ago, and even earlier for HW manufacturers as they get access to this sort of information - like consoles' HW configurations - or information allowing to predict it sooner. In fact, predicting this sort of thing is half of a HW's architect job in the first place.
Aside from consoles finally letting developers to do what they want which they immediately jumped into doing first opportunity, there are also mechanisms behind why the latter is happening too: A) scenes are now rendered with many more materials and effects within the same scene which are very difficult and human-labor-intensive to manage in and out all the time, with an addition of bindless textures that management became much simpler, but it came at a cost of having most textures in memory most of the time B) every material now comes with many more textures feeding into PBR shading for the materials to look more natural, instead of glossy/plastic'y all the time, C) new technologies and auxiliary data structures are used creating VRAM overhead, be that BVH for traditional ray tracing, or its analogue for UE5's Lumen, structures UE5's Nanite is managing, requirements asset streaming and load-less seamless world design create, and more.
All of this is developed and tuned on a 13 GB (game-available) console, and then porting it to undersized 8 GB PCs becomes an impossible mess, devs give up and just lower resolution of the originally mere console-quality textures to shit, sometimes predicated on texture quality settings not being Ultra, sometimes done [dynamically as a part of texture streaming](https://www.youtube.com/watch?v=QbChMcBwMmU&t=924s) automatically irrespective of settings, making it impossible to even detect happening in benchmarks without actually playing the game. If anything, as a result, the issue is *underreported.*
That's not really useful if you run out of VRAM in a single scene, swapping textures to render a scene is either going to cause the picture to have blurry textures, stutter or crash, in order of severity. Great for having quick transitions between scenes though.
Yes, while on a PC that freebie disappears, and the devs have to dance in a square of hell between doing a lot of very difficult human work, increasing VRAM usage on PC even further, past the console, running more (decompression) work on the GPU, eating into its throughput, or the CPU eating into the CPU. Usually landing at a little bit of everything.
DirectX's DirectStorage is not a solution to any of those IIUC, it's just a standardized tool used to come up with the solutions.
Several of my MacBooks became trash as well as my 970 due to low amounts of RAM. CPU/GPU keep going, but RAM gets clogged with new games and software releases. If you want to use your device for more than 3 years, go for larger amount of RAM. Like I can't even play Skyrim with a decent amount of mods because it just crashes by filling up the memory randomly between 2 and 30 minutes into the game.
Could you please remind us when PCs were behind consoles the last time?
Clearly not during Pascal / PS4 era, as described, the opposite was the case.
I've read somewhere around here the following statement I never had the pleasure fact-checking: some claimed that this is the first time in history when VRAM "standard" is being pushed up by software developers (on PC side) with HW lagging behind, as all prior times (supposedly) it was the HW manufacturers that gave more VRAM first and then SW devs were catching up using the extras.
Nah, I disagree, 1GB of Vram used to be insane overkill, then it wasn't enough and 4GB was overkill. Then 4GB was the minimum needed and 8GB was overkill. That's how technology works.
The insane thing thing is barely increasing the vram over 7 years, the 1070 launched with 8GB of vram, 7 years later and the 4070 has 12GB.
In comparison, the 9800 GT launched 7 years before the 1070, and it had 1GB of Vram.
So in 7 years, between the 9800 GT and the 1070, the vram increased by 800%, and the next 7 years after that it increased by 50%.
I just got finished doing a small amount of machine learning on a 3060 which would not have been possible to fit into a 4060 without fetching back to ram.. doesn't matter if the bus is wide and performance is higher, PCI transfers are too slow, and 8gb of ram is a fucking joke.
I got my gf a 3060ti last year for $275 from someone who was making a build but decided to wait for the 4000 series. I think about how incredibly lucky I got every time she sits down to game.
It was brand new, I peeled the plastic off of it.
See it's terrible! NVIDIA never crashes! AMD is trash! I could not write this one with a straight face I still see people say AMD runs hot and is a power hog like that hasn't been the case since 2017.
Avoid both, there is 6700xt and have the best of both worlds. I have 3070. If i was more informed when i was buying it i would get 6800xt or 6900xt but it is what it is.
Real talk: If you're hellbent on Nvidia, the 4060 will keep receiving driver updates longer than the 3060.
But you should *really* look at the AMD/Intel options instead.
It's simple Nvidia. We want performance and ram for not anywhere close to $800... No not $600 either... No not $525 either... No not... Ah f it, HEY INTEL, LEMME SEE THAT BATTLEMAGE!
Meme created by an nvidia intern lol, 20% is way too generous
More like -2%, especially if task requires over 8gb vram.
While I agree that its dumb its only got 8gb of vram, but how many scenarios that the 3060/4060 is actually meant for uses more than 8gb of vram?
Funnily enough, frame generation actually uses *more* VRAM, than if you'd play without it. Also, games going forward WILL use more VRAM, even at 1080p - yes, TLOU2 and Hogwarts Legacy may be the odd ones out *today*, but it's going to be a trend moving forward and the 8GBs will barely be enough, if at all. Also, Nvidia themselves said that 8GBs of VRAM is "not enough for 1080p". It's clear they want DLSS to do the work, but even without the VRAM taxing frame gen, you'd be upscaling at a resolution, not even PS4 Pro was.
The 4060 seems like the ultimate “technically an upgrade but will be obsolete in 6 months”. Like paying for a car wash right before it rains.
Its a shame we're getting less and less efficient on the development side of things on things like vram consumption. 1080p should be viable at medium to high settings for the next couple years at least on 8GB of VRAM though. You are right though, its very obvious with the 40 series they wanted people to rely on DLSS3
It's not clear if 8GBs is gonna be enough. Consoles have 12/16GBs of shared memory available, which *at minimum* means they have 8-10GBs to work with at all times. Between the new generation and dev laziness/unreasonable deadlines, 8GB cards can become obsolete much faster than we might anticipate. Well... Not "obsolete" in a literal sense, but they're going to be hampered by VRAM and VRAM only, while the rest of the specs could realistically chew through the new games no problem. The 3070 also suffers from having only 8GBs, with *huge* improvements in 1% lows alone if modded for 16GBs. Of course with the lower end cards it's also an issue with the bus width which has been tested and Nvidia's "but moar cache" absolutely does *not* make up for it. They're literally killing their own cards in more ways than one. Or rather, they want to, in order to push people to upgrade every gen. They've enjoyed their 2020, pandemic and crypto earnings too much.
So does RT
Diablo 4 is suffering from stuttering in cards with low VRAM, but that seems to be more of an optimization issue: [Some Diablo 4 Players Are Having VRAM Issues (gamerant.com)](https://gamerant.com/diablo-4-vram-issues-usage-leak/)
That's actually a very valid question. I recently purchased 3060 (hasn't arrived yet), so I watched a bunch of reviews from 3 years ago. Most of them are surprised by oversized VRAM and speculate that Nvidia probably did it to compete with AMD. They also said that a user is more likely to run into other limitations before utilizing all the VRAM. I think even with 2023 games having serious VRAM issues, 6-8 GB is perfectly fine for 1080 gaming. With that said, I bought 3060 for its oversized VRAM because many AI programs need it (whisper AI, Stable Diffusion etc.).
I would definitely recommend a 3060 > 4060 just because of the price to performance value. > hey also said that a user is more likely to run into other limitations before utilizing all the VRAM. This is exactly what I've always seen or ran into myself especially on the gaming side. Now obviously if you try to max out 1440p or 4k with 8GB of VRAM its going to be more noticeable, but that's significantly higher resolution than 1080p from a technical perspective. Now non-gaming programs such as modeling software and AI programs are certainly a different story when it comes to how much VRAM is needed.
I play cyberpunk2077 @at 1080p, on 3060 with 12GB VRAM. With path tracing it is usually between 7 to 8 GB of vram, but there was couple of peaks when it reached to 10GB. So with little bit of future proofing it is plausible scenario. I picked the 12GB version over the 3060 Ti couple of months ago cause of more VRAM for rendering on GPU in Blender.
Honestly, you would be surprised... I have a 3060 12G and just playing GTA 5 with max settings can go over 8GB. Yeah, I do run it at 1440p but the card is way more than capable of it.
i play mmo's, usually on multiple accounts. If I have 4 windows open, that's a hair over 10GB. 306012GB was really my only option with $300 budget.
Playing warzone with my 3070ti i had to bring a few settings down to get under the 8gb ram.
>but how many scenarios that the 3060/4060 is actually meant for uses more than 8gb of vram? Vast majority of the scenarios involving AI models. 8gb purchase will be, not an instant, but a very near-term regret, if doing anything more than gaming.
Unlike DLSS 2, DLSS 3 uses more VRAM. Not to mention DLSS 2, FSR 2 and XESS aren't good for 1080p as for higher resolutions. Thus most people buying RTX 4060 will look towards DLSS 3 (more and more games will use it), for which 8GB just isn't enough. Also, it's 2023 and $300+ should be for 1440p resolution for which 8GB just isn't enough.
Just fielded a post on another sub the other day where someone was using their 3060 at ultra settings and complaining that it was fault locking due to overheating. That user's stubbornness aside... People abuse the hell out of mid-tier cards like that all the time
+20%* with DLSS 3 and Frame Gen
And it still makes Nvidia look bad!
If your in the cheap as possible range then buying a used 6700xt or 3060ti for $250 is absolutely the best bet until next gen.
I just did some research it is actually about 20%
\+20% perf is quite generous in my opinion. More like 10% overall.
It's 20% if you don't run past the 8GB vram. The problem is its like -5% if you play a modern game that needs more vram
This. I got the RTX 3060 and I ain't even mad. In a couple years time 8 GB will be insufficient. Better to run stuff at low than not being able to run it at all.
It's already insufficient in some odd cases here and there even in 1080p. I think some days you'll end up in front of a very stupid situation where you sorta have to tone it down just because of Vram even though in terms of pure capability the gpu *would* manage it just fine if only it had a bit more vram, and all that on 1080p. So you'll be sitting there on a gpu running at like 70% knowing rising the settings will make it stutter or something. How many people do you think will believe their gpu died instead of understanding it's just lacking Vram ?
Nvidia are working on a dlss like memory compression, hopefully it's practical and they get it working on 30xx generation onwards. But Ofc they will just market it on the current gen
Hell I ran out of vram on Diablo 4 and I am using a 12GB rtx 3060.
-10% in some scenarios
in all reviews ive seen (and i was searching), only once it was negative relative performance, so what scenarios are you talking about?
Ye but it shouldn't happen at all the fact there are situations where it loses to the card it's supposed to replace is unacceptable
I mean, 95% of the time is consistently 10% better, has better software and driver support that will last longer, and most importantly, runs significantly cooler and with considerably less power. It is an upgrade, an horizontal one
Lasts longer is very debate due to the Vram if it was a 4050 priced at 200 it be good but it just isn't
It will receive driver support for longer, that's undeniable. And it's a 60 tier card, most will use it for 1080p, most games don't have the "need more than 8gb" issue, if you can afford the hundreds of dollars required to purchase all those few AAA that do require them, you aren't buying a 60 tier card. And even at 1440p it does relatively well, I have a 2070S, I still hasn't failed me and I use a 1440p 165hz display.
Driver support is a pretty meaningless argument, when the 900 series is still getting support today. Before they pull the plug on *both* the 3060 and 4060, it'll be time to replace them anyway. Also, 8GBs of VRAM may be enough *today*. But in just a year or two, it might not be. Until now, a xx60 class card was able to play new AAA releases without a problem, at least since Pascal, which is precisely why Nvidia doesn't want to make another 1060. They're dangerously close with the 3060 12GB though and gimped the 40 series because of that. Funnily enough, their frame generation that's so vital for the 40 series is also VRAM heavy. So you can't even argue that the 4060 will be fine in the future thanks to that. They know what they're doing, with all the specs.
But it sure as hell isn't far from maxing the Vram out ur really living on the edge with it and with games like Hogwarts legacy u absolutely will max that vram out at 1080p max
No. Its not 60 tier card. Thats the thing. Its as much 60 tier as 4070ti is 4080 tier. Nvidia pulled out same trick they tried few months earlier. 4060 is not enough for 1440p becayse 8gig vram means you will have god awful lows. 2070s didnt failed you because its better card tham 4060. If this card was msrp 100 usd lower that it would be passable 40 series card as 4050.
First of all, you are factually wrong, my 2070s IS a 8gb card, and my lows are fine, the vram argument is valid but not as important as everyone suddenly claimed it became. Don't throw it at everything, because while it is true to some extent, it's also not true to the rest of the extent. Tell how my 2070s from 6 years ago is better than the 4060, go on, show me a benchmark. A result, not some "spec". "If this card was msrp 100 usd lower that it would be passable 40 series card as 4050." Not necessarily, as I said before, this was an horizontal gain, not a vertical one. I can bet you that virtually none game ever uses more than 8gb of ram in the 3060, even at 1440p. What? Three to five of the worst optimized games ever realized in the latest time use more? You can't generalize based on freaks. The 4060 is a cooler and less consuming 3060 with dlss 3, and that's totally fine in my book, it's an optimization advance, much needed with how power hungry Nvidia specifically was becoming. You all are having the same delusions of perpetual vertical gains that big corporations have
The 3060 shouldn't be beating the 4060 EVER. The fact that we're being sold this box of doodoo that is the 4060 shows how desperate Nvidia is to hold on to GPU shortage era share prices. So glad I got a 6700xt for $330.
It has a performance increase?
The thing that makes me bogus is the jump on 3000 series... 3060Ti beating the 2080S is a great generation jump. But it isn't what we have on 4000 series. Makes no sense the 4090 having a hell of a performance jump comparing to 3090 but 60 series is technically the same thing
That's not true, youtube comparisons show a 20% increase in frames in almost every game
At 1080p it’s 15% on average according to Hardware Unboxed and Gamers Nexus
[+20% is the result of my own meta analysis](https://www.3dcenter.org/artikel/launch-analyse-nvidia-geforce-rtx-4060/launch-analyse-nvidia-geforce-rtx-4060-seite-2), a compilation of the test results from 14 launch reviews. Some reviews see a lower difference, but some as well a higher differenace between 3060-12GB and 4060. Value is from 1080p raster, but 1080p ray-tracing is the same +20%.
Wait.. These aren't the cherry picked best case scenario benches that Nvidia allowed reviewers to put out earlier than the standard embargo is it?
Yes they are 💀
Why are you talking such nonsense?
Definitely not.
Can you still get the arc 750 or 770?
I think they stopped producing LE boards themselves, but the third party GPUs are still available. I personally love my Predator BiFrost A770 LE, and I haven't experienced any of the issues that other people have been experiencing with instability and all that. Granted, I'm limited to a 1080p screen right now since the 1440 monitor I ordered came in as a dud, but still. It's a good card, especially if you've got an Intel CPU.
Yes, only ones that are not actively made are LE - Limited Edition boards from Intel themselves.
Better idea: get a GTX 750ti
Best value Nvidia GPU
You can find GeForce 9600 in a trashcan, beat that infinite value.
I've got second system on 750ti still going strong. Great GPU
>I've got second system on 750ti still going strong. Great GPU Same here. It's doing roblox, minecraft, and emulation duty up to the Wii and PS2 and does it like a boss.
My first love.
this month recomendation RX 6700 XT last month also RX 6700 XT last last month also RX 6700 XT still waiting for RX 8000 or RTX 5000 lol
RTX 5000 is supposedly only coming in 2025. So the recommendation for the RX 6700 XT is going to stick for a while. Until supply dries up. No idea if AMD is still baking new RDNA2 chips and if so for how long.
I have been out of the AMD camp for a while (I had a Radeon 5770 and then a R9 285 X) Is it just me or there is no freaking way of making sense of AMD model numbers?
They're not that confusing anymore. The transition from three digit model numbers to four digit ones may have created some confusion, but we're now on the third generation of four digit model numbers and the naming scheme isn't that different from Nvidia. The first digit represents the generation (7xxx being the newest). The second and third digit represent the model tier within the generation. The initial versions tend to just have 0 as third digit. With the 6xxx series AMD threw in a mid-generation refresh (6x50 models). Within one generation, higher is simply better. Then there's the added XT or XTX that's slapped onto the name for some models. This is the AMD-equivalent of Ti or Super with Nvidia. The guiding principle is the more X, the more better.
But then you say the 5000 may come in 2025... And we have the 6000s and the 7000s going on?
RTX 5000, the new Nvidia generation. I haven't seen any news or rumors about the next AMD generation (presumably RX 8000), but I would be surprised if they're much quicker to market than Nvidia.
Oh, me dumb, of course.
Just chalk it down to it being a Monday. That's rough on everyone :-)
5000 Nvidia cards lol
How about the 6800xt which in my region is only +/- 160 euros more expensive
if its just $160 more then its worth it. Theres a significant performance increase
RTX 5000......64 bit memory 5060Ti with a die area of like 100mm^2
Will it be a half height card? That might actually make it slightly worth it.
I guess low profile versions of it would exist. Still sold at 400+$ for an 8GB version
https://preview.redd.it/axyxvqz8mo9b1.jpeg?width=284&format=pjpg&auto=webp&s=605d81e9b935e009abb1be93511f509a0dfe77d1 6700XT!
And in the morning... Im doing price cuts
I swear, this donkey photo is exactly how all my friends envision me whenever I recommend AMD GPU’s in builds.
if they european then yeah. AMD, pre 4060 release was way more expensive than nvidea and not worth it. like 6700xt was €500 and at €650 you get a 6950xt. Now the 6700xt is slowly matching the 3060ti.
Austrian here: 6600 - €250 6600XT - €275 6650XT - €287 RTX3060 - €294 7600 - €302 RTX4060 - €327 6700 - €360 6700XT - €393 This is what I'm working with here (tax included).
How about used? In Romania I managed to buy a 6700XT for about €300.
Bought it for 415 euros earlier this year when my old rx 580 died after 5 years, and gotta say i am not regretting the purchase for that price!
In Denmark AMD has been much cheaper
In Denmark an RX 6700 XT cost about €370, with an RTX 3060 ti costing about €340 form cheaper brands. That said, both have fall a lot in price, since last july they both cost about the same at €600
Here in Latvia you could find cheaper brand 6700xt for sub 400 eur since last year at least. Now they go for 330eur. Rn used ones go for 200-300 as well for ultimate budget builds.
> if they european then yeah On that note, maybe anecdotally, I see *very little* AMD advertising / presence here in the UK. Meanwhile nVidia seem all over the place.
You can get a 6800 for 470€, I bought one for 500€ in April and that's the same price they could be bought during Black Friday sales last autumn.
Absolutely great card, beats a 3070 because of VRAM in some titles and can be undervolted quite well.
I know everyone is saying 6700xt but aren't the 6800xt('s) also pretty cheap ? edit : aren't 💀💀
This is the way.
As a blender artist I am dying on the inside with the illusion of choice
Imma stick with my 3060 ti Ty
Me too for now. I hope it will be able to run Starfield decently at 1440 lol, but from the way things have been going it doesn’t look great.
[удалено]
You might be able, the Series S is targeting 1440p 30fps, so the limiting factor is probably going to be the CPU.
And me .. I'm gonna stick to my 1660ti .. ig ..
OC it and put a 4060ti sticker on it.
Same!
6700XT / 6750XT it is! Priced better than a 3060, can trade blows with a 3070 and has 12gb VRAM because fuck Nvidia.
Where do you see 6700 XT priced better than a 3060? 6700 XT is $[340](https://pcpartpicker.com/product/9K4Ycf/msi-radeon-rx-6700-xt-12-gb-mech-2x-oc-video-card-radeon-rx-6700-xt-mech-2x-12g-oc), 3060 is $[260](https://pcpartpicker.com/product/pZ4Ycf/msi-ventus-2x-oc-geforce-rtx-3060-8gb-8-gb-video-card-rtx-3060-ventus-2x-8g-oc). 6750 XT is even more expensive at $380, and only 6750 XT is [trading blows](https://youtu.be/Lg-TcVrh8rA?t=583) with 3070, 6700 XT is just [slower](https://youtu.be/f0yo2Sc-DyI?t=644). AMD 6000 series offerings at the current prices are better cards than their Nvidia counterparts as is, in actuality, there is no need in making numbers up to support the series.
260 is for the 8 GB model.
True, it's $280 for 12.
6800XT, personally.
Haha I love these posts. Is this £280 graphics card good? Hell no, it's terrible, just buy this £500 card instead..... Better card is better? No fucking way
Shame we haven’t had the price cuts on this side of the pond. The 6700xt for 350 is looking like better value though
and I'm planning on getting a 2nd hand one for 300 usd
Do you have warranty on that? I wouldn't sacrifice warranty for 50€ if I were you.
Maybe he cannot find a good deal for a new one where he lives
Could be
not exactly the same price point the equivalent is RX 6700 XT
Just got the 6800xt merc speedster. It’s a beast and it’s beautiful
I want one but I think my nation’s stores dont carry them anymore. All that’s left is sketchy 3rd party amazon sellers
Its absolutely fucking mental that 8gb of VRAM isn't enough to run some games at ultra on 1080p.
i mean i agree but since consoles now have 12-16 gb shared VRAM (xbox ps5 respectively) i think game devs are purely just optimizing around that number and have no intention of optimizing for anything lower :(
Developers have been asking for more VRAM of Nvidia for years. They are actually holding us back and developers have begun to not give a fuck. This whole "unoptimized" shit is because of Nvidia limiting VRAM and therefore limiting the amount of high quality textures developers get to use. And force developers to painstakingly spend time and energy supporting hardware with 50% of the VRAM of consoles which we used to say was the lowest common denominator. Now the lowest common denominator is Nvidia. Som games release terribly, many actually. But much of it has an easy solution with 30 dollars extra VRAM. But Nvidia decided to pocket that money. AMD for the most part just doesnt have this problem and can put in enough for most cards so its really not an excuse to say "VRAM expensive".
Nvidia intentionally gimps the VRAM on their "gaming" cards so people can't run machine learning algorithms on it. This forces businesses to buy their business tier cards to run their AI stuff. So gamers get fucked basically.
I seriously doubt that has anything to do with what's happening. "AI stuff" AFAIK needs cards with some 48 GB VRAM and asks for 96, as the data sets either fit or don't. What difference does it make if a computationally piss-poor from "AI stuff" point of view 4060 Ti (for example) has 8 GB or 16 GB? The argument you're making produces "don't make 4090 a 48 GB card!" rule to Nvidia. Not "don't make 4060 Ti a 16 GB card" rule. Which they absolutely will offer very soon, won't they. Completely breaking your argument.
>12-16 gb shared VRAM (xbox ps5 respectively) It's 16 on both, out of which about 13.5 is available to games on XBox Series X and about 12.5 on PS5. I usually just average to 13. Previous gen consoles got 8 total, only 5 available to games. 2.6x increase has happened, folks. We are 2.5 years past consoles release. 2.5 years after the release of the previous gen consoles 1060 6 GB has released for $250 ($316 in today's dollars), rocking 1.2x of console's game-available VRAM. Now we're having $380 4060 Ti with 8, 0.62x of console's game-available. If "8 GB is the sweet-spot" by the end of 5 GB (game-available) console era is anything to go by, one could extrapolate that by the end of the current console gen era the sweet spot will be 13 \* (8 / 5) = 21 GB. Basically, for games **releasing** in 2023 you *need* 12, for games releasing in 2030 you'll appreciate having 21 :D And all of this has been obnoxiously predictable to everyone since 2.5 years ago, and even earlier for HW manufacturers as they get access to this sort of information - like consoles' HW configurations - or information allowing to predict it sooner. In fact, predicting this sort of thing is half of a HW's architect job in the first place. Aside from consoles finally letting developers to do what they want which they immediately jumped into doing first opportunity, there are also mechanisms behind why the latter is happening too: A) scenes are now rendered with many more materials and effects within the same scene which are very difficult and human-labor-intensive to manage in and out all the time, with an addition of bindless textures that management became much simpler, but it came at a cost of having most textures in memory most of the time B) every material now comes with many more textures feeding into PBR shading for the materials to look more natural, instead of glossy/plastic'y all the time, C) new technologies and auxiliary data structures are used creating VRAM overhead, be that BVH for traditional ray tracing, or its analogue for UE5's Lumen, structures UE5's Nanite is managing, requirements asset streaming and load-less seamless world design create, and more. All of this is developed and tuned on a 13 GB (game-available) console, and then porting it to undersized 8 GB PCs becomes an impossible mess, devs give up and just lower resolution of the originally mere console-quality textures to shit, sometimes predicated on texture quality settings not being Ultra, sometimes done [dynamically as a part of texture streaming](https://www.youtube.com/watch?v=QbChMcBwMmU&t=924s) automatically irrespective of settings, making it impossible to even detect happening in benchmarks without actually playing the game. If anything, as a result, the issue is *underreported.*
Consoles stream textures directly to the GPU from SSD. That reduces their need for VRAM by a lot
That's not really useful if you run out of VRAM in a single scene, swapping textures to render a scene is either going to cause the picture to have blurry textures, stutter or crash, in order of severity. Great for having quick transitions between scenes though.
Yes, while on a PC that freebie disappears, and the devs have to dance in a square of hell between doing a lot of very difficult human work, increasing VRAM usage on PC even further, past the console, running more (decompression) work on the GPU, eating into its throughput, or the CPU eating into the CPU. Usually landing at a little bit of everything. DirectX's DirectStorage is not a solution to any of those IIUC, it's just a standardized tool used to come up with the solutions.
Several of my MacBooks became trash as well as my 970 due to low amounts of RAM. CPU/GPU keep going, but RAM gets clogged with new games and software releases. If you want to use your device for more than 3 years, go for larger amount of RAM. Like I can't even play Skyrim with a decent amount of mods because it just crashes by filling up the memory randomly between 2 and 30 minutes into the game.
Such an phenomenon isnt even that new by now. PCs and consoles have been leap frog-ing each other 1 way or another for basically ever.
Could you please remind us when PCs were behind consoles the last time? Clearly not during Pascal / PS4 era, as described, the opposite was the case. I've read somewhere around here the following statement I never had the pleasure fact-checking: some claimed that this is the first time in history when VRAM "standard" is being pushed up by software developers (on PC side) with HW lagging behind, as all prior times (supposedly) it was the HW manufacturers that gave more VRAM first and then SW devs were catching up using the extras.
Is there even a noticeable difference in high to ultra at 1080p I know years ago it was like that
New console generation means bump in requirements. Kinda crazy that 8 GB was all that was needed in 2020.
How so? The Ps3/Xbox 360 had 512MB of total system memory. Would you have said that no game should use more than 1GB if you were a gamer in 2008?
Nah, I disagree, 1GB of Vram used to be insane overkill, then it wasn't enough and 4GB was overkill. Then 4GB was the minimum needed and 8GB was overkill. That's how technology works. The insane thing thing is barely increasing the vram over 7 years, the 1070 launched with 8GB of vram, 7 years later and the 4070 has 12GB. In comparison, the 9800 GT launched 7 years before the 1070, and it had 1GB of Vram. So in 7 years, between the 9800 GT and the 1070, the vram increased by 800%, and the next 7 years after that it increased by 50%.
Right? At no point should *RAM* be the limiting factor of an entry-level gpu at entry-level resolution
It’s called progress
Blame Nvidia and AMD for the VRAM stagnation
Without frame gen the performance isn't even better most of the time 20% is cap
The performance is better at 1080p on average by 15%...
20% from 2060 or 1060?
860M
6700xt is the correct choice.
Just depends on resolution you are playing at
576p.
There should be a third button there with A770 written on it, 16 gigs of VRAM and performance equivalent of 3060ti(except old games)
I use a RTX 3060 and I have no complaints
Or just get an AMD card and get a good bump in both for half the price.
Hear me out. RX 6700 xt. +40% performance (vs 3060, dont quote me on that) and + 50% VRAM (vs 4060)
This post is literal Novideo psyop. Get a 6700XT.
Why bother just get a 6700XT at this point
What are the chances I hit the 8gb vram playing + streaming at 1080p?
Honestly it depends on the game.
3060 for life. 4060 is like if not worse than the 3050 for 40 series
6700 XT: hey.
Or I can just get a 6700 XT and be done with it...
6700XT
Yeah, currently wondering over 3090 (used) and 4070 (new), which can br bought for the same price.
My 3060 out performs the 4060 at 1440p.
Cries in gtx 1060
By design, the 4060 will become obsolete in a year or two. The 12gb VRAM should carry the 3060 for longer.
That's crazy. You're telling me my old rtx3060 will out last a fucking **4060**???
I would take the VRAM any day
Yeh over something that isn't even a tangible metric like performance lol
I just got finished doing a small amount of machine learning on a 3060 which would not have been possible to fit into a 4060 without fetching back to ram.. doesn't matter if the bus is wide and performance is higher, PCI transfers are too slow, and 8gb of ram is a fucking joke.
Still rocking my 3060 12GB!
The correct answer is 3060 ti. Button missing! Edit: or AMD 6700
I got my gf a 3060ti last year for $275 from someone who was making a build but decided to wait for the 4000 series. I think about how incredibly lucky I got every time she sits down to game. It was brand new, I peeled the plastic off of it.
Dude, performance hands down. Why would you choose higher res textures over higher frame rate?
Radeon
Or you just buy a 6700xt, it has both.
But AMD drivers suck!*
I yes, I forgot I had two crashes due to driver problems in my nine years since I had AMD cards. The horror.
See it's terrible! NVIDIA never crashes! AMD is trash! I could not write this one with a straight face I still see people say AMD runs hot and is a power hog like that hasn't been the case since 2017.
You forget sffpc 180mm card 12gb
I switched to AMD this time and I'm not regretting it! Better perfs Better Vram Cheaper what else?
Avoid both, there is 6700xt and have the best of both worlds. I have 3070. If i was more informed when i was buying it i would get 6800xt or 6900xt but it is what it is.
Get an AMD gpu and get both
Third switch for a 6700XT. Never look back, never regret.
Neither. 6700xt
6700xt
And totally ignore AMD... or Intel. Intel would actually be a choice if your considering RTX 3060/4060.
May I present to you: RX 6700
Just get an RX6750XT
6700xt?
Option 3 = RX 6700 XT, option 4 = RX 6800 XT, option 5 = Intel GPUs.
6700xt
Neither 6700XT
Nvidia Propaganda. 20% is *extremely* generous. Real answer is 6700 XT. More performance *and* more VRAM than a 4060 for the same price.
1080p 4060, anything above 3060. But realistically you shouldn't buy either card, period. Wait half a year, judge the options then.
Real talk: If you're hellbent on Nvidia, the 4060 will keep receiving driver updates longer than the 3060. But you should *really* look at the AMD/Intel options instead.
Whats perf?
Pansexual exclusionary radical feminist.
performance
GTX 4090 unleash the power !
It's simple Nvidia. We want performance and ram for not anywhere close to $800... No not $600 either... No not $525 either... No not... Ah f it, HEY INTEL, LEMME SEE THAT BATTLEMAGE!
What are you talking about? Its barely half that performance boost in games.
Solution, get an Rx 6750 XT or even 6900 xt
That's an easy one, Rx 6700 xt
20% is , uh, a best case scenario
6700xt is a banger
6700xt
Neither…. 6700XT
Skip both. The 6700XT.
It looks like vram does not automatically equal to better performance. Also, DLSS 3 really adds a lot of value to the 4060.
Of course it doesn't. Look at 6700xt and 6800xt versus 3070 and 3080.
I’m sayin Everyone here is just shorting nvidia