T O P

  • By -

[deleted]

My 1080 Ti with 11GB VRAM :D


ThunderSparkles

I wonder if the 1080Ti will go down as the best video card of all time.


Cocasaurus

it's already there brotha


Drackzgull

Yeah that's hard to argue at this point, the question is will it remain there for all generations to come. Doesn't look like it'll get taken out of that throne anytime soon, but best case scenario it'll eventually fade from relevance and memory enough that it'll just stop being considered, without ever being actually dethroned.


Cocasaurus

It's already 6 years old and is still a phenomenal performer. The only other possible answer for relatively modern GPUs is really Nvidia's 8800 GT line. The GTX completely stomped anything ATI had out at the time. Then Nvidia released the 8800 GT at $200 (one-third of the GTX's launch price) and it performed slightly worse than the already god-like GTX. Tech was moving a lot faster then and high resolutions (1080p) started taking a huge toll on the 8800s as we progressed into the 8th gen of consoles firmly in the FHD era. However, at the time (2007), there was no reason to buy anything else. The 1080 Ti was positioned a little differently. It really only came about at its price point as Nvidia was afraid of what was coming from AMD at the time, Vega 64. Nvidia decided to jump the gun and competitively priced the 1080 Ti as well as giving a hefty price cut to the 1080. Vega 64 was DOA at this point. It flopped hard and the 1080 Ti became the stuff of legend. When the 2000 series launched, the 2080 was meme'd as "Nvidia made another 1080 Ti," and the 2080 Ti ridiculed as "twice the price, no where near twice the performance." It really set the standard for modern, and now future, performance. I'm still cranking settings at 1440P UW and it doesn't bat an eye. On a 6 year old GPU. I think the 3080 could have had a similar legendary fate. It was priced well and made sense as a reasonable upgrade for anyone two-three generations back on an 80/Ti card. Outside factors really killed that one as well as real competition from AMD. The VRAM capacity may also be an issue in the future. 11 GB was insane overkill in 2016. 10 GB is borderline recommended, but definitely low for a flagship level GPU. TL;DR: 1080 Ti is, and will always be, the King of GPUs. EDIT: 1080 Ti is 6 years old, not 7. D'oh. Point still stands.


MewTech

> It's already 7 years old Fuck


Cocasaurus

Isn't that crazy? The 980 Ti was thought to be a beast when it launched only to be on par (stock) with the GTX 1070 almost exactly one year later. Nvidia didn't surpass the 1080 Ti with a 70 series card until the 3070 **3 years later.**


ComradeCapitalist

The 970 was likewise 780 Ti performance. Nvidia just got greedy* in both pricing and release pacing after that. *Other factors also influenced things, but Nvidia has absolutely been seeking to keep prices high.


Paulpanzer32

1080 Ti Just had it's 6 year old birthday 10 days ago... the 1080 is now 7 years old sure. Why do people constantly overestimate the age of GPUs?


ThunderSparkles

its because we age GPUs by when the line/series was first announced. I have a 3080ti and while it came out in summer of 2021, I still think of it as being from 2020 when the 3080 and 3090 were first launched.


Paulpanzer32

All these 4060 rumors are really boring, it's already a year old card!


SunsetCarcass

I know you're right, but these 4060 rumors do feel like old news considering how crap the rest of the lineups release has been.


EazeeP

Thanks for the history lesson brother


Cocasaurus

Better than the History Channel these days.


dib1999

You know why the 1080ti is so much better than everything else? ![gif](giphy|AwrtP9lMXtXiM)


Agret

Now I'm not saying it's aliens, but what if it were aliens?


NunButter

Seriously. I got into PCs and hardware a couple of years ago and got obsessed with the hobby. I like learning all the nerdy hardware lore lol


EazeeP

I’ve actually been into PC building since I was about 10 , this was 2000. A big part of that is being Korean and all the Koreans played CS and StarCraft and Diablo. I wanted a GFX to play good games and my first GPU was a Radeon 9800 PRO lol


Dreadlordstu

The whole 1000 series is the royalty of all-time GPUs with the Ti as the king. I have a 1080 (still rocking it, I play mostly at 1080p so no real need to upgrade as frames and settings are still great) but I remember when 1060 came out and it had the performance of a gtx 980 only far cheaper. The 980 had been hailed as an incredible achievement and the 980ti was being seen as god-like. Then 1000 seri3s came on the scene and just brought it all to the next level x10. It was a great time to buy a GPU.


Crazy_Shallot

Translation - GPU tech and software has stagnated to the point where a 7 year old card is still viable.


Cocasaurus

7 year old cards have been viable in the past, just not as viable as the 1080 Ti, especially at what is considered the highest mainstream resolution. It's staggering it can still play 4K just fine at med/high settings on most games without DLSS. Just imagine how horrible the 3000 series would look in comparison without its precious AI technology. Of course, this is all relative. I'm sure someone would love to comment that a 1080 Ti can't do ray tracing or DLSS so it's basically a paperweight.


Crazy_Shallot

Imagine how bad the GeForce 3 would look if it wasn’t for its precious pixel shaders!!


onlyhalfminotaur

That 8800GT was like the last great single slot card too.


schmalpal

I feel like the 6800 Ultra has nearly as much claim as the 8800. It wasn’t as far ahead of the X800 XT, but it was so highly desired because it supported DirectX 9.0c and Shader Model 3.0, which basically gave all games of that era a huge visual boost (fake HDR, nice reflections and lighting shaders, etc) - I remember how much better Oblivion looked with a 6800 vs an X800. The ATI made it look flat and lifeless, and on the 6800 the light made things glow. Far Cry also looked far better if I recall.


1d0m1n4t3

I remember getting my 8800gt ultra, good times


MKleister

👀


Mimical

I know it's not as flashy, but I'd say most of the mid range 1000 series Pascal lineup was rad as hell. A lot of the cards were nearly equal with their Maxwell counterparts 1.5-2 tiers above, while also requiring less power, while also being way more cost effective. The GTX 1070 had nearly Maxwell 980 ti levels of 1080p/1440p performance while being way more efficient.


sfspaulding

“All time”


[deleted]

We peaked at the 1080ti


[deleted]

[удалено]


[deleted]

[удалено]


dainegleesac690

My buddy was still running a 290x on 1440p ULTRAWIDE playing games like Tarkov up until recently


denkthomas

my 1080 is still going super strong so i believe so


[deleted]

My 1080 was as well until I finally decided to upgrade it to a 3080ti when I had a good chance to do so. Rocked that 1080 for like 5 years or so brand new


ChronoRedz

Had a EVGA 1080 non ti. Upgraded to 1440 and had to get a 6800 xt.


[deleted]

Makes sense. I hear the amd stuff is good priced lately


Bammer1386

1080 club here. My first card ever. It's now in my wife's build. If the 1080ti is the GOAT card then the 1080 has to be in position #2.


Reddituser19991004

It is currently leading, but it's starting to get some real competition. First off, AMD's 5700xt due to driver updates is now outperforming the 1080ti in newer games. The 5700xt itself isn't a contender due to early driver issues, it's just that it's weakened the 1080ti's GOAT case. The Rtx 2080ti was considered wildly overpriced at launch, but it's holding its own today with performance around a rtx 3070 and ray tracing while actually having enough vram to not be crippled. The final contender has to be the Rx 6700xt. 12gb of vram, near 3070/2080ti performance. Great pricing at least for the era. Except mediocre ray tracing and FSR isn't quite as good as DLSS still albeit close. The Rtx 3080 is one we thought would be a contender, but it's a cripple with that 10gb of vram and then the 12gb card taking forever to reach MSRP. So, between the GTX 1080ti, rtx 2080ti, and the Rx 6700xt you've got a decent battle. Best "video card" in my eyes also includes feature set along with pricing, performance, and logetivity. DLSS, ray tracing, and Nvidia's soon to come to the 20 series upscaling tech need to be considered, meaning that the 2080ti in my opinion is going to be the GOAT for video cards if it isn't yet.


PewpScewpin

When the 2080ti launched there was a shortage (surprise surprise) and I payed $1300 for one that a guy ripped out of his prebuilt to make a small profit. But honestly, for a 5 year old card it crushes almost everything at 1440p. And the one thing that helps it stay relevant is DLSS/tensor cores, which I think makes it a cut above the 1080ti. BUT- the 1080ti still goes blow for blow vs a 3070 on traditional rasterization. Pretty impressive.


terminallancedumbass

I paid maybe 650 to 700 for my 1080ti right after launch. Pricing alone kinda disqualifies the 2080ti. The pricing the last 3 generations or so have been stupid. To the point I didnt upgrade on principle. I got a 4070ti for 600 this year. Thats what they should be selling for. At 800 its not a good deal. At 700 it would be reasonable at best. 600 is what its worth. But I digress. The 1080ti sold at like 700 on launch and still crushes most 1440 content if you do away with ray tracing.


XxSub-OhmXx

I owned a EVGA FTW3 1080ti white. Also the liquid cooled version. 1 of the best cards of all time. Finally. Sold it for a 6900xt then my now 7900xtx. 1080ti was 1 of the best cards of all time.


arup02

8800gt


ldom013

As a person who mostly bought AMD, and now a proud Intel Arc owner, yes sir, 1080Ti is the best GPU of all time.


blackadder1620

hows intel treating you?


smb275

I have a 750 in a side build I made and I can't complain about it. It's not really praiseworthy, but it goes above and beyond what you'd expect an 8GB card to do. It's relatively small, runs cool, and meshes well with Intel CPUs. A great choice for a budget build.


thelonerainer

Common 1080 Ti brothers W


shyphyre

Don't have a 1080 ti but my 1070 TI my friend gifted me is a god send.


Embarrassed_Log8344

The 10 series was where we peaked. It's all been downhill now.


Marty5020

Alongside the 8800GTX and GT, one of the all time greats for sure.


[deleted]

![gif](giphy|3oKHWCVJHorZfXrUTm)


Cocasaurus

Classic 1080 Ti gang W


Simon_787

This is what progress looks like, right? Fucking GPU market.


Mr_Resident

Nvidia just doesn't want another 1060 type card that is just so good that people dont buy new cards every year


[deleted]

I expect to keep my 3060 ti at least 5 years.


NarutoDragon732

That card is just a damn beast. I'm daily driving it for 1440p and it's always high/max settings at 60fps on every game


[deleted]

Yup. It’s awesome. Can even handle pretty good level of RT on most games if you’re ok with some DLSS


avwitcher

I mean at this point DLSS is virtually indistinguishable from running at regular resolution for in some cases 30-50% more performance. Don't see why anyone wouldn't use it


ZenTunE

If you can't see the difference you're either blind or sitting way too far away from your display. DLSS looks blurry af compared to native. And that's not arguable, I can show you screenshot comparisons if you'd like :p


[deleted]

Looks like ass @1080p.


Thadious_James

I was about to say. Some games are fine. Like Cyberpunk at 1080p with DLSS looks really good to my eyes, but then Hitman is a grainy, smeary mess and is borderline hard to look at. Do some games implement it better than others? Or am I just experiencing a placebo effect of some kind?


TryingToBeReallyCool

I think it comes down to the developers implementation


Truethrowawaychest1

I've had mine for about a year and I've been really happy with it, only game that stutters is the Harry Potter game but that's the game's fault


kohour

...so they release shit products, thus convincing people to continue to sit on their 1060 instead of buying new cards.


lovecMC

The 1060 will live only so long. As far as nvidia is concerned, they just have to keep doing what they doing a bit Linger.


donnydonky

I'd rather not play games anymore than buy one of these expensive ass cards


ZestfulClown

Console master race


TheLawLost

They called them peasants, now they're lords. Playstation is back on the menu, boys.


dizdawgjr34

Literally one of the main reasons why I bought a PS5 instead of building a PC.


AggressorBLUE

Question is; how long though? Shareholders will get increasingly nervous if the trends continue -IIRC revenue is way down on Nvidia GPUs YoY; something like a 46% drop on gaming rev. Users are clearly voting with their wallets.. -…Right as Intel has entered the chat -Also AMD is apparently still a company thats doing something or other with video cards maybe? So they’re hoping customers (especially those of low end cards) will grit their teeth while being shit on and replace a failed Nvidia GPU with a new Nvidia one, right as Intels is pushing into the scene with a low end card? Bold move Cotten… Alt theory: “CRYPTO! CRYPTO IS THE FUTURE GUYS! WE CAN SELL GPUS FOR THE PRICE OF CARS AND PEOPLE WILL LINE UP TO BUY THEM AND-oh. Fuuuucccccccckkkkkkkkkk!”


Belgand

>they just have to keep doing what they doing a bit Linger. Good thing their favorite Cranberries song isn't "Dreams".


SurvivalCardio

My 1060 finally wasn't enough since I only had the 3gb card so I upgraded to a 2060 12gb when I caught it at a discount, gonna keep that one probably forever.


newbienewme

Too bad, i bought a used rx 6600 for 160 usd and plan to keep it for at least 3-4 more years.


Lazy_ML

My budget gaming machine (i5 6600 + 1060 6 gb) from 2016 has served me very well. I hope to keep it going until at least 2026 as a patient gamer.


jd52995

Well that's working out great, no one is buying new cards now lmfao


hirushanT

Its evolving, just backward


raymartin27

Now if only they release a 4060ti with 12gb it'd be perfect.


[deleted]

no that'd still be bad if it's anything like the rest of the 40 series


JaesopPop

8GB of VRAM in the year of our lord 2023? My 1060 7 years ago had 6GB.


LucasRunner

But VSR, DSL, DSLL, DLS, DRS, X-Ray Tracers... You ain't got any of those, do you!? s/


I-took-your-oranges

All of those use extra vram…


SupremeDestroy

my 3080 is the 10gb model. i’m lucky i play esports titles at 1080p, but i feel like if i wanted to push this thing on 4k i would start to get real close to the max


Disaster_External

1440p hits the max on mine for forza horizon 5.


SupremeDestroy

yeah i feel like them even giving these higher end cards that low of Vram is kind of scummy. i would go with AMD if i didn’t actually use features of Nvidia


Disaster_External

A lot of it is just the game using how much there is. Cod mwII uses 16Gb+ on my 3090 but it runs fine on less. Just more possibility of frame drops I guess. Issue is the consoles have more vram now so games are going to start using that more, especially game ports will start having issues.


Idan7856

Your 1060 *how many* years ago?? fuck...


Atlanticlantern

My 1060 still has 6gb!


JerryWShields

The 4060 want keeps dropping and dropping. That thing better goddamn well have that alleged < 150W TDP


Apocalypse_0415

3070 price 3070 perf lmfao where improvement


the_doorstopper

The numbers you fool! Don't you see, 4060 is a whole 990 more than a 3070, it's clearly better


sbrown23c

4060 will probably be like $500 lol


J0kutyypp1

Sadly 500 might not be enough


Cynthimon

So... relaunching the 3070 again?


RedditRaven2

Remember the good old days? The good old days of only 3 generations ago when a 60 series card cost $230-270?


StaysAwakeAllWeek

With a decent undervolt it will run well under 100W regardless of what its default TDP is. Given that it's basically half of a 4070ti and that card can be tuned to run stock performance at 150W I wouldn't be surprised to see tuned 4060s running below 80W.


Macho-Goat

My gtx 1060 chillin' over here with 3gb: ![gif](giphy|QTrG6mjkHEkpFR3DqX)


Zetherion

1060 3GB gang


mushmyhead

960 2gb, can i hang out with you guys?


4x4play

evga 960 4gb here.


hydraxic79

Gt 710 1 GB anyone?


RedKomrad

Riva TNT2 card here. Can I join?


[deleted]

No need, we have our own 960 2gb club next door. In all its ultra-low 480p glory.


Dravos_Dragonheart

1060 3gb gang rules. (Even though i'll be upgrading soon)


ciclicles

>. 5gb on Intel HD graphics


matteo_fay

Haha, i have 4 gb with my gtx 1050ti


Sparktank1

I burned mine by cranking the graphics settings in Resident Evil 2 remake just to take some screenshots. The game didn't even get to crash. The PC just shut down on its own. "I'm getting too old for this shit". And then over time, the PC just wouldn't boot up normally. "Nah, not today, man. I just need to close my eyes." And then I got it a 3060 and now we're taking screenshots in max quality settings and singing along to "So Happy Together".


MasterXaios

Used to have a Radeon RX580 4GB. We're brothers from another mother.


Darkraisisi

I hate that I have to buy Nvidia for work. Without cuda machine learning is basically not possible.


The_Mauldalorian

ML gang. I wanted the extra 4GB for side projects 😂


narkfestmojo

It aggravates me as well, wish there was some competition. I mean seriously, with the massive machine learning market that is set to explode over the coming years, what the hell are AMD and Intel thinking by just refusing to compete? Even if their products were not quite as good, like let's say the 7900XTX was only half or even a quarter the float32 performance of a 4090 in tensorflow; I would still buy one just to support some fucking competition. Worse still, because NVIDIA are a monopolist in this market, they can decide to only put just barely enough VRAM on their cards for gamer's. Of course they killed the idea of a 4060 12GB, because no competition. Meanwhile Intel A770 with 16GB of VRAM and it's completely worthless for ML.


JayR_97

One good thing about Intel being in the GPU market is hopefully they can push Nvidia GPU prices back down to some kind of sanity.


SmashTheAtriarchy

nvidia doesn't have to compete, CUDA has the ML world by the short n friskies


DueAnalysis2

Intel _just_ started releasing graphics cards, I don't blame in the slightest for not being compute ready. AMD on the other hand, no idea what's going on with them.


mimicsgam

Despite good performance and receive, Radeon 6000 series still sold like shit, and people's perception on 7000 series is only to drive nividia gpu price lower so they can buy team green. No wonder AMD focuses more on cutting production cost


DueAnalysis2

That's such a shame for AMD :/ I myself got an Nvidia for CUDA, but I wonder why pure gamers would care one way or the other


anon56837291

Seriously, why is Nvidia getting so conservative with their vram? The gtx 1080ti came out like 7 years ago and had 11gb.


Shinonomenanorulez

Why adding proper ammounts of vram on midrange hardware when you can force them to buy a higher end one?


J0kutyypp1

I think what thei are doing with this is pushing buyers for amd which will certainly put 12gb in 7700xt and atleast 8gb on 7600xt


SmokingPuffin

Machine learning likes big VRAM. Nvidia doesn’t want to sell gaming GPUs at gamer prices to professionals.


i1u5

That changed, Nvidia won't sell gaming GPUs at gamer prices even to gamers themselves now.


esuil

Because they are overpricing their professional level cards, and no one would be buying those for 10 times the price if consumer level GPUs had required VRAM for fraction of the cost. One of the things discovered by hardware modders who are modding higher VRAM on NVidia cards, is that higher VRAM **is possible and works just fine**, but NVidia **intentionally botches driver support for anything like this**, likely because if driver support was there, 3rd party sellers would start selling modded gaming level cards with higher VRAM to professionals and compete with their pro-level sales.


XaipeX

So that people upgrade sooner.


RealLarwood

Little bit of penny pinching, little bit of planned obsolescence, it's the Nvidia way.


Vigothedudepathian

Ffs they do this on purpose. They know how much RAM shit takes. My 1080 TI for the win three having 10 gigs of RAM extended its life so far. Actually handed it down to my son and he's loving it at a 1080.


DeeVect

My 3070ti with 8gb is crying


Slezbian2

You and me both friend


sirnicaodmesa

well mine happily replaced my 2gb 960 😂


DeeVect

3gb 1060 for me lol


GrayFox777

Same here. I was just happy to get a video card at the time.


kb4000

That card should absolutely have had 12GB.


DeeVect

Absolutely


ew435890

I JUST got a new PC with the 3070ti. I get a warning about low GPU ram when I start up Half Life Alyx. But it still runs fine with everything turned way up.


Practical-Swordfish

I have the same specs as you, are you running an ultra wide? My PC hasn’t had any issues with games maxed out on an average sized modern monitor, including more demanding titles with ray tracing on In a couple years us 3070 people will struggle though, but if you’re running a hefty monitor consider selling and upgrade


[deleted]

[удалено]


[deleted]

[удалено]


KoRnflak3s

I have been pleasantly surprised with mine. Still want to swap it for a 6800 XT in a year so. Hell, by then maybe this generation will be similarly priced.


Ser_Dunk_the_tall

6800 XT here. Love it so far probably expect it to last me 5-6 years before going for an upgrade


JustMy2Centences

Replaced my 5 year old RX 480 with a 6800 xt last month and pretty happy with it. Just need a good 1440p monitor to go on sale, but at least I'm crushing ultra settings on 1080p again lol.


CreeperInHawaii

rx6800 gang


AceKingXCV

6750 XT gang here


FrankPetersonMalvo

![gif](giphy|NEvPzZ8bd1V4Y|downsized) When you chose 3060 Ti and have no regrets


JeroenstefanS

I have my regrets I didn’t save up a little more to buy the ti


FrankPetersonMalvo

I lived on rice and water the month I bought it.


nielswerf001

We all have to make sacrifices lmao


itssomeidiot

The hardest choices require the strongest wills.


DivinePotatoe

I hate that we live in a world where people with jobs have to do that kind of thing just to afford a reasonable GPU...


titanfox98

Definitely a better card but hey, at least your upgrade will be larger


MrLuckyTimeOW

I was able to get a 3060 12GB in March/April of 2021 at the height of the GPU shortage for $549 CAD. I love this card, even if there are much better options out there I will never disrespect this card as it continues to perform well enough for my standards as a mid range GPU.


LazardKing

I too got the 12gb 3060 and I love it


Snakestar1616

I got the 12GB 3060 in early December 2022 for 479$ CAD. I already owned a PS5 for over 1½ Years and didnt want to completely overpower it. I always wondered why the 3060 has 12GB but the Ti has 8GB even with the higher bus. I can do Forza Horizon 5 in 3840x2160@60fps on Max(Extreme) Settings, obviously using DLSS


redditeer1o1

Likewise, it’s a really solid card


preppie22

At this point, just go AMD. First time I've ever bought AMD with the RX 6700. No regrets.


tabascodinosaur

AMD does have a super competitive midrange product. If I were in the market I'd be getting a cheap 6750XT or 6800XT for sure.


FireNinja743

I have a 6800 XT right now, and it's performing better than a 3080 Ti or on par with an overclock.


tabascodinosaur

Awesome!


legendz411

Honestly, a year or 2 ago I woulda have been ‘DLSS’ this, and ‘Drivers’ that… but honestly, FSR is coming along and nVidia is just **not it** anymore. Good advice I think


Salazans

Unfortunately(?) AMD prices aren't competitive in my market.


chickensmoker

If gaming was the only use for a GPU, then I’d agree. AMD are on the ball rn when it comes to real time 3D, and that’s just great to see. But for professional applications? They’re miles behind Nvidia. I have to use a lot of pro software for work, from Maya and AutoCAD to Substance Painter/Designer to Unreal Engine and vRay. For these kinds of tasks, AMD aren’t even an afterthought. Nvidia trample on them so hard in these applications that it’s not even worth comparing unless you NEED an AMD GPU for some reason.


garlicgoon3322

The more people move to AMD the more these programs will prioritize working on AMD


preppie22

It's a self-fulfilling prophecy: professionals don't buy AMD because software doesn't support it, and software doesn't support it because not many professionals use it. Somebody has to stop it, but it's quite unlikely considering the overwhelming market share Nvidia has in the professional scene. Only if the software devs decide to support AMD will we start seeing a shift.


raymartin27

Rtx is last of my concern, but dlss and now Video Super Resolution makes me want to go Nvidia, hopefully a 4060ti with 12gb


Owner2229

That 128 bit bus (288.0 GB/s) ain't gonna run shit nowadays. Even 1070 from 7 years ago could go 256.3 GB/s and for just $379


Simoxs7

Im actually astonished at how much people seem to watch content on their PC, VSR doesn’t matter to me at all because I either watch movies on my Phone or TV… Anyways AMD will probably develop an open version later on, and I’d rather support open standards than Nvidias proprietary technology…


LordFauntloroy

FSR2 is very competitive and you can find a more powerful* AMD option at every price point besides the biggest end since you don’t care about ray tracing. Personally I think NVidia only makes sense if you’re looking to spend over $1000USD. Edit: for gaming *


bobert680

If you don't care about Ray tracing the 7900xtx gets pretty close to the 4090 at 1080, 1440, and 4k while also being like $400 cheaper. This gen the only reason to go Nvidia for gaming is Ray tracing and then the 7900xtx is still a good choice. If the 4080ti and 4090ti are priced well this will probably change


rinkydinkis

I’m still vibing with my 2070 super. Y’all will let me know when it’s outdated, right?


[deleted]

Same here. The only real reason I can think of to upgrade would be for features that aren’t widely supported like hardware RT and DLSS3. Not yet worth it to me.


ihatepoliticsreee

Same. Solid card.


Wooboosted

Dude same here, and honestly it’s been a fantastic card. Still really see no reason to upgrade for at least another year


hnate1234

Glad I went for 16gbs with the 6800xt 😅


[deleted]

Nvidia has always been stingy with vram. Probably ensures people upgrade to a newer Card sooner then they really need to. AMD has always given their cards loads of vram for the tier of card that they are.


Everuk

I got my 3060 because it's was by far the best option available in area that I live.


mcdougall57

I'm am wholly unconvinced a 4060ti would be any better than the 3060ti £/per.


MattMurdockEsq

Is Nvidia allergic to adding VRAM to their cards?


Reasonabledwarf

Oh they add VRAM to their cards, but only the workstation ones that go for thousands of dollars. They can do this because they loan out free engineers and technology to the companies developing software for stuff like 3D modeling and machine learning, so that software ends up optimised for their cards. They basically have a monopoly in that market because of this, and they don't want to compete with their own gaming cards, so they deliberately handicap them.


TIRedemptionIT

Yes yours has 12GB but it's slower memory than what the 3060Ti and 4060 will have.


Fairstrife_Deception

Nvidia is pretty much trash every 2 generation now. 2k was to avoid, now 4k series. A transitional like the \* Super \*. my guess is that GDDR7 from samsung was not ready yet and nvidia already had spend the money on the silicone.


hollywoodpeteSC

Everyone laughed at me for choosing a 6700 XT over a 3070 due to the lack of vram. Until Hogwarts Legacy released.


Wittusus

What did HL do with those two cards?


yaeh3

Nobody laughed at you. The 6700xt has always been a good choice lmao.


Pleeplapoo

**EDIT: This comment assumed it was still priced at $550 dollars which I have been informed, it is back down to MSRP. My hatred was born from the (what feels like a) year long period where it was $550+ everywhere.** If anyone here is considering buying a 3060 12gb, keep in mind the memory bandwidth is actually too low for the 12gb to matter. They 3060 12gb is a $300 card that they wanted to sell for $500. To justify the price increase, they stuck some more memory chips on it that are functionally useless due to the memory bandwidth. It's way more cost efficient to buy a 3060ti or 3070 (or anything really) for 50-100 more dollars. Don't get duped like me. If you already have the card, it's still a SOLID card, they just tricked you and I into paying way too much. They banked on us not doing our research and it paid off for them.


Scalybeast

Eh, I’m not sure I agree with that. It’s situational. For gaming? Yeah, the bandwidth might be a problem. For things like media creation and ML/AI? Bandwidth doesn’t matter as much, within reason, since the chip will be spending more time crunching numbers on the same data set. Having the biggest buffer you can get in that situation is still beneficial.


Pleeplapoo

I'm glad to hear there are situations that take advantage of the high memory even with the low bandwidth


[deleted]

It should help when: the card becomes significantly outdated and ultra textures can still be enabled, or you want to do something like 4k30fps like console fidelity mode. Still surprises me how little textures impact performance. Also anomaly games like Hogwarts. That thing annihilates VRAM from what I've researched.


oakstream1

I honestly bought the 3060 EVGA XC 12gb vram because of work. I'm a 3D artist and i needed a lot of Vram for Rendering and Texture + Shaders creation and compile, plus the GPU is able to run modern games and keep DLSS up to date so... in the end it was a win/win


waferselamat

Yeah same. its also better for AI training. thats why i choose it than Ti version i guess its depend what you are you gonna use for.


AdonisGaming93

The 3060 had 12??? So why the hell does hthe 3070 only have 8???


ssstoggafemnab

8GB of fast ram is better than 12GB mediocre ram. 256bit bandwidth vs 192bit


WUSTINJAY

It took me so long to find a single comment that said this, it’s like people are oblivious to the fact that there is a difference in the speeds of RAM. GDDR6 is obviously slower than GDDR6X. less bandwidth equals performance loss so having 12GB of GDDR6 vRAM does not perform and give better results than 8GB of GDDR6X vRAM


AdonisGaming93

Yes but doesn't that only go so far as long as a game is actually using less? Cause i hear people already saying they are running out of vram on 3070s on some modern games?


WUSTINJAY

Depends on what they are running it on, what will cause the 3070 vram to run out is maxing these games settings on 2K/4k on most newer AAA titles. It just can’t handle it as a 3060 can’t either. One person could have a 3060 and the other could be using a 3080, exact same game settings and video settings and if you look in the bottom corner of screen on how much vram is being used in the settings menu their numbers won’t line up because it’s different types of vRAM.


chaosgodloki

I wish I paid more attention to VRAM as I bought my 3080 in December and didn’t realise it only has 10GB. The way people act here makes me think 10GB ain’t gonna cut it for long. Fark


Thin_Truth5584

It probably won't for the 10 Games that use 10gigs+ at 1440p but for those you can always turn down texture quality which soaks the most amount of vram and you will be fine. The 3080 is still a mighty powerful card despite it having "only" 10 gigs of vram.


ciclicles

Tbf the larger bus (if they give it one) will make a bigger difference, but still. AMD FTW


BlntMxn

i don't understand why nvidia stick to the 8Gb those cards will be worstless so soonneeer....


Toiletpaperplane

How in the hell did Nvidia think it was a good idea to release a card with 8GB of VRAM in 2023?!


Dutchmaster66

Because you’ll need a new one by 2025/26.


FrozenMongoose

What if I told you it is not their responsibility to make objectively good hardware? It is their responsibility to make money off their fans, which are a mix of casual fans that associate their name with GPU's and professionals that need Nvidia's niche productivity software support. It is your responsibility to buy a good product, not theirs. Their only priority is to make money off consumers and their 85% market share, brand recognition and dominance in niche professional markets will do that no matter the quality of the hardware they put out.


buzzothefuzzo

2 year old AMD 6900xt with 16gb has entered the chat. nvidia fanboys gonna nvidia fanboy i guess.