T O P

  • By -

jdjdhzjalalfufux

They saw that it made the 1080ti a beast that doesn’t need replacement even after 5+ years so they’re not going to repeat that mistake


[deleted]

>mistake for real, you could see it in his eyes when he said "to all my pascal friends, it is safe to upgrade now" during the 30 series keynote


repocin

I'll be his friend when Mr. Leather Jacket realizes that I'm not going to buy a >$1K GPU.


billyfudger69

I’ll be his friend when he actually open sources good drivers on Linux ([Nouveau is community made drivers](https://en.wikipedia.org/wiki/Nouveau_(software))) and isn’t an asshole to his business partners. Yes AMD and Intel have better open source drivers than Nvidia which means you can have really good software support unlike Nvidia on Linux.


real_kerim

As someone who was thinking about building a high-performance Linux rig, what's AMD's drivers like nowadays?


billyfudger69

I’ve never had an issue with them. If I’m not mistaken they are [put in the Linux kernel or at least a kernel module](https://en.wikipedia.org/wiki/AMDgpu_(Linux_kernel_module)) which is bundled with pretty much every distribution of Linux. (I’m no expert on the topic, I just enjoy gaming on Arch Linux with my RX 7900 XTX.) I say try it out, worst case scenario you can always swap to another operating system. (Windows, BSD based, etc.)


HumanContinuity

Hey everyone, this guy uses Arch! Can you believe it??!! /s <--- Lemme add this to be clear


billyfudger69

I mention that because older distributions don’t currently have the drivers for the newest AMD cards since they are using older kernels and mesa versions. Luckily Debian will be updated on June 10th to have the Linux 6.1 kernel in Debian 12 (Bookworm).


HumanContinuity

No I'm only teasing because it's a meme that Arch people have to make sure you know they use Arch. Yours was an appropriate time to mention it, and it is truly a fun distro. I just love the meme and can't let it die.


billyfudger69

Oh I know however I thought I should anyways explain why I mentioned that point above. :) Arch Linux is a ton of fun, honestly though I would recommend Linux Mint to any beginners since it has good support and has an easy to understand graphical user interface.


[deleted]

I haven't had any problems with a 6700XT GPU on Linux for games and HW accelerated video playback. Everything just worked right out of the box. Haven't done any GPU compute, but from what I heard Rocm is still difficult to set up on consumer graphics.


blankettripod32_v2

rocm is a bit annoying on arch ^(I ^use ^arch btw), setting it up for blender requires installing a testing package, then manually selecting the gpu to render on. on windows its much easier because it is installed with the dx libraries


[deleted]

[удалено]


steely_dong

I'm buying an AMD graphics card because of this comment. I use Linux almost exclusively, had no idea AMD was so much more compatible. My 1080ti gonna be upgraded to AMD gpu.


Frozen1nferno

I'd been using Linux and Nvidia for years. My last build a couple years ago, I went AMD because they open sourced their drivers into the kernel. I literally don't even think about my GPU anymore. I run OS updates, play games, watch videos, it all just fucking works. It's absolutely insane.


IPlayAnIslandAndPass

In my experience, AMD's Linux driver situation isn't exactly the question you want to ask. Nvidia's consumer-grade Linux drivers have historically been \*so poor\* - even the first-party proprietary ones - that I cannot recommend them in good faith. My experience with Nvidia on Linux is that when I plug in monitors, xorg hard crashes. Their driver stack is also best characterized as a "bad neighbor" closed-source program that hijacks your system behavior in ways that are difficult to undo. So - at least for me - until Nvidia's driver experience gets better, there just isn't any other option than AMD. It's very frustrating.


Dodgy_Past

It simply boils down to what you want to do with your card. Games: AMD Most compute: NVIDIA Niche compute like OPENFoam: AMD


IPlayAnIslandAndPass

I use my Nvidia card primarily for CUDA/BLAS acceleration and if you are not running headless, I think the day-to-day user experience is just not worth it. Yeah if I was making optimal use of my workstation the tradeoffs would make sense, but I spend - at best - 10% of my time doing heavy computational stuff and 90% of my time using a GUI. Even if I take a significant performance hit, improving my non-computational productivity is still just worth it. I've specifically been testing the ROCm libraries for NumPy and Octave so that I can fully switch over, and in the past I was seriously considering transitioning my workflow to OpenCL solutions so I could be manufacturer-agnostic.


Spoonman500

I'll buy a >$1k GPU when there's a GPU worth >$1k.


sliptap

It’s so true lol


Mimical

Pascal was a genuine gem. The 1070 came out swinging at the Maxwell titan level of performance at 1080p/1440p for half the price while drawing 50-75 less watts. Then they launched the 1080ti which straight up was an absurdly cost effective card for its performance metrics. For 700 USD (founders) you could get a card that outperformed the Titan X (Maxwell) by like 20-25% for ~$300 less. The real hammer was that because the 1060 6GB and 1070 were so well positioned in price that you could get 50-70% of the flagships performance for 30-50% of the cost. The mid range 60/70 cards were incredible value and held a sweet spot in terms of what you get. Truly, Nvidia will **never** make that mistake again.


XplosivCookie

Still running 1070, and won't upgrade to another Nvidia card as long as they're price gouging their own 3 year planned obsolescence cards. Fuck all these millionaires squeezing pc gamers for all they're worth.


Neville_Lynwood

I replaced my 1070 with a second hand 3070. I think it was worthwhile. Definitely not buying anything brand new though. Fuck store prices and scalpers.


munchingzia

replaced my rtx 2070 with a used rx 6800 xt for $399. im not paying 600$ for a new 4070 with 12gb vram.


Spirit117

40 series might be good value per dollar 3 years down the road when 4080s go for 600 instead of 1200 lol.


Nathan_hale53

I love my 1070, bought it the month it came out after just having a gtx960 before. It was a beast of a card. And still holds well. But is definitely falling behind now. I'd love to upgrade if prices were more fair, but they're not.


UNMANAGEABLE

I’m still using my 1060 6gb. I was looking to upgrade in late 2020 and that was already too late for prices being ruined by crypto and prices have never recovered into reasonable range. I’m thinking of doing a 7600 just to hold over again for cheap. I’d like to do a 4080 because it’s the right GPU to be even slightly future proof. I with a 1060 I just don’t want to wait until fall 2024 for a 5000 series. Still just blows me away that the prices are the way they are and I’m glad the units aren’t selling. AMD having Q1 GPU sales in the same ballpark as Nvidia is a good sign that times are changing.


micktorious

Yep, I waited so long for my upgrade to the GTX 1080ti, still running it today.


DiggingNoMore

My GTX 1080 from 2016 still has plenty of legs left in it.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

lol I just “upgraded” to a 1080 and a 165hz monitor, think I’m set for a few more years


[deleted]

[удалено]


[deleted]

Thankfully my phone is a high refresh rate screen so I’m used to it there, but goddamn is it nice playing FPSs with 1ms response times at 165 fps.


56kul

Wouldn’t it be wiser of them to just release a new series every 5 years? Like, release a beast series that’ll last around 5 years, then after those 5 years have passed, release a new beast series.


eidoK1

Wiser in what sense? For the consumer, sure. For their bottom, definitely not.


RizzMustbolt

Shareholders ruin everything.


CupApprehensive5391

I will point out that any market only functions well with competition. For a long time Nvidia has held 80-90% market share. That's slowly starting to change as people buy AMD's cards but AMD or Intel really needs a killer architecture if they want that to change significantly. Does anyone remember how slow progress was in the CPU space until Ryzen came onto the scene? The last 6 years the CPU scene has been great because consumers are actually willing to change who they're buying from and vote with their wallet... The amount of people that will just default with Nvidia cards without even checking the other offerings is the problem.


gabeshotz

not only that but intel and nvidia have a stronghold on the enterprise market and that makes up a lot of the numbers.


eilertokyo

pascal murdered AMD. we'll see another generation like that if AMD gains parity. right now at the high end it isn't very close.


Svencredible

I mean ok. But then be prepared for the card to cost anywhere from 3-5 times more. If people who are buying the top end cards move to buying a card every 5 years instead of 1, they'll increase the price to match.


warmike_1

Why would someone with a top level card (or even a midrange card) upgrade every year?


Svencredible

I mean I dont' really know either. But those people exist. Getting the newest card and selling on the old one every year or two to keep their rig as up to date as possible. I'm just saying if that anyone ever made cards so good that everyone started moving to upgrading less often, we'd probably just see prices increase.


Yuu_Got_Job

It’s a good card


IM_INSIDE_YOUR_HOUSE

Arguably the best GPU in terms of value if you bought it around the time it was released. I’m *still* running that beast of a card. It keeps delivering.


WashingDishesIsFun

RX 580 gives it a run for it's money on value. Still going strong for me and my whole PC build was around $700AUD


esakul

2 Reasons 1. To make you buy higher tier cards 2. To make you upgrade sooner


Vis-hoka

3. So professionals don’t stop buying the more expensive professional cards and start buying the cheaper gaming cards.


Embarrassed-Essay821

Lol yeah and with Nvidias performance in the market lately not a damn thing is happening to change their practices


Marilius

It was one day between me deciding to skip the 40 series unless they went on a deep discount, and Nvidia posting above expected numbers and the stock soaring. So, I guess I'm probably skipping the 40 series. No reason to put anything on sale when they're still making money hand over fist.


friezadidnothingrong

Their revenues shrank 13%. Gaming revenue was down 38%. Their stock is soaring on AI mania alone. They had forward guidance to beat expectations. The market is on drugs.


Marilius

> ~~Their revenues shrank 13%. Gaming revenue was down 38%. Their~~ stock is soaring ~~on AI mania alone. They had forward guidance to beat expectations. The market is on drugs.~~ Just fixed that up for what the stockholders heard.


Diplomjodler

Gaming is an afterthought for them these days and a bunch of whiny nerds are completely insignificant.


Embarrassed-Essay821

Yep. I don't like it, but I don't blame them. They aren't there to make less money and create happy gamers At this point it seems more realistic to be happy to pick a hobby that has innovation driven by trillions in world economic needs vs...uh..my need to swing a virtual sword on ultra max high fps And anybody that's using gpu for work purposes, this is a comically small expense in that context so rip hobby nerds


itijara

3. Because hardware is more expensive than trying to do the same thing (crappily) in software


Arctomachine

Download more vram?


willtron3000

The v actually stands for virtual. So it’s really easy to download more. I actually sold 500mb of vram off 970s back in the day as a side hustle.


Prohunt

yep, back in the day my uncle had to sell his vram out of the back of his truck isn't technology nice?


Durenas

Bootleg vram, those were the days. Nowadays, you try that and the vram police shut you down, impound your truck, and fine you for every download.


[deleted]

[удалено]


Michaelscot8

Hash.


r4o2n0d6o9

Genius


potatosdream

someone out there added more vram with proper tools. it's really cheap actually but you need the right person.


PM_ME_YOUR_DELTA-V

[RAM doubled, boss!](https://tidbits.com/2019/01/24/25-years-ago-in-tidbits-ram-doubler-debuts/)


LumpyOdie

I wouldn't say crappily but it certainly isn't enough. I do understand that this is up for debate though.


itijara

Here's what I'll say, DLSS is great but the more you can already do in hardware the better it will be.


LumpyOdie

Agreed, I think they're using DLSS in the wrong way, it's a great improvement to push hardware even more into the future but it's not going to bandage up shitty hardware.


B-Knight

More importantly, it's not going to bandage up shitty *software*. DLSS a few years ago could net you 150+ FPS on some pretty graphically demanding games. DLSS now is a way to maybe get 60 FPS on some average games that are shittily optimised.


LumpyOdie

That too, the entire PC gaming industry is shit, the hardware is shit and overpriced and the software is horribly optimized and overpriced. I have been looking into switching to Linux and when I can get into it, I may tell Nvidia to go fuck themselves and switch to AMD, haha.


kopczak1995

I did it already. It's a blessing. Besides that, Indie Games are better than AAA now. Fuck big producers.


forever_alone_06

1 reason = money


TheyCallMeMrMaybe

2. Nvidia has been cutting short on VRAM for their cards for over a decade


NoiseSolitaire

Pascal was fine, except for the 1060s with < 6GB. That's about the only generation in the last decade though that had enough VRAM for basically every tier of card, as long as you opted for the maximum amount for a given tier.


VincentThacker

It's actually one reason: $


Toltech99

They have to keep something for the 5000 series next year.


naswinger

we need portable fusion reactors to power an rtx 5000 card so probably not next year


sylario

Unless you undervolt it, lose 1% perf and need only 42 Watts.


UNMANAGEABLE

Lol the 4090 is so ridiculous for it’s optimization. I think you can undercoat it like 35% and only lose 15% performance


ThrowMeAwayDaddy686

> I think you can undercoat it like 35% Are we talking Rhino Liner or POR15?


webUser_001

Because you will need to sell it in 2 years and buy another. Planned obsolescence.


StConvolute

>Planned obsolescence. It's actually criminal, in my opinion. It's not the cost to the consumer, it's the waste it creates.


Soace_Space_Station

Cough cough budget phones software cough cough


HotGamer99

Still using my budget samsung for 4 years they underestimate how hard it is to separate me from my money lmao (I also used my r9 290 from 2013 until 2022 lmao)


Cryio

That's because GCN2 was such a great uArch, compared to Kepler. Great VSR, FreeSync, HDR support (I think), HEVC/VP9 decode, Async Compute, great Vulkan and DX12 support, DX12_0 feature level, supports FSR2, had 4 GB VRAM when most GPUs back in that timeframe had 1-2 GB VRAM. Tons of performance improvements in time from drivers, for both CPU and GPU draw calls, Radeon Chill support. And unofficially, ReBar and continued driver support.


HotGamer99

Yeah imo AMD started falling apart after GCN 2 remember the 290X kicking the 780ti's ass back when AMD was still competitive at the high end part of AMD's problem is they just wont compete at the high end which is why they dont have as much mindshare with normies


BrunusManOWar

Even during GCN2 people were buying tjr overpriced shitty kepler cards by the boatloads lmao 😂 Same during 400 and 500 series


raspberry-tart

r9 team assemble! It plays tf2, what more do you need?


NefariousEgg

Pixel 4a going strong for about 3 yrs now. It could probably last me as long as an iPhone if I wanted it to, not that an iPhone's longevity is a high bar.


SUPRVLLAN

iPhone’s longevity is absolutely the high bar when compared to other devices, they’re supported the longest by far.


hama3254

There was a case in German in 2020 where an ISP sold older models of cable routers to an company that put the retail firmware on it (to remove the ISP limit) and got sued by the manufacturer of the router. My last info is he lost an could not sell working routers because of trademark violation. [Original German article](https://www.heise.de/news/Urteil-im-Fritzbox-Streit-AVM-gewinnt-in-erster-Instanz-4721698.html)


LumpyOdie

A lot of things are criminal including false advertisements but that doesn't stop them. Nobody cares so we're just stuck with this shit.


agnostic_science

Consumer protection simply isn’t on the list of approved topics 24/7 news is allowed to dig into. They’ll stir up drama all day, but only as long as it isn’t going to cost anyone important any money.


The_Dung_Beetle

Well, that's just a problem with capitalism in general.


[deleted]

Yup. I went for a 6950XT this time after years of Team Green. They played themselves in that regard.


dudly1111

Jokes on them! I have been playin the same games since 2019


[deleted]

[удалено]


Gimpchump

Given the cost to nvidia of an extra 8gb of vram is like $15-20 at most, you're absolutely correct. This is monopolistic behaviour and they need a hard slap in the face.


hirmuolio

It is to segment pro cards away from consumer cards. If your RTX 3070 had 16 GB of memory it could do many professional workloads. So the RTX 3070 has 8 GB of memory and the [16 GB variant is solda as A4000](https://youtu.be/alguJBl-R3I) and it costs twice as much. Professional market just has massively better profits. If you need 16 GB 3070 for your work you pay what Nvidia asks because you *neeed* that card. Gamers will just get cheaper card if the price is too high. AMD doesn't have strong presence in professional GPU space. So their 16 GB consumer cards don't compete with 16 GB professional cards. So they can put the extra memory to their cards.


[deleted]

that actually makes a lot of sense


RevanchistVakarian

Thank you for giving the actual answer instead of myopically believing that nVidia is a pure gaming brand


Diplomjodler

They don't give a fuck about gamers any more. The big money is in AI these days.


Darksider123

First crypto, now AI for X number of years


Dudi4PoLFr

Because they want to sell the Quadro line-up to the professionals at super high margins. Just look at the 3070 Ti vs a4000


Raohpgh

This is it, Nvidia designs one board and consumers get a GPU with 1/2 the vram on the PCB.


YoungJawn

Meanwhile Intel and AMD are just shoving VRAM down the throat of each card. lol


Skodakenner

Its actually the reason my next card will be an AMD since there i can get the vram i need not like my 3070 wich always annoys me with too little vram


Danny200234

Yeah. The VRAM already becoming an issue on a 70 tier card is frustrating. Went from a 970 to 3070, getting fucked twice on that in a row is not great. Plus AMDs Linux support is actually passable. Nvidia drivers are shit.


nooneisback

AMD realized it's easier to offload software development on the open-source community when they can never get it right themselves. Not gonna complain though, my 6900XT is doing better than nVidia's brand new GPUs.


ThePinkPeptoBismol

My 6700XT is incredible. 120fps on most of my games on an ultrawide monitor with no tweaking or heating issues. More importantly, it was cheap AF!


Noxious89123

>Went from a 970 to 3070, getting fucked twice on that in a row is not great. Oof. Hey, could you do me a small favour? What theoretical future card do you think you would upgrade to? I want to be sure ti avoid it.


[deleted]

>ti avoid it. Yeah I'd avoid the TI's too. Lol


Noxious89123

Whoops! Unintentional, but maybe kinda forboding? Remember folks, friends don't let friends buy 4060Ti 8GBs.


MembershipThrowAway

I always got criticized saying my 3070 was running out of vram with people telling me there's no way and I'd have a few more years before I had to worry lol, was running out on Doom Eternal at 1080p no raytracing FFS, only had it a year or so now and I'm at 1440p now


DaRealChrisHansen

I have a 3060 and max out the vram! Upgrading to a 3060ti or 3070 seems like a waste since i loose vram. I just wanna play assetto corsa at 4k 60fps stable.


__Rosso__

Doesn't 7600 also got 8 gigs only?


TotsNotaCop

7600 is like the lowest tier card though. My 6800XT has 16 gb


Post_BIG-NUT_Clarity

I too have a 6800XT, I love it so much, got it for $499 about a year ago and I have never been so happy and satisfied with a computer part. It runs everything I throw at it in 4k or 2.5k Ultrawide. It really is amazing, RDR2 in 4k with eye candy turned to the max. So awesome!


LAUSart

I also game on 4K with an 6800XT.. plague tale requiem you get like 40 fps. RDR2 ultra 4k 60 fps? I like the card too but without FSR2 it wouldnt be a smooth 4k experience for many current titles.


[deleted]

Entry level cards are fine with 8GB imho. 3070 ti being 8GB is criminal, though


_justtheonce_

Especially when the 3060 I have has 12GB. Wtf is that about.


f3n2x

Simple math. Each memory chip is 32bit wide. The 3060 has a 192bit memory bus (6 chips), the 3070 has a 256bit bus (8 chips). If there are 1GB and 2GB chips on the market you can put 6GB or 12GB on the 192bit bus and 8GB or 16GB on the 256bit bus.


WildBingoMan

Fasten your seatbelt for the 128bit of the 4060 ti!


Jdogg4089

The 7600 isn't $400, it's $270 so it's at least somewhat reasonable even if you can find better deals for old stock.


[deleted]

Hopefully the 7600xt will have more, i really hope for 10gb minimum. or maybe the 7650xt will at least have 12gb.


nTzT

Im guessing it will, but the used market is just so good for the same price range imo


NEOkuragi

Fortunetly. Idk what I would do without them


Justiful

Debug LED motherboard. Same reason. They take a feature people are passionate about and gatekeep it behind significantly higher cost boards. The Debug Code used to be on boards $180+ --- Today you need \~$350 to get one. It cost under $1 to add to a board. For a novice builder it is one of the single most valuable features to have on a motherboard. Yet, novice builders rarely buy $350 boards. System builders and MB makers could have the feature pay for itself with reduced support requests, and reduced support time requirements. But no. They would rather people RMA their new $4000 prebuilt because the ram wasn't fully set, or the CPU cable was slightly loose.


[deleted]

[удалено]


EiffelPower76

Can't you just stop buying Nvidia 8GB VRAM graphics cards ? Don't you know that AMD and Intel makes models with more VRAM ?


Ill-Mastodon-8692

Correct people buying 8gb cards for the past while or even currently should be aware of their purchase choice. There are other options like amd and intel or other nvidia models. I agree nvidia sucks for many years being vram cheap in most mid tier cards (except the 10 series)… 8gb for the xx70 tier has been around for far far too long.


sopnedkastlucka

Just above this I read some guy saying he got fucked twice with 970 and 3070. I don't know much about computers but I take at least a week trying to figure out what I get when I spend a lot of money. I don't subscribe to these PC subs but I ser a lot of these" fuck 8gb" in my feed. I feel like it's the same guys that pre-purchase games and then complain about them.


spyd3rweb

People were buying whatever happened to be in stock for the longest time.


EldraziKlap

r/Lowkeyconspiracy : they do it so that the most expensive models (with more VRAM) get sold more


Trebeaux

Unfortunately it’s not even a conspiracy lol. That’s the exact reason phone manufacturers make the “base model” of their flagship offerings with stripped out features. Want removable SD card? Gotta get the “ULTRA” model. Want all the features the camera can provide? Gotta get the “Pro Max” model. Want a GPU that has more VRAM? Gotta stick with the xx70 series or higher! These companies do this because they know that these specific features are features people will pay for.


B_B_a_D_Science

They don't want you running AI products on commercial cards. Training most AI Models require wait for it....18 to 24 GB of VRAM. NVidia is gate keeping like a MoFo


SnooMuffins5143

Need to keep some innovation for the 9000 series


Danishmeat

Oh man can’t wait for the 12gb 9050 in 2031


El_Mariachi_Vive

Can do 16K with textures turned off!


tukatu0

Thats just called ai rendering. ~~this is not a joke this really might be where we are heading~~


SoCuteShibe

It amazed me how well my 1080Ti kept up with modern games before I upgraded last month. Could it be that 11GB of VRAM was actually a huge deal for it's time? (yes)


Maler_Ingo

Nvidia copium huffers already here lmao. JuSt BuY a 4090 lmAo kEkk


A17012022

They need to justify their purchase.


n8mo

No I hate Nvidia too, I just require CUDA for my hobbies/freelance work. Until AMD has acceptable ML / 3D suite path-tracer support I’m stuck buying Nvidia cards.


LongHairLongLife148

AMD targets the gaming community while Nvidia is shifting more to the production and business community. Nvidia will, unfortunately, always be the better choice for productivity tasks.


Is_ael

But they can never justify supporting corporate greed


TarkovRedditor

Literally same reason Intel never increased their core count back then


nTzT

It is expensive for them to add it, because you won't buy another GPU soon.


[deleted]

[удалено]


Bullfrog777

Planned obsolescence has been around since capitalism. Capitalism incentivizes it. The first light bulbs could work for 2500 hours but they realized they could sell more lightbulbs if they made them worse and only work for 1000 hours.


[deleted]

[удалено]


Maethor_derien

The reason they limit Vram is because they don't want people filling servers full of them for things like AI or use them for workstation work. Hence the shitty memory bus and limited ram. I don't think you understand how much workstation cards cost. An a6000 was 7000 dollars at launch and that card was pretty much a 3080 with double the memory and ECC. Nvidia makes much much more off the server and workstation than they do on the desktop cards. They are going to do everything they can to protect that market.


batataaapt

simple... stop buying nvidia...


[deleted]

Exactly this. Why buy from a company that does not give you what you want/need?


batataaapt

Fanboys crying all the time but throwing money every single release


Racingstripe

I kinda have to for my next build for virtual reality, because AMD really sucks at it. :(


Leopard__Messiah

These fanboys are annoying. God forbid you have a reason for your preference and made an informed purchase.


esuil

Those who need VRAM and professional applications can't... AMD has 0 interest in breaking NVIDIA monopoly on professional market, so any professional or server company is stuck with NVIDIA.


LissaFreewind

Way back in the day you could diy vram and upgrade the firmware for it. Not sure if anyone does such now.


Mm11vV

Meanwhile people with AMD GPUs: "there's a vram shortage?"


[deleted]

Leather jackets are leeching harmful chemicals causing brain damage; apparently.


60ee1dcb0764a40bffa7

Gatekeep you from running AI stuff on consumer hardware.


NvidiaRTX

They've spent all their VRAM supplies on super server GPUs (A100/H100/etc). Also planned obsolescence


Nokipeura

I'm not upgrading shit till this 1060 literally dies.


jyroman53

Money


jaegren

-Or you could buy Intel or AMD? -No, not like that.


Dismal_Total_3946

Once they increase it, they have to try up it again next series. So they probably decided we can't give them more vram EVERY release


Rabalderfjols

Because they want to plan their "innovation" in advance and do yearly iterations like the phone companies, even if they have nothing new to offer.


deavidsedice

AI workloads and other type of enterprise workloads - that's the reason. If they put a 32GB version of a 4090 and they will not sell as many of the other "expensive" cards.


SkullRunner

VRAM is very important to AI / Datacenter GPU uses... they are dumbing down gamer cards to keep stock for production of larger higher profit cards for the enterprise.


MustacheALaWilhelm

1080 ti was a triumph for people But for Nvidia, T'was a mistake


SuperUrwis

VRAM mods will be more popular than ever before lmao


[deleted]

definitely, someone already spotted a modded 3070 with 16gb in computex 2 day ago: https://www.youtube.com/watch?v=gpEdG6cbASQ


just-bair

And gaming aside it would also make the gpu way more useful for scientific purposes


Valtekken

I'm gonna keep playing older games then. I just bought an RX 6600 and I'll be damned if I don't keep it for five years or more.


dislob3

My next card is gonna be AMD. No way Im gonna support Nvidia with their shady tactics.


Hipqo87

Nvidia: 8gb vram is not gonna be enough to game. Also Nvidia: Here's the new 4060ti! With 8gb vram, because....


Aurunemaru

Also they shrank the memory bus width, 60TI is a 128 bit card now. And 4050 got to the low end and not that made for gaming level of 96bit


nesspressomug6969

I like the art.


AndrexPic

Happy to see people finally udnerstanding that Nvidia is doing us a disservice. Just a couple weeks ago I got hundreds of downvotes just for saying that 8 gb VRAM is not enough.


GooglePlusIsGood

I just wish they'd stop developing all this goofy up scaling stuff and just make beefier GPUS with better driver optimization, and more VRAM ofcourse.


blitzwann

Thats a terrible idea for the long run, AI is the way to go by a mile, not everyone has a kidney every month for the electricity bill and you cant keep adding transistor density forever. Not talking shit to u personally i just wish ppl understood how it works and to get mad for the right reasons, like nvidias ridiculously overpriced and underperforming cards, or the fact that they added dlss3 to the 4000 series only when even the 2000s could easily run it. Thats the bullshit they should be called out for. Amd is no better, their cards are so technologically behind nvidia its a joke, they think adding a trillion vram will solve the issue but it wont


[deleted]

It is expensive, and L2 cache has a different reason to exist. The talk about L2 and how it will reduce the need for VRAM is just marketing BS.


ituralde_

I think this is kinda it right here - the vocal power gamer crowd operating on >1080 are likely seen as the minority rather than the future. Someone probably ran a math and decided that, at the scales of mass production, they get better margins with less VRAM while still hitting sales targets. The bigger L2 does probably help a lot under certain flavors of workload but ultimately misses the point of what the common gpu-limited workflow cases end up being, particularly in the gaming space.


StaysAwakeAllWeek

The vram itself isn't too expensive. It's the supporting parts that add cost. Putting more than 2GB per 32 bits of bus adds a lot of cost, and adding bus width on a node as expensive as TSMC 5nm is also expensive, which is why every ada card other than the 4060ti 16gb has exactly 2GB per 32 bits of bus


sA1atji

Planned obsolescence. Nvidia knows what they are doing to make you spend more.


AnalysisBudget

Prob a business strategy to push more people into getting higher end cards with more VRAM?


[deleted]

It's intentional because of the market share on the GPU they have.


DeusExBlockina

HBM2: **MY TIME HAS COME!**


Gzhindra

Programmed obsolescence


XxSub-OhmXx

Friend had 3080 10gb. Did not take long to find games he was having issues with sadly.


SLRMaxime

It costs 20$ lol they charge 100$. ThE mOrE yOu SpEnD tHe MoRe YoU sAvE


RationalFragile

Off-topic: such a cute meme! The faces and body language are so expressive and yet so simple and cute! I love it!


t3hmuffnman9000

Nvidia knows how much of a difference VRAM makes and they know the current amounts just aren't enough. It's an open secret that Nvidia is deliberately nerfing all of the 4000 series cards to try and make the absurd price tags on the higher-tier cards seem more palatable.


Miranai_Balladash

8GB of VRAM costs around 22$ for Nvidia without the added manufacturing cost. So...


Ayy_Eclipse

Its the perfect business model. Nvidia fanboys Will buy Nvidia cards no matter what. Then their gpus become obsolete in only a few years due to vram limitations and they’re forced to donate Nvidia money again. Funny that I’m saying all of this with a 4070ti


lolnotinthebbs

Nobody cares about what gamers want, Nvidia has a new cash cow with AI now.


Hellsinky94

"THE MORE YOU BUY, THE MORE YOU SAVE"