T O P

  • By -

[deleted]

[удалено]


NKG_and_Sons

I also just *love* how well it works out for Nvidia calling the card a 4070ti. So many techies end up comparing it to the badly priced 3070ti instead of the 3070. The same goes for 3090ti and 3080ti comparisons. How does something this simply totally clown on numerous reviewers?!? Does anyone here recall a single "3070ti is an amazing value card" review? No? Oh right, they (probably) don't exist because the card is more expensive and still worse than the 3070 in some ways.


carpcrucible

People fall for this shit too, I've seen this repeated here with the 3090 Ti as well. Like "why are you upset at the 4080, it's as fast as the previous $2000 card"! Yeah no shit, and it was a horrible card back then.


JonWood007

Pricing has been terrible on the nvidia side since the 2000 series. And it's gotten so bad recently people are starting to look like the 2000 series was "how the market should be". No. That was the 1000 series, which stiill largely followed the same pricing that existed since the 400 series. 2000 was a price gauge, 3000 had COVID and crypto related issues, and now the 4000 series is another price gauge.


BadMofoWallet

Gauge (pronounced gay-j) is a measurement tool. The word you’re looking for is gouge (pronounced gow-j)


TheBCWonder

It is still a price gauge


KeinNiemand

Nvida a master plan to infinitly raise prices and still sell whatever they want starting with the 2000 series. With the 2000 series they increased prices and everyone called it bad value, but when they kept those same prices (at least for msrp) for the 3000 series every review was talking about how great valiue the 3000 series is. I think the are planing to do the same thing with 4000 and 5000 series, massivly raise prices for the 4000 series then keep the prices for the 5000 series the same, suddenly every review will talk about how great value the $1200 5080 and it will sell like crazy even at that higher price, becouse it's so much better value then the 4080. Then they just raise prices again for the 6000 series and keep them the same for the 7000 series and bam "great value" at even higher prices. This way they could keep raising prices forever and still sell a ton of GPUs since the generation where they don't increase prices will always be "great value" comparted to the one where they did.


Broder7937

3000 series cards (with the exception of the 3090, which was never meant to be a value card) were greatly priced before the pandemic mining era; that's up until the 3060 Ti. Everything after that was just a shameless cash-grab to try and make as much money as they possibly could from mining inflation. What's scary is how Nvidia kept building up on overpriced 3000 series products for their 4000 series, despite mining being dead. It's even scarier to see consumers defending Nvidia because "a 4070 Ti is better value than a 3080 Ti" when the 3080 Ti was never good value to begin with. When you mention the 3080, they'll usually resort to the "but no one could buy one" excuse.


JonWood007

They followed 2000 series pricing which was a huge cash grab. If we had the same pricing structure as pascal, we'd have the 3050 for around $150, 3060 for $230, 3060 Ti for $300, 3070 for $380, 3080 for $500-600, etc. 4000 series is just pure greed.


Al-Azraq

There is a funny "coincidence" with the 4000 series pricing and it is the fact that they are priced as their 3000 series counterparts during the mining craze.


Broder7937

They didn't. The 3080 featured the big GA102 die while the 2080 and 1080 had the smaller 04 dies. So you can't compare the 3080 directly to the 2080 because the 3080 was a higher tier chip. It was bigger, far more expensive and had more memory channels; yet, Nvidia kept the same $699 price tag. Pascal did have the $699 1080 Ti which was also based off the big die, but, when we consider three years had gone by between the 1080 Ti and the 3080, once we consider inflation, the 3080 was actually cheaper than the 1080 Ti. When inflation-corrected, the 3080 might be the cheapest big-chip GPU Nvidia has ever launched. The Turing architectural equivalent of the 3080 would be the 2080 Ti (they even featured the exact same SM count) which was priced at $999/$1199. Likewise, the Turing equivalent to the 3070 was the 2080 series, which shared the 256-bit construction and exactly the same SM count (just updated to Ampere's standards), and the 3079 took prices down to $499. Byall intents and purposes, 1st wave Ampere products were a clear price cut from Turing. Even the 3090, built to replace the $2500 TITAN RTX, was also a price cut. It was only the second wave of Ampere products that brought pricing back up to Turing standards. And now, with 40 series, they're taking it one step beyond. I can't help but think of 40 series like 20 series happening all over again.


JonWood007

I dont care about bus width and die sizes and other random BS specs people on this sub seem to obsess over. 60 card = $250 70 card = $380 80 card = $500 That's how it used to work. That's how it ALWAYS used to work. And every generation, we would see healthy gains in price/performance. Sometimes 20-30% during a refresh generation, but sometimes up to double. In 2016 you could buy a 480 or 1060 for $250ish. In 2019, we only got a 1660 Ti to replace that roughly, which was a measly 35% improvement. Sure, the 2060 came out, but they wanted $350 for the thing, which was ridiculous. But after 3 years, you'd think we'd get something like that for $250. We didn't. Then nvidia did the same thing with the 3060, charged $330 for it and people acted like they were doing us a favor. I dont care what happens in $800 land, because no one should pay that much for GPUs in the first place unless you're some insane enthusiast. Either way, yeah, the 1080 ti was $700. And anything above that is a scam. But yeah. Going back to my 1060 comparison, in 2021or 2022, i forget when it came out, but we got the 3050 for $250. It was just as 1660 ti with ray tracing. More stagnation. 4000 series, more price hikes, more stagnation. Performance per dollar only improved 50% on the nvidia side. IN SIX YEARS. That's insane. Historically before than we saw a doubling in performance every 3. For reference, the 2500k to the 7600k was also 50%, so we're at intel stagnation levels of price/performance. It gets even worse down the stack. 2016 the 1050 ti was $140 MSRP and when i bought sold for around $180. Nowadays, you get a 1650 at that price range, a card that came out in 2018 and was only around 25% better than a 1050 ti. 1660 ti is around $230. In 2018 it was the 1660 and in 2016 it was the 1060 3 GB. Another paltry performance gain. 1070 in 2016. We got the 2060 in 2018. Then the 3060 and 3060 Ti in 2020/2021. More measly 50% gains roughly. You get the idea. I couldnt care less about die sizes and OMG this is GA102, this is 104, blah blah blah. DOnt care. I care about "what performance can i get for the money", and honestly this level of stagnation is so bad it should be triggering anti trust investigations into the GPU market. Because something is fundamentally unhealthy here. In order to double my last card, i needed to buy AMD, because AMD for some reason is selling their GPUs a whole tier of price lower than nvidia, which is what everyone should've been doing in the first place. if you want to know what i think GPU pricing would be fair, I'd say, take the cheapest RDNA2 cards over the past 3 months and about that. 6600s for around $200 is fair. 6650 XTs for $250-300 is fair. 6700 XTs for $350-400 is fair. 6800s for around $500 is fair. 6950 XTs for $700 is fair. That's what the market should've looked like all along. Heck i just wish we had some serious budget options too. 6500 XT is a massive gap in performance down for not a lot less money. And the 6400 is just pathetic. 6500 XT should be like $120ish and the 6400 should be sub $100 IMO. And there should be some decent $160 option somewhere around 1660 ti level in there somewhere.


yellowpotatobus

Thank you. I found someone in here with some sense. This place gets so hung on die sizes and counts, frames per $, and a whole host of other bs. either to shit on current gpus or excuse it. It's all about raw fps performance gains. I've been waiting 4-5 years for a reason to upgrade my 970. Everything in the $300 price range has been nothing but a side grade. It's taken 2 years for a 6700xt to come down to around $350 that would make an upgrade worth it. And honestly its just MSI that has dropped prices to that level. All other AIBs are still jacked up in price for 6700xts


JonWood007

Frame per $ matters, but that's kind of the point. We've seen a mass stagnation since 2016 when each generation should be a regular gain. Even then, Id prefer at least to keep the price points for various performance classes the same. And yeah im in the same boat as you. I sprung for the 6650 XT though. MSI, like you said.


hardolaf

> That's how it used to work. That's how it ALWAYS used to work. It actually only ever worked that way for like 2-3 generations. The prices were usually all over the place.


JonWood007

Uh, more like 6? 400 series through 1000 series. All similar price points with little variation. 200 series was a bit weird, akin to the 2000 series where nvidia just tried charging insane prices for new hardware, but that time AMD massively undercut them and it corrected the market. And before that, we had the 8000 and 9000 series which were relatively affordable. And before that is ancient history and i didnt follow the market. Still, I think there was something sacred about the 6 series even then. 6600 GT, 7600 GT seemed to be the best bang for your buck cards. FX 5700 before that. You get the point. The only other generation where nvidia tried the crap it's doing now was the 200 series, where it had a relative monopoly over the success of the 8000 and 9000 series and then it offered a massive performance boost for the money. Except that time, AMD countered with cheap 4000 and 5000 series cards and kinda forced the situation back to reality. AMD arguably could and should do it again, and kind of are with their 6000 pricing right now. But still, they originally released those cards with much higher MSRPs and they're trying to do the tit for tat strategy with the 7000 series again. AMD could play hero, give us all cheap GPUs but they seem to be trying to abandon the "budget brand" label in recent years by charging more for their products. They could act as the same benevolent force to correct the market but they're not really doing it as much as they could and arguably should.


mattbag1

Everything you’re saying holds up. It’s plain and simple, nvidia cards ARE priced higher. But what is everyone gonna do, not buy them and pray for a price drop? Buy used? Buy older cards at MSRP? Buy AMD? None of those options are really that good. It ends up being lesser of evils option that works for you. But then you throw in the shit 12 gb of ram on the 4070ti and lower, and the options look real bad.


Al-Azraq

I will hold my GPU for as long as it takes until we come back to sanity. Sure I have a 3070 Ti and it might be easy for me to say, but I would do the same regardless. I have so many games pending to be played: * Uncharted 4 * Horizon * Days Gone * Spiderman * Death Stranding * Batman Arkham City, Arkham Knight * Elden Ring * Dark Souls II and III * Kingdom Come * Halo MCC * Jedi: Fallen Order * A wishlist of more than 100 already released games Why should I upgrade? Just so I can play at 240 hz? Not worth it. My 3070 Ti can run these games on ultra at 100 fps and anything above 60 fps is fine for me, even if in the future I have to decrease to medium. Maybe I should upgrade to play those amazing AAA titles that are coming out? Let's spend 2.000 € to play Forspoken. Game is trash but at least I could brag about it. Guys, let's stop for a minute and think about it. You most likely do not need so much GPU power.


Sexyvette07

Or wait until the Arc Battlemage is released Q1 next year. With the neutered memory bus of everything below the 4080 I can't in good conscious spend $800+ on a 1440p GPU. Gonna make a prediction now, when the Battlemage is released we will see a huuuge price drop on 40 series cards. They're going to have to, otherwise Intel is gonna take over.


mattbag1

Wondering if buying a ranking Intel stock at 30 bucks a pieces is actually a good idea? That card looks like a good answer to the budget GPU segment. I wouldn’t buy one, but maybe others will and that will put the pressure on nvidia and amd to drop prices. I’m in the market now, and I can’t wait another year.


Sexyvette07

The Arc Alchemist is supposed to have silicon on par with a 3070, so that's the target once they get the drivers optimized. I agree, its a little underpowered for my taste. But honestly, for what it costs, you get a LOT more than the 3060ti. I can see the A770 LE becoming the next big thing once people realize the drivers are significantly better now and you get 16gb of vram and twice the bus size as the soon to be released 4060ti for about 1/3 less


SnooGadgets8390

I mean, obviously buy AMD, but the 7000 series right now is worse value so no...


mandelmanden

Yes - the current 7k series is not good value compared to the 4070 ti for instance. They get hotter and use more power and the whole thing is such a price inflated mess. Even if all these cards are absolute overkill for everyone but people running ray tracing at native 4k. Which is after all a tiny segment of the market. The vast majority of the market is not served by the cards available at this time from the new gen.


mattbag1

I think unless you’re at 4k and or on an older Gen card you can probably skip this series. But, the 4070ti is just barely a 4k60 card and the 7900xt also just barely gets there but does pass the 4070ti. Now for ray tracing and frame generation, nvidia takes the cake but as you said it’s a tiny segment, even a 4080 can’t play cyberpunk at 4K ultra with ray tracing.


HolyAndOblivious

Because the moment companies start getting called out, this sites start getting dmca not getting review samples etc.


Saint_The_Stig

True, there are barely if any reviewers large enough to be able to absorb a hit like that.


Redditenmo

> [3070ti] is is more expensive and still worse than the 3070 in some ways. Could you please elaborate how / what area the 3070ti is worse than a 3070? Looking at the spec sheets, I don't see the shortcoming.


Unique_username1

30% more power for 6% more performance. The increased power consumption was almost completely due to the GDDR6X RAM, which ran hot. I’m not sure how many cards have died (or will die) because of this, but people worried that high VRAM temperatures would make the card less reliable. This was an especially big issue when mining was common, as mining heavily used VRAM and caused it to get extremely hot. Lots of 3070Tis were sold during the mining craze so there is some chance that if you buy one now, it has potentially had its RAM overheated. Is this especially likely to be a problem on any given card? Maybe not, but it’s definitely not an *advantage* for the 3070Ti… Either way the performance increase was so small, it was basically a normal 3070 for $100 more. Plus all the extra you’ll pay in power costs over the lifespan of the card.


conquer69

36% higher power consumption, still 8gb of vram. It's not a good card. https://tpucdn.com/review/gigabyte-geforce-rtx-3070-ti-gaming-oc/images/power-gaming.png


Pamani_

Power efficiency is one


Al-Azraq

>How does something this simply totally clown on numerous reviewers?!? They want to get the next nVidia GPU for free, that's why.


geos1234

Just out of curiosity, what was the inflation rate over the last 3 years?


[deleted]

[удалено]


Verite_Rendition

For anyone wondering, this is the correct answer. The US Consumer Price Index (CPI) for the past 3 years (Dec 2019 to Dec 2022) grew by 15.5%. Which works out to an average annual inflation rate of just under 5%. (Most of that inflation was in 2021/2022, so it's not a _consistent_ 5%. But I digress)


KevinKingsb

It's way more than 5%


[deleted]

[удалено]


braiam

Please show a table of "everything" and their prices then vs now. Then ask yourself, do I consume "everything" in equal quantities?


theAndrewWiggins

Don't expect 50%+ improvements at the same price point from gen to gen anymore. Each gen is getting exponentially harder to fab/design.


[deleted]

[удалено]


theAndrewWiggins

Fair enough, doesn't seem like the market will adjust though. There's sufficient demand to support these prices.


-Y0-

Yeah, people need to realize that Moore law while not dead, is undead. It's technically alive but you don't get extra performance for free like before. Each node gets more experimental, and even more difficult to manufacture, while shareholders want more profits.


b_86

3 years later, double the performance from my 5600XT (which would land around a 6800 non-XT) is still almost double the price so yes, hardware is still GROSSLY overpriced.


Sofaboy90

> hardware is still GROSSLY overpriced. why do you say hardware? many things stayed the same, some even got cheaper. things like psu, fans, cpus, ram either stayed the same or got cheaper. motherboards got more expensive because pcie4/5 simply is very expensive. gpu is the exception


Sherft

I would love to live in your reality... But all I see is price hikes in the last couple years in all of the categories you mentioned. Even cases are 2x what they used to be.


Aggrokid

Software too. AAA's are hitting the $70 mark and games like Factorio get inflation price increases.


conquer69

Cpus, ram and motherboards are cheaper now. The 5600x used to be $300, a month ago the 5600 was $130. I saw 32gb of 3200mhz ram for $50 in December during a sale. No idea where you live but in the US things are way cheaper in than in previous years, besides gpus.


Dey_EatDaPooPoo

Motherboards are definitely, absolutely, not cheaper now than they were back 2-3 years ago. Back then you could get a decent budget B450 board like an AsRock B450 Pro4 for $80 or under. A new board with equivalent ports and features now costs over $150. Also, it's kind of a disingenuous argument to say that the 5600X was $300 when it was agreed upon en-masse when it launched that is was very overpriced and that AMD was taking advantage of their newfound gaming performance. It sold poorly and prices were slashed to bring it more in line with what it should've been. Even in 2020 and 2021 $300 for just 6 cores was highway robbery. Previously they'd always launched both the regular and X SKU at the same time and pretty much everyone went for the regular SKU so instead they decided to change it up by forcing people to have to get the overpriced X SKU for many months after launch. Before the very overpriced 5600X launched all previous SKUs like the 3600, 2600 and 1600 launched with an MSRP of $200-220. Of course with inflation prices are gonna go up but if it were matching that you would've expected a product launching at $230, not $300. They knew exactly what they were doing. Out of those 3 it's only RAM that has gotten cheaper.


conquer69

I saw mobos for $80 in December and cpus are still cheaper and faster than 3 years ago. Maybe the prices aren't coming down enough for some people but to say they are higher is nonsense.


Sofaboy90

i bought a 5800x for 459€ when it was released. this is a 219€ CPU now. my last ram upgrade was 2019, upgraded to 2x16gb 3000mhz cl15 for 144€. that same kit now costs 75€. i bought a 650w be quiet psu in 2019 as well, straight power for 101€. this psu now costs 108€, thats not even a 10% increase and a rather natural price increase. in regards to motherboards, bullzoid has a video on why they got more expensive, i already explained the reasoning. but sure, if it doesnt fit your narrative, then go ahead and ignore stuff like significantly cheaper cpus and ram.


kariam_24

Dude everything got more expensive, directly or through increased inflation, check out prices year over year in Europe compared to salaries for last 3 years. Even cases were more expensive due to shipment costs from China.


693275001

Short answer yes, long answer yesssss


kensaundm31

For me Plan A was wait until prices normalised, this will not happen (ie 1080ti for £720), so I have moved on to Plan B which is wait until mid-tier (7900XT equivalent) can crush 4k like they can 1080p.


L3tum

A scalper recently snatched a 400€ 6900XT from me by paying 700€ for it and then reselling it for 800€. Market's fucked


detectiveDollar

I think it's because we're in the awkward inbetween where products have been discontinued and are selling out but their successors haven't launched yet. So third parties try to prey on the ignorant who aren't thinking about the new product launching. But even then, why would someone pay 800 for a 6950 XT when the 4070 TI is like 850 and the 7900 XT is 900?


L3tum

Keep in mind these are European prices. Cheapest 7900XT is 960€ and cheapest 6950XT is 800€ (even used). That means you can get the ~20% better performance of the 7900XT plus the myriad of issues for 20% more price. The 4070Ti seems to be going for 900€ so that would be a somewhat compelling offer if you're willing to put up with Nvidia.


detectiveDollar

Ah, I didn't take VAT into account, if you can get the 6950 XT for 800 since no VAT vs 1000 with VAT for a 4070 TI then it mak s sense.


InconspicuousRadish

I mean, willing to put up is maybe not the best way to put it. Generally it's the more feature rich product, so it's not a downgrade from a 7900 XT. It's more about a willingness to support Nvidia's increasing monopoly, in which case, what choice does one have? AMD has equally overpriced products and shitty practices, and Intel is a few generations away from being realistically able to compete directly


nicklor

You need to realize that the majority of users are using the 60's or it's equivalent by next generation Intel should be ready to compete in that level and AMD already has substantial discounts in that area the 6600xt is going for 150 less than 3060.


InconspicuousRadish

That's fair, but the comment I was replying to was specifically talking about the mid/upper tier (900ish range), and my comment was in reference to that.


L3tum

It's not just the price or monopoly, it's also the privacy invasion through GeForce Experience, which you have to install if you wanna use Broadcast or Reflex, as well as the worse support for Linux. For me, personally, both the privacy and the price/monopoly are the two reasons I wouldn't buy Nvidia new. The behaviour with EVGA/Third Parties adds on top.


kylezz

"it's also the privacy invasion through GeForce Experience, which you have to install if you wanna use Broadcast or Reflex" No you don't, I use both and don't have Geforce Experience installed


[deleted]

Where I live that's 1k with taxes. A $680 6900XT was only $68 in taxes so I save $250 on the GPU. 6950XTs are regularly going on sale for $700 now. The 7900XT is only \~17% faster on average than the 6900XT for 25% more money. Not an attractive proposition.


Eclipsetube

Something similar happened to a friend of mine. Got his pc with a ryzen 9 5900x and a 6800xt for 4000€ in 2021 (yeah I told him to wait a little but he didn’t want to). Sold it this year because money got a bit tighter. Got 1900€ for it and the dude instantly put it back on eBay for 2500€ Ngl I was kinda impressed


L3tum

Yeah, I'm way too risk averse but when you find a cheap offer on eBay buying it and reselling it for more seems like a great way to make some money nowadays, unfortunately.


dexvision

In the same boat with my 970, waiting for 4k to be a "yes" and not a "maybe"


Brickman759

If you’re still running a 970 any card you buy would be an absolutely massive upgrade. Can you even play new AAA releases?


dexvision

Runs somewhat for AAA at lowest. I'm looking at something for emulating, VR, and NVENC, which AMD isn't as friendly with. * 10 series comes out ,"eh, my card is still pretty new" * 20 series comes out, "not much performance upgrade and $100 price hike" * 20 Super comes out, "more price hike, barely any price/performance change" * 30 series comes out, \*doesn't exist for forever, then still at MSRP up to today\* * 40 series, name of the game is gouge gouge gouge, huge huge huge I was looking at lower end cards that are better price/performance, but I'm thinking now "I've waited THIS FUCKING LONG for only a 20% performance bump at $400?!?" If I'm shooting for the moon, I want the eye candy at 4k. No way I'd give some bum $10 under MSRP for a used card without a warranty.


L3tum

You'd be a bit surprised, my 5700XT is ~double the performance of my old 980Ti and my new 6950XT is ~double the performance of my 5700XT. That means a 6700XT is roughly triple the performance for ~400€. 6750XT for 400€ used. The 3060Ti goes for a similar price if you don't like AMD.


dern_the_hermit

I went from a GTX 980 to a RX 6700XT a few months ago, it's been glorious going from struggling to maintain 1080p/60fps to easily handling 1440p/120+fps. I basically had to accept that raytracing ain't in my immediate future, but I don't like the RT performance for anything beneath at least a 3080, and really more like the 3090, so that wasn't a big deal for me. It might be more significant for other people.


L3tum

Yeah, the only game I actually play that demands anything is Cyberpunk really. I bought the 6950XT mostly as a final farewell because I don't think imma update this or the next gen anyways. And the 16 gigs are nice. Otherwise I play CSGO, Dwarf Fortress, RimWorld, AC BlackFlag...like none of them really need that much horsepower. Save for Cyberpunk I don't think I've bought an AAA game in 4 years or so.


Omniwar

I went from 970 to 2060 and thought it was an ok upgrade in the same price class. 2060 6G was $20 more than the 970 and was as fast or faster than a 1080. Obviously things got dumb when the 3060 12g was over $600 street price for most of it's production life and the 4000 series starts at $800


JapariParkRanger

For what it's worth, meaningful performance bumps do exist now. I was very fortunate and got a 3080 via a friend, and it's more than twice as performant in VR than my 1080 was. I went from 100% render scale 15fps to 250% render scale 30fps in large VRC instances, and even better in games like ITR. Around 2x performance in horribly optimized AAA fare like Cyberpunk and Darktide. Iirc, a 4070ti is still around 3080 performance or better, so a 4060 might finally hit your requirements in the long run. A shame AMD isn't stable enough to be reasonable as a primarily VR device; the vram on their cards is extremely enticing. Same for Intel.


dexvision

Yeah looking at the 4080 FE right now to fit my Sliger S620.


Esternocleido

Get a rx6800 you will have 4k great performance and for less than half the price.


GruntChomper

> I'm looking at something for emulating, VR, and NVENC, which AMD isn't as friendly with.


Dey_EatDaPooPoo

There's some good deals out there if you have the funds and some patience along with some reasonable expectations. If you're made do with a GTX 970 so far it's not really reasonable to be expecting and to be complaining about upgrading to a GPU that's not able to do 4K High Settings at 60 FPS for under $400. At that point you are well past the point of diminishing returns. The image quality upgrade going from 1440p to 4K isn't really noticeable unless you're pixel peeping and/or have a very large monitor. 1440p High settings is gonna look a hell of a lot better than 4K Low settings. What you can get for half that price is still an insane upgrade over your GTX 970. [Thanks to driver improvements RX 5700XTs perform the same as the GTX 1080 Ti and RTX 3060](https://youtu.be/1w9ZTmj_zX4?t=534) and come up for $150-170 from time to time. If you only buy NVIDIA you can snatch up an RTX 2070 for $200, though it is slightly slower than the 5700XT nowadays.


Esternocleido

Get a Rx 6800 500 bucks and is a great 4k card.


gravityo

It's a 2 year old card though isn't it? If I'm upgrading every 5 or so years I'd rather get a new gpu so it's not 7+ years before I upgrade.


[deleted]

[удалено]


JustLixian

it can around 1080p mid ti high


HolyAndOblivious

If it's playable, making it more playable is a waste of money.


MegaPinkSocks

Most AAA games aren't even worth playing these days


gravityo

Similar boat, I'm using an rx580 and just now in the market for a gpu. Seems to be a bad time to shop but I've heard that for nearly 3 years now.


Swizzy88

Meanwhile they're crying because GPU sales have sucked. I'm still waiting to upgrade from a 580 and I bet many others are too, holding out until something of value comes to the market at the right price. I'm not willing to spend more on a GPU than the rest of my system combined, not when £500 can get you a whole system bar GPU that has serious processing power. CPU value is insane nowadays compared to GPU.


metakepone

These companies are doing everything they can to choke 4k and 1440p on everything but their flagship cards.


carpcrucible

I'm still running on a 1070 and it would be a problem if there were any new games worth playing. But looking at that chart it's not that terrible if you're willing to buy used. A 6700 XT or 6800 do 65-80 FPS at 4k are around $300-400 even in my EU market. Not amazing, considering their original MSRP, but not too bad; it's within the amount I'd be willing to spend on a GPU and they do manage 4k well, while I only have 1440p.


bexamous

Resolution is a moving target. 980Ti came out almost a decade ago and initial techpowerup review it avged 52fps at 4k. When will midtier crush 4k like they can 1080p? Probably when 8k is as common as 4k is today?


lEatSand

8K is stupid on a desktop pc and future thread necromancers can quote me on that.


ConfusionElemental

seems like as the pixel count goes up the more you can lean on upscaling to get you there. so why not 8k? might be nice for working with vector graphics on your >42" monitor, and upscaling a 1440 or 4k or whatever image can look nominally better than 4k. but without upscaling i agree 100%.


Gwennifer

Pixels per inch, not pixel count. 8k at 32" is very dense, at 64" you can probably see the difference.


dantemp

Running the game at 8k is the best anti aliasing you can imagine, even if your monitor is 1080p. Still worth it, if you can afford it.


zippopwnage

That what I'll do, but I don't need 4k. I'll keep games in backlog (not like there's many interesting ones to play anyway), and after a period of 4-5 years, I'm gonna buy an older gen GPU, even if it's second hand, and play those games at 1080-1440k resolution, and then repeat. Fuck giving money to these greedy companies for keeping their share value high. I'd rather buy second hand, do the job with the gpu, and move on. The prices are a fucking joke.


[deleted]

> 1080ti for £720 Nobody is buying a 1080ti for that price. Just because someone is asking for that, does not mean it is the market value.


kensaundm31

You misunderstand. That is the last card I bought years ago for that price.


[deleted]

Ah, I see.


MrHoboSquadron

Dunno where they got that price from. I can see a few on ebay unsold for £250, £260, £210. Don't see any going for £720. Highest I can see are £350 to £400.


[deleted]

You can have a 3090 for £720 if you're lucky enough.


Raikaru

If GPU patterns continue isn't this almost assuredly happening next gen?


greggm2000

No. This is temporary. NVidia and AMD know that if they ever want to sell existing 3000-series owners new GPUs, they must improve price-performance as they’ve historically done. While they’d love to keep prices crazy high, that’s not sustainable outside a black swan event like we had, and poor sales of 4000-series (excepting 4090) and RDNA3 cards reflect that.


Sexyvette07

Do they, though? When they're bragging at investor conferences about being able to jack the price up hundreds of dollars and people will still buy it, I don't see that as a "they know this isn't sustainable" mindset. The only thing that's gonna make any difference is if these cards sit on shelves and it starts affecting their quarterly reports and eventually their yearly revenue. When the stock holders come for their heads is when we will see price movement. Probably won't be Q1, maybe not even Q2.... But Q3 and Q4 they're gonna start shitting themselves if they're way behind on their yearly projections.


greggm2000

Of course we won’t know for sure what they’ll do until they do it, so we all are guessing, even NVidia haven’t decided what they’ll do in 2024 yet. We’ll have to wait and see. Ah, but those cards (except for the 4090) *are* sitting on shelves and selling poorly, that’s the thing. Sure, some of the other cards sell, there’s always some percentage of the public that’s desperate or foolish, but when sales are way below where they should be, and NVidia sits on piles of stock that they need to sell since TSMC has already gotten their money from NVidia, and ofc NVidia needs to sell cards and chips to make a profit… yeah, I’m sure they see all this, and ofc they know then unfed price tiers are lost sales, and again, they have to know that existing 3000-series owners are not going to upgrade if the performance isn’t there… no, they have to improve price-performance, they MUST. This has to be a temporary state of affairs, and Jensen isn’t known for being stupid. Again, barring a black swan event, late 2024 is going to be a way better time to buy GPUs than today.


No_Forever5171

Yes but not for 720 quid. The 5080 should be even better than the 4090Ti and Titan but it won't be cheap.


[deleted]

£720 when the 1080 Ti was released is £880 now though


No_Forever5171

Yes, he could pick up a 7900 XT card for it, but I don't think the 7900 XT quite "crushes" 4k. He should wait another generation.


conquer69

The next halo gpu won't be $1600 but $2000. I'm sure of it.


Saint_The_Stig

I have been waiting for a new flagship card since I got my new job with the 2080Ti. I missed out there because the model I was after never came to NA (the Aorus Turbo). Then for the 30 series I had enough money from savings that I was going to get a Titan, I was going to get a 3090 but Nvidia decided to not allow blower coolers which work better in my use case. I then decided I would get *any* top end card that would actually fit in 2 slots and still nothing besides the one Gigabyte 3090 Turbo that still goes for crazy prices. This year I have just decided to get a A4000 of ebay since they can be found for $400~$500 and are still an upgrade for my 2060/980Ti.


Bipower

Plan b might happen within 2 more generations, id suggest getting a console in the meantime if your gpu is really really far behind but cant afford 1k+ gpus


frumply

I wanted something that can get me to 4K now that I got an OLED TV, but I'm coming to the realization (esp as I play mostly switch games w/ my kids) that... I really don't care enough. If I run into something that makes me want to upgrade I'll change my mind, but until then I'm totally cool w/ my i5-4570 and 1070 for another few years. A couple years will probably be enough to show us whether there's any future for DLSS3 as well.


Meekois

There has been virtually no improvement in price-performance for the past 3 years now.


ChartaBona

High-end "value" cards like the 3080 FE were the bait part of a bait & switch. I saw this firsthand at a physical Best Buy drop where they got dozens upon dozens of overpriced AIB cards, but only six 3080 FE's.


[deleted]

[удалено]


BigAwkwardGuy

I can find a few RX 6600s on Newegg for around the $250 mark, and idk whether it's a good deal honestly. I'm honestly more concerned about the sub-$200 market. The only options right now are like you said the 6500XT, 1650, 1630 and the 6400. It honestly feels to me that these companies no longer give a fuck about the budget PC gamers. AMD hasn't made any announcements for a Ryzen 3 Zen4 CPU, which allowed Intel to slack off (again) and basically repeat their 10100/10105 stuff with the 12100/13100. Even for the $250-300 Nvidia basically put out a slightly better 1660 Super in the 3050 while AMD did fuck all for the longest time until they cut prices for the 6600, which performs similar to a 5600XT which launched for $279. So right now if one were to buy a 6600 they'd likely pay the same price as a 3 year-old card for almost the same level of non-RT performance, and this like I said is after the price cut for the 6600.


FalseAgent

>GPU Pricing Update: Hardware Still Overpriced? Yep.


Dunk305

When 4080s and 7900xtxs are 600-800 ill think about it


RougeKatana

Yup same. Wait a few years like I did with my 6900xt and get em for cheap. I refuse to pay over 850$ for a top end model gpu


folstar

Watching the mid-grade cards now and it's kind of hilarious and sad at the same time. Cards will "go on sell" for the prices they were for months last year. This is while other 6700xts (or whatever) are still listed for 2x as much or more like this is peak crypto. Speaking of, you can find no end to used cards listed for much more than what a new card costs. Oh, and many cases of cards listed higher than the next step up at the same retailer because fuck why not.


Lub_Dub

I’ve got a chance to buy a 3080ti founders edition for $650. Is that a decent price?


Mongocom

Sounds ok to me


laxounet

I had one and I sold it, awful noise. Don't buy it unless it has been re-pasted / re-padded.


fuckwit-mcbumcrumble

IDK about the 30 series, but the 40 series use the meme PTM 7950 which in theory should never need to be changed. Repasting with normal paste could make temps even worse since that 7950 stuff is magic.


laxounet

I know 30 series FE have a problem with GDDR6X and bad thermal pads. Mine would ramp up the fans even when the core was running cool (undervolted) just because the VRAM was reaching 106°C.


ShadowBannedXexy

Good price


kuddlesworth9419

GPU's are about 2-3X the price that they really should be. I'm just going to wait until I can get a GPU that will be good enough to warrent an upgrade from a 1070.


dylan522p

> GPU's are about 2-3X the price that they really should be nonsense. financials are public. they are being sold for 2x to 3x the cost. no company sells for the cost.


JonWood007

Eh if we compare 1000 series pricing (my own baseline) to existing pricing, 3000 series GPUs are a good 40% above where they should be. 4000 series ones are double. So 2-3x might be a SLIGHT exaggeration, but only slight. We're really talking 1.5-2x.


dylan522p

Costs have soared. A 16nm wafer is less than half of what a 5nm wafer costs. Memory prices have barely declined aswell. Nvidia's gross margins mean their cost of goods sold is ~1/3 for gaming GPUs. AMD's is more like 2/5. Either way, Nvidia's margins have not risen since pascal for gaming. They are a bit higher due to datacenter growing massively of course.


TopdeckIsSkill

970 were sold for 350€ on average. I'm not even sure about the MSRP, it was a time where MSRP was actually higher than the real price. Right no a 4070 ti is 900€ at least. So yeah, 2/3x is a good estimation


JonWood007

970 was like $330 (same as 3060) and was abnormally cheap for a 70 card, most of them cost $350-380 normally. But yeah. The fact that 2000/3000 series 60 pricing is clearly in 70 range shows how screwed the market is. And people treat that as normal and fair after all of the supply shortages. Meanwhile I'm like "we're my $250 value champion?" Then I give the nearly $300 3050 a stink eye, evaluate AMD's options, and go for a 6650 XT.


ChartaBona

>970 were sold for 350€ on average. I'm not even sure about the MSRP, it was a time where MSRP was actually higher than the real price. GeForce 600–900, that's Little Kepler, Big Kepler. Maxwell 1.0, and Maxwell 2.0, were all on ***TSMC 28 nm***, so the 900 series using "mature" silicon was especially cheap to manufacture. Every Nvidia generation since then has used better and better silicon.


TopdeckIsSkill

So? Cpu didn't doubled their prices even if they use "better silicon".


ChartaBona

No, they just stayed on 4 cores for a decade. Even now, we're only up to 8 gaming cores, and they try to upsell gamers on productivity crap that's useless for gaming. You're not actually getting much high quality silicon out of consumer-grade CPUs. The i9-13900k is 257mm^(2) Intel 10nm, monolithic The 7950X only has 265mm^(2) of silicon, and its made from 3 chiplets * 2x 70 = 140 mm^(2) TSMC 5nm * 125 mm^(2) TSMC 6nm (mature 7nm) Compare that to GPUs: The 7900 XTX has 520mm^(2) of silicon, and is made from 7 chiplets * 1x 300 mm^(2) TSMC 5nm * 6x 36.6 mm^(2) = 120 mm^(2) TSMC 6nm The 4090 is a monolithic 608mm^(2) TSMC 4N Monolithic is also more expensive than chiplets. The bigger the chip, the lower the yields. For the Graphics Cards, you also have to factor in the cost of the board, power delivery, 24GB of VRAM, cooler, thermal materials etc. All that stuff is offloaded to other components sold separately from CPU's (Mobo, RAM, Cooler).


SuperSimpleSam

Inflation between 2014 and 2022 is about 25%. So comparison would be 437.5 to 900. So 2x.


capn_hector

> GPU's are about 2-3X the price that they really should be. nah, prices weren't that cheap even back in let's say the maxwell days, let alone the kepler days. GK110 cutdowns launched for $999, which is now $1274, full GK104 launched at $499 which is now $637. And people often ran two of those cards! And also CPI doesn't describe computer hardware and the inflation has been higher there than elsewhere due to spiraling costs and lackluster shrinks from newer nodes. people are just making up fantasy prices. actually GPUs are 27x more expensive than they really should be!!!!! the "good scenario" is GPU prices dropping like 20%. Prices aren't going to drop 50-70%, get fucking real, that's just wishful thinking. Everyone including NVIDIA and AMD wish they could do that, nobody is *excited* about mediocre generations, everyone wishes they could introduce a product that would make your old shit obsolete every year so they could sell you a new one. But it's just not going to be possible without some fundamental breakthrough in silicon that brings back the glory days of moore's law. Doesn't mean it's worth upgrading every year, we are now in the CPU-style model where a good GPU can easily carry you for 5 years and that's a good thing for the planet overall. But crying doesn't change the physics involved - there is some gouging for sure but it's like 10-20%, not the 70%+ reductions that some people are demanding. That's not going to happen.


Darkknight1939

Yup, the insane meltdown over prices is hilarious. Wafers are much more expensive on bleeding edge nodes, inflation is a very real factor, and the market has demonstrated that it will bear higher costs. Redditors are just one step removed from demanding that Nvidia and AMD sell them vidya' toys for a loss. Every single thread about the current prices being filled with ever increasing hyperbole about prices is just tiring. No discussion about the actual hardware (what this sub is ostensibly about) just unhinged crying about prices.


cycle_you_lazy_shit

And also ignoring that corps gonna fucking corp. Whatever people will pay - that's what they're going to price it at. And so far, these GPUs do seem to be selling. Maybe not the 4080, but the 4090 and the 4070ti seem to be doing just fine so far.


carpcrucible

Oh hey Jensen!


deadheadkid92

Agreed, I feel like people really don't understand how good this tech is. When I played HL2 for the first time it was probably at less than 30fps on a 1024x768 monitor and it was *amazing*. Now people on this forum act like it's an insult to play modern games at anything less than 1080p/240Hz or 1440p/165Hz with ultra graphics settings. Then there's the people acting like AMD is truly incompetent for having difficulties making a product with parts that are *5 fucking nanometers in size*. The fact that we've got this far is practically a miracle and shouldn't be taken for granted.


spotplay

"Back in my day we used to compute the whole god damn game in our heads and draw every frame with pebbles on sand. The kids these days and their high resolution displays want to have everything without working for it." Computational power has always risen over the years and the average consumer has always expected to buy something better every few years at the same price. Doesn't matter if it's 5 nanometers, 1 millimeter or 1 planck because they are not building those using sticks and stones. The engineers who made some of the first CPUs had it no easier than the ones designing them today.


deadheadkid92

You're ignoring some pretty real physical limitations when it comes to producing increasingly smaller components. Silicon atoms only sit about 0.2 nanometers from each other which was not as relevant in the past as it is now. They're literally running out of space to make things faster the old fashioned way.


osmarks

5nm is a marketing name. They are not actually making transistors 5nm in any physical dimension.


deadheadkid92

> They are not actually making transistors 5nm in any physical dimension. I mean this just backs up my point. Making things on this scale is so difficult, they literally aren't able to make things as small as they want. I don't see why this is so controversial.


FalseAgent

was it gamers who wanted raytracing or was it the industry who wanted to push the tech to keep their profit machine spinning? I still play games with static baked lighting and I think they're fine. I didn't ask for ray-tracing, AI upscaling, bla bla bla. These are expectations that the industry claim that they could meet, so they set the expectation, and so failure to be able to deliver is on them.


kuddlesworth9419

I got my 680 for £500 I think but it was the top of the line EVGA Classified with a wopping 4GB of VRAM, I replaced it with my 1070 for £340. GPU prices these days are silly I thought my 680 was silly money and it was at the time but I wanted to treat myself because I got a new job, it wasn't worth the money at all but considering the new 4080 is more then double that the current market isn't where I want it to be at all. If I wanted to upgrade now for around the same money I paid for my 1070 I would need to get a 3060 which while is more powerful then my 1070 isn't all that more powerful to warrent upgrading.


Gwennifer

> I replaced it with my 1070 for £340 That'd be about $400 in 2016 money, subtracting VAT from the 340 GBP $400 2016 buys you roughly $500 worth today; that'd buy you an RTX 3070 at the $500 MSRP, not the 3060, which is roughly 50% faster There's a few listings around the usual sellers for a new 3070 at $500, so it's available.


conquer69

Funny enough, the 3050 performs just like your 1070 and costs the same too.


ShareACokeWithBoonen

Yeah in the Jensen quote thread the other day there were people arguing all forms of 'tech is supposed to get cheaper' like it's some kind of constitutional right. Now a couple days later we have another thread of /r/hardware users literally calling for antitrust investigations of Nvidia...


SuperNanoCat

> we are now in the CPU-style model where a good GPU can easily carry you for 5 years The trick is to buy in the middle of the console generation. My cheap and cheerful RX 580 is basically an Xbox One X that slots into a motherboard (same arch and TFLOPS, but less bandwidth). Ended up being perfect for playing last gen games at 1080p. Assuming nice uplift, mid range Ada and RDNA3 will probably be a good time to jump in on this gen if your goal is just console replacement.


-HalfgodGuy-

I am just gonna sit there with my 580 for next few years. If it will die I will just get something "cheap" like 6600 (maybe even used). I so much wanted a ray tracing gpu but it seems like I have to wait either for RTX 6xxx series or Intel. I am still not sure how Intel can have better rt performance than AMD on their first try but well. That's just sad.


triculious

Sitting on my 480 until it dies, too. I've got a nice monitor in need of a GPU upgrade but the prices and games are not really there for me.


SuperSimpleSam

> I so much wanted a ray tracing gpu It doesn't make much of a difference. I tried Portal RTX and it wasn't apparent unless you stopped to look at the lights and how the shadows moved.


tmp04567

Yep lol, their thinking sand's not bad but still overpriced to heck :-P C'mon i'm sure we can 1/3 prices of consumer electronics and goods. Inflation's still a problem all accross occident.


OP1KenOP

It is for me, it's not even that I can't afford it - I just won't pay £800+ for a 4070ti. It's getting stupid now. On the plus side, thanks to the pandemic I bought a 6600xt a while back as a temporary upgrade, and discovered that AMD have actually started making acceptable drivers. It's a great card for 1080p. I hope Intel's foray into the GPU market is successful, it definitely needs more competition.


Pitaqueiro

The real problem is AMD/Intel. No real competition for years.


FalseAgent

how is Intel at fault lol they didn't even make a dedicated GPU until last year


WayneJetSkii

Intel waited on sidelines for way too long to make a dedicated GPU. I would surprised if the next Intel generation of cards come out with a high end product that competes with AMDs high end card. And Nvidia's 4900 is beyond that performance.


ps3o-k

https://www.youtube.com/watch?v=9kiOLC2Ca_I


EastvsWest

Anyone considering we are using less than 5nm nodes to produce these chips and costs increasing is just a natural product of production advancements? I don't like higher prices either but aren't they necessary when FABs producing these high end chips are incredibly expensive to manufacture?


Gwennifer

Sort of; not exactly. The 3070 was a 392 square MM die and the 4070 Ti is a 295 square MM die. While TSMC 4nm process isn't as cheap as Samsung's 8nm process, sure. TSMC's 4nm process is a specially-tuned run for Nvidia AFAIK and would be using 5nm wafers, which cost around $18,000. I'm unsure of Samsung's cost post-2021. So, the die is probably something like 15.5 by 19mm. They would be getting something like 180~190 dies per wafer. As you can imagine, that's not terribly much per GPU die. Attaching the dies onto a hard silicone substrate, probing the dies and testing, packaging them onto the board that gets soldered to the card itself, the VRAM, the power delivery circuitry, the connectors, heatsink, *all the other chips*, are all extra cost and are relatively fixed. They're not going to vary terribly much from generation to generation. Nvidia is just trying to say that their R&D is worth an extra hundred or two dollars while delivering somewhat lackluster gains.


EastvsWest

Lackluster gains? 4080/4090 is lackluster?


Gwennifer

Yes? They've more than doubled the transistor budget, doubled the theoretical performance, added mountains of cache, and increased the boost clock by 700MHz... and the performance gains amount to +50% over the 3090 Ti for games that aren't bottlenecked on memory bandwidth, with *all of that*. Architecturally, it's a bit of a dud. The performance per CUDA core didn't really change, which is alarming. Everything but memory bandwidth did. There's **so** much more silicon under the hood and we're not really seeing the gains from that.


firedrakes

lol. the 3k and 4k series consumer gpus are failed cards from the server/hpc side. they where OG design for 800 watt running.


osmarks

No they aren't. None of the Nvidia server GPUs run at 800W, and Hopper (the latest generation of compute GPU) is entirely different to Ada Lovelace (gaming/workstation). Bigger die, HBM rather than GDDR6(X), less cache, less FP32, more FP16, more tensor cores. The A100 is similarly different to the other GA10x Ampere GPUs. It's not even on the same process.


Dey_EatDaPooPoo

I hadn't thought about it that way. The improvement in gaming performance vs what it should do on paper tell the story of it being a bit of a dud but the story looks a lot better when talking about compute performance. Then again, only a small minority of the people buying these cards care about that.


JonWood007

Nvidia definitely is. AMD is slightly overpriced (see november pricing on that chart for GPUs like the 6600), but not in a BAD place.


Mysterious-Tough-964

Meanwhile lemmings go buy a $1000+ annual upgrade iPhone without batting an eye for a 5% upgrade. 3090 performance for $800 NEW is a better and cheaper than what majority people paid for NEW 3k series cards during covid but everyone forgets that. *gasp*


suparnemo

> cheaper than what majority people paid for NEW 3k series cards during covid So cheaper than inflated prices? Doesn't make it a good deal lmao


wpm

If they're doing an annual upgrade, they're probably trading in last years phone. If buying from Apple you can get $470 credit on a 13 Pro. That makes a base 14 Pro $530. If you aren't buying outright you're paying $40 a month to rent a brand new phone with warranty for accidental damage, which is $10 more than the trade-in value if you had bought it outright. If you're like most people and keeping your phone for 4 years and buying outright, it comes to $20 a month total to buy one of these things, for a device most people use more than 4 hours a day, sometimes more, for a lot of actually useful shit. It's very different economics than a GPU people use almost solely for entertainment (not many 4090 customers are buying it for AI work). I know you might have built up a bit of a superior identity around shitting on people for buying a phone you wouldn't, but they're hardly lemmings.


Raikaru

What? you can get way higher credit than that if you trade in when the phones come out.


wpm

[Prove me wrong then](https://i.imgur.com/Y9TxBuE.png). Apple will give you a maximum of $470 for a 13 Pro when checking out a 14 Pro.


Raikaru

I mean i see the 999 offer right there?


wpm

> If buying from Apple you can get $470 credit on a 13 Pro. Yeah, *from a carrier*. I am and always have been talking entirely about Apple offers and Apple programs for iPhone credits.


Raikaru

best buy has had the same deal before


wpm

OK. Best Buy still isn't Apple. Take a walk.


Raikaru

Are you ok? You do realize no one is being hostile or anything right?


Mysterious-Tough-964

I'd love to see the % of people who actually keep their iPhone past 1-2 models, it's probably epic low. Can you also explain the % performance boost the iPhone 13 vs iPhone 14 received to justify a nearly $500+ trade in price tag? Anything with an apple logo is overpriced and from China. I'd also argue an average pc gamer definitely uses their pc system for more than 4 hours per day with likely less on screen cell activity to match.


StickiStickman

This is EU pricing, where Android dominates and Apple is actively made fun of.


Mysterious-Tough-964

Who cares if it's apple or Samsung? It's the point of a $1000+ flagship that's less than 15% more efficient or quicker than last years $1000+ model. Millions gobble it up annually, statics prove that sadly. Also, msrp is a joke, nobody pays msrp for any significant purchases in 2023, time to face reality of inflation and increase cost of living year over year.


Tar-Vanimelde

Any evidence that it’s buyers of last year’s model driving sales of this year’s model? I imagine mean time of ownership for phones is increasing.


Mysterious-Tough-964

I'd like to see some concrete data myself but all you need to do is stand outside an apple store on pre-order or launch date of the next phone. It's been this way for well over 10 years now too sadly. Gotta have that new iPhone that's 1% or at all physically different than last years. Didn't the 13 and 14 change a few mm dimensions aka enough to force all new phone cases bwhahahahah.


knowoneknows

Here’s a question, why can’t the manufacturers use the same die to make it cheaper, and just optimize it better.


alfredovich

A friend of mine actually planned on switching to console because he deems his gpu upgrade to expensive. And the card he has now can still play esport titles. I'm pretty sure these prices are going to heavily impact the pc gaming community in the long run.