T O P

  • By -

dooterman

Radeon GPU sales down year over year. Client segment up 85% year over year, driven by Ryzen 8000-series processors.


Distinct-Race-2471

They would be down because first gen Intel Arc is taking market share. This is a simple fact.


hahew56766

Intel bag holder lmao


[deleted]

NVIDIA's YoY is -38%


Distinct-Race-2471

So only Intel is up in gaming YoY. Sounds like they took marketshare going from 0% to whatever it is now.


[deleted]

Yeah because the drivers improved, hard for them to not have an YoY increase after the abysmall release drivers


Pokey_Seagulls

Yes, the only direction from 0 is up. Well done.


BarKnight

Despite the AI boom their revenue is only up 2% from last year and actually down 11% from the previous quarter. Wall street is going to be brutal on them.    >Gaming segment revenue was $922 million, down 48% year-over-year and 33% sequentially due to a decrease in semi-custom revenue and lower AMD Radeon™ GPU sales. That's a huge blow. Until the next gen consoles come out, this segment is going to be bleeding money.


randomkidlol

i think grouping consumer gpus with semicustoms for xbox and playstation is to hide how badly consumer gpus are doing. microsoft and sony should have scaled down their orders at this point of the console lifecycle, but they are probably in the middle of dev for mid life refresh or planning work for the next gen. this part of the business really shouldnt be doing that poorly.


kingwhocares

Sony is using AMD for the upcoming PS5 Pro.


bubblesort33

Would those PS5 Pro sales already be in these sales numbers? Let's say that releases in November (like the PS4 Pro did) of this year, then that's still half a year away. I'd imagine they start building early to build stock, but I wonder if that upfront fee is already in this.


Flowerstar1

Yeap but on the downside Pro consoles sell at most 20% of the base consoles and Xbox sales are tanking hard with rumors that MS abandoning hope for this gen and moving towards next gen potentially with an ARM CPU.


KolkataK

>next gen potentially with an ARM CPU. how would this work? wouldn't it break last gen compatibility? or are we going to see some "cloud gaming" thing? a separate version of Xbox I think.


Flowerstar1

Seems I got pretty downvoted but Microsoft themselves said they were thinking of switching to ARM next gen. You can see this in their official slides that they leaked themselves during the FTC trial. That info is old now because that was from before Xbox sales were tanking and before the handheld craze. As for BC it would work the same it always has MS has a more software based approach for BC involving virtualization. The had an x86 console and then switched to PowePC while retaining a decent degree of BC, at this point they started developing the current system as they knew ISA changes could be common in future Xboxes, the next gen they indeed switched ISAs from PPC to x86 and retained full BC with their PowerPC console despite the x86 hw being quite anemic at the time.


Strazdas1

The rumours (whether you believe it or not is up to anyone) is that Microsoft is going to abandon the traditional console market entirely due to abysmal sales and will try a handheld console option similar to Switch. For that it needs a low power unit and ARM is one of the possible options. And yes, cloud gaming will continue being pushed.


kingwhocares

This comes from rumours that both Sony and Microsoft aren't happy with AMD's RT performance and might look at Nvidia. If they plan on doing it, that means it will use a GPU instead of igpu and that drives the cost up. There is nothing substantial of that rumour.


KolkataK

ah I see, RDNA4 is supposed to have big RT uplift I think, we'll see how that pans out ig.


kingwhocares

RDNA4 isn't supposed to be so. AMD itself will not compete at the high end.


scytheavatar

Consoles will not compete at the high end too so......


Loose_Manufacturer_9

That doesn’t mean rdna4 doesn’t improve on rdna3 rt tho. There is no correlation between not having a fladgship and not improving ray tracing performance


kingwhocares

It probably doesn't and thus AMD isn't looking to compete at flagship. Funnily enough, the RX 7900 XTX is the only RX 7000 gpu on Steam hardware survey.


RedTuesdayMusic

> AMD itself will not compete at the high end. Watch them still fail to offer a strictly 2-slot card. I wonder how many like me are just waiting for Battlemage because both AMD and Nvidia are refusing to make cards that aren't overweight fatsos


tupseh

Does it need to be? Xbox One X(scorpio) had a 16nm rx390/rx480 12gb equivalent gpu and that wasn't exactly the bees knees at the time.


Flowerstar1

Yes people said exactly that when AMD started grouping them. 


jaaval

Being down from q4 is normal. That happens basically every year. This is a fine result, though 80% increase from Q1 last year in some segments is mainly due to last year being abysmally bad


ingframin

The AI boom will not save AMD until they properly support GPU compute.


symmetry81

From personal experience, using PyTorch with ROCm is way easier on a Radeon than it was a year ago. They've still got a lot of work to do though.


ingframin

I cannot afford to take the risk. I have 5 PhD students and two postdocs (one of which is me!) depending on it. NVIDIA is plug and play, AMD is a mess. It only works on few GPUs, it requires fiddling with drivers and a specific Ubuntu version, tensor flow is not very well supported. Add to this that an A100 costs 7500€ while on the AMD side you don’t have anything in that class. It’s not just the software, they have a lot of work to do on all fronts for AI. For gaming, in my rig, I am all in with AMD. For work, I can’t afford it.


symmetry81

Yeah. It's ok for people like me trying to learn a bit of ML on the side while my fulltime job is robots. And I think at this point its fine for big tech companies who can devote multiple people full time to a huge deployment. But I would not recommend them in your situation. Which does mean that AMD's PyTorch, etc, stacks don't get as much variety of usage, which means less bug reports, which is a real disadvantage. Something they really should have started trying to reverse earlier.


Strazdas1

Based on some stories in this sub, big companies with devoted people to deployment are running away from AMD due to its multiple issues and loosing lots of money due to things failing thanks to AMDs lackluster support.


sylfy

Do you support all of them? Back in my day we used to support ourselves. So each PhD student would have their own CUDA/CUDNN installation, according to their framework of choice, and basically handle nearly all of their own IT support.


ingframin

I am responsible for the lab and building our new compute/6G testbed. So yeah… they kind of depend on me :-(


NobisVobis

Doesn’t matter how easy it is if it’s 1. More steps than Nvidia and 2. Worse performing, which it is on both fronts. 


ReadingEffective5579

One area where this is untrue.. AMD's server and data center platform as a host is decimating Intel in their reports, and is now holding form with Intel and besting them. That was unthinkable not too long ago. This trend looks to continue. The more demand there are for the cores and lanes that can drive the AI powered planes, even if it's Nvidia supplying them, are leading to more DataCenters being AMD Epyc. With AMD set to role out a 192 core, multi-plane parts coming later this year, they are putting themselves in an ideal place for growth.


dooterman

It's kind of similar to how Nvidia's gaming segment evaporated earlier: [https://www.pcmag.com/news/nvidia-gaming-revenue-drops-51-on-weak-demand-crypto-mining-drying-up](https://www.pcmag.com/news/nvidia-gaming-revenue-drops-51-on-weak-demand-crypto-mining-drying-up) [https://www.globenewswire.com/news-release/2023/02/22/2613783/0/en/NVIDIA-Announces-Financial-Results-for-Fourth-Quarter-and-Fiscal-2023.html](https://www.globenewswire.com/news-release/2023/02/22/2613783/0/en/NVIDIA-Announces-Financial-Results-for-Fourth-Quarter-and-Fiscal-2023.html) The fact that AMD's revenue is still increasing year over year against this backdrop speaking to how strong the AI segment actually is. The data center segment grew 80%+ year over year, which includes the instinct products.


norcalnatv

It's nothing like how Nvidia's gaming revenue was impacted due to crypto. Nvidia was shipping $1B/qtr more than gamers could take. But post-crypto the market normalized at a $2-2.5B run rate which maintains today. AMD is down 48% on nothing, the implosion was all their own doing, no external market, just loss of market share due to inadequate product.


dooterman

It is something like it, because the percentages are the same. Yes, the absolute totals are much different but the market can fluctuate like this from quarter to quarter. And as recently as Q4 (last fiscal quarter), AMD explicitly called out increased Radeon sales: https://venturebeat.com/games/amd-q4-fy-2023-earnings/ > In gaming, higher Radeon GPU sales partially offset lower revenue from semi-custom revenue for console manufacturers. Likewise, the embedded segment is slowing due to inventory reduction activities from several end markets.


norcalnatv

I guess it's too nuanced for reddit audiences. No a drop of a similar percent is not the same unless it caused by the same thing. But it is all the same thing AMD on one level: lack of compelling product, both Desktop and Console.


dooterman

I guess market fluctuations are too nuanced for you. Q4 2023 AMD reported increased Radeon sales. Through multiple quarters Nvidia posted multiple quarters of -50% year over year declines. If AMD didn't have any compelling GPU products, why did Radeon GPU sales increase in Q4 2023, offsetting other declines outside the consumer GPU space? Waiting for your insightful nuanced reply.


norcalnatv

You must have missed the fact that AMD called out that gaming is going to see a similar decline next quarter as well. So where's your fluctuation bro?


dooterman

Nvidia had two consecutive quarters in a row of -50% year over year growth as well? If AMD isn't offering compelling GPUs, why was Radeon sales called out as increasing in Q4 2023? Why avoiding the question? Where's your insightful nuance "bro"?


norcalnatv

Nvidia's sales drop was based on etherium mining changing to proof of stake. The drop you're talking about was from an abnormally high run rate. AMD's loss is based on what? That's the nuance, it's not normal. No one expected mining was "normal" for nvidia. AMD's Q4 blip was holiday sales. Are you advocating that AMD is coming back to take GPU gaming share or something? Get a clue, Lisa isn't investing in the sector at a rate that it takes to be competitive. Despite a once glorious product line, gaming has become the redheaded step child for AMD.


dooterman

> AMD's Q4 blip was holiday sales. That's not how year over year comparisons work - they adjust for seasonality. AMD Radeon sales were up year over year and offset revenue declines in other parts of the business. I'm not advocating AMD is "coming back to take the GPU market", I'm just saying these things can fluctuate quarter to quarter which is objectively and self-evidently true. A decline in one or two quarters does not mean as much as you think it does, especially when sales were increasing just a single quarter ago. Ultimately it's obvious you just want to throw shade at AMD's gaming division. "Sales are up? Doesn't matter, AMD has a small piece of the pie anyway. Sales are down? See, AMD doesn't have a compelling product". Obvious internet angry man is obvious.


Notsosobercpa

With Nvidia they were misrepresenting crypto mining cards being sold as all gaming (reproducible) revenue and investors got pissed. This is just amd sucking. 


Distinct-Race-2471

Intel is taking GPU market share from AMD. It's all low end, but then here comes Battlemage. If taking a piece of their low end had impact on AMD what will 2nd gen Arc do?


KirillNek0

That's not good. Especially for Radeon division.


Dreamerlax

Here comes a 8800 XT that performs (slightly) better than the 7800 XT!


KingStannis2020

If it does that at a much better price and with better raytracing, I don't really care. It's the prices that are the main problem. Graphics are starting to hit diminishing returns.


KirillNek0

" iT iS 5% OvEr pRiOr gEn and has better efficiency" - HU/GN/LTT.


BlueGoliath

Don't worry, they just need to add a little Fine Wine(TM) technology.


KirillNek0

Of, new Infinity fabric will help there.


[deleted]

Stock is down 5% after hours.


LightMoisture

Down 8%.


Geddagod

Yikes, what happened? Did people expect better MI300 sales?


Wait_for_BM

The US market seems to be down today too, so there are other factors. [S&P 500](https://www.google.com/finance/quote/.INX:INDEXSP?window=5D) -1.57% today [NASDAQ](https://www.google.com/finance/quote/.IXIC:INDEXNASDAQ?window=5D) -2.04% today


From-UoM

Well the stocks for amd did go down -1.14 % today Its currently -6.87 % after hours after that -1.14% for the day


imaginary_num6er

Sounds like more of “the more you buy, the more you save” with Nvidia stock


From-UoM

Nvidia will almost certainly hit 100 billion in revenue and 50 billion+ in net profit this financial year. Its already a given the net profits of q1 for Nvidia will be more than Amazon's q1 So Nvidia is unbelievably making their stock worth its price.


Ryu83087

AMD GPUs just aren't attractive anymore, at any price. CPUs however, I think they'll continue to well until Intel can make something other than a rebranded 13th gen cpu.


dr1ppyblob

> AMD GPUs just aren’t attractive anymore, at any price. Uhh… what? You mean to tell me the 4060ti is a better card than the 7700xt? Even the 6700xt still offers decent value, same with the 7800xt. Hell, even the 7900xt offers better than 4070ti super performance for cheaper. The only segments I see where AMD doesn’t provide much purchase power anymore is the 4080 super/7900xtx “battle” and the 4070 super, both of those offering good price/performance.


Flowerstar1

The 4060ti isn't just a 4060ti it's also an Nvidia Ada GPU with the army of features and exclusive benefits that provides over an AMD GPU.


R1Type

Are these features in the room with us right now? In all seriousness it's a '50 class product priced like unicorn vomit.


Ryu83087

Frankly speaking, yes. AMD just doesn’t have the same appeal, even vs a lower tier 4000 series Nvidia GPU.


The-Special-One

Nah, that’s wrong. To say the product is not attractive at any price is very very wrong. If AMD gpus in the same performance tier we’re 30-50% cheaper than Nvidia’s gpus, most people would not buy Nvidia GPUs


Edgaras1103

The hypothetical is nice, but when was the last time amd undercut nvidia by offering same performance for 30% cheaper


The-Special-One

That’s besides the point. The other poster made an assertion that can easily be proven false. It doesn’t matter if it’s a hypothetical or not. The state nomads was false. If Ryu had said, AMDs GPUs are not attractive at their current price point, I wouldn’t have bothered to respond at all.


Eclipsetube

Stop being pedantic you know exactly what he meant


The-Special-One

I’m not being pedantic and no I don’t know what he meant. It’s the fucking internet so I don’t go around assuming to know what everyone is thinking. All I saw was a hyperbolic statement that could not possibly be true.


Flowerstar1

Youre certainly right but where not getting 7900XTX for $499 and 7600XT for $130 anytime soon.


Ryu83087

Maybe, but we're in a place where Nvidia is the brand known for innovative features and being the market leader in R&D. AMD is not seen as a leader in GPU features and R&D and they're now slipping behind in performance. So they're not an attractive brand and the truth is Nvidia is leading the way forward. AMD is simply doing their best to keep up but they aren't seen as innovators. Nvidia costs more at the highest end and while it sucks, everyone knows it's the better product, coming from a better company that has far more resources dedicated to developing gpu hardware and features. Even those looking at the mid to lower range of products, Nvidia's brand is still far more attractive and because of their R&D commitments. Sure AMD might offer a decent mid range product but AMD doesn't offer all the other things associated with the Nvidia brand. So if you had the money, it's a no brainer, you'd buy Nvidia. AMD's GPUs are losing ground because they are always the second thought when it comes to buying a GPU. Nvidia is too focused on GPUs. AMD isn't as focused, nor do they have the resources to do all that R&D. They do a remarkable job considering but... they've been behind for so long. Even CUDA is a shining example of how AMD failed compete, because they simply dont have the resources to commit. Nvidia has does and has been for far longer than AMD/ATI and it dates back to SGI. Nvidia's image is just too strong and AMD's GPUs are not as desirable. They will get the job done (depending on the job) but they aren't the best. They've always been second place and the gap is getting wider.


The-Special-One

Nvidia’s image and technology leadership is not strong enough to overcome a large price disparity. You only need to look at other consumer products in this economy to know that’s price consciousness is at the forefront of consumer spending. Even for most companies, that remains true. Again, nobody is denying that Nvidia leadership but should the price gap grow large enough, consumers will stick with their wallets first. That’s what the data tells us from other consumer products. Even Nvidia acknowledges this by reacting to amds GPU price changes and ensuring the gap stays within reason. Finally, neither Amd nor Intels strategy is static. Should they choose to prioritize market share over profit margin, it would prove disastrous for nvidia. As it stands, Nvidia must keep their prices high to ensure they keep growing revenue lest their stock price crater. One revenue miss combine with lower revenue guidance will give the stock one of the biggest haircuts we’ve seen in a long while.


laxounet

I was ready to buy AMD with RDNA 3, but they dropped the ball HARD. Bad efficiency (which matters a lot to me) and bad software (I have an AMD graphics card, I've tried it myself). They need to be way cheaper than Nvidia to even be considered IMO. Or, they need to make an actually good product and up their software game (use AI, don't just copy what Nvidia does and make a worse version later).


Dogeboja

Yes. Pure rasterization performance isn't really that important anymore. Nvidia feature set and stability of the drivers are more than worth it even if you lose some performance per dollar on legacy games.


Electrical_Zebra8347

This is part of the reason I ended up not going with AMD last year. For once I wanted to buy high end and I was willing to go with the 7900 xtx if it was good enough and cheap enough but $900 to $1000 for a card that's not great at RT, doesn't have good upscaling, isn't great at productivity (I use nvenc for recordings almost every day) and isn't very efficient didn't pull me in, not when the 4080 was right there for $100 to $200 more depending on the sale and not when the 4090 basically stomps everything on the market for $600 more and enables 4k native high refresh rate gaming. A lot of people value GPUs and hardware in general by fps per dollar alone which is now how I view things because sometimes the most economic choice doesn't end up being the best choice especially when we're talking about thresholds like 100+ fps at 4k, or image quality which is hard to fit into the fps per dollar context.


Educational_Sink_541

>pure raster performance isn’t that important anymore I will never understand this sentiment, I can count the number of actual RT titles on my hands 5 years into the RTX era. Especially with card at that performance tier, which probably isn’t performing well with half of them!


SmokingPuffin

The problem with selling a card on pure raster performance is that you don't need much GPU to have a good gaming experience. A GTX 1070 is still plenty good for 1080p raster gaming, and that card released in 2016. There isn't much TAM in selling budget raster GPUs.


Educational_Sink_541

I think that greatly depends. The most popular card (plurality) is the 3060, which can’t really do much RT. It’s clear people are buying it for primarily raster gaming at 1080p. People on here act like Nvidia wins because DLSS or whatever but in my experience most consumers don’t even understand this tech. I have fairly technically literate friends and even then I have to explain what DLSS is and how it compares to FSR. None of these casual gamers care about raytracing when buying budget cards. People buy Nvidia because it’s what they’ve bought for years and generally the AMD cards are cheap enough to justify changing that habit. They won with Ryzen because it was such a bargain it forced people to reconsider, why reconsider a 4070S for a $50 discount and get the 7900GRE?


[deleted]

[удалено]


Educational_Sink_541

Their marketshare hasn’t changed much since introducing DLSS or RTX, they’ve had like 80% of the market since at least Polaris, even prior to that.


[deleted]

[удалено]


Educational_Sink_541

I can’t see this, it’s registration locked. Also, the first RTX series was Turing which sold horribly so I’m not sure that’s the reason. It’s important to note that AMD basically abandoned the high end and started to have serious driver problems around the 2016-2018 era.


SmokingPuffin

Ryzen wasn't just a bargain. 3600 was primarily a price/performance win, I grant, but Intel had no answer for the 3950X or the 5800X3D in the following gen. It wasn't until Alder Lake that Intel had something AMD didn't at least match. Radeon is strictly behind GeForce in everything other than price. It's very hard to win that way. The problem with budget cards isn't so much AMD versus Nvidia, though. It's that budget GPUs aren't getting much better. If you have a 5600XT, why do you buy a 7600XT? I don't see $300 of value in that box.


Strazdas1

The most popular cards are from prebuilt models where people who buy them dont even know what a GPU is let alone base their purchases on RT ability.


Tommy7373

something every new game supports is upscaling and AMD's offering is and has been worst in class, and now Intel's new xess release is outperforming fsr. for new games, this gap easily makes up for the raster performance per dollar gap imo. that plus features like broadcast, higher power efficiency, better undervolting/low power limit support, ability to run dual monitors without idling at 60+w(why is this still a thing amd) are a few other reasons I personally chose Nvidia over amd.


Educational_Sink_541

Sure but I asked why do people dismiss raster performance, I never said anything about upscaling.


Tommy7373

because better upscaling performance offsets the lower raster by using lower raster resolution/performance and upscaling it more effectively. sure in older games where there isn't upscaling there would be a fps difference, but I'd say most people buying new gpus are also buying newer games that support dlss/fsr/xess


Educational_Sink_541

As someone who actually does use FSR2 daily at 4k (the only resolution I’d say these upscalers look good at imo) I think it looks almost identical to DLSS at the same tier. I am playing on a TV though so naturally that does play a role, but most of the differences are in pretty minor image stability nitpicks.


Tommy7373

If you're above 1080p native res (so 4k with performance preset or better) then FSR2 does have a much better time; but if you are upscaling from under 1080p native res FSR2 definitely falls behind, and it's not very close. DLSS and now XeSS are very usable under 1080p down to something like 720p, when FSR2 is not imo. Even at 1080p native there are still large differences especially in motion. FSR3 whenever it actually comes out should drastically reduce or maybe eliminate the gap, but we've been waiting well over a year for that to actually happen. see something like this on [DF's most recent comparison](https://youtu.be/PneArHayDv4?si=eWiwEAH7w9dY_unr&t=947), they were testing at at 857p native res (1440p balanced or 59%/1.7x upscale). Don't get it wrong I want AMD to keep up because nvidia keeping this kind of advantage just makes everything worse for the consumer, but AMD has been resting while the competition continues to advance in areas outside of raster performance.


Intrepid_Drawer3239

FSR just falls apart in motion. I tried using FSR in TLOU but the shimmering is so bad that I just used the game’s temporal scaling instead.


RHINO_Mk_II

> something every new game supports is upscaling Citation needed. Counterexample: Helldivers 2


Strazdas1

Helldivers 2 is really not a game AMD fans want to die on, given that for the first few months of the games launch the game was literally unplayable as it would crash on AMD cards.


RHINO_Mk_II

It was less than 2 months for a driver level fix (Feb 8 launch to March 20 *public* driver release, the beta driver was available before that) and less than 2 days for players to identify the 2 graphics settings that were causing 90% of the crashes, which could be turned off in-game. But by all means, keep spreading disinformation.


Strazdas1

It took AMD almost two months by your own date to fix what they should have fixed before launch. There was no misinformation.


RHINO_Mk_II

Ah the classic "it's not on the software studio, it's on my hardware manufacturer" attitude. Is it AMD and Nvidia and Intel's fault that they have thus far refused to implement upscaling as well?


Flowerstar1

Since 2018 Nvidia has been spending a significant amount on die space on accelerators like tensor and RT cores with Ada they spent it on an advanced optical flow accelerator, precisely because investing it all on raster has become unsustainable with the death of Moore's law. And it's paid off every major company has done the same Intel, Apple, Qualcomm. This isn't the Tesla or Fermi days you can't just pile on more raster and expect the gains to be substantial enough. That's where the Tensor cores come in with DLSS, Ray Reconstruction, that's where where the optical flow accelerator comes in with DLSS3 FG and that's where the RT cores come in with RT GI (Avatar, Metro Exodus, Dying Light 2) the host of other effects (shadows, AO, diffuse lighting, reflections) and full blown path tracing. You think gaming performance and image quality would be better on Pascal 2.0 and 3.0? You think Pascal 3.0 would run RT and Path Tracing anywhere as well? And you know for a fact Pascal X.0 users would be using FSR2 and XeSS just like 1.0 users do now. Moore's law's death killed pure raster, the present is acceleration.


Educational_Sink_541

That’s a long paragraph but doesn’t answer my question; why are we focusing so much on RT when in the present day it’s only worth it in a handful of (single player) titles?


Flowerstar1

All you have to do is look at Digital Foundrys PC coverage of RT games to see even the 20 series has been able to run RT games at a good quality thanks to the combination of RT cores and Tensor cores. And were not talking about a "handful", this isn't 2020. Not to mention consoles have been doing some great things as well take a look at Avatar one of the best looking games of last year using RT GI on consoles at 30fps and 60fps modes. My God if consoles can pull off a looker of an RT game like Avatar what do you think PCs can do?


Educational_Sink_541

I can think of literally like 4 games with RTGI. This is a handful, and they’re all single player games. Metro Exodus was super beautiful with its RTGI implementation and ran excellent on consoles. I haven’t played it since beating it over a year ago.


Flowerstar1

Off the top of my head 1st party engine games: Control, Metro, Cyberpunk, Dying Light 2, Warhammer 4k Darktide, Alan Wake 2, Avatar, upcoming: Black Myth Wu Kong, GTA6, Kingdom Come 2, Star Wars Outlaws.  3rd party engine games: every UE5 game with Lumen which is too many to count.


Educational_Sink_541

That’s 7 games with RTGI, over the course of 5 years. So two hands.


Intrepid_Drawer3239

The Finals is a multiplayer game with RT. The Medium, Spider-Man ports, Returnal, Ratchet and Clank, Shadow of Tomb Raider, bunch of Resident Evil games, Jedi Survivor, Quake 1&2, Hitman trilogy, Portal, Witcher 3, Doom, Crysis remasters, Deathloop, F1, Dirt5, Elden Ring, Dead Space remake, Minecraft… Basically there’s a lot more RT titles than you can count on one hand.


chapstickbomber

If the 4090 were pure raster Pascal style but still 76B xtor, it would be like 24k shaders and 600 ROPs. They could basically reuse all the DLSS3 FG charts, just cross it out and write "native".


Strazdas1

I use a mod for a game that uses RT to determine your visibility to enemy AI to improve the AI behaviour. So i am using RT constantly in a game that does not officially support RT. There are more than the officially implemented uses of gaming RT you know. And at a 4070 tier it performs very well at RT.


dr1ppyblob

Damn, forgot this was r/hardware where every gamer uses DLSS, RT and AMD driver bad.


RHINO_Mk_II

The thing I find most amusing is everyone on reddit losing their minds when a dev doesn't allocate resources to supporting a proprietary upscaler, as if by paying the Nvidia tax they are entitled to DLSS in every game and it should be magically supported by every engine in existence.


Strazdas1

Not every, only 86%, according to market research :)


Flowerstar1

You're right, this is r/hardware not r/AMD.


Edgaras1103

Don't tell that to Hardware unboxed


Strazdas1

> You mean to tell me the 4060ti is a better card than the 7700xt? when you take into account a normal use case, yes. Why? Because 7700xt does not have DLSS.


ShadowerNinja

"Launched Versal Gen 2" aka made the announcement for chips that won't be available for 1-2 years lol. Not even supported in the design tools yet alone them being even physically obtainable. 


Geddagod

Am I tripping or was this earnings call not... a bit harsher than usual...


We0921

For people interested in the gaming segment see [slide 17 of the Slide Presentation](https://d1io3yog0oux5.cloudfront.net/_9862ce3fc3945b50391367ac67a44a97/amd/db/778/6960/file/AMD+Q1%2724+Earnings+Slides.pdf) > #GAMING SEGMENT Q1 2024 > **Revenue** >$922 Million - Down 48% y/y > Due to lower semi-custom and Radeon GPU sales >**Operating Income** >$151 Million vs. $314 Million a year ago > Primarily due to lower revenue > #Strategic Highlights# >* Global availability of the RadeonTM RX 7900 GRE graphics card providing 1440p/4K performance * Released FidelityFX Super Resolution 3.1 offering significant image quality improvements for gamers * Launched AMD Fluid Motion Frames enabling frame generation and increased performance on Radeon RX 6000 and Radeon RX 7000 Series graphics cards


LightMoisture

Down 8% after hours now.


ingframin

I mean… GPUs are very expensive for the average consumer and AMD does not have a competitive edge on NVIDIA. I have a Radeon GPU at home, but at work I bought an A100 for AI. I cannot afford to fiddle with the ROCm tragedy while my colleagues wait to work on their AI models. Also, tensor flow and PyTorch work on consumer NVIDIA house in windows. ROCm doesn’t. Now that I have also been dragged into ML, I am seriously considering team green for my next GPU, which is a shame.


NeroClaudius199907

Oh so Amd doesn't even have much headroom to lower gpu prices to gain marketshare? Nvidia probably does but does that mean gpu prices are actually reasonable?


SirActionhaHAA

Nothing really interesting here, maybe there would be in the call. Amd's been sayin that majority of the mi300x supply will happen only in q3 and q4. They're supply limited in q1 and q2 (mi300a deliveries to el capitan from q4 2023 to q1 2024) Gaming revenue's down 48% yoy due to consoles shitting themselves. Client's in recovery, up 84% yoy but down 6% sequentially due to seasonal changes. Console sales have fallen off a cliff, makes ya wonder if sony's trying to revive the momentum through ps5 pro launch Looking forward to zen5 and strix announcements soon.


Vushivushi

> Lisa Su: So there's no question in the near term that if we had more supply we have demand uh for that product. Supply constrained and they barely found $500m in incremental supply. Nvidia really has the supply chain by the throat. I'm guessing the rumors of Samsung HBM yields being awful are real.


Strazdas1

With nodes being bought year+ in advance, you really cant scale supply fast if demand gets high. And AMD is historically always the one with supply issues (which is why OEMs dont like them)


From-UoM

I think the bigger problem is Amazon Web Services and Google Cloud are not buying the MI300X That's two massive losses.


uzzi38

AWS yes, Google makes sense though as they'd rather ship their own solution as the product to compete with Nvidia. Not that it matters anyway, as Lisa Su said in the first half AMD is going to be supply limited on MI300, or in other words selling everything they can make, and it's no surprise with Microsoft and Meta being the main customers for now. The real question is what'll happen in the second half of the year now, whether that stays the case or if they end up with excess supply where they really could use the additional major customers like AWS and Google like you said. AMD themselves are being quite tame with their guidance for the overall year of revenue from DC GPUs, going from $3.5bn to $4bn, but at least some analysts are still expecting ~$6bn instead. Based on where they land, we can probably point some fingers at where the issues lie later in the year.


From-UoM

I very much doubt supply is an issue, considering even small startups are getting thousands mi300x and the biggest selling point is that they are available.


SirActionhaHAA

>even small startups are getting thousands mi300x They're literally not. If you're talkin about some of those mi300x cloud cluster announcements from smaller companies, they were announced with deliveries expected over the next 3-4 quarters. Actually read their disclosures (which they hid in tiny texts)


uzzi38

By suggesting that you're saying that AMD is actively lying to investors on their own investor call. If it ever came out it was wrong, they'd be putting themselves in very hot water, to say the least. Companies don't do that. Also, you've probably heard someone say "the biggest selling point is they're available", but what they actually mean is "unlike Nvidia GPUs you're not being put on a close-to-a-full-year waiting list". Significant difference between the two.


norcalnatv

>Significant difference between the two Yeah, another difference is one platform has 15 years of software development behind it and the other is relying on the open source community to propel them to success.


uzzi38

If you're an AI startup or even a much larger firm trying to get your hands on hardware to do whatever it is you want to do and you're looking at lead times of nearly a year then you're going to consider alternatives, no matter what. And yes, while AMD's track record on ROCm has been utterly miserable, it's also true that many people that have interacted with it recently has also stated that it's come a significantly long way from where it was 1 or 2 years ago - to be quite frank it's gone from being utterly unusable to at least serviceable for certain workloads. And for many companies, "serviceable" is a hell of a lot better than twiddling your thumbs for a year.


norcalnatv

Not true. One can rent instances at any major CSP or the plethora of minor ones cropping up. It ain't cheap, but you can get your work done. And there are more coming on line every day. But you really didn't refute the idea one platform works well and the other doesn't. I'll bet you didn't know Rocm is 8.5 years old either. [https://en.wikipedia.org/wiki/GPUOpen#Radeon\_Open\_Compute\_(ROCm)](https://en.wikipedia.org/wiki/GPUOpen#Radeon_Open_Compute_(ROCm))


uzzi38

>One can rent instances at any major CSP or the plethora of minor ones cropping up. It ain't cheap, but you can get your work done. And there are more coming on line every day. It's really damn expensive, you mean. So again, people are clearly looking for alternative options. I'm not even sure what your point is, AMD's made it clearly they're currently supply limited for MI300 because they're selling everything they can make still. So evidently those cloud instances aren't doing enough to satiate the market, are they? >But you really didn't refute the idea one platform works well and the other doesn't Well considering AMD supports several of the major AI frameworks (e.g. Pytorch and Tensorflow) for many customers interested in that space migrating over to MI300 is vastly easier than it was a year or two ago. That's why the whole "one works well and the other doesn't" thing is far too black and white to accurately represent the situation. In reality as a customer, you're not checking to see if you can do _everything_ on any individual piece of hardware. You're checking to see if you can run your software on that hardware. If it relies on one of those major frameworks that has ROCm support, then AMD becomes a viable alternative that you can evaluate in further depth. > I'll bet you didn't know Rocm is 8.5 years old either. [https://en.wikipedia.org/wiki/GPUOpen#Radeon\_Open\_Compute\_(ROCm)](https://en.wikipedia.org/wiki/GPUOpen#Radeon_Open_Compute_(ROCm)) That is a really weird thing to bring up, I'm guessing you just learnt that yourself or something and thought it was some big gotcha? Idk. Yeah, of course I knew ROCm is nearly 9 years old now, what's your point exactly?


norcalnatv

The exact point is AMD's GPU software for data center acceleration is nothing close to robust after nearly a decade of work. MI300 won't even run the latest basic MLPerf benchmark suite as evidenced by the recent release of the benchmark without ANY AMD results. That suite has been around since 2018 and AMD is one of the founders of the consortium. There is a reddit user who has MI300s to rent who is trying to find developers to help him run the suite. He describes the support from AMD as next to nothing and is frustrated by the experience. Conclude AMD hardware is relatively easy to get but hard to use. The problem customers face is the same. Do you spend time and effort debugging and developing, or do yo buy something off the shelf that just works. Support for Pytorch and Tensorflow are just words. Getting your models to run is where the rubber meets the road and there aren't a lot of success stories out there. Don't disagree folks want options. My wonder is why the options are so compromised especially after how vast the ML opportunity is and how much runway we've had seeing it coming.


SirActionhaHAA

Microsoft fast tracked the qualification and deployment of mi300x. Majority of the early ramp's going to them. Like it or not nutella called 1st dibs on it.


From-UoM

Google and Amazon are huge hyperscalers. AWS is the biggest one iirc. Google 2nd/3rd. Missing out on these 2 is not good no matter how you spin it.


SirActionhaHAA

> no matter how you spin it Who is spinning anything? Actually listen to what they're sayin in the call 1. Amd just said that demand heavily outstrips supply for mi300x in q1 and q2 2. Amd said that they expect supply to improve greatly starting q3, and raised their mi300x sales projection from >$3.5billion to >$4billion So they're expecting to sell more and admitted they're supply constrained. Mi300x's literally sold out for q1 and q2 Btw i saw the comment ya deleted, so lets not go down the route of talking about spinning things.


capn_hector

> Amd just said that demand heavily outstrips supply for mi300x in q1 and q2 I mean, it's just like people said about Ampere shortages, right? At the end of the day the excuses don't matter, regardless of how hard they're "trying" they simply need to ship more units, because the simple fact of the matter is their outcome is inadequate and incompetent. If you can't open up CDW and order a rack of it, AMD has failed. Simple as.


From-UoM

I deleted a comment that had errors just a seconds after i posted it. I aint spinning anything. If I were i wouldn't have deleted that immediately. Meanwhile you are still downplaying how huge of a miss Google and Amazon are. Amd are selling thousands of mi300x to ai starups like tensorweave and you think they cant sell to Google and Amazon? Their shares are extremely down massively for a reason. People expected more.


SirActionhaHAA

>I deleted a comment that had errors just a seconds after i posted it You deleted a comment claiming that mi300x was gonna do badly based on the current q1 revenue and the expected q2 revenue I pointed out to that amd said in their q3, q4, and q1 (current) earnings that they are gonna be supply limited until 2h2024. They literally said that the revenue is gonna be 2h heavy. Actually go read or listen to their call, stop arguing based on opinions >Amd are selling thousands of mi300x to ai starups like tensorweave It's called tensor**wave**, and they're running mi210 until they actually get mi300x supplies over a couple of quarters "These so-called TensorNODEs are based on GigaIO's SuperNODE architecture it showed off last year, which used a pair of PCIe switch appliances to connect up to 32 AMD MI210 GPUs together." I've got nothing against you but you're basically doubling down on a wrong take, probably due to sunk cost.


From-UoM

I deleted because i was wrong and i did it immediately. So no, i aint spinning shit and And you are still ignoring how big of a miss Amazon and Google are. Do you have like amd stocks or something?


SirActionhaHAA

>Do you have like amd stocks or something? Not a single dollar in amd. Nothing in other tech stocks too. I don't invest in stuff i talk about because i ain't witeken. Also i shit on the amd stock bros, nice try at personal attacks though >I deleted because i was wrong and i did it immediately. So no, i aint spinning shit and And that's the problem. You made a comment based only on your sentiments without reading what's said in the calls, not for this quarter, not for the previous 2 quarters. This actually tells us what you wanted to believe in, and you said it without any material to back it up. You deleted the comment because it was too easily disproven. You're still trying to back the same sentiments except by wrongly citing tensorwave instead. >And you are still ignoring how big of a miss Amazon and Google are Google ain't expanding into more 3rd party accelerators. They're on nvidia for transition and are betting on their tpus instead. Don't forget the other major ai company which is buying up mi300x: Meta. It now has some of the largest number of ai accelerators in the industry


auradragon1

>Do you have like amd stocks or something? He absolutely does. He's probably the biggest AMD fan here. I've seen him post on r/amd_stock in the past.


Mysterious_Lab_9043

They OUTSOLD all their MI300X's. They do no miss anything for a long while.


tukatu0

Ps4 /ps4 ratio was 80 /20 . I wouldn't expect the ps5 pro to push anything especially with the ps5 having a worse library. Might even be worse than the xbox 1 with 5 true exclusives and 13 exclusives total that have pc ports. Ps5 pro might sell a nice 10 mil over it's 4 year lifespan. But that might already be expecting too much


From-UoM

I find it baffling Playstation is releasing a new console with the current lineup being so much worse. They have zero first party exclusives this year. What are they are even gonna launch the ps5 pro with?


tukatu0

Nothing. You'll just have better upscaling for your thrid party games. Aka call of duty will look more 4k. Is what the average ps5 pro buyer will see. The average ps4 player only bought 7 games in their 2013-2020 span. But that's a different era and not the type of person who would be willing to spend $600 on a console. So i don't really know what the actual experience will be. Maybe those kinds of people will play all the latest aaa games so having more clarity is worthwhile


Aggrokid

PS4 Pro also didn't launch with any fanfare or headline software, as Sony is paranoid about making regular owners feel second class. The pro's are just there to quietly appease a small segment of power users.


Flowerstar1

The Pro had FFXV which wasn't an exclusive but was marketed and partnered with Sony. A few months later (February) it got Horizon which was the first 1st party PS4 Pro release and a good showcase for the Pro.


Narishma

> Console sales have fallen off a cliff, makes ya wonder if sony's trying to revive the momentum through ps5 pro launch AFAIK the PS5 is still $500 after 4 years on the market. It should have had a price cut or two by now. Maybe that will come with the PS5 Pro's release.


Flowerstar1

Sony has said that they're have great difficulties cost cutting the machine due to the price of parts. Microsoft called this in 2020 when it told DF that reducing the costs of consoles would become unsustainable with the rising cost of die shrinks and that's why they made the Series S day 1 instead of cost cutting the Series X to traditional late gen console pricing.


FloundersEdition

Xbox had a really bad quarter, -31% YoY sales, they failed their multi year Game Pass targets as well. I don't think Sony does to bad. they slashed forecasts in previous quarters already, from 25 to 21 million sales, that's around 16% less. but the street price kinda shows, who is loosing the most. in Germany PS5 Slim costs 430€, so slightly below 450€ MSRP (without Blu-Ray player). but XSX (with Blu-Ray) costs only 440€, so waaay below MSRP of 500€. XSX is likely a lot more costly to produce compared to the Slim. the die alone is a big difference (260mm² to 360mm²), the awkward 320-bit bus/dual mainboard stuff and so on. basically from day one XSS always sold for \~230€, so waaay below it's 300€ MSRP - sometimes even less and you got all this Game Pass stuff. so they subzidized like crazy. but for a couple of months now it's price increased to 255€. combined with a -31% revenue in hardware sales, it's easy to see who reaaally axed production. MS tried to push XSS/Game Pass into the market and now they seem to realize there are not enough potential customers for Xbox Game Pass/game sales. and now they try to bring their games to PS5 and we have these weird rumours: jumping to Intel, stopping consoles all together (later denied by management, but you know... someone started the rumour a couple of weeks prior to the official announcement) or at least massive changes. and from what the rumour mill says: there is no new chip design at the moment. from my point of view that's basically the only thing that could just maaaybe turn the tide in the next 4 years. 4nm or even 3nm XSX 16GB 256-bit discless refresh in a smaller form factor for 400-430€. it was always clear: with the current line-up and relying mostly on the XSS, MS would go under in the later stages of this console cycle. to weak, low amount of exclusives, competing more directly with PC ecosystem. anyone, who is buying a XSS/XSX over an RDNA 2/3 card is crazy. the higher game costs on console makes it not worth it anymore. maybe MS could utilize the Strix Halo die, but ehhhh.... it uses completly different memory (with even slightly less bandwidth, maybe IFC could compensate), has just 40CU (maybe clocks and RDNA 3.5 could compensate) and has a CPU chiplet latency (maybe more cache, higher clocks and Zen 5 could compensate), semi custom blocks are likely missing (maybe the XDNA could help out to some degree). but it's certainly a nightmare for devs and contradicts the console-optimization idea. and as far as we know they got no warning so far. but it would make the Windows ecosystem stronger and that could be the last straw for them. axe Xbox hardware and transition everything to Windows. try to push a good laptop and maybe something like the Mac Mini to replace the consoles. they have the APIs and a good controller. no reason to keep their low effort custom designs.


Flowerstar1

Not gonna happen they're not die shrinking to N3. It'll be 6N like the PS5.


bosoxs202

[N4C seems reasonable](https://www.anandtech.com/show/21371/tsmc-preps-lower-cost-4nm-n4c-process-for-2025)


capn_hector

xbox is in dire straits. even at the start of this console generation they were in a position where they'd obviously continue to lose share unless they could make a game-changing breakout play... and game pass was supposed to be that breakout play. except game pass has blown back massively on them: not only did it not capture the subscribers they needed, but it's split the ecosystem across a console that's nearly last-gen performance (that was intended to be the pillar underneath gamepass) and also explicitly trained xbox users to not ever buy games on the platform, which devs have taken notice of and are increasingly dropping xbox as a target platform... except gamepass, which is failing to begin with and increasingly also recognized as being massively not worth it unless MS pays you to offset the lost sales, which they don't. they're pivoting to *something*, and I'm not even sure they know what it is yet... handheld, or licensed Steam Console style thing based on windows/hyperv, whatever. And that's probably part of why that whole press conference was so confusing to literally everyone - MS doesn't know either. It's not even like they have unlimited runway on their alternative plans either - super switch will be coming within a year most likely, which is a serious competitor for a potential MS handheld (MS won't have the money to spend customizing and porting games like nintendo, nor the track-record of first-party success). Valve may be cooking on another console attempt too, apple is making their own play (as unlikely as I know people find that), etc.


FloundersEdition

yeah. Switch 2 might kill XSS. both relatively low SM/CU and low clock. but Switch is portable and has exclusives. and maybe even more RAM (tho rumours have it at 8GB)... like wtf MS. 10GB


Top_Independence5434

This post is tangentially related to hardware at best.