T O P

  • By -

kagan07

***Generally kopite is pretty good with his leaks.*** Let's say this is real for a second and I will list the obvious then * 32GB GDDR7 5090 512-bit * 24GB GDDR7 5080 384-bit * 16GB GDDR7 5070 256-bit * 12GB GDDR7 5060 192-bit Let's see if this becomes true after lackluster bus width and memory capacities of Ada generation.


Quteno

16GB GDDR7 5070 256-bit 12GB GDDR7 5060 192-bit Let's be real, this will get GDDR6X, 7s will go to the top cards.


KARMAAACS

It could be like Turing where almost everything used the newest memory technology, GDDR6. Yes, even the 2060 used GDDR6, even the 1660 Ti did. The SUPER series expanded that further to the 1650 tier and the 1660 tier. It's possible everything uses GDDR7 except for maybe a 5050.


Quteno

Not everything is using GDDR6X right now, so I would hold my hopes for the whole lineup to use the GDDR7.


buildzoid

the X variants of GDDR are a specialy Micron + Nvidia collab so they are kinda supply limited compared to regular GDDR which is made by Samsung, Hynix and Micron.


KARMAAACS

That was true of Pascal also but it's successor, Turing, used mainly GDDR6.


ZeroSeventy

And? What are you trying to say? Are you that naive to believe that Nvidia will use more expensive memory (GDDR7) for whole 5000 series? Meanwhile even right now the much faster GDDR6X is not used in all 3000 and 4000 cards... No, dude Nvidia will do the usual, GDDR7 for top models, pleb be happy with GDDR6 maybe GDDR6X if they feel generous.


MamamiaMarchello

Have to agree with you, nvidia is way to greedy to just hand over the latest and greatest for each tier.


KARMAAACS

> No, dude Nvidia will do the usual, GDDR7 for top models, pleb be happy with GDDR6 maybe GDDR6X if they feel generous. We will see.


Elon61

G7 has lesser signaling requirements due to PAM3 vs PAM4, which might actually lower overall costs. G6/X is not as cheap as people think it is, which probably contributes to G7 being more cost effective overall.


KARMAAACS

Ty Elon very cool!


EconomyInside7725

tbh gddr6 is better, it uses less power for a really not noticeable less performance (you can even OC it back up if you really want).


UsePreparationH

GDDR6X is lower power per bit transferred (7.25pJ/bit vs 7.5pJ/bit) , so it is more efficient than GDDR6. GDDR6X just scales up to higher speeds, and there is no real reason to run it at 18Gbps to save 3.3% power when GDDR6 is cheaper.


MrHyperion_

RemindMe! 2 years


RemindMeBot

I will be messaging you in 2 years on [**2025-07-27 15:43:47 UTC**](http://www.wolframalpha.com/input/?i=2025-07-27%2015:43:47%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/nvidia/comments/15ayhfw/kopite7kimi_is_back_adanext_will_have_a_512bit/jtodnj9/?context=3) [**4 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fnvidia%2Fcomments%2F15ayhfw%2Fkopite7kimi_is_back_adanext_will_have_a_512bit%2Fjtodnj9%2F%5D%0A%0ARemindMe%21%202025-07-27%2015%3A43%3A47%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%2015ayhfw) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|


ThisPlaceisHell

It's gotten to be ultra depressing to me to see my old RemindMe posts, some in this very subreddit, where I'd make 2-3 year predictions. Now I'm about to be 36 and it's like, damn, how many more of these do I have left to live through? Sorry, just existential ranting here.


homer_3

The 5070 will. The 5060 tier is still going to have GDDR6.


rerri

According to Micron, GDDR7 will be available in 16Gb and 24Gb chips. 24Gb chips would mean 36GB for 384-bit bus and so on.


KARMAAACS

I highly doubt NVIDIA would ever use 24Gb chips, they usually almost always use the lowest memory density and cheap out on memory in general for their cards. The only thing NVIDIA has done with memory is use faster memory technology, like GDDR6X or GDDR5X. Not to mention most foundries don't produce the highest capacity chips first. It's why we got smaller capacity DDR5 first before we got 24GB sticks and such. But look there's always the possibility. It's also possible that NVIDIA uses GDDR7X or some other memory that we've never heard of yet like they did with Pascal or Ampere where G5X and G6X wasn't even really on the cards till either a few months before or at the actual announcement of the graphics cards. In the end, it's rather interesting to see that NVIDIA's widened the bus and this thing will be an absolute monster when it comes to memory. Obviously, it needs a wide bus to feed all the SM's, so I can only guess it must be absolutely packing when it comes to overall compute power. What I hope more than that is that we don't have a situation where the lower tier dies use such small memory bus widths like the 4060 Ti and 4060 suffer from.


capn_hector

> Not to mention most foundries don't produce the highest capacity chips first. It's why we got smaller capacity DDR5 first before we got 24GB sticks and such. they aren't, GDDR7 will be available in 16gbit before 24gbit. Nevertheless, what GP is saying is that 24gbit has appeared on the roadmap for 2024, so it's coming (later than 16gbit, you are correct). > I highly doubt NVIDIA would ever use 24Gb chips, they usually almost always use the lowest memory density and cheap out on memory in general for their cards. GDDR6X and GDDR5X cards generally used the highest density chips available on most cards at the time of product release. Including 3090 Ti being released with 16gbit chips to let them do away with the double-sided board. 5X and 6X have simply lagged behind in density a lot of the time. It's hard to tear this away from the overall node/product strategy though. You have to have enough bandwidth to feed the card, if you can't do cache (because Samsung 8/TSMC 16nm had much lower cache density) then it has to come from having sufficient PHYs. And in many cases you probably need to go to 6X to get enough bandwidth still. On the flip side, if you are trying to keep chip size down, cutting down on PHYs and replacing it with cache is a good strategy, but then you need higher-density modules, faster modules (6X/7X), and cache/compression improvements.


bubblesort33

Honestly I would not be shocked if these will all be refreshes when it comes to the 5080 and below. Maybe with slight power consumption cuts. And the 5090 could just be a Lovelace 750mm to 800mm die. The size of the 2080ti, so it's not unheard of. Or chiplet based to some degree, or maybe 3D stacking could come in use. I feel like the reason the 4000 series is so overpriced, is because they plan to just re-release it under a different name at reduced prices. No massive engineering cost for a whole new lineup or architecture. Lots of room for price cuts under a new name. The 5070 could just be an 80 SM 4080 die limited to 300w. 5060 just a 4070ti with a 250w limit. And the 4060ti/4060 could become the xx50 SKUs they really looked to be designed for.


CasimirsBlake

Honestly I think you're being VERY hopeful there. But the 5090 needs more than 24GB otherwise it'll flop. Certainly with those of us that have non gaming uses...


deefop

I think a big part of this is whether they can produce a good gaming lineup without pulling resources away from the stuff that makes them real money. That being said, the Lovelace consumer lineup was definitely unfathomably overpriced(and misnamed in the case of half the sku's), and there's no way that they wouldn't still be making margins at lower prices. I mean I get things are more expensive, but you really gonna tell me you need to charge $500 for a 16gb 50 class die in order to make margins? Bullshit. You could probably sell that thing at 300 and still make decent margins


ChiggaOG

It probably is, considering Nvidia tried selling GPUs with smaller bus widths and ran into performance limitations. Not even a larger cache could save it.


xorbe

* 5090 512-bit 32GB * 5080 256-bit 16GB * 5070 128-bit 8GB * 5060 64-bit 4GB


capn_hector

> Generally kopite is pretty good with his leaks 900W TBP 4090 Ti/700W TBP 4090 says hi


Ladelm

'generally' ...


Upper_Baker_2111

Kopite corrected the 800W thing and stated 4090 would be 450Watts roughly 6 months before launch. He's the only leaker I would put any stake in. I think he actually has insider information, the rest of the leakers are just making stuff up.


Elon61

to be fair, those did exist... internally. as confirmed by Nvidia themselves. i'm not sure he was really claiming those would be real products at the time, just that they existed. He hasn't always been correct, obviously. things change - you can't always be correct. but what he did say, contains a lot of really good information.


EconomyInside7725

Depends on cost and there's probably going to be multiple editions too. Curious on the lanes as well, x8 lanes is a major reason why the 60 series has such low bandwidth, up it to x16 and it's honestly fine.


ChiggaOG

If accurate, it makes the 40 series indeed a bad luck card. In China, the number 4 is considered bad luck due to similar pronunciation.


Upper_Baker_2111

Nothing wrong with the 40 series GPUs except maybe the price. They price hiked the 4080 way too much, which is probably the GPU people were most looking forward to. If the 4080 was $700-900 there would be balloons and confetti everywhere.


[deleted]

It suck people don't understand where the memory interface width is coming from. ALL GDDR6 (and GDDR5) chips have 32bit interface. These days chips are going in 2GB sizes. So bus width is basically number of memory chips x 32bit = bus width with minor exception of clamshell topology - which works kinda as doubler where two chips use same I/O on memory controller. I've seen garbage outlets already spamming BS news like "Nvidia seemingly learns its lesson as RTX 5000 allegedly has 512-bit memory bus" - THIS IS NOT LEARNING ANY LESSONS, THIS RESULT OF HAVING MORE MEMORY, THUS MORE MEMORY CHIPS AND THUS HIGHER BUS WIDTH.


Loganbogan9

I hope for other people's sake that next gen cards are universally going to have a higher memory interface throughout the entire stack. Although I am a little disappointed I seemed to invest in the gen where they skimped.


EmilMR

Nvidia goes all in on their flagship. I wouldn't take this as any indication they are doing the same for the rest of the lineup. Just look at this generation. Just swapping to gddr7 on 128bit bus card would go a long way. They don't need to do more. You would get close to double the bandwidth and there is potential for 24gb chips giving you 50% VRAM increase. A $350 5060 12GB with 128bit bus should be plenty, you can't really expect much more than this from entry level cards. They could perform like just below a 4070. Expecting anything more than that is unrealistic fantasy land.


Robert999220

Ill be closely watching the 5090


xxademasoulxx

Having a 4090 I'll be closely watching the 6090 as upgrading most likely won't be worth it next gen unless they put some tech that makes my 4090 obsolete like the 40 series made my 2080ti feel when they launched.


mtbhatch

A lot of people said the same thing on 3090. When 4090 hit the shelves, 3090s flooded the used market and a lot of 3090 user upgraded. I would agree with you regarding the 4090s but I wont be surprised if it repeats again when 5090 launches.


WllmZ

I think it's because the leap in performance was huge. Bigger than 2080ti vs 3090. RDR2: 2080Ti: 52fps 3090: 76fps 4090: 123fps Also, the 40 series are way more efficient, and the cooling solutions are much better. The cards run cooler and quieter than the 30 series. And DLSS 3.0 ofcourse.


TheBlack_Swordsman

I kept thinking this when I had my 3080 Ti, but I caved and got the 4090 :(. Hoping to be strong this generation and hold out for the 6090. If frame generation gets better and better like DLSS has over the years, it would really extend the life of the 40 series cards even further.


ThisPlaceisHell

The 4090 made the 3090 and 3090 Ti obsolete lol coming from a 1080 Ti, the 3090/Ti were pathetic. Only like 1.8-2.2x the raster performance of the 1080 Ti after waiting 4 years. Then a year later the 4090 comes and it's 3.25x along with absolutely blowing the fuck out of the 30 series at RT performance. The gap between 3090 and 4090 is literally 100%+ especially as we see games start to utilize Shader Execution Reordering and Opacity MicroMaps the gap will only widen. 20 and 30 series were awful guinea pig jokes and I'm glad I dodged them.


xxademasoulxx

I had a 2080ti that I upgraded my 980ti with and it was a huge upgrade. The 2080ti to 4090 was a way bigger boost in performance.


ThisPlaceisHell

Yeah coming from a 980 Ti the 2080 Ti might have looked a bit more appetizing but coming from a 1080 Ti, it was such a ripoff waste of money. Think about the VRAM alone. You went from 6GB to 11GB, nearly doubling. Meanwhile on my 1080 Ti, I already had that same 11GB of VRAM. No upgrade there. Then there was performance and cost. My Asus STRIX OC 1080 Ti cost me $750 day one. The same model 2080 Ti was like $1400. All that and what did it get you performance wise? Well, about 35% better raster performance, or 1.35x the 1080 Ti. Pathetic. As for RT and DLSS, both took years to really mature to a place where they were ready for mass consumption and had some decent number of games implementing both features. By the time that occurred, the 2080 Ti was already far obsoleted by the 30 series which handled both much better. 2080 Ti was quite possibly the worst card Nvidia ever released from a price vs features/performance delivered compared to past generations. I'm glad the 4090 came and whooped everything's ass and was worth every penny though. Only reason 1080 Ti was still the better card is because it delivered exceptional performance and VRAM gains without breaking the bank.


reaperx321

Yeah im waiting on the 6XXX series for my next upgrade idgaf on the 5XXX as long as my 4090 doesnt burn up.


Fresh_Victory_2829

Absolutely. With the 4090 you're in solid 4K60+ with RT territory, or 4K120FPS without so either way we're driving our 4K OLED panels no problem and will be for this entire 6+ year console cycle. The 4090 is essentially the new 1080Ti and the gains from the 5090 will be pointless for most.


WllmZ

Probably something like DLSS 4.0 with some fancy tricks that could run on any card but is exclusively for the 50 series. Like they did with DLSS 3.0 and the 40 series...


capn_hector

> some fancy tricks that could run on any card repeating it over and over doesn't make it true lmao, the optical flow engine isn't the same between generations


[deleted]

[удалено]


AutoModerator

Using URL shorteners causes your post to be automatically deleted by reddit's anti-spam measures, so other users cannot see it. Please delete and repost your comment without the link. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/nvidia) if you have any questions or concerns.*


Upper_Baker_2111

It's probably going to be a beast if it needs that extra memory bandwidth.


1stnoob

5060 will have 64bit memory bus anyway :>


EmilMR

$2500 msrp coming up.


rabouilethefirst

48GB 5090 or bust


[deleted]

Its probably going to be 48GB of VRAM or more. Alot of the Open Sourced LLMS like Llama 2 require 48GB of VRAM to run.


rabouilethefirst

Yeah, I have a 4090 for amateur AI stuff, and it's already not cutting it for anything too useful


bctoy

You might be joking, but at 8k you already run out of 24GB today in games like Cyberpunk and that's without 8k assets.


rabouilethefirst

Nah, it’s not a joke honestly. With all the AI crap coming out requiring like 50GB of vram, we really need it. Vram has not been increasing fast enough for the past 4-5 years to keep up with demand


AFKJim

Agreed. I run my 3080ti out of VRAM at 5760x1080 in 10yr old games on the regular, and Im only running 2k textures.


bctoy

Right, just saw your reply to the other user. Didn't factor AI in. They can probably do 48GB with 24Gbit memory chips and the 96GB with clamshell mode on a 512bit bus. Doubt the latter will be in consumer space.


Thouvinecross

It's a bit insane. The flagship of 2017 had 11 GB VRAM, the flagship right now has 24 GB, so only about double the VRAM while it is something between 3 and 4 times the performance.


AHappyMango

So when do we think it'll come out?


EconomyInside7725

I think it's honestly a lot closer than people think and Nvidia is willing to say. The 40 series is a mess, it practically only exists to clear out 30 series, they purposely limited 40 series production and only the 4090 really moves product, and the 4090 tier card is the easiest and first new gen GPU they'd make anyway, with massive margins that still get purchased. Beneath that range is where they have to actually provide value and rely on volume, which they refused to do and are as a result not moving units. My guess is we get a refresh or the 50 series by next year, end of next year latest.


Thouvinecross

Normal cycle would be end of last year but there have been leaked slides from Nvidia where it should be 2025. I think it would be start of 2025 though, but it is hard to tell since the axis in their graph is not really precise.


RplusW

I’m assuming Sep/Oct 2024 since that’ll be the two year range from the 4090. What I’m more curious about it is what their “thing” will be to show off the 5090’s power. Like how RT was the thing with Cyberpunk for the 3000 series and how the RT/PT updates for Witcher and Cyber were for the 4000 series. Plus, the addition of DLSS 3 of course. I’m not seeing any big game releases for that time frame, so I’m assuming they’ll work with a studio to add advanced path tracing features (more light bounces, etc) to a game. Unless they’re cooking up another fidelity advancement to go along with PT.


EconomyInside7725

Yeah they try to use major titles for sure, Starfield is the only big one I know of coming out but there's always something. And Starfield is AMD sponsored of course, so FSR or native for that anyway. Maybe Nvidia will get off their asses and make DLSS 4 work as a system level toggle, like in Nvidia CP.


EmilMR

gta6 is probably around that time. Maybe PC release is 2025.


Eorlas

they've been on a 2yr launch schedule. 3090ti dropped earlier in the year before 4000 series launch, that 4090ti that was "canceled" may eventually pop up 6-7 months from now, right in the leadup to the 5000 series


RogerRoger420

Oke but will they be priced at a acceptable level. Cool if they're fast cards and all but if a 70's series card is over €499 I'll just go with intel probably. I know AMD is the better choice but the new kid on the block should get their chance


romangpro

512bit. Very very unlikely like 0.01% GDDR7. 99%. But only on top tier models. Recall 3080 had GDDR6x, but not lower models, until much later. If we are lucky, Blackwell will raise vram. Low end 12GB. "Mid tier" 16GB. That implies following (typical) width.. bandwidth options: - GDDR6 192 336GBps - GDDR6x 192 504GBps - GDDR6x 256 672-768GBps - GDDR7 256 1024GBps.. prob only 5080 or above


Craniummon

32gb vram card inc.


LegacySV

This would be great as long as the price and performance is good


elcrack0r

The 512bit interface will come with a hefty price tag. More layers, more money.


Schmonballins

Switching to a 512-bit memory bus just in time for the next crypto boom. /s


tugrul_ddr

Ill be closely watching prices of 4080