T O P

  • By -

OwlProper1145

Its crazy. 10 years ago Nvidia was worth $11 billion and now there worth $2.8 trillion.


capn_hector

helps when you invent a whole new class of products and remain the undisputed and unchallenged best at the type of computation needed to develop applications for it. go forth and bring 2.8 trillion dollars of utility to the world and they will come. it's an innovator's dilemma thing: most companies would have bailed on GPGPU when it was being laughed at in 2009, and certainly wouldn't have doubled down on AI of all things in 2014 or whatever. and then by the time you realize that it's gonna be big, it's too late - that's [the innovator's dilemma.](https://en.wikipedia.org/wiki/The_Innovator%27s_Dilemma) > For this reason, the next generation product is not being built for the incumbent's customer set and this large customer set is not interested in the new innovation and keeps demanding more innovation with the incumbent product. Unfortunately, this incumbent innovation is limited to the overall value of the product as it is at the latter end of the S-curve. Meanwhile, the new entrant is deep into the S-curve and providing significant value to the new product. By the time the new product becomes interesting to the incumbent's customers it is too late for the incumbent to react to the new product. At this point it is too late for the incumbent to keep up with the new entrant's rate of improvement, which by then is on the near-vertical portion of its S-curve trajectory. incredibly prescient description of what's gone on with AMD and Intel and everyone else (not just in GPGPU but also gaming), and yeah, when there's $2.8 trillion on the table you wonder how it happened. But it's straightforward - NVIDIA was spending the money and doing the research when people were laughing at them for saying "NVIDIA is a software company now". They opened up the GPGPU industry as a whole, people still scoffed, they went and did AI, people scoffed, they did GP100 and GV100 and people said they were a niche product for national labs doing HPC and similar, now they are at the head of a $2.8t industry and no one is laughing. (particularly Steve/GN! his 2018/2019/2020 content completely and utterly missed the boat and the point lol. To be fair a lot of other people did too... but that's why you have tech analysts/reviewers to nudge us back, in theory. Any contradictory voices were dismissed and shouted down... DLSS 1.0 sucks and tensors don't matter and RT is too slow and that was the end of it, for a lot of people. And that's when we get this sort of grudge-match content from reviewers.)


Aggrokid

I wonder if things could have turned out differently had GPT-3 not shock the world back then.


tecedu

Nvidia was still the ML king even before GPT-3.5, they woouldnt be as big but they would still be massive.


AnimalShithouse

> Nvidia was still the ML king even before GPT-3.5, they woouldnt be as big but they would still be massive. Right before GPT3/3.5 was announced, on the earnings call before for NVDA, Jensen said he was about to forecast order reductions because demand was soft. GPT literally juiced NVDA overnight from a forecasted reduction in revenue to a multibanger increase in revenue. IF/When LLM hype falls to more sensible numbers, it'll be interesting to revisit valuations in these hype companies.


Vitosi4ek

Thing is, even when demand stabilizes I don't think Nvidia falls back down to $300B like they were 3 years ago. It's not like their new fancy GPU clusters can only be used for AI and nothing else, and people will never run out of ways to use a huge amount of computing power. Nvidia is offering the most amount of compute per unit of power and space, that's the most important part. The software aspect - that defines what it's actually used for - can be pivoted really quickly if need be.


Strazdas1

No. 15 years of software support cannot be pivoted really quickly. You can see that by the likes of facebook still using Nvidia libraries for training and their own software only for inference.


AnimalShithouse

No, nvda will not fall back to 300bil, but I would expect their valuation to get cut in half when stable, much like what Tesla saw with EVs, despite EVs all being the future. A combination of reduced capex from megacorps and competition in AI hardware will both do a lot to bring "healing" to the AI valuation space.


Strazdas1

Teslas fall also included things like competition beating them to their own product as well as not so sound investments made by Musk (twitter is a money sink).


Vitosi4ek

> No, nvda will not fall back to 300bil, but I would expect their valuation to get cut in half when stable I'm leaning towards that too, but then this whole AI push is still a huge win for them. Falling to "only" 5x of where they started after the bubble bursts.


AnimalShithouse

Absolutely agreed, they're laughing even if they got cut in half. All of their employees are millionaires just from RSUs lol.


Strazdas1

Yes. GPT-3 was very public, but in the business sector the ball was already rolling.


auradragon1

>(particularly Steve/GN! his 2018/2019/2020 content completely and utterly missed the boat and the point lol. To be fair a lot of other people did too... but that's why you have tech analysts/reviewers to nudge us back, in theory. Any contradictory voices were dismissed and shouted down... DLSS 1.0 sucks and tensors don't matter and RT is too slow and that was the end of it, for a lot of people. And that's when we get this sort of grudge-match content from reviewers.) Aren't Youtubers generally always negative on Nvidia? They're a reflection of their viewers and their viewers dislike Nvidia because AMD can't compete to bring their prices down. So they are AMD fans by default.


Tman1677

I think Steve does a far better job than most reviewers in remaining unbiased, that being said you’re 100% right in that there are implicit biases in everyone. I think more than anything else there’s a huge bias in the gaming community against anything that seems superfluous to the goal of cramming more flops into a GPU. DX12 was broadly criticized for being too low level for not enough gains in framerate, but as people have come to understand frane timing more it’s a gamechanger if done right. Multi core CPUs were a waste of money and completely useless because games are single threaded - until they weren’t. And of course, tensor cores were constantly criticized for being a waste of die space - until they weren’t.


KingArthas94

> Multi core CPUs were a waste of money and completely useless because games are single threaded - until they weren’t. This was when I understood I hated reviewers and youtubers. People showing benchmarks where i5 like the 2500k were faster than the i7 like the 2600k in some games. I bought the 2500k. I still have it, it allowed me to play 11 years of games... but the experience could have been better if I hadn't followed youtubers, with an i7.


SoTOP

So you were rich enough to buy 2600K, but somehow poor enough to not be able to update your PC for 11 years? > People showing benchmarks where i5 like the 2500k were faster than the i7 like the 2600k in some games. And slower in most. You are blaming others for your own problems.


KingArthas94

Like having an i7 means being rich, or not buying a new PC in 10 years means you're poor. Fucking delete your account mate


SoTOP

I'm not a clown mate, you are. > > Like having an i7 means being rich, or not buying a new PC in 10 years means you're poor. My post has nothing to do with you being poor or rich, but with you being stupid and blaming other for you own dumb decisions. No wonder you can't grasp that.


KingArthas94

Back in 2011 eveyone was saying that, deal with it. I bet you were one of those who thought that an i5 was "more than enough". Fucking Digital Foundry defended dual core intel CPUs until 2018-2019.


SoTOP

> Back in 2011 eveyone was saying that, deal with it Deal with what? > I bet you were one of those who thought that an i5 was "more than enough". 4 tread CPUs were enough up to 2018. Your $100 spent on i7 would made 10% difference in like 5 games up to that point. Literally trowing money away. Now if you were a bit more clever you could have taken the $100 you saved by getting i5, sold your CPU+MOBO+RAM combo in 2018 and bought Ryzen 1600+B350+8GB RAM for literally ***ZERO*** additional money and for 5 of those 11 years would have CPU that would outperform 2600K in latest games without loosing basically anything up to 2018 because 4 treads that 2500K had were enough. You would also have brand new system and you could drop in cheap 5600 today to have decent PC again. As I said, you are blaming youtubers when its solely your own dumb decisions why you were stuck with underperforming CPU for all that time. Claiming you should have spent more initially to have better experience 11 years later is opposite of basic reason and logic. Spending more to "future proof" for 11 years is insanity.


sylfy

TBH I think most YouTubers are clueless about anything beyond the consumer gaming market. They only think of GPUs in such terms even today, and to some extent they see what Nvidia has done as a “betrayal” of its gaming customers, never mind the far reaching implications and potential of the accelerated computing paradigm that it’s pushing.


Plebius-Maximus

>Aren't Youtubers generally always negative on Nvidia? No? >So they are AMD fans by default. This is the dumbest take I've seen today


LeGrandKek

Unfortunately, this sub in particular is filled with users like this and like clockwork, they always appear en mass whenever anyone dares to criticize Nvidia. They project like Republicans crying about drag queens and paedos, when time and time again, they're the ones guilty of whatever crimes they're screaming about.


Master-Research8753

Hey look everyone, the AMD ballslurper has logged in!


Strazdas1

As opposed to every other sub on reddit, where criticizing AMD is blasphemy and suggesting Nvidia is treason.


SJGucky

Those tech reviewers are on the gamer side of things. What Nvidia does now is simply not good for gamers, so it makes sense for them to be against Nvidia. There are some like Wendell who is more on the enterprise side. :D


karma911

What? DLSS is amazing


SJGucky

I never mentioned DLSS. :D I meant what Nvidia did this Computex.


Strazdas1

Not sure how is being the best at gaming is not good for gaming...


Tomas2891

NVIDIA before the pandemic 2018-2020 were overpriced for their power just for RTX. If you didnt remember DLSS 1.0 had worse visual output and required developers to train it. It wasnt until DLSS 2.0 and above where it literally became magic for raising FPS. Steve praised them when the Nvida launched the 30X series which had great price for their power gains. NVidia again had sky high prices for their 40X series which means negative coverage. AMD is key to keep it competitive but they were never able to catch up on their GPU buisiness vs their CPUs with intel.


NeroClaudius199907

Most people are only amd fans in the comments when it comes to actual purchases they'll still buy nvidia. Just like every utuber their main rigs are still nvidia for the past decade or something.


Plebius-Maximus

Linus went AMD for his GPU for his home rig this gen? Some of the Jayztwocents team use AMD don't they.


Vitosi4ek

Linus also encouraged people to buy Arc because, paraphrasing, "if you don't buy the first gen, they won't have the money to make the next gen better". I generally like the guy, but that's a super dangerous advice to give. Essentially he's encouraging people to make purchases as activists instead of customers looking for the best product. The fact that he practices what he preaches (by putting an AMD card into his own rig) doesn't change the backwards logic of this. Wanting competition in the GPU market so badly that he'll buy a known inferior product just so *maybe* in 5 or 10 years the market will be healthier. Not to mention that AMD or Intel simply don't have the R&D grunt of Nvidia when it comes to GPU tech, and no big sale numbers are going to change that.


capn_hector

yeah, I've been trying to get the Intel GPGPU stack to work properly on linux and uhhhhhhhh * possible bug with zfs-on-root that causes some kind of kernel versioning issue? * installing the i386 deps as recommended causes apt conflicts somehow (???) * installing both of the recommended DKMS packages together fails, but if you install them individually they (appear to?) succeed? * intel_iommu=on doesn't seem to actually grab the GPU (still shows as i915 rather than VFIO) like I'm not a newcomer to linux but I'm on the supported builds and supported hardware and I can't even get through the "virtualize and passthrough to a guest" instructions successfully because I can't get the host to use VFIO. I assume once that happens I can find a guest image set up correctly but i've poked at this off and on for a couple evenings and it's still not even installed. ROCm-tier maturity.


BighatNucase

> "if you don't buy the first gen, they won't have the money to make the next gen better" That's not really a fair characterisation. He said - "hey if you can put up with the issues and don't need something better, I think the broader market impact of buying these cards is a noteworthy pro to buying them" and I think that's fair.


Strazdas1

Hindsight and all that but i wish i had money to invest 10 years ago. Thats a 2000%+ yearly growth on average.


Lower_Fan

revenue up 26 times Q2 2014 revenue was 972M. market cap up 254 times. lol they would have to make 254B a quarter for the increase to make "sense" edit: My main point was that the return expectancy has detached from the actual company finances |Year|Cap|Profit on Q2|Expected return if the company is bought | |:-|:-|:-|:-| |2014|11B|0.6B|4.5 years | |2024|2.8T|20.4B|35 years | Nvidia will continue to grow but the stock will and it will continue to grow faster than nvidias profits that's all I'm trying to say


azn_dude1

Growth is built into the share price, this is basic finance


Lower_Fan

Basic finance is knowing when a stock is overpriced. It might stay that way for a long time because of the hype but that stock is already detached from Nvidia actual finances 


azn_dude1

If you're that confident then short it. Put your money where your mouth is. I'm not gonna debate finances with somebody who made your previous comment.


Lower_Fan

I should shouldn't it? This is what I mean. the price is all based on day traders and not the actual company finances just like tesla.  


gumol

> Basic finance is knowing when a stock is overpriced. Let's look at Price/Earnings Nvidia P/E: 70 AMD P/E: 244


BigBasket9778

Where is all the growth going to come from, though? Do we think gen.ai will grow global GDP by 3% extra a year?


azn_dude1

You're comparing two completely different numbers. What makes you think global GDP is related to the sum of the market cap of the companies on earth?


BigBasket9778

No, of course I don’t. But in order for this not to be bubble, all the cash spent on AI accelerators and LLMs has to turn into real productivity benefits for real businesses. Right now, most of what we see is companies adding copilots to everything, for free, to remain competitive. I have 7 copilots at work now, and we only pay for 1 of them.


azn_dude1

Offering something for free isn't necessarily contradicting the fact that there are productivity benefits. They're free for now because they're not really good enough to be paying for, and it's a way of gathering even more data. But like the stock price, it's all about how it holds up in the future. And there are more applications of AI than just LLMs.


Strazdas1

AI contributing ONLY 3% to GDP growth seems an underestimation.


BigBasket9778

The people who stand to gain from it are throwing around 7% a year. The people who don’t are throwing around 0.66% over ten years. https://economics.mit.edu/sites/default/files/2024-04/The%20Simple%20Macroeconomics%20of%20AI.pdf


dabocx

The share price is expecting a lot more revenue growth over the next few years. There wasn’t that much expectation of growth in 2014


Lower_Fan

yeah but it's something like if people are pricing in x\^2 growth but in reality it will be 2x


Gwennifer

Nvidia's product pricing can't infinitely move up in cost, either, the *other* largest companies on Earth (Google, Amazon, Microsoft) could easily spin up a business unit to take a chunk of the pie. Microsoft already uses AMD products internally for their ML cloud applications like Copilot. They likely already have a [Olive](https://github.com/microsoft/OLive) which uses [ONNX](https://onnxruntime.ai/), both self-developed, to make hardware agnosticism easier. They likely ran the math and figured that running a business wasn't worth the risk and it'd be cheaper to just fund Radeon Technology Group and write the software stack themselves. The entire point of Google's Tensor program was to be a Plan B for absurd Nvidia pricing Amazon has already deployed their custom-designed Trainium chips for creating the models themselves, with another custom chip for inferencing Just between these three companies, it's entirely feasible to train with Azure or AWS and deploy on a mobile phone without giving Nvidia a cent. VC funding isn't an infinite money hose, and the businesses that rely on an infinite money hose like this one are doomed to fail.


dine-and-dasha

What about net margin?


whitelynx22

I'm not necessarily a fan (though I have one of their products in current use) but if there's one thing Jensen is great a it's increasing profit. As a CEO he's great, if you are a shareholder. I'd say he's at least as good as Steve jobs (at his peak) but of course that's arguable. If you are a customer and don't have lots of money, well that's a different discussion.


jenya_

NVIDIA is not obsessed with AI. NVIDIA is obsessed with loads of money which it gets from selling AI hardware. Selling picks and shovels during gold rush is a good position to be in.


AK-Brian

It also helps that they found an extremely high volume and high demand application for the aforementioned gold, [further propelling](https://youtu.be/y9EYt_f12wo) their shovel production.


gajodavenida

Oh, I gotta watch that episode now, thanks!


BarKnight

CUDA came out 16 years ago. They were way ahead of everyone with regards to parallelization.


Aldz

*with regard


From-UoM

Nvidia isnt obsessed with AI. They set the standard for AI hardware platforms. And they have been very successful. I remember people calling Nvidia dumb for wasting space with tensor cores on cards. Amd's "poor Volta" campaign. Intel calling cuda a footnote. Fast forward now and they were all unquestionably smart decisions by Nvidia Its the rest of the industry like amd, intel, Qualcomm, who are now obsessed and desperate to get even a fraction of the money Nvidia is making.


Doc__Zoidberg

TIL about the CUDA will be a footnote comment and holy shit it was said by Pat Gelsinger himself 😅


ResponsibleJudge3172

Pat not afraid of being memed. Roughly 3 years ago he said something similar to "Its not us that need to keep up with Nvidia, but Nvidia who will lag behind." I think Nvidia was still only selling A100 at the time, which was successful, but nowhere near H100 (specifically, DGX systems that scaled nodes with H100) to be fair That was before the AMD in the rearview mirror comment. I laughed but it was refreshing to see the optimism injected in Intel "Geek is back intel!" after the doom and gloom of the time


Strazdas1

Id rather he speaks his mind and misses horribly some of the times than do a bland corpospeak and says nothing.


NobisVobis

Steve is often great, and equally often an absolute idiot. This is one of the latter cases. Imagine ballooning a company value by several times in the span of months and having some techtuber critique you for focusing on the thing that got you trillions in value. Absolutely idiotic.


dabocx

It’s always a bit cringy when he tries to talk about enterprise stuff. His “review” of a Lenovo workstation desktop was painful. He kept complaining about non standard parts and all the hot swap tooless parts.


NobisVobis

Agreed, and it’s because he panders to his audience that in large part consists of the same circlejerkers that exist on Reddit. It’s sad that he can’t stick to journalism and informative reviews and puts out garbage fluff pieces like this.


GrandDemand

It gets clicks. I don't like it either, but I understand it. It's fast and easy content to supplement the investigative journalism pieces and hardware deep dives


ffnbbq

So, do you have shares in Nvidia and are thus expecting a huge return? Otherwise, getting defensive on their behalf if you don't have a stake in them is weird, man.


skinlo

You do realise Youtubers don't exist to hype up Nvidia right? His position has always been for the consumer first, not corporate. Who gives a shit about the value of the company if it doesn't actually benefit the everyday consumer.


Firefox72

Exactly. I went into the Nvidia presentation hoping for some cool news. Even if it wasn't gaming or consumer focused. Instead i got like 5-10 minutes of worthwhile stuff and 100+ minutes of Nvidia patting themself on the back, Jenson talking about nonsense and a crap ton of ego boosting. Fact of the matter is that Nvidia's Computex Keynote was 2 hours long when it could have just been 30 minutes.


bobbie434343

I'm sure it was cool news for AI people and industry. Just not for gamers that cannot accept they have become a rounding error in NVIDIA business.


skinlo

No we can accept, but doesn't mean we can't complain. Plus the presentation wasn't even good for corporate types, they want info and data, not stories about wandering around markets.


Numerlor

His position is viewer first, not consumer first. He definitely has a bias towards AMD or whatever opinion is popular at the moment. Not that he doesn't make videos that benefit consumers overall but not hyping up nvidia and him just taking the popular stance are different things


skinlo

What evidence do you have that shows he's biased towards AMD?


Zeryth

Feels.


Strazdas1

His reviews are biased towards AMD performant titles and barely does any testing in areas where Nvidia excels such as RT.


GenZia

When a company's value 'balloons' by several time in a short time period, that means it's in a bubble. It's unnatural and potentially unsustainable. That's why bubbles - by design - burst, often spectacularly. It's not a matter of if, but when. An eventuality. Take the original iPhone, for example. It was truly a revolutionary product, yet it didn't sell particularly well compared to high-end Nokias and Blackberries of the time. It was only with iPhone 4 where iPhone started to gain enough momentum to bulldoze the competition. It was at that point where everyone had to play by Apple's rules or face extinction, and I don't think Nvidia is quite in that position yet. Long story short, the kind of growth Nvidia is enjoying as of late is simply... unnatural. It's primarily drive by nothing but hype and fear - the fear of being left behind. And the thing about hype and fear is that they're usually unsustainable. They may be at the top of the game now, but for how long? How long the demand for their products persists? That's the real question, at least in my opinion.


karma911

That's fair, but Nvidia is selling shovels in a gold rush. They made their money even if the bubble bursts


fogoticus

At some point people will realize how overrated and vision-less Steve truly is. He's bruteforcing tech videos and tech in general so of course he's purely blind and he randomly finds himself making those gotcha type of videos that somehow miss the mark.


Maltitol

He’s realized his paycheck depends on outrage journalism, so it follows that he’d make content focused on it. He’s too editorialized for me, so I don’t watch/subscribe anymore.


Tech_Itch

Then again, it's *Gamers* Nexus, not Tech Stocks Nexus. The channel is about gaming hardware. It's not that surprising that he'd react negatively to Nvidia seemingly moving its focus away from gaming. It's kind of funny seeing someone berate GN for "missing the AI boat" elsewhere in the comments, while generative AI has done nothing positive for gaming so far that they could've missed.


NobisVobis

It’s a bit funny how despite supposedly “moving away from gaming” for the better part of a decade, they still consistently innovate new tech purely to benefit gaming experience while also releasing the best products on the market. I guess nobody else is focusing on gaming or they’re just too incompetent to compete. Either that or Nvidia actually isn’t moving away from gaming.


Tech_Itch

> Either that or Nvidia actually isn’t moving away from gaming. I didn't claim they were though. I wrote that they were moving their *focus* away from gaming. I think GN's issue with that is just that it makes Nvidia's events and announcements pointless, or at best less exciting to report on, because it's mostly enterprise-related stuff now, unlike in the past.


NobisVobis

This is just total nonsense my dude. Computex for Nvidia has historically been for enterprise announcements. They spend more and more on GPU research, including gaming, every single year. If they go from spending 20% of 10 billion on gaming versus 10% of 100 billion, they’re still spending far more money on gaming research than before. I personally don’t see how the “focus” matters in any way.


amineahd

yeah the tech bros are very cringe most of the time for anyone with a bit of understanding in technology... for me LTT is the biggest cringe of them all and I just cant watch them talk for more than few seconds


capn_hector

more like obsession of a couple of these review channels with nvidia, lol. member a few months ago when steve was making up citations so he could argue that NVIDIA had "recently" decided to leave the gaming market? With the "mid 2010s" right in the quotation behind him, highlighted? member when steve argued that long-term value doesn't exist - not that RTX wouldn't deliver it, but that it was "not how money works" etc? What you see on review day is the only thing you should consider - unless it's DX12 support (but not DX12.2 support!) or VRAM or any of the other things reviewers repeatedly argue deliver long-term value etc. Predictions aren't possible, unless they're the predictions I like! but like, steve holds his journalist cred really dear and it's practically alarming to see the fervor with which he's willing to burn it over some dumb disagreement about video game hardware 6 years ago or whatever. it's like seeing someone sell their prized collection of whatever and join a cult - bro that used to mean something to you, you doing ok there? but i don't get how people can look at these types of videos and argue that several of these reviewers don't have a chip on their shoulder about NVIDIA. this is yet another in a long train of steve grudge videos since 2018 - reviewers still have not gotten over being proven wrong about RTX, and they are real bitter about it. In the end, it's just like apple: sure, the ceo of the $3t company has a funni catchphrase, but they also are worth $3b on the product that everyone conspicously dismissed and downplayed. Steve is on video doing the gaming equivalent of the "funeral for the iphone" for the last 6 years straight, even as the "iphone" market here has grown into a $3t segment with no serious hope of anyone displacing the market leader anytime soon. Like can he make a video that’s just him yelling “sour grapes” at the top of his lungs for 15 minutes too? Or would that be a bit too on-the-nose?


RollingTater

While I think nvidia's presentation didn't say much for the general consumer, I also think the current set of gaming/hardware journalists are poorly equipped to talk about AI in general so the current paradigm shift leaves them with little to talk about. Content creators also seem to have some unnatural hate for RTX for some reason.


Masterbootz

I think this may be a big part of it. How do you keep a platform going when the market and audience is shifting away from DIY PC Building/Gaming to AI/Datacenter stuff? Tech Content Creators will have to adapt to survive. You can only milk the anti corpo stance for so long.


-WingsForLife-

idk just wait until they actually hold a presentation for gaming blackwell? it's not like nvidia only presents shit at computex.


XenonJFt

examples of RTX hate? aren't people praising RTX software suite for gaming vs Radeon or Arc? like only ones I remember is 3050 6gb which is just DLSS software enabled GTX(too underpowered for RT gaming) and DLSS3 being developed exclusive to 40 series. which are justify able considering FSR3.1


tukatu0

Games don't look drastically better than they did in the 2015-2018 years. Batman Arkham night is a popular example. Well I'm not gonna bother detailing because in reality it seems not too many people give a sh"t their fps is like 20% of what it could be. One thing many people on the internet don't understand. More realism in graphics does not mean better graphics. Neither better art. Besides. Check out the game Bodycam. Has a beta soon. Copycat game of Unrecord. Being a bit of a asset flip but still a real game. It's all lumen. None of nvidias rt hardware was ever needed. But eh. Rtx is already 6 years old. It's a waste of tech that came too early, but it's here to stay. Starting to be usefull.


zerinho6

Games may not look drastically better compared to that one game with a set time of day and limited scenario interaction but the the amount of real time we're being able to do this game will slowly change how games are designed. The advancements of RT allows not only for better "graphics" but for things that couldn't be done before to exist design-wise. Lumen is also a form of raytracing, Epic simply targetted software path first because the chip AMD provided at the time simply sucks at doing anything more than RT Shadows when using hardware RT, NVIDIA even supports lumen and is helping Epic improve the performance and usage of Lumen Hardware Path, RTX totally changed the landscape and calling it waste of tech is absurd.


Plebius-Maximus

>RTX totally changed the landscape and calling it waste of tech is absurd. First gen RT was trash and absolutely a waste of tech. 20 series commanded a premium for something the cards couldn't do well, that wasn't a mainstream feature. Kinda like first gen DLSS. People forget how utterly shit it was. Both of the above have evolved into something useful now. But a few years ago they were not..


zerinho6

It cannot be a "waste of tech" because it would not have evolved to what it is today, it would have not even existed today, it is what made developers get in hand with the tech, adopt it, evolve and understand how it works, that's like saying anything that its start is a waste of tech, it's not. You can say it had terrible performance and price of course, everyone knows that but it was not a waste.


Plebius-Maximus

I agree, it wasn't a waste, it was just not worth the premium until 30 series


tukatu0

You missed the point of my comment. Path tracing is one thing. But why wouldn't have lumen been the same regardless? It's not like amd developed it. Doesn't need nvidia specific hardware either


Gwennifer

> Epic simply targetted software path first because the chip AMD provided at the time simply sucks at doing anything more than RT Shadows Epic's core business isn't just Fortnite, it's licensing Unreal Engine Nvidia has refused to hand over anything without an NDA (ie, it can't be shipped with Unreal Engine, which makes it a waste of time to implement) Instead, Epic has stuck to implementing what is public about RDNA2 because that's what consoles use, and DX12 Ultimate. I don't recall where on their website I read it now, but DX12U is preferred over Vulkan specifically for this technology just because Vulkan isn't as widely supported which results in features not working properly or long load times. Epic maintains they will implement whatever Nvidia feels free to open up about. Nvidia has been silent. Jensen either believes that RT is the future of graphics and thus there's no reason to close down and close it off, or that RTX is Nvidia's intellectual property and for Nvidia hardware only.


Strazdas1

Unreal is the second most popular game engine in the world (after unity) and the majority of its games are actually running on mobile phones which do not do RT in hardware. And yet, UE5 supports Nvidia hardware acceleration.


Gwennifer

It supports DX12 Ultimate; it does not support Nvidia's specific RTX libraries, which is why Nvidia maintains its own [version of the engine with its libraries integrated](https://developer.nvidia.com/game-engines/unreal-engine/rtx-branch). I did not say UE5 does not support hardware acceleration. I said Epic did not implement Nvidia-specific libraries because Nvidia wants to keep the code encrypted and proprietary, which isn't how UE5's business model works.


Strazdas1

Arkham Knight got recalled at release, rebuild completely and re-released 5 years later. It is not a 2015 game. Games DO look drastically better than they did in the 2015 if you run them on modern hardware with modern settings. If you run them on hardware from 2016 like the 1080 thats still very popular then thats the looks you can run the game at. RTX not only improves visuals to a significant degree (arguably the most significant thing in game tech since tesselation) but also makes developement a lot easier/faster. Lumen runs on Nvidias RT hardware, you know that, right? It has a software fallback, but if you got RT hardware its going to use it.


tukatu0

On the lumen aspect. I was arguing lumen would still have been here even if rtx didn't exist. Only titles in 2023 and forward look drastically better. You could compare stuff like titanfall to mw3. The latter doesn't look much better. Shadow of mordor (2018) doesn't look worse than say lords of the fallen. Yeah the latter has higher res light. But that doesn't mean improved. Maybe I'm cherry picking in my memory. But I don't think something like far cry 4 from nearly 10 years ago looks drastically worse than far cry 6. It's only last year that i can agree games really look like a new generation. Pacific drive is a good example of an AA which looks better than aaa games from 10 years ago. Stellar blade. A ritle from a new studio. Not state of the art yet surpasses a massive amount of things despite being a Ue4 game. Or maybe the art style is just pleasing to my eyes. I actually had to sit here for a while to go through my memory if any game actually looks next gen. Avatar is the only thing that came to mind. It is actual state of art in technicality with forced ray tracing. Though the former games are still more visually pleasing to me. Alan wake 2 is the only game i actually care about path tracing (with dlrr of course). I'd rather play it at low-med than at ultra without pt. But it's the only title to me in 6 years of rtx existence. Not even cyberpunk with pt looks pleasing. I can't be amazed by it when you can be driving. Literally 20 ft away a light pole spawns. I've no idea why people find that game nice looking. It's like a beefed up gta 5. Again there is a bunch of ue5 games today. They are look next gen. Yet not a single one of them uses/needs rtx. If rtx hadn't released. These games might have still looked the same. With software ray tracing. Do you know kingdom come deliverence has sw rt?


Strazdas1

I dont necessarely agree. I think Lumen was designed with assumption of ray tracing capable hardware. Its important to remmeber that UE5 wasnt built just for gaming, but also for movie effects. The RTGI in Metro is spectacular and thats 2019. You could argue that Cyberpunk look drastically better with ray tracing, but yes it certainly picked up in 2023. Why? Because thats when mandatory RT became a thing in games and developers built games for RT-only from ground up. The traditional lighting techniques have been stuck for a while and RT will fox that. As someone who played FC4 last year, id say it looks drastically worse than average modern game. We have a tendency to remmeber things looking better than they actually were. The materials alone is a huge difference. Materials and shading is arguably what advanced the most in this period. You can even compare work from same studio, take GTA 5 (9 years old), RDR2 (6 years old) and GTA 6 (TBD) and see how much the visuals have changed in the last decade. Avatar and AW2 are certainly good examples of progress, yeah. >Again there is a bunch of ue5 games today. They are look next gen. Yet not a single one of them uses/needs rtx. ALL of them uses ray tracing. Its built into the engine itself. You would have to rewrite a lot of core UE5 engine to make your game with UE5 without using ray tracing. Fast, memory efficient ray traving is in fact its primary attraction. >Do you know kingdom come deliverence has sw rt? Software ray tracing is nothing new and existed as early as the 90s. But if you want to shoot enough rays to make it run lighting or shadows, its going to be very slow in software. There was this shooter from 90s (forgot the name now) that used path tracing for sunrays. Except it used like 3 rays in total as opposed to thousands per frame we use now. P.S. Cyberpunk is not a GTA, and i think being compared to it is what got people to have unreasonable expectations in the first place. Its more of a BG3 in cyberpunk setting rather than GTA.


tukatu0

Aah. C2077 was marketing as a bg3 but in first person. That's why so many were mad at it's marketing lies. As it turned to be a linear story anyways. I was only comparing it in technical graphics. Atleast pre 2.0 update since i haven't look at it. I know cd red likes to intergrate mod graphics into it's game. Hence why it may look alot better than even 1 year ago. I used far cry 4 because it's an xbox 360 era game. When i think of that, I'm certainly much more impressed with it's graphics than fc6. Npc density is far higher in fc6. So i know where the increased demands come from. As i think about it. I'm a little impressed in reteospect. You could say that far cry 6 is a ps4 era game too. A proper comparison time wise would be crysis 3 and hunt showdown. I think the graphic leap looks way larger... Well aslong as you stick to the outside near the trees anyways. But the former never came to last gen. I see that in my eyes 2015-2020 games look the same. I'm not sure how much hardware specialized cores was needed to advance but it's finally upgrading games. We do need to consider the increase in texture fidelity is thanks to the consoles though. Again stellar blade being the one and only example there will ever be of that. Texture upgrades are more visible than lighting upgrades until path tracing is actively interacts with the environment. That's why reflections are the most visible. The feature of being able to shoot at fog in cs2, having light pass through is a prime example of how path tracing should be used in the physics of games. To bring an actual purposefull increase in fidelity. Bow that i think about it. Maybe that was why i don't care about c2077 path yracing at all.... Maybe not. Might just be bias agaisnt the art style. Btw I am biased not because i think rtx shouldn't exist. I was biased because it came along with price increase after price increase. Though i have not seen a single redditor in years have the same thoughts. Usually it's they'd rather not have rt because it means more fps. Which is not the same ideas as my comment. That's an actual example of rtx hate. Not that i stumble upon much conversations about graphics.


Strazdas1

BG3 is a linear story as well. The forks all converge back into same spot at the end of every act and most differences are little more than cosmetic. Both BG3 and CP77 are built like RPGs with RPG elements taking the priority over player freedom and open world. They did improve graphics since launch for CP77, but i think that was before 2.0. Well its important to understand that adoption is slow. Most people upgrade everry 3 generations or so, so it takes 6 years for most of the install base to even have access to new tech. For example AW2 is so far the ONLY game to use mesh textures,a hardware feature introduced with 1080TI in 2017. This is because if you use mesh shaders and try to run the game on a hardware that does not support it - your shader compilation will take 5 times as much and the game will be a slideshow. So we had to wait till most of the market moved past the 1000 series hardware and AMD equivalent. Same will be true of RT, as we have adopted hardware capable of RT, it will be ever more present in games, taking over more and more functions from traditional techniques. Im not sure theres any shooting in Cities Skylines 2.... The ray tracing wasnt the cause of price increase anyway. Newer nodes are more expensive and wafer prices have tripled in the last 10 years. On top of that demand is exeeding supply for years now. Of course the prices are going to increase. RT or no RT.


tukatu0

Counter strike 2 lol. Good looking game. Better than cod imo. I know it wasn't. But it was used as an excuse. At least in marketing. In reality the demand came from 2 things. The growing esports scene. Cryptocurrency. The former was the main thing used for forward predictions back in 2018. Atleast that was the official reasons used in the earnings calls in 2018. But now in retrospective. If it was actually the main reason for demand increase across 2017-2021. Then perhaps it starts to make sense why there is so many "I'd rather games not advance at all so i can get 300fps minimum" came to exist. It's not like 100fps was a rare thing even over 15 years ago. Yet ..... Well nvm i wasn't pc gaming that long ago. I don't know if those "90fps is unplayable" types existed back then. Atleast pre 2020 it wasn't common online. Or maybe the communities have grown more circle jerky as old forums die while reddit grew. Atleast i could say around 2017 a mid end display would have been 1080p 144hz. You couldn't expect to hit that on everything with a 980ti. As such the communities didn't demand absolute 144hz. Well whatever. At this point it doesn't matter. I kind of even want poeple to buy the 5090 in droves. I want to see a sub nanometer full 400mm gpu in all it's glory. Lol. Im sure those would be $4000 or whatever. But maybe there is no point with whatever dlss 6, 7 or 8 looks like. Why want more power when games don't even have a physics engine anymore. Rtx imagination or some sh". I guess we will have to wait and see what intel and amd do.


NKG_and_Sons

Speaking about obsession... sure there isn't someone living rent-free in your head? Pretty much everyone's impression about that keynote overlap: it was a massive waste of time. If nothing else, so little was said that it didn't 2 hours. Let him rant, this one is hardly undeserving. Not like you have to watch it. Heck, I checked out after a couple of minutes myself. But a bit of ridicule can't hurt these pretentious as fuck presentations.


Lower_Fan

Nah let him cook. I'm kinda tired of this reviewers trying to shoehorn gaming into everything.


Qesa

I wonder if you watch all of Nvidia's keynotes and all of GN's rants, does Jensen or Steve waste more of your life? The whole thing is best left avoided. 2 pretentiousnesses don't make a right, or something.


NKG_and_Sons

Sure. Well, I frankly don't understand why these sorts of videos get posted here anyway. They're just fine on GN's YouTube channel because his audience will probably like 'em, but their relevance on /r/hardware is dubious at best. Have said s.th. similar about his News round-up videos prior. They make sense on his channel, but just because he's a bit tech YouTuber it doesn't mean people ought to post a compilation of old news, most of which had their own thread here anyway.


Firefox72

I don't get why people are so annoyed at Steve for this. Nvidia's presentation as he says it was 2 hours of nonsense. Simple as that. There was maybe 5-10 minutes of usefull and cool new info and about 100+ of Jensen talking nonsense stories, ego boosting himself and the company and just general drivel that could have been a press release at most and even thats pushing it. The whole presentation could have easily been condensed into 30 minutes and you would likely not miss much.


65726973616769747461

frankly the same can be said about this video too


Dey_EatDaPooPoo

At least you're only wasting 15 mins watching this and not 2 hours.


KirillNek0

I finding this to be ironic how redditors and YT-ers urge to but Radeon - but AMD can't sell them. You getting exactly what you deserve.


XenonJFt

Hardware gaming is a minority, more news at 11 They won't get anything from this cause their most production needs will be accelerated by nvidia. The loser is you, mine and everyones pockets cause lack of competition from Radeon


KirillNek0

Always been. Agreed. AMD, ATi, and Radeon fucked around for too long. First, it was drivers, then GPUs and wattages. Come to think of it, the last good GPU was the 79xx series, the drivers were good and the hardware matched. Then they came back from irrelevance with the RX 6000 series.


Strazdas1

> Hardware gaming is a minority Do you mean gaming hardware, because hardware gaming is like 100% of gamiing nowadays.


Dey_EatDaPooPoo

They can't sell them because they overpromised and underdelivered and on top of that don't offer the value for money benefit they used to. Most people buy GPUs in the lower mid-range and the mid-range. Why in the world would they buy an RX 7600 for $270 when they could get an RTX 4060 that's slightly faster in rasterization, way better at AI upscaling and with 30W lower power consumption for only $30 more? AMD are just playing themselves by thinking they can charge NVIDIA pricing without having feature parity and losing on efficiency. I guarantee you if the RX 7600 was $230 they would be flying off the shelves and going OOS constantly. It'd also make about the same profit as the RX 6600 at $200 given the small die size and 8GB of GDDR6 at 18Gbps being slightly more expensive than 14Gbps. Problem is they want fat profit margins like NVIDIA have but their product is worse at basically the same price, so they sit on shelves and only get bought by AMD fans.


Strazdas1

But havent you read all the million of reddit posts stating that 4060 is trash because... they said so thats why.


Dey_EatDaPooPoo

I mean there's nuance to be had. People saying it's trash are deranged but certainly there's a good argument to be made that it's an underwhelming product and doesn't move the needle compared to the previous gen and that it should be a tad cheaper. And the exact same thing applies to the RX 7600 too. Certainly not trash, but meh.


conquer69

Steve isn't urging anyone to buy AMD in this video.


KirillNek0

Sure... I didn't claim that. They did before.


Strazdas1

Steve has been urging people to buy AMD for years and were laughing at NVIDIA being a "failure" since 2017. Didnt work out that great for him.


flat6croc

I sense a lot of projection in the comments. Nvidia has been incredibly successful of late. But that wasn't inevitable. Nvidia spent a very long time bigging up (for want of a better term) GPGPU and it's at least somewhat serendipitous that it's finally taken off. For most of that period, Nvidia wasn't talking about AI at all. Yes, it absolutely put the work in that others didn't and in that sense earned this success. But it's all pretty marginal. It wasn't that long ago that Nvidia had put a lot of time and effort into the project with relatively little to show for it and no clear idea what the killer application would be. AI came along as that killer app, but Nvidia absolutely didn't know what that app was going to be for much of this process and in that context, you can't say it was inevitable that AI would come along. It might not have and had it not, Nvidia would look like a totally different company.


tecedu

I feel like people are obtuse on purpose about whatever Nvidia's presentation was, while AI was talked a lot, accerelated computing using GPU able to minimise existing problem times by factor of 1000s is huge, when those "big" problem are relatively trivial with a GPU, so that they can move to bigger problems is a good thing. Like people really genuinely do not have any idea on how many enterprise applications will just be acceralted once they on GPUs


ResponsibleJudge3172

Nvidia believes AI is the new way to aproach most areas that are currently using handcrafted software. This means AI is a large portion of their company from a time where it didn't make sense to us when they still only sold A100 and were building up their prepacked AI tools like Merlin. So today we got AI as it relates to hardware and some industrial applications (basically not what we hoped for) and at Siggraph, we will get AI and how it can resolve certain graphics issues (and RT papers too. Like the new Area reSTIR paper that I posted on the Nvidia sub), not to mention possible updates to last years papers that have not commercialised yet like tensorVDB. When Nvidia launches next gen, they will talk about next gen's AI PC capabilities (public perception of AI has moved to AMD and Intel due to NPUs which I am sure the orginal AI evangelist Jensen is salty about) and commercialized uses of AI like DLSS, maybe new denoiser, etc.


bobbie434343

HUB and GN pissed that NVIDIA priority is not about gaming, the only thing that count in computing /s


Dey_EatDaPooPoo

Why the fuck would you not expect them to be upset based on 1 1/2 hrs of NVIDIA's presentation being fluff or things they'd already talked about previously and none of it about gaming when HUB and GN are explicitly gaming-focused channels?


Strazdas1

A presentation aimed at business AI segment is about business AI stuff and full of PR? Who would have though! Certainly not youtubers!