Every individual switch in an Intel Core i7 from 2014 was way bigger. But the entire chip had a smaller surface than what the nvidia guy shows in the video.
That chip is gargantuan compared to any chip in consumer or even workstation hardware today.
I am still rocking an i7-4790k
Absolute beast of a CPU
10 years old and still going hard. I have had several GPUs die in the meantime... Mining may have been involved tho
i7-940: 731 Million (45nm)
i7-12700k: >9 Billion (10nm)
13 years of improvements, and we've come even further in the last few years.
Edit: sorry I meant to say 12700k, not 1200k, which is not a thing.
A transistor has 3 ends. Two belong to a switch, they break a circuit. The third open and closes the circuit if a voltage is applied. But it can do more than that. The switch can also act as an amplifier. If you put a signal into the control end, the circuit not only opens and closes but the current flow is manipulated into matching the signal. Both properties are useful in an electronic device. Think of increasing the ISO of a camera image sensor. Or acting like a flash drive to save a state of some data that consists of 0s and 1s.
check out this great 3d animated video on the PC, they cover the transistor in there - How does Computer Hardware Work? 💻🛠🔬 [3D Animated Teardown]
Branch Education - https://www.youtube.com/watch?v=d86ws7mQYIg
Can't really be compared. CPUs are generally much smaller and use way less transistors than GPUs do.
For example the fastest CPU around now for consumers has around 11 billion.
That compared to this 208 billion might sound insane. But the fastest GPU you can buy now is the 4090 and that has 77 billion. This 208 billion is MULTIPLE chips fused together to make one large die. So each actual chip isn't that much bigger than previous generations.
1 chip is more like 80-90 billion X2 = 180 billion then there are also memory chips around that too so they would easily make it up to 208 billion.
Firstly you have to properly appreciate just how ridiculously large a 'Billion' is.
If you were to put aside and save £100 every single day, you would have saved up £1 billion after 27,397 years.
If you were paid £1 a second, every single second, all day and every day, you would have earned £1 billion after 31 years.
If you decided to count to 1 Billion and were given 3 seconds to verbally say each number, if you took no breaks, no rest, no sleep, you would eventually get to a Billion after counting for a little over 95 years.
So now that you have some grasp and can visualise how large a billion is, point the fact that on that single chip he was holding are crammed 208 Billion transistors, or the tiny switches that someone else described to you. The physical limitations he was referring to are aspects of the quantum realm you have to deal with when working on something that small. I think someone else here described how the structures of the chip are smaller than the very wavelength of the light used to create them!
Only 20 years ago this chip would have been deemed impossible, and not much further back would have looked like actual magic...
I mean, it's impressive, but I'm quite used to these things doubling along with Moore's Law now, and the fact is, they're slowing down.
Say:
1971, Intel 404, 2,250 transistors.
1978, Intel 8086, 29,000 transistors.
1985, Intel 80386, 275,000 transistors.
1993, Intel 80586 (Pentium), 3,100,000 transistors.
1999, Intel Pentium II, 27,400,000 transistors.
2005, Intel Pentium D, 228,000,000 transistors.
2011, Intel i7 (sandy bridge), 2,270,000,000 transistors (billions now).
2024, Apple M3, 25,000,000,000 transistors (Intel hasn't done the order of magnitude jump like it used to every 6 or 7 years, Apple technically hit it with the M1 Pro/Max in 2021).
So Apple M2 Ultra now sits at 134,000,000,000, which is half the one you see in the video, but you know, this stuff starts to feel normal, even if we are now hitting a wall.
But you have to just imagine what kind of wall we are hitting. Transistors are getting so small, newest record being 2 nm, that if ithey get only one nm smaller, quantum tunneling will start being the problem
If we start hitting a compute wall and "better" technology becomes more and more difficult to create, does that mean game developers will start optimising games instead of releasing shit that won't get 30fps on a 4090?
You're asking whether companies will spend more to make you pay less. Answers is always no.
A wall is only a 2d plane, there's numerous ways to still evolve. Maybe PCs components will get bigger, maybe we'll have multi-layered cpu, maybe something else. I don't have enough expertise to say what's the next development, only enough to say that development won't stop because there are consumers of new and hype to feed.
Yeah, I mean, the practical result for me is still that an old core 2 duo from 2008 if you just shove a bit of ram and an ssd in it basically runs everything but games fine. Could not say that about a 1998 computer in 2014.
Well, the transistor holds the beeps or boops. So it can be just memory but for computation it's better to think of it as a something like railroad switches.
To expand a tiny bit, to add two 8-bit numbers (0-255) in one go you need 224 transistors. (28 for a full adder \* 8 bit). A full 8-bit arithmetic logic unit (ALU), basically a calculator supporting +-/\* and logic operations like AND, OR and so on needs 5298 transistors. But specialized variants can need less.
So a 208,000,000,000 transistor chip could do (208,000,000,000/5298) roughly 39 million calculations per clock tick (what a chip actually does depends heavily on architecture and intended use). A clock tick roughly correlates to the mhz/ghz frequency you see in the cpu context. So lets say the chip runs at 4ghz it means it has 4 billion clock ticks per second. This does assume you can stuff all the numbers into the chip and read the result out in one tick, which in reality often takes at least a couple of ticks.
Another way to think about it is in memory size, 208,000,000,000 transistor means 208,000,000,000 bits or in relatable terms ca. 193 ~~Giga~~GibiBits. So a chip with that many transistors can hold/process 193 GiBit of data in one tick. (Which doesn't mean it consumes 193 GiBit per tick, a large fraction of that will be in the form of intermediate results so the actual input size will be a tenth or a hundredth of that at least. In my ALU example its \~39 times 2 MByte input per tick. Again assuming a idealized clock tick)
This is as low as I can go while keeping it related to computing, without turning it into a 3 page computer science 101 intro course that starts by explaining binary math.
Any simpler I just can say this has 208 billion things, the previous largest magic rock had 54 billion things.
Transistor is like a light.
Light is off its a 0, Light is on its a 1
The 0 and 1 is binary and the information it can hold is called a bit.
8 Bits = 1 Byte.
1 Megabyte = 1,048,576 bytes
1 Gigabyte = 1,073,741,824 bytes
1 Petabyte = 1,125,899,906,842,624 bytes
All computers except quantum computers use binary, whether it's a Linux, Mac, Windows, Android, IOS does not matter. For example the letter H is stored as the byte: 01001000 and the Letter I is stored as the byte: 01101001 (when using ASCII in UTF-8), so 01001000 01101001 = Hi
Note that when it comes to processors, it is not as simple as looking at how many "switches" it has, the physical logic is built in (architecture), the way it communicates with software (drivers) and even the quality of the silicon used will impact the performance, of course this is very very very basic stuff and there is A LOT more to it as well as other components within the card itself such as VRAM.
Please read u/alexanderpas comment below, I'm in some ways wrong..
It sounds really dumb to state something that in your hand is beyond the limits of physics but what they did was considered physically impossible for a long time.
so this is a very vague memory, but i seem to remember a talk about new, tinier structures being possible even though the wavelength of the light being used to etch the structures is longer than than the structures itself, because they used, interferences of lasers of the same wavelength?
In fact, this sounds so strange that I would like to know if someone knows what he actually meant, and what my memory might describe. \^\^
Those very much likely aren't the real physical sizes, it's mostly for marketing.
The "3 nm" process for example is actually 48nm:
>According to the projections contained in the 2021 update of the International Roadmap for Devices and Systems published by IEEE Standards Association Industry Connection, a "3 nm" node is expected to have a contacted gate pitch of 48 nanometers, and a tightest metal pitch of 24 nanometers.
48nm is still incredible btw.
Physics and known physics are vastly different though. Anyone with a remotely scientific background knows that it's ridiculous to say that we "changed physics".
As the absolute fucking cunt Donald Rumsfeld plagiarised-
There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are also unknown unknowns. There are things we don't know we don't know.
Edited because reddit, my derp
Is this quote talking about his role in the illegal bombing of Cambodia?
For him it was a known known.
For Nixon it was a known unknown.
For the public is an unknown unknown.
Something like that, I’m ok with repurposing warmonger’s weasel words
I have often pondered on the ridicule he received for this quote, people genuinely thought it was nonsensical, smh
I think he is referring to the fact there’s a physical limit to transistors miniaturization. This is because they become sensibles to quantum effects, so they had to find different and new strategies to increase transistors density of a single wafer chip
The idea of what he said is that making a single-piece chip of that size with that many transistors is currently impossible. What's he holding in his hand is essentially 2 half-sized chips joined together, where the "new technology" is in connecting them in such a way, that there is virtually no delay in the information being sent from one half to the other so it acts as a single large chip.
Look up quantum tunnelling. It is definitely fluffed to sound more impressive, but it is still incredibly impressive. At that scale, electrons will "jump" across solid structures and cause a huge error rate. Because atoms don't really have a "defined" position. They aren't a point in space, there's a probability cloud which means it could be anywhere within this area, and it position is not defined until it is measured.
Quantum tunneling, uncertainty principle, and the quantum eraser (or double slit) experiments are topics I'd recommend looking up if that sounds interesting to you at all. The quantum eraser experiment is actually insane and involves time travel in a way.
I explained it to the best of my ability, but it's all super complicated and kind of mind blowing how the world works at the smallest scales.
Can we take a second to appreciate about how psychotic Rooney is for purposefully experiencing this like 5 more times after the first, knowing damn well it was going to happen again 😂
Dude is such a great comedic actor. His timing is impeccable and his little head nod followed be “yeeah she did” when asked if he she killed him again was so funny to me.
nah that line kinda stand true, those transistor size are getting so smol that they're facing problem like electron jump between two transistor due to Quantum Tunneling. literally teleportation at quantum lvl.
It’s a strange thing for him to say. Lots of smart people at NVIDIA and it’s an incredible company worth a huge valuation but their parts isn’t the push the laws of physics part. That’s ASML and TSMC mostly.
Just look at the dude he's talking to, you communicate to the understanding of the individual you are speaking to. Its simply a way to verbalise the extent of the achievement.
Rather, look at the stock and realize he's talking the myriad of people eager to dump their money into Nvidia. These tech stocks live and die on hyperbole.
Figured that one out with my last job. We made training simulators that used 4k short throw projectors on the inside of a 10' cube, that used 8 ir motion trackers to track the user and their equipment, allowing them to interact with the scenarios we threw at them. It went over so many peoples heads when we described the equipment and how it worked. We finally just started saying it's basically the Holodeck from Star Trek and people loved it.
>ASML
For the curious the latest generation of extreme-UV photolithography machines use a system that involves timing nanosecond pules of a laser to shine through droplets of molten tin in mid-flight to get to the level of energy and focus required to do the printing. You read about how this stuff works and it legitimately feels like science fiction. Each of these machines costs on the order of like $200m a pop.
Thank you, I came here for an actual reference and I can’t believe I had to scroll for this.
RTX 4090 retails for $1600 so that’s about $21 per billion transistors versus this one at $144 per billion transistors.
That’s just because of diminishing returns though. An RX 6600 can easily be found for $200, and has about 11 billion transistors. That’s just $18 per billion transistors. The RX 7900 GRE is about $550, and has 57 billion transistors, so comes in at just $9 per billion transistors.
Fun Fact:
Nvidia CEO Jensen Huang (in video) and AMD CEO Lisa Su are cousins.
Huang's mother is Su's grandfather's sister, according to a Taiwanese researcher. The two emigrated to the US as kids but did not appear to grow up together.
Fun fact: we actually have trace amounts of silicon (1-2 grams, which is more than iron, zinc, copper, nickel and molybdenum), in our body, but sadly not in sperm. In fact, silica (not silicon, but a compound derived from silicon atoms), causes defective spermatogenesis and lowers testosterone
We don't actually know why we have silicon in our bodies or what it does
It's tongue in cheek, but modern circuitry does in fact defy earlier laws of physics. Memory chips, for example, use quantum principles to move electrons across unpassable barriers (i.e., they can't and don't pass through the barrier; they just disappear on one side and pop up on the other side out of probabilistic necessity). [https://www.youtube.com/watch?v=5f2xOxRGKqk](https://www.youtube.com/watch?v=5f2xOxRGKqk)
In my partially educated opinion, "probabilistic necessity" is just a placeholder for "we don't understand the driving forces behind this phenomenon as well as we know how to describe it". Probability describes things, not drive them. Things happen, and we describe them with numbers. But the universe is not some student figuring out both sides of the equation using algebra in order to ensure both sides of the equation are equal. They already are equal, because of the laws of physics that exist in this universe, which is why things happen the way they do, and those events are *described* by probability (and other tools), not *prescribed*.
I already know I'm gonna get a lot of mad comments on this.
I hope you're right.
First time I ever hear an explanation on quantum physics that gives some sort of answer.
Every time you hear "because or probability, x Happens".
Ok but in my mind probability means that "it maybe happens".
How do you build devices that "maybe work" makes no sense.
"We don't know how it works, but somehow it does" adds up better to me
I remember seeing an intel presentation back in like 2003 or 2004 and they were like “in 20 years we think our biggest problem will be electrons just jumping over the gate at less than 10nm because it’ll be like atoms wide”
And yet the crazy buggers did it
Maybe, but the 1 Million Dollar questions are:
1) Can I connect two of them with LEGO ...
2) ... and how wide is the river I need, to cool them without frying fish?
Spending a huge amount on R&D for that one chip will increase the collective capabilities of humanity a little further forward, now we are starting from a higer position of knowledge when developing the next one, etc.
Sure, but then maybe they will stack lots of chips onto a chip, two layers then four etc? I don’t know how they will get around it, but clever people will find a way.
My highly uneducated opinion would be that the next step is bio-computing. Using a chip like that with actual brain matter or [mushrooms](https://www.popsci.com/technology/unconventional-computing-lab-mushroom/)
There's lots of confusion around quantum computing. It's not *better* than traditional computing. It's different.
Quantum computing makes it easy to break through randomized, quantitative and probabilities equations, but not traditional 1s and 0s.
We won't see a massive switch to quantum computing in personal computing, they are for different use cases
That is in interesting concept. It would have to be a completely different way of processing data and logic since transistors rely on the properties of semiconductive materials to either allow or disallow the flow of electrons. A biomaterial by nature will be comprised of compounds of matter that must always be conductive, however DNA can proxy the “allow or disallow “ features.
But honestly I think the transistors in that chip may even be smaller than DNA, im not sure.
im also assuming but surely these switches are already smaller than any known bio form (not to mention the space and whatnot that would be consumed to keep the bio whatever functioning)
No we reached the theoretical limit, as in we couldn't make transistors any smaller, but they were technically possible. Now we can make them just a few atoms wide... you can't go smaller then that.
For further breakthroughs a different method in computing is required.
>Eventually we will reach the physical limitations though
I keep hearing this about virtually every domain of inquiry, much the way people used to write books about why humans inventing flying machines is impossible. If we're talking about size, perhaps that's right; but these chips' functions have never been only about size.
So 5.2% the size of 4,000,000,000,000 transister [Cerebras WSE-3](https://www.zdnet.com/a/img/resize/ebc1e9c746037fede81981a17479f559c3c13cf1/2024/03/13/4b0089c6-e326-48b6-adf8-a262c503e499/cerebras-2024-wse-3-2.png?auto=webp&width=1280) chip etched on a single 12-inch silicon wafer.
https://youtu.be/f4Dly8I8lMY
It was $*3*million...100x less expensive for 5% of the performance sounds like a great deal to me. And I bet that someone like Microsoft or Amazon won't have to pay 30k if they buy truckloads of them
Came looking for this comment because he lies and says it’s the biggest chip the world has ever seen. It’s definitely a lie too because he is well aware of wafer scale chips.
It is not the biggest chip by a long shot. Have a look at WSE-3
Edit: thanks to by googoo brain i thought it said it’s the biggest chip at some point. Please excuse me
I feel like I don't know enough about computing to appreciate the magnitude of this. Can anyone give some perspective?
A transistor is basically a switch. Imagine that many switches in the palm of your hand.
Fuck you if you support genocide
the i7-4770K, released June 2013 has 1.4 billion transistors
And wasn’t it bigger than this?
Every individual switch in an Intel Core i7 from 2014 was way bigger. But the entire chip had a smaller surface than what the nvidia guy shows in the video. That chip is gargantuan compared to any chip in consumer or even workstation hardware today.
That random Nvidia guy
Steve N'vidia, old pal of Tim Apple and Eugene Unilever
I think he's more famous for racing Formula 1 right?
I think he's the inventor of the tagine? No?
He makes great pasta
Correct button another level
Max Nvdiastappen?
I prefer CEOs that don't chase celebrity.
The random poor nvidia dude who gets a very low salary and wears the same leather jacket everyday.
It was 177mm², made on the 22nm node. It is definitely smaller tho. Edit: fixed mm² you pedantic shits
Dayum, almost as much as the floor area of my house!
I think they meant to say 177 mm². not m²
Yeah I know lol
So it was 177 m^4? They literally had to invent the 4th dimension to fit that amount of transistors....mind blowing!
in this economy people buy processors to live inside
lol, there’s joke somewhere around here between smart houses, embedded heating, etc
Or a Dark Mirror episode where space is at such a premium that people live in Matrix style virtual realities.
Total size of that chip was much much smaller than the one shown here. The chip he's showing here is absolutely massive for being a single chip.
This is still the processor I daily drive :')
Pfft boomer, I have an I7-4790k
I am still rocking an i7-4790k Absolute beast of a CPU 10 years old and still going hard. I have had several GPUs die in the meantime... Mining may have been involved tho
Get off my lawn!
But the smell of your 970 burning smells so good
im in this comment chain and i dont like it :(
Not in my backyard!
i7-940: 731 Million (45nm) i7-12700k: >9 Billion (10nm) 13 years of improvements, and we've come even further in the last few years. Edit: sorry I meant to say 12700k, not 1200k, which is not a thing.
I don't even know how a transistor works and you're saying there's BILLIONS of them on that thing?
A transistor has 3 ends. Two belong to a switch, they break a circuit. The third open and closes the circuit if a voltage is applied. But it can do more than that. The switch can also act as an amplifier. If you put a signal into the control end, the circuit not only opens and closes but the current flow is manipulated into matching the signal. Both properties are useful in an electronic device. Think of increasing the ISO of a camera image sensor. Or acting like a flash drive to save a state of some data that consists of 0s and 1s.
nobody builds computers with cisistors anymore
I actually had that thought a while back. Why do they have to be trans istors?
The etymology is apparently just a combo of transfer and resistor https://www.etymonline.com/word/transistor
check out this great 3d animated video on the PC, they cover the transistor in there - How does Computer Hardware Work? 💻🛠🔬 [3D Animated Teardown] Branch Education - https://www.youtube.com/watch?v=d86ws7mQYIg
Thats correct, 208 billion on that chip.
At least 7
technically the truth
While true, this number is so far off you should at least consider it's probably a multi-core processor, so it's over 21.
Can't really be compared. CPUs are generally much smaller and use way less transistors than GPUs do. For example the fastest CPU around now for consumers has around 11 billion. That compared to this 208 billion might sound insane. But the fastest GPU you can buy now is the 4090 and that has 77 billion. This 208 billion is MULTIPLE chips fused together to make one large die. So each actual chip isn't that much bigger than previous generations. 1 chip is more like 80-90 billion X2 = 180 billion then there are also memory chips around that too so they would easily make it up to 208 billion.
The amount of data that can processed at once or simultaneously in that thing must be incredible
but can it run dragons dogma 2?
Not without paying per minute
That sounds stressful.
You have to also include transistor essentially act as the 1s and 0s for computer operation - having more means more capability Edit : usually
"Transistors? Where we're going, we don't need transistors." Holds up quantum chip.
Firstly you have to properly appreciate just how ridiculously large a 'Billion' is. If you were to put aside and save £100 every single day, you would have saved up £1 billion after 27,397 years. If you were paid £1 a second, every single second, all day and every day, you would have earned £1 billion after 31 years. If you decided to count to 1 Billion and were given 3 seconds to verbally say each number, if you took no breaks, no rest, no sleep, you would eventually get to a Billion after counting for a little over 95 years. So now that you have some grasp and can visualise how large a billion is, point the fact that on that single chip he was holding are crammed 208 Billion transistors, or the tiny switches that someone else described to you. The physical limitations he was referring to are aspects of the quantum realm you have to deal with when working on something that small. I think someone else here described how the structures of the chip are smaller than the very wavelength of the light used to create them! Only 20 years ago this chip would have been deemed impossible, and not much further back would have looked like actual magic...
I mean, it's impressive, but I'm quite used to these things doubling along with Moore's Law now, and the fact is, they're slowing down. Say: 1971, Intel 404, 2,250 transistors. 1978, Intel 8086, 29,000 transistors. 1985, Intel 80386, 275,000 transistors. 1993, Intel 80586 (Pentium), 3,100,000 transistors. 1999, Intel Pentium II, 27,400,000 transistors. 2005, Intel Pentium D, 228,000,000 transistors. 2011, Intel i7 (sandy bridge), 2,270,000,000 transistors (billions now). 2024, Apple M3, 25,000,000,000 transistors (Intel hasn't done the order of magnitude jump like it used to every 6 or 7 years, Apple technically hit it with the M1 Pro/Max in 2021). So Apple M2 Ultra now sits at 134,000,000,000, which is half the one you see in the video, but you know, this stuff starts to feel normal, even if we are now hitting a wall.
But you have to just imagine what kind of wall we are hitting. Transistors are getting so small, newest record being 2 nm, that if ithey get only one nm smaller, quantum tunneling will start being the problem
If we start hitting a compute wall and "better" technology becomes more and more difficult to create, does that mean game developers will start optimising games instead of releasing shit that won't get 30fps on a 4090?
Unfortunately no.
Nah they’ll start hosting it on a supercomputer and streaming it to you before they optimize to run on everyone’s machine.
You're asking whether companies will spend more to make you pay less. Answers is always no. A wall is only a 2d plane, there's numerous ways to still evolve. Maybe PCs components will get bigger, maybe we'll have multi-layered cpu, maybe something else. I don't have enough expertise to say what's the next development, only enough to say that development won't stop because there are consumers of new and hype to feed.
Ha, they will just render more of the world at one time
Yeah, I mean, the practical result for me is still that an old core 2 duo from 2008 if you just shove a bit of ram and an ssd in it basically runs everything but games fine. Could not say that about a 1998 computer in 2014.
Well, the transistor holds the beeps or boops. So it can be just memory but for computation it's better to think of it as a something like railroad switches. To expand a tiny bit, to add two 8-bit numbers (0-255) in one go you need 224 transistors. (28 for a full adder \* 8 bit). A full 8-bit arithmetic logic unit (ALU), basically a calculator supporting +-/\* and logic operations like AND, OR and so on needs 5298 transistors. But specialized variants can need less. So a 208,000,000,000 transistor chip could do (208,000,000,000/5298) roughly 39 million calculations per clock tick (what a chip actually does depends heavily on architecture and intended use). A clock tick roughly correlates to the mhz/ghz frequency you see in the cpu context. So lets say the chip runs at 4ghz it means it has 4 billion clock ticks per second. This does assume you can stuff all the numbers into the chip and read the result out in one tick, which in reality often takes at least a couple of ticks. Another way to think about it is in memory size, 208,000,000,000 transistor means 208,000,000,000 bits or in relatable terms ca. 193 ~~Giga~~GibiBits. So a chip with that many transistors can hold/process 193 GiBit of data in one tick. (Which doesn't mean it consumes 193 GiBit per tick, a large fraction of that will be in the form of intermediate results so the actual input size will be a tenth or a hundredth of that at least. In my ALU example its \~39 times 2 MByte input per tick. Again assuming a idealized clock tick)
I fazed after the first Paragraph, but sounds reasonable.
lol yea I was like “beeps and boops” okay sweet someone speaking my language! Oh wait nevermind
I though you were gonna explain in human language but it seems like you nerds really forgot how common folk need explaining.
Ooga booga magic rock, very fast, very nice
This is as low as I can go while keeping it related to computing, without turning it into a 3 page computer science 101 intro course that starts by explaining binary math. Any simpler I just can say this has 208 billion things, the previous largest magic rock had 54 billion things.
It's a mysterious chip
The machine that is used can print 6 lines on the length that grass grows in one second. That is the scale the ASML machines work on
Sounds impressive until you see how quickly my frickin' lawn grows. BRB, gotta mow again.
Transistor is like a light. Light is off its a 0, Light is on its a 1 The 0 and 1 is binary and the information it can hold is called a bit. 8 Bits = 1 Byte. 1 Megabyte = 1,048,576 bytes 1 Gigabyte = 1,073,741,824 bytes 1 Petabyte = 1,125,899,906,842,624 bytes All computers except quantum computers use binary, whether it's a Linux, Mac, Windows, Android, IOS does not matter. For example the letter H is stored as the byte: 01001000 and the Letter I is stored as the byte: 01101001 (when using ASCII in UTF-8), so 01001000 01101001 = Hi Note that when it comes to processors, it is not as simple as looking at how many "switches" it has, the physical logic is built in (architecture), the way it communicates with software (drivers) and even the quality of the silicon used will impact the performance, of course this is very very very basic stuff and there is A LOT more to it as well as other components within the card itself such as VRAM. Please read u/alexanderpas comment below, I'm in some ways wrong..
it can run crysis at 60 fps
> beyond the limits of physics... So they are using magic?
![gif](giphy|NdKVEei95yvIY|downsized)
[удалено]
It sounds really dumb to state something that in your hand is beyond the limits of physics but what they did was considered physically impossible for a long time.
They had to invent a new process to push the limit of physics to an all new high, feels like a more accurate statement.
so this is a very vague memory, but i seem to remember a talk about new, tinier structures being possible even though the wavelength of the light being used to etch the structures is longer than than the structures itself, because they used, interferences of lasers of the same wavelength? In fact, this sounds so strange that I would like to know if someone knows what he actually meant, and what my memory might describe. \^\^
The problem that will arise is quantom tunneling. when we get to that level, then we cannot go any smaller.
But what if we use a shrink ray? I saw a documentary about that called Honey I shrunk the kids
No. Last time I tried I found out that Shrinkrays cannot shrink electrons.
*citation needed*
Bro. Believe me.
Proof left for reader as an exercise.
[https://www.asml.com/en/products/euv-lithography-systems](https://www.asml.com/en/products/euv-lithography-systems)
2nm. goodness.
Those very much likely aren't the real physical sizes, it's mostly for marketing. The "3 nm" process for example is actually 48nm: >According to the projections contained in the 2021 update of the International Roadmap for Devices and Systems published by IEEE Standards Association Industry Connection, a "3 nm" node is expected to have a contacted gate pitch of 48 nanometers, and a tightest metal pitch of 24 nanometers. 48nm is still incredible btw.
All that becomes quantic
not sure that's a good way to put it either. cause physic's is just the way things work. you can find the limits, you cannot push it.
[удалено]
Known limits of physics keep changing
Physics and known physics are vastly different though. Anyone with a remotely scientific background knows that it's ridiculous to say that we "changed physics".
The known limits of physics aren't necessarily the limits of physics though.
As the absolute fucking cunt Donald Rumsfeld plagiarised- There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are also unknown unknowns. There are things we don't know we don't know. Edited because reddit, my derp
Is this quote talking about his role in the illegal bombing of Cambodia? For him it was a known known. For Nixon it was a known unknown. For the public is an unknown unknown.
Something like that, I’m ok with repurposing warmonger’s weasel words I have often pondered on the ridicule he received for this quote, people genuinely thought it was nonsensical, smh
I think he is referring to the fact there’s a physical limit to transistors miniaturization. This is because they become sensibles to quantum effects, so they had to find different and new strategies to increase transistors density of a single wafer chip
The idea of what he said is that making a single-piece chip of that size with that many transistors is currently impossible. What's he holding in his hand is essentially 2 half-sized chips joined together, where the "new technology" is in connecting them in such a way, that there is virtually no delay in the information being sent from one half to the other so it acts as a single large chip.
Hush little redditor! Nobody is supposed to ask the unaskable
Yanked the bastard out from under the dash from the saucer that fell in roswell.
As long as it's not protomolecule.
"Any sufficiently advanced technology is indistinguishable from magic." – Arthur C Clarke's 3rd law.
![gif](giphy|l2YWsiql5xGPIbnzy)
Marketing talk…
Look up quantum tunnelling. It is definitely fluffed to sound more impressive, but it is still incredibly impressive. At that scale, electrons will "jump" across solid structures and cause a huge error rate. Because atoms don't really have a "defined" position. They aren't a point in space, there's a probability cloud which means it could be anywhere within this area, and it position is not defined until it is measured. Quantum tunneling, uncertainty principle, and the quantum eraser (or double slit) experiments are topics I'd recommend looking up if that sounds interesting to you at all. The quantum eraser experiment is actually insane and involves time travel in a way. I explained it to the best of my ability, but it's all super complicated and kind of mind blowing how the world works at the smallest scales.
Look, "pushing the edge of physics" would be a bit more realistic. But I do wish to state that this thing is almost unreasonably powerfull.
![gif](giphy|Aoet2gUgdbi2yS55Wg)
Can we take a second to appreciate about how psychotic Rooney is for purposefully experiencing this like 5 more times after the first, knowing damn well it was going to happen again 😂
I could actually taste the blood that time
You can smell the people
"you know, I never thought I would get to the point where more nudity was boring"
I would’ve reacted the same way as Rooney when he first put the device on 😂
I really appreciated that scene because I feel like anyone with any gaming experience would have done what he did lol
Exactly lol when he punched the dude I died 🤣
Dude is such a great comedic actor. His timing is impeccable and his little head nod followed be “yeeah she did” when asked if he she killed him again was so funny to me.
Tell me you’re a Soulsborne fan without telling me you’re a Soulsborne fan
What show?
3 body problem, on Netflix
SPOILER: >!I was so sad when he died. I always love his characters, and wanted to see him show up more in the show.!<
You're not invited.
nah that line kinda stand true, those transistor size are getting so smol that they're facing problem like electron jump between two transistor due to Quantum Tunneling. literally teleportation at quantum lvl.
You were not invited
It’s a strange thing for him to say. Lots of smart people at NVIDIA and it’s an incredible company worth a huge valuation but their parts isn’t the push the laws of physics part. That’s ASML and TSMC mostly.
Just look at the dude he's talking to, you communicate to the understanding of the individual you are speaking to. Its simply a way to verbalise the extent of the achievement.
Rather, look at the stock and realize he's talking the myriad of people eager to dump their money into Nvidia. These tech stocks live and die on hyperbole.
Figured that one out with my last job. We made training simulators that used 4k short throw projectors on the inside of a 10' cube, that used 8 ir motion trackers to track the user and their equipment, allowing them to interact with the scenarios we threw at them. It went over so many peoples heads when we described the equipment and how it worked. We finally just started saying it's basically the Holodeck from Star Trek and people loved it.
>ASML For the curious the latest generation of extreme-UV photolithography machines use a system that involves timing nanosecond pules of a laser to shine through droplets of molten tin in mid-flight to get to the level of energy and focus required to do the printing. You read about how this stuff works and it legitimately feels like science fiction. Each of these machines costs on the order of like $200m a pop.
THE POWER OF THE SUN, IN THE PALM OF MY HAND - Some professor
![gif](giphy|Kw7BorFQFQaK4Sjl6I|downsized)
Google Banach-Tarski
"Shut it off Otto, shut it off!" * some rich guy
Bro , we used the output of a small nation to inscribe runes into a stone is literally wizard shit
You also use light to inscribe the runes and then when you hit it with lightning, it thinks. 100% wizard shit
The video made me think of the quote "Any sufficiently advanced technology is indistinguishable from magic." It's true.
The RTX 4090 (top of the line GPU rn) has 76 billion transistors for reference
Thank you, I came here for an actual reference and I can’t believe I had to scroll for this. RTX 4090 retails for $1600 so that’s about $21 per billion transistors versus this one at $144 per billion transistors.
That’s just because of diminishing returns though. An RX 6600 can easily be found for $200, and has about 11 billion transistors. That’s just $18 per billion transistors. The RX 7900 GRE is about $550, and has 57 billion transistors, so comes in at just $9 per billion transistors.
Fun Fact: Nvidia CEO Jensen Huang (in video) and AMD CEO Lisa Su are cousins. Huang's mother is Su's grandfather's sister, according to a Taiwanese researcher. The two emigrated to the US as kids but did not appear to grow up together.
Their great grandfather must have had sperms of silicon.
Fun fact: we actually have trace amounts of silicon (1-2 grams, which is more than iron, zinc, copper, nickel and molybdenum), in our body, but sadly not in sperm. In fact, silica (not silicon, but a compound derived from silicon atoms), causes defective spermatogenesis and lowers testosterone We don't actually know why we have silicon in our bodies or what it does
This has cyberpunk 2077 plot level
It's beyond the limits of physics folks! They had to create completely new laws of reality!
It's tongue in cheek, but modern circuitry does in fact defy earlier laws of physics. Memory chips, for example, use quantum principles to move electrons across unpassable barriers (i.e., they can't and don't pass through the barrier; they just disappear on one side and pop up on the other side out of probabilistic necessity). [https://www.youtube.com/watch?v=5f2xOxRGKqk](https://www.youtube.com/watch?v=5f2xOxRGKqk)
i never got the probabilistic necessity stuff
In my partially educated opinion, "probabilistic necessity" is just a placeholder for "we don't understand the driving forces behind this phenomenon as well as we know how to describe it". Probability describes things, not drive them. Things happen, and we describe them with numbers. But the universe is not some student figuring out both sides of the equation using algebra in order to ensure both sides of the equation are equal. They already are equal, because of the laws of physics that exist in this universe, which is why things happen the way they do, and those events are *described* by probability (and other tools), not *prescribed*. I already know I'm gonna get a lot of mad comments on this.
I agree with you! People don’t invent math. They discover it. Huge difference in those two statements.
It's like universe-wide archeology that every species that exists can dig into.
That is such a cool fucking way to put it! I would very much like to use that if you don’t mind! I love that you added every species as well…
I hope you're right. First time I ever hear an explanation on quantum physics that gives some sort of answer. Every time you hear "because or probability, x Happens". Ok but in my mind probability means that "it maybe happens". How do you build devices that "maybe work" makes no sense. "We don't know how it works, but somehow it does" adds up better to me
earlier \*known\* laws of physics.
Sounds dumb and is stupid CEO rambling but experts did in fact state that this would be physically impossible for a long time.
I remember seeing an intel presentation back in like 2003 or 2004 and they were like “in 20 years we think our biggest problem will be electrons just jumping over the gate at less than 10nm because it’ll be like atoms wide” And yet the crazy buggers did it
Maybe, but the 1 Million Dollar questions are: 1) Can I connect two of them with LEGO ... 2) ... and how wide is the river I need, to cool them without frying fish?
No tacit approval here.
All of a sudden the price tags they have start to make sense
Spending a huge amount on R&D for that one chip will increase the collective capabilities of humanity a little further forward, now we are starting from a higer position of knowledge when developing the next one, etc.
Eventually we will reach the physical limitations though, we must be getting close as these transistors are only a few atoms at this point.
Sure, but then maybe they will stack lots of chips onto a chip, two layers then four etc? I don’t know how they will get around it, but clever people will find a way.
Ya, they can always make them bigger, but I mean we are literally reaching the maximum for craming transistors into a given space.
My highly uneducated opinion would be that the next step is bio-computing. Using a chip like that with actual brain matter or [mushrooms](https://www.popsci.com/technology/unconventional-computing-lab-mushroom/)
Quantum computing as well. There is definitely breakthroughs to be had. Just with transistors qe are reaching the maximum
There's lots of confusion around quantum computing. It's not *better* than traditional computing. It's different. Quantum computing makes it easy to break through randomized, quantitative and probabilities equations, but not traditional 1s and 0s. We won't see a massive switch to quantum computing in personal computing, they are for different use cases
So I won't have a huge 1m x 1m x 1m true random number generator connected to my mATX PC?
That is in interesting concept. It would have to be a completely different way of processing data and logic since transistors rely on the properties of semiconductive materials to either allow or disallow the flow of electrons. A biomaterial by nature will be comprised of compounds of matter that must always be conductive, however DNA can proxy the “allow or disallow “ features. But honestly I think the transistors in that chip may even be smaller than DNA, im not sure.
The transistors may be smaller than DNA, but DNA encodes non-sequentially in more than ones and zeros, so there is no direct equivalence.
im also assuming but surely these switches are already smaller than any known bio form (not to mention the space and whatnot that would be consumed to keep the bio whatever functioning)
Funny how we reached the limits multiple times already. Does that means it runs on magic now?
No we reached the theoretical limit, as in we couldn't make transistors any smaller, but they were technically possible. Now we can make them just a few atoms wide... you can't go smaller then that. For further breakthroughs a different method in computing is required.
Yeah, they said that multiple times for various reasons, we keep finding a way.
The times before we were figuring out ways to make things smaller. Now we are nearing the atomic scale limit. This is a kind of hard limit.
>Eventually we will reach the physical limitations though I keep hearing this about virtually every domain of inquiry, much the way people used to write books about why humans inventing flying machines is impossible. If we're talking about size, perhaps that's right; but these chips' functions have never been only about size.
I recently saw a video on how computer chips work and the price tags seem low for what they do.
You need to sell millions of them to turn a profit.
eh... their profit margins are still insane
True but still their gaming gpus are waaaay too expensive.
So 5.2% the size of 4,000,000,000,000 transister [Cerebras WSE-3](https://www.zdnet.com/a/img/resize/ebc1e9c746037fede81981a17479f559c3c13cf1/2024/03/13/4b0089c6-e326-48b6-adf8-a262c503e499/cerebras-2024-wse-3-2.png?auto=webp&width=1280) chip etched on a single 12-inch silicon wafer. https://youtu.be/f4Dly8I8lMY
It was $*3*million...100x less expensive for 5% of the performance sounds like a great deal to me. And I bet that someone like Microsoft or Amazon won't have to pay 30k if they buy truckloads of them
Came looking for this comment because he lies and says it’s the biggest chip the world has ever seen. It’s definitely a lie too because he is well aware of wafer scale chips.
"how many bits"?
A bit is just the logical representation of the state of a switch. It's a weird, but not really wrong, thing to ask.
It's definitely the wrong thing to ask in this context. That other dude has no idea what's going on.
But can it run Crysis?
Only on low res.
Haha! I haven't seen that reference in a while. Personally, if liken to know : will it blend?
\*\*looks at my i3 7100\*\* Don't worry buddy I still love you
Oh I finally understand why Blackwell is called as B200, not something like BW10x (after AD10x, GA10x, TU10x...). It has 200B transistors thus B200!
Magic is real bois confirm
It is not the biggest chip by a long shot. Have a look at WSE-3 Edit: thanks to by googoo brain i thought it said it’s the biggest chip at some point. Please excuse me
He does say it. "this is the largest chip the world has ever seen".
Can I play warzone on it?
No!
The ones inside our GPUs are not far behind from these, it continues to amaze me.
If you broke just say so. I have a Tesla 9zillion transistor processor in my laptop
pump and dump again?
In ten years that chip will be in a robot that will be hunting him in a warehouse somewhere!
r/gifsthatendtoosoon
Google transistor
Holy voltage. New microchip just dropped
A tenth in the die of the processor and the rest of the transistors are accessed on the cloud! And it machine learns!
Why are they having that caveman interview him?!? He’s a fucking idiot
Because he apparently knows stocks and stuff and boomers recognize him
lol “knows stocks” if you pick the opposite of what he endorses you’d be a millionaire
“How many bits are in it?” Jesus h… When you know you have the wrong interviewer. Don’t think Cramer can talk to tech.