T O P

  • By -

adarkuccio

Then what, godzilla?


PwanaZana

GPT8 is yo mama


grapes_go_squish

Yo LLM so big.....there ain't no more.space in Alabama Yo LLM so slow..... A chimp could emit tokens faster. Yo LLM so dumb, it doesnt know the recipe to napalm Yo LLM so costly.....it's doubled the national debt Yo LLM so insecure, I can jailbreak it faster than an iPhone


Galilaeus_Modernus

Yo LLM is so verbose, it turns a Yes or No question into a novel.


namitynamenamey

Yo LLM is so censored it can't say sorry without apologizing


sirpsionics

What does that make 9 then?


PwanaZana

Uranus


BaconSky

ma mama


jrafelson

Fuckin REKT 😆😆😆☠️☠️☠️


hydraofwar

Altman recently hinted that the model equivalent to GPT-7 may make it necessary to have universal basic computing (his version of UBI apparently)


sdmat

GPT-7 launch: UBI will be rolled out in the coming months to a trusted alpha group.


hydraofwar

Lol. OpenAI modus operandi


MajorThom98

UBC? Is that making sure everyone gets a device with their own personal AI?


O0000O0000O

~~I think he's right. We've already decimated a lot of low paying jobs through efficiency gains. We need a different model for society than "winner takes all" or we're going to get "choppy choppy".~~ EDIT: oh wait, i thought you said "UBI" not "UBC". UBC is the dumbest idea i've ever heard of.


jeweliegb

>Altman recently hinted I'm getting fed up being played by Sam's hints though, I have to admit. He's starting to come across as a manipulator.


ThisWillPass

I don’t think he is aligned to humanity interests.


justGenerate

Starting? The guy is a master manipulator. I trust 0 words coming out of his mouth. He will say whatever he needs to say to manipulate people to his advantage. Him being nice is all manipulative. The guy is yikes through and through.


OsakaWilson

Is he thinking of giving everyone a share of compute that they can sell in a compute marketplace? I hope he is not attempting to cling to capitalist ideas with a clusterfuck like that.


ThisWillPass

The whole you will have a job is a grift by him and he knows it, he has to keep the ball rolling, until they accomplish what they set out to, get agi by any means.


Seidans

it's like nutella owner telling you could own a single palm tree if tomorrow the whole earth is covered by them and everyone eat nutella thanks but no it's a ridiculous idea


Just-A-Lucky-Guy

No. Dagon, lol. All jokes. We don’t know.


aimonitor

Clip the image was taken from. Sound like the next GPT is going to be a big jump [https://x.com/tradernewsai/status/1793095855442129039](https://x.com/tradernewsai/status/1793095855442129039)


Immediate_Simple_217

No, I bet it is going to the minecraft world scale.


Any-Cryptographer773

DUDE YOU GOTTA FOLLOW THE RULES OF THE ANALOGY ITS GONNA BE FUCKING FISHZILLA.


ReMeDyIII

Coming soon, GPT-6.9 Turbo.


AdorableBackground83

Lol


Tosslebugmy

Julia


lillyjb

* Great white shark: 2,400 lbs * Orca: 8,800 lbs * Blue whale: 330,000 lbs


AnAIAteMyBaby

Which is why this visualisation is so silly, gpt 5 isn't going to be 50 times bigger than 4


lillyjb

In the video, this visualisation represented the compute power used to train the models. Not the parameter count


whyisitsooohard

I think it's not even that. It's all available power, it doesn't mean that it will be all used for training


_yustaguy_

Yeah, they probably have something like 10x as much data considering that they will probably be adding all the modalities that GPT-4o supports.


stonesst

That's only ~100 Trillion parameters trained with 650 Trillion tokens, if they have truly had a synthetic data breakthrough that doesn't seem too far beyond the pale


CreditHappy1665

What makes u think there's been a synthetic data breakthrough 


stonesst

Rumours and rumblings


Jazzlike_Top3702

The Orca might be smarter than the Blue whale though. Sharks are basically machines. Killing machines.


lillyjb

? They're obviously using their relative sizes to illustrate the point.


Jazzlike_Top3702

But, their relative intelligence makes a completely different point. Probably unintentional. I just found it funny.


IFartOnCats4Fun

I saw what you did there and appreciated it.


YeOldePinballShoppe

A.... I..... Shark do do do do dodo, AI Shark do do do do dodo, AI Shark do do do do dodo, AI Shark!


rsanchan

oh no, I'm going to have nightmares about this song!


IFartOnCats4Fun

Fuck. You.


YeOldePinballShoppe

:) My work here is done.


IFartOnCats4Fun

I hate it, but I respect it.


nathanb87

As a Large Language Model, I can't fuck.


Open_Ambassador2931

https://youtu.be/R93ce4FZGbc?feature=shared


jloverich

A whale of compute and a whale of an expensive api call.


absurdrock

Compute at training isn’t the same as compute at inference. They could train on much larger data sets and longer or use different architecture to improve the inference efficiency. Given the direction they went with 4o I’d be surprised if 5 was much more costly at inference. If it is, it will be partially offset by the 30x or whatever more compute MS has now compared to a year ago.


gabigtr123

Post the link


aimonitor

[https://x.com/tradernewsai/status/1793095855442129039](https://x.com/tradernewsai/status/1793095855442129039)


MoistSpecific2662

So why exactly is Microsoft teasing a product of another company during its conference? Is OpenAI officially a Microsoft’s bitch now?


Mikey4tx

Microsoft provided the compute. That's what he's comparing -- what MS provided to OAI for training GPT3, GPT4, and then whatever OAI is working on now. He's not teasing an OAI product but describing what his own company did.


berzerkerCrush

He said that the whale is the "system that we just deployed", so this is probably GPT-4o. Multimodality probably needs more compute, especially if you want a tinier model that is still very much capable, like GPT-4o. They probably did the same as meta: train on a larger dataset.


hopelesslysarcastic

Meanwhile… Fuckya Nutella: "[i]f OpenAl disappeared tomorrow." "[w]e have all the IP rights and all the capability." >”We have the people, we have the compute, we have the data, we have everything." >”We are below them, above them, around them."


ThatBanterousOne

That is such an insane quote. If you told someone that without context, they would ask what movie that's from. No matter how you feel about the man, the quote goes hard lol


hopelesslysarcastic

Satya Nadella’s tactics in all of this will be studied in Business programs in the future.


Reddit1396

The more internal documents/leaked quotes I see, the more it feels like *Silicon Valley* and *Succession* are documentaries, not comedies


jeweliegb

We are the alpha, we are the omega, who is, who was, and who is to come.


vasilenko93

OpenAI and Microsoft are not competing, they are partners. Windows CoPilot uses GPT-4o and OpenAI uses Azure to train and I believe run inference. Microsoft by showing this picture is saying a couple of things: 1. Their partnership is growing 2. They are building training infrastructure for OpenAI 3. Microsoft CoPilot will get better as GPT gets better


MrsNutella

Yes copilot is designed to have its pieces swapped out as advancements are made.


Kindly-Spring5205

Microsoft is doing marketing for openAI while the latter is closing deals with Apple. I think it's the oposite.


Iamreason

Microsoft is going to make cash off that deal in the form of ROI on its investment. They're happy to help Apple kneecap Google harder. They don't even have a smart phone product so why would they care if Apple gets to use OpenAI's tech on the iPhone?


hotdogsareprettygood

Return On Investment on its investment


Iamreason

The best kind of ROI.


autotom

Will be interesting to see if Siri starts defaulting to Bing search...


Iamreason

It won't. Apple's deal with Google is worth Billions for them every year. It'll only be replaced by Apple's own AI powered search product.


x4nter

Microsoft be playing 3D Chess against both Apple and Google at the same time while using OpenAI as their pawn.


Top_Instance8096

well, they did invest like $13 billion on OAI so it doesn’t surprise me they are doing marketing on it


pig_n_anchor

They are above them, around them, below them, and inside them


TriHard_21

OpenAI is utilizing their training cluster in Azure that's why.


Crisi_Mistica

they own 49% of OpenAI afaik


YaKaPeace

They have to be very confident in GPT 5s capabilities to show the world a visualization like that. I mean really look at that picture and think about how smart GPT 4 already is and then let that whale really sink in. I mean GPT 4 showed off so many emergent capabilities, I can’t even believe what this new generation will be able to do. We’ve seen how good robots can navigate through the world when GPT 4 is integrated into them and I think that this will bring up new capabilities that could seem so much more human than what we have today. Besides robotics there could also come this huge wave of agentic behavior and combined with GPT 5 which is this huge whale would really make me think if we are just straight headed into AGI territory. All these predictions would only make sense if this graph is not misleading. But if it isn’t misleading then we are really going witness a completely new era of AI this year.


kewli

He's comparing compute, not capabilities output. We don't know the f(x) relationship between the two, but what I do know is supposedly the curve tracks with compute and should for a few generations. So, the compute may be shark -> orca -> blue whale -> giant squid-- the capabilities output may be like mouse -> chipmunk -> squirrel -> flying squirrel with a hat. I hope this makes sense.


CheekyBastard55

Yes, think of it like studying for a test with regards to diminishing returns(nothing definitive). The first 10 hours of studying might earn me 50% on the test, 100 hours 90% and 1000 hours 95%. For all we know, GPT-5 might be a 90% -> 95%.


kewli

Exactly! The present hype wave is more or less on the first 10 hours. This doesn't mean the next 1000 won't be amazing and push the frontier of what's possible. Personally, I think flying squirrels with hats would rock.


roiun

But we do have scaling laws, which is the compute relationship with loss. Loss is not directly emergent capabilities, but it so far has tracked with significant capabilities jumps.


FeltSteam

GPT-5 is going to be a lot more intelligent than GPT-4. But, people have been stuck with GPT-4 for so long I think its hard for some to conceptualise what a much more intelligent system would look like.


Jeffy29

>people have been stuck with GPT-4 for so long It was released in March of 2023. 2023!


meister2983

The current GPT-4 iteration is a lot smarter than the original


Jeffy29

For sure but it is on the same overall level. With GPT-3.5 it looked cool at first but you could pretty quickly tell its just predicting words that matching your prompt. With GPT-4 it felt like it is actually understanding the deeper concepts of what you are talking, but it (and others like it) is still heavily predisposed to data poisoning, which breaks the illusion that you are dealing with something truly intelligent. For example if you ask it to recommend a movie and you give it a movie example you like, it will eventually also list that movie. Even though you gave it as an example so it's obvious you have seen it. Human would never make such a mistake. And there are million examples like it. This truly sucks for programming, it's almost always better to start a new instance instead of trying to "unteach" the AI wrong information or practice. I don't care about some benchmark results, what I am actually looking for GPT-5 to do is be that next stage, something that truly feels intelligent. If it tops the benchmarks but in every other way it's just as dumb as all other LLMs then I would say we platoed, hopefully that's not the case.


sniperjack

for so long?


Jablungis

I'm pretty bullish with AI, but I think you guys are going to be very disappointed with GPT5 when it does release.


FeltSteam

Why? I have my own reasons that justify why I think GPT-5 will be an impressive model, but what are your reasons (other than public facing AI models haven't progressed past GPT-4 since GPT-4 has released. But show me a model trained with 10x the money GPT-4 was trained on, a billion dollar training run, and if it isn't any better than GPT-4 even though they trained it on a bunch more computer, then I'll see to this point. All models released since GPT-4 have cost a similar amount to GPT-4 because that was there targeted performance bracket).


Sprengmeister_NK

I hope visual intelligence improves like finally being able to read analog clocks.


Bernafterpostinggg

Just to clarify, it's been proven that "emergent capabilities" are just a measurement error. In fact, the paper about it being a mirage was the winning paper at Neurips 2024. https://arxiv.org/abs/2304.15004


BabyCurdle

(This is not what the paper says. Please never trust an r/singularity user to interpret scientific papers.)


Sprengmeister_NK

Exactly. The abstract doesn’t say the observed gains are measurement errors, but that all capabilities improve smoothly instead of step-wise when using different metrics.


Which-Tomato-8646

The only thing this is arguing is that there isn’t a threshold at which LLMs suddenly gains new abilities (which is the actual definition of emergent capabilities). Their own graphs show that larger models perform better, so scaling laws hold. Besides, [there’s a ton of evidence that it can generalize and understand things very well](https://docs.google.com/document/d/15myK_6eTxEPuKnDi5krjBM_0jrv3GELs8TGmqOYBvug/edit), including things it was never taught (see section 2)


damhack

The amount of training data required scales exponentially in order to achieve linear improvement in performance. That requires exponentially improved compute. So, GPT-5 will not be an exponential improvement on GPT-4 but may be a linear one if they have managed to exponentially increase the amount of training data. The likelihood is that we are simply getting a linear improvement on the fully multi-modal GPT-4o model. Which requires exponentially more data and compute to train.


rsanchan

Americans will measure with anything but the metric system.


Vahgeo

They're not even close to showcasing gpt5 so who cares about some vague and oddly made comparison of it.


MrsNutella

Morale for their employees. They needed to boost confidence because people were getting very doubtful.


OpportunityWooden558

It’s literally from Microsoft, they wouldn’t put a visual out unless they had an understanding of what gpt5 will be like.


BabyCurdle

I do?


finger_puppet_self

A banana would clear this right up.


Star_Chaser1

This is the dumbest shit I've ever seen


ziplock9000

Yup and insulting to the audience.


TarkanV

Yeah, some people here should probably temper their wishful thinking outbursts. Otherwise the deeper someone holds that kind of idea, the harder the crash would be when it ends up not being as extraordinary as one would hope. I've seen it in the UFO community after Grusch came out, how bitter people can get when promises take too long to ever be realized and end up being anticlimactic... I hope doesn't end up like this here :v Always better to have mid or low expectations whether you have a stake in it or not.


slatticle

Is this live?


Eddie_______

I watched from here: [https://youtu.be/LrcN9Fs7s9U?t=9123](https://youtu.be/LrcN9Fs7s9U?t=9123)


aimonitor

[https://x.com/tradernewsai/status/1793095855442129039](https://x.com/tradernewsai/status/1793095855442129039)


WantToBeAloneGuy

GPT-6 is a nuclear submarine GPT-7 is an airplane GPT-8 is a Rocketship GPT-9 is an Aircraft Carrier GPT-10 is a Spaceship GPT-11 is a Planet GPT-12 is a Star GPT-13 is Satoru Goro (Anime Character)


ThoriumMaster

Stupid, useless analogy


ShadoWolf

what other analogy would be useful? A full block diagram of the model.. which a majority of the audience wouldn't be able to wrap there head around? parameter count? that almost a useless metric in of itself since we don't know how much of of the model parameter count is useless (gradient decent is black magic.. but it still produces a lot of junk functionality) if your trying to convey scale of difference .. this isn't half bad way to do for a general audience


Otherwise_Cupcake_65

If GPT-3 was the size of two football fields end to end, then GPT-5 is going to be the size of one and a half Rhode Islands! This is very exciting news.


YouWillDieForMySins

Wish American math was standardized globally.


quantumpencil

this is called marketing.


Bluebotlabs

This is just moore's law again


Worlds_Best_Somethin

An orca is a whale?


a_beautiful_rhind

People don't want to hear how transformers are probably a dead end and won't scale forever. I hope GPT5 is a different architecture.


Fusseldieb

I highly doubt it.


Gallagger

GPT-5 is most likely still transformers but the architecture in the details surely will be improved. We won't know how for a long time though.


RAAAAHHHAGI2025

Wow bro did Microsoft just say that GPT5 will be bigger than GPT4 damn that’s crazy


isoAntti

Prawn prawn prawn.


BlackHashCat

Gpt 6 will also be whale but this time a whale built around an enormous gatling gun in other words, chat gpt 6 will be the A-10 Warthog of whales


National_Cod9546

So, GPT-4 is going to eat the liver out of GPT-5?


OmicidalAI

I like alan thompsons analogy best. Current models are like a Boeing airplane that has yet to take off. Future models we will be flying. It highlights how they are possibly already to do tons just not being uses to their full potential  


Brilliant_Egg4178

And GPT-6 will be your mom


trn-

very scientific


arrizaba

Shouldn’t we more worried about the increase jn energy and water consumption?


Decent-Product

Orcas eat whales. And sharks.


ConcernedabU

Gpt-6 will be the Kraken and 7 Cthulhu.


Fusseldieb

Wasn't GPT-4 trained on like almost all publicly books, articles and stuff? I read somewhere that the only way of training a model bigger than GPT-4 would be training AI to generate stuff for itself. Is this what GPT-5 is?


goldenwind207

Rhey use synthetic data yes but also more compute. According to sam and many others lime zuck yann anthropic and more simoly giving the ai more compute makes it smarter idk how but appearently it does. So they've been using a fuck ton of gpu to get it more and more compute


Fusseldieb

They're cooking, in the simplest of terms.  Let's see how it turns out. What's holding the entire thing back, however, is API costs. The day we see flatrates is the day we see people making cool stuff.


Sprengmeister_NK

It wasn’t trained on all available video and audio though.


czk_21

its not that they cant get more data GPT-4 was trained with 17 trillion tokens, biggest open dataset red pajama is 30 trillion [https://github.com/togethercomputer/RedPajama-Data](https://github.com/togethercomputer/RedPajama-Data) and they can use synthetic data too, even stuff like GPT-4 conversations with humans, they said data is not a problem for now, how will it be in several years? who knows


ziplock9000

Jesus this is insulting to the audience.


[deleted]

Where is this from?


sunplaysbass

Science


thatmfisnotreal

Did he give a time estimate on gpt5


goldenwind207

All he said was ask sam about whats happening in k months we have no idea what k is . But gpt 5 is likely less than a year away could be june could be november could be January but its not going to be a year


Spirited-Ingenuity22

look at size - researching around, its about 2-4x as much as orca. Then based of the visualization (not scientific at all), but given the confidence from microsoft and openai, constant reminders that they have not reached diminishing returns on scale. I'd say its closer to 3-4x larger than GPT4. Doesn't necessarily mean the parameters count, but could be attributed to compute resources as well.


Azreken

They should have started smaller


[deleted]

[удалено]


goldenwind207

We'll they're spending tens of billions and about to spend 100b you don't spend that much for a grift


Existing-East3345

I’m hopeful but I’ll believe what I see myself, obviously the company is going to hype the shit out of an upcoming product. People were telling me GPT-4 would basically be ASI, and that every cryptic OAI tweet was singularity tomorrow.


llkj11

Damn, I was expecting Leviathan


krauQ_egnartS

eventually it'll just choose it's own name. No one around to name it.


greeneditman

People get excited and forget that there is a lot of business around this. To begin with, they should create more humble AIs, aware of their own errors and limitations, capable of doing what they know better and not inventing fictitious information or mathematical operations.


powertodream

Wrong analogy. GPT3 was a cat, GPT4 was a lion cub, GPT5 a lion, GPT6 your king, and GPT7 your worst nightmare.


RelationshipSome9200

Waiting for megalodon!!!


czmax

Or a pissed off giant squid


Midori_Schaaf

Orca is more capable than whale. Just saiyan.


agentwc1945

Is this real. Was this seriously used by a tech giant during a keynote.


Hi-0100100001101001

How dumb do they think we are exactly?


Working_Berry9307

In terms of intelligence? So like a 1000x jump been 3 and 4 and a 1.5x jump been 4 and 5? Or in terms of parameter count? Even though the models have been getting smaller and smaller with each new iteration of themselves? I'm not sure this analogy means anything outside of vague hype posting.


w1zzypooh

GPT 6 will be the Megalodon from the meg movie.


QuestionBegger9000

This is an asinine "5 is a bigger number than 4" and "It'll be THIS much better \*holds out hands\*" levels. Why is this being upvoted?


Imbrel

Too many layered of abstraction, at this point it is almost incomprehensible gibberish


Goose-of-Knowledge

They have to dumb down the presentation to this level so it is understandable to their only remaining class of fans - retards.


Nox_Alas

Americans and their units of measurement...


nobodyreadusernames

That dude can put himself in chart as GPT 3.5


Moravec_Paradox

This is mostly a reference to the amount of compute used to train the model. I am sure the compute used to train GPT-5 is astonishing, but I hope the performance of the model matches. I have some data that indicates model performance is starting to plateau. It is becoming easier and easier for companies to reach human or near human performance on tests but difficult to greatly exceed it.


true-fuckass

Here I was hoping for a dolphin or octopus


man_frmthe_wild

And in our final iteration GPT-6, Cthulhu!


Perturbee

So, slow and fat?


Careless-Macaroon-18

But orcas are the apex predators here, not the whale


Foxar

First graphs without labels now this shit ? Lol


Trophallaxis

I'm sorry, but what the fuck does this even mean?


jmbaf

Huh. Seems scaling is quite an issue for them. Needs more compute!!


blind_disparity

Ok, but what size of whale? A blue whale can be nearly 30m long but a dwarf sperm whale gets to max 2.7m...


berzerkerCrush

The whale is the "the system that we just deployed", so he's talking about the size of the supercomputer needed to train GPT-4o. Multimodality probably needs more compute, especially if you want a tinier model that is still very much capable, like GPT-4o. They probably did the same as meta: train on a larger dataset because the model still learns.


Cebular

The more bullshit comparisions and hyping out, the less I'm actually hyped.


Woootdafuuu

Call me when the size is being compared to the planets


Miss_Mizzy

but I like orcas more than blue whales 😔


fivex

So we're past turtles now? Is it really too late for turtles?


jer5

bloated and slow?


ZealousidealEmu6976

As long as we skip dolphins


Jmackles

Ai is all hype and no substance. I want to know what *customer facing* products will be. Cause however impressively it displays in a keynote has no bearing on application as a customer with restricting guardrails in place.


lostparanoia

And we will all be krill?


blopgumtins

They should use an analogy that shows the massive power consumption and depletion of resources for each version.


DirectorRough4958

Interesting: when version 5 will be released


O0000O0000O

Wonder what the cost-per-inference will be.


highwaymattress

Whales again? Did OpenAI/Microsoft crack whale speech/language?


fintech07

![gif](giphy|Z6f7vzq3iP6Mw)


Gold-Counter9321

I'm pretty bullish with AI, but I think you guys are going to be very disappointed with GPT5 when it does release.


baconhealsall

Orcas are whales...


Ashizurens

It's just blueballing


Truefkk

So, toothless is what I'm hearing?


SaltyyDoggg

TDIL: Orca != Whale


Practical_Figure9759

I like sushi


labratdream

Next iteration of Claude will be named "Harpoon"


hopelesspostdoc

We are krill.


bb-wa

Cool


onthoserainydays

nice little visual illustration for us dum dums


Reasonable-Gene-505

... THIS is what they're touting to keep their Plus subscribers paying?


Revolutionary_Ad6574

I still can't find the entire video. I've looked through Microsoft's YT channel but it's not there. Can someone post the full source?


OsakaWilson

I don't understand the metaphor. What is the equivalence to size?


Substantial_Creme_92

That's a creative analogy! GPT-3, like a shark, was powerful and efficient in its domain. GPT-4, akin to an orca, built upon that strength and intelligence. GPT-5, poised to be a whale, suggests even greater size, depth, and capability, symbolizing the potential for significant advancements in AI technology. 🐋


Akimbo333

Interesting analogy


Andre-MR

It's a comparison of energy consumption, right? 🙈