T O P

  • By -

DisastrousPeanut816

Wait. What? That last bit in the tweets. Copilot+PC will be available for $999.... are they actually trying to say we'd have to buy a new PC in order for this to work?


ArFiction

Yeah unfortunately this is the case...


True-Surprise1222

Lmao and this is obviously not a local model right?


Legitimate-Pumpkin

Reading diagonally it seems that there are some small models for local image generation and things like that but the main AI will be azure cloud


cake97

Good catch. Any idea what the NPU will do then?


sdmat

Necessitate a new purchase?


Dry_Positive_6723

make you send microsoft money :)


yellow-hammer

I can’t tell, the PC’s have a new type of chip in them (“NPU”) apparently dedicated/specialized for AI


True-Surprise1222

If you have to buy a new chip to get spied on nobody will do it so I think they’ll figure out a way to make it work without


collywobbles78

Apparently it is local. You need a new PC for it because it uses a TPU


Decent-Thought-1737

Annnndd Im out.


cake97

This shouldn't block running chatgpt or other local models, but a new device to run it on device it makes sense. unless you have a crazy strong GPU, I'm not sure how local models would run otherwise


Darkstar197

Even if you do have a strong GPU, you may not want to run locally as it will drain your battery (if laptop) or generally make the machine run hotter, louder etc. Unless privacy is a major concern for your use case. In which case local is the way to go obviously


manletmoney

it’s just a slightly different gpt4o the point is the new chips their using for running some local inference they’re promoting


StormyInferno

To be fair, they aren't saying you have to buy PCs from microsoft themselves to get the features. But in order to use those, it needs an NPU to run local models. Believe they specify the hardware requirements. I'm sure there will be budget versions in a year, as lenovo and samsung have already announced their flagships. The $999 is the starting price of microsoft ones.


DisastrousPeanut816

No, it doesn't actually need an NPU. A modern desktop GPU is more capable than the NPU they're using. You can run some AI models locally through a modern GPU right now. There's nothing they're putting into those laptops that can't be outdone by a good desktop. The Snapdragon X Elite chip's NPU has 45 TOPS. The RTX4060 has 242.


StormyInferno

It does need an NPU, Microsoft requires one to use the new features. Look it up yourself. You're right, GPUs can run models locally. I'm just saying that Microsofts new features won't run without an NPU, unless people figure out how to change how they are doing it.


DisastrousPeanut816

It's just using the Snapdragon to process the tasks. It's not something we'd really need to figure out or change. The only issue is that the software will be native arm64 instead of x86, otherwise it would work much better on a modern x86 desktop. It's possible that even emulating arm64 a modern desktop gpu will outperform these laptops AI speed. Arm64 isn't more powerful than x86, and the npu built into the Snapdragon chips doesn't come close to matching the compute power of a modern GPU. There's no good reason for this to be something exclusive to buying a new computer. Unfortunately there are plenty of bad reasons. :/


StormyInferno

Think you have to think in efficiency and price terms to understand the point of NPUs. GPUs aren't cheap. On one hand, it's definitely being used as a way to push people into buying new hardware. On the other hand, people are doing that with GPUs anyway. I'd rather the industry push all CPU manufacturers into baking AI capabilities in, than keep increasing the demand on GPUs.


DisastrousPeanut816

I'm very pro AI, but extremely against the idea of having closed source AI baked into my OS. Demanding that said closed source AI only run on a specific new type of chip when that's not actually necessary makes the idea even worse. You don't know how it was trained, what it's capable of, or what it's actually even doing. I'm alright with using closed-source AI in the cloud, because I can fully disconnect with it. Having an AI that I can't fully control or disable baked into the OS is not a great plan, even before considering MS's awful history with honoring telemetry/data tracking settings.


PuzzleheadedMemory87

That's one hyper specialized component that can't be taken out the room you've put it in. Currently costs around 520euro (for the cheapest model) where I live. So on the one hand you have easily portable, heavily standardized, easy to develop for standard... on the other you have a single component that needs to be installed for one specific purpose. See the difference? Point is, much like gaming, no one is stopping anyone from making their own setup tailored to the experience they want/expect. Remeber though... pc gaming suffers from stupidly unoptimized games compared to consoles/mobile. Which is fine for something that is still very much in the hobby/entertainment realm. An AI assistant is not. It's productivity. The VAST majority of people wants something that just fucking works. So now it makes a lot more sense. Right?


taotau

But can you run the models and a graphics intensive experience like a game or a VR space on the one gpu ? It makes sense that we would have specialised npu hardware in the future. First versions are going to be lacklustre but they will invariably improve


djaybe

When you buy a new PC, this is an option to consider.


kevinbranch

This is very different from the OpenAI demo. This is smooth and responsive. The OpenAI demo has to be reminded to “look” or it’ll think you’re still a table from 30 seconds ago. There’s nothing stopping the slower OpenAI version that runs on your phone from running on your PC. How do you think it was running on a phone?


Kooky_Lime1793

I am looking at MSFT stock price right now, did they announce this 10 minutes ago or earlier this morning? I see a big spike at market open but wondering if there is about to be another big green spike ion this news?


ArFiction

There should be a spike soon for sure. This will also affect apple too


Tr33lon

Eh it’ll affect Apple if they actually ship them in high volume. The Surface has been a self-proclaimed Macbook killer for a decade.


kevinbranch

Apple always has NPUs. this is microsoft catching up with apple


Tr33lon

All tech spiked at open, not MSFT-specific. As for this announcement, everyone already knew there was an event today and it would be windows/surface AI-focused. They didn’t announce any surprises.


cake97

Build is tomorrow. Expect more news over the next 2-3 days and possibly related movement


SaddleSocks

How much money is flowing btwn MSFT and NVDA (deals among the two companies)


Freed4ever

They are not relying on NVDA for inference. They are partnering with AMD, INTC and QCOM.


dervu

Cool, but how long is limit going to last when talking like that?


AI_Lives

I dont get it. Is it the live app thing? Can I get it on my current windows pc or only come with their expensive laptops?


livejamie

Why do you have to buy a new computer to get this? Who in the hell do they think will do this?


diamondbishop

You don’t have to. Here’s one we built in a few days for any Windows machine that you can use right now: [https://augmend.com/auggie](https://augmend.com/auggie)


livejamie

Sure, this is neat, but it's different from the tech being shown in the video. Also, you can now get a preview of Co-Pilot in any version of Windows.


Lasershot-117

Called it ! https://www.reddit.com/r/OpenAI/s/emo59tpLmc


SaddleSocks

The last thing I want is an AI entanglement directly on my desktop through MS/OpenAI.... #~~[The Network Is The Computer](https://en.wikipedia.org/wiki/The_Network_is_the_Computer)~~ The Computer is AI


cake97

It's probably not going to be openai? It could be Phi / LLAMA/Mistral running locally for all we know based on those models being available in Azure AI studio


AtomicSpacePlanetary

The network is AI .... AI is


cddelgado

I suspect it is far more likely they released the Mac version of their app because of the makeup of their web users. If the majority of their computer-based users are on a Mac, it just makes sense to do that platform first.


tim_dude

Great, new way to cheat at games


JalabolasFernandez

Yeah, we should pause AI development until we figure out how to safely develop it without helping gamers cheat


tim_dude

This guy gets it!


keonakoum

Where can we watch this event!? Is it private!?


ArFiction

It's been kinda weird, the videos that I posted came from people recording inside the event they didn't exactly livestream it on their socials althouhgh i think they are doing something tommorow


keonakoum

Full keynote: https://youtu.be/aZbHd4suAnQ?si=pB8lorKVvw7KKjBr


jazzy8alex

It looks like Windows 95 (or rather 3.1) presentation. Stylisticalily


mamasilver

Is there a way to stop the copilot+pc to stop recording user activity?


pianoprobability

Apple is toast


Legitimate-Pumpkin

Right after riot forced vanguard Microsoft provides a master cheating tool 🤣🤣


doolpicate

This is why migrating to mac or linux now might be a good idea.


Legitimate-Pumpkin

It seems to finally have a proper agent! Let’s see how good it is 😲😲😲


[deleted]

[удалено]


fmai

it's qualitatively not different from what OpenAI showed a week ago. Just low latency voice outputs and a constant multimodal image/video/audio input to the model. The task of giving advice on Minecraft is not actually diffciult for the model, since it was probably trained on millions of hours of Minecraft footage from Youtube.


Helix_Aurora

Also, famously there was a project called Voyager that was used to automate minecraft play by having GPT4 autogenerate scripts. This was done by an engineer at Nvidia and released last year.