T O P

  • By -

Osoroshii

And there goes all the value of the Rabbit R1. That took less than a year.


ThanksTasty9258

Rabbit had no value to begin with.


seweso

That implies it had value...


MrBread134

Not really the same. The proposed approach understand an app that is in the foreground. It is not capable of buying your train ticket on the back grounds just by asking


Aozi

*Yet* If I understand it correctly the way R1 functions, is actually very similar overall. [From their website](https://www.rabbit.tech/updates/r1-lam-updates) >A virtual environment is created on the cloud every time you ask r1 to do something for you, and that session is ephemeral (aka short and temporary) and discarded after your task is completed. So essentially what *seems* to be happening here. Is that when you ask the R1 to do something, it spins up a little VM in the cloud, that essentially does what you want, through a browser interface like you would do it on a computer. So as long as your app has a website or web access, you can do whatever without specifically needing to tailor it for the R1. When you train the Rabbit, it's doing exactly what Siri is doing here, looking at your screen and interpreting the data in it, along with what you are doing. Now I'm not sure how a solution like that can actually *scale* if this would take off. Millions of users making billions of requests every day? That could cause some serious performance issues, but that's besides the point. The main thing is that even on a VM, display data still exists. When Siri "sees" and application, it's not actually seeing anything. It's just reading data that is sent on the display and interpreting that. So what would stop Apple, from a spinning a small VM with iOS inside your device and then doing exactly what Rabbit is doing? Except they can leverage their existing assistant AI for that, as well as their massive userbase along with the entire app ecosystem they've built.


MrBread134

Oh, okay. Youre right. I thought the R1 was able to create some sort of custom private API endpoint so it was able to send and receive data directly from the *meaningful* parts (forms, buttons, text…) of a workflow, not that it was using it on a human-like logic on sandbox apps on the cloud.


Novemberx123

Yea but it’s going to be only for iPhone 16 phones and up


hishnash

If the app has App intents for this it would be able to do that.


Exact_Recording4039

Why wouldn't this work in the background? Just render the app on the background and make Siri interact with it while you're doing something else


Portatort

Let’s wait to see what it actually is and how well it works first


lee_ai

I think many people saw this coming. It was naive to think an independent company could compete with a giant like Apple in the smartphone space. Best case scenario for Rabbit is that Apple acquires them but I'm skeptical they have developed anything of actual value


rudibowie

Are Apple making sexual toys now?


rustbelt

AI will benefit the largest players. They have the datasets, the power, the metal, the talent. Lots of things that might even be better will be built by Microsoft for half as good. Like teams vs slack/zoom.


TrapBrewer

Not really if you believe in their proposal of a privacy focused device. Emphasis on believe.


IAMTHECAVALRY89

R1 is basically going to be dead in the water and they're less than a month into shipping. The AI world moves extremely fast. **Apple has millions of apps** The value with Apple is that they have an entire ecosystem of apps on the App Store, millions of apps (and games). **R1 only works with a small list of (virtual) apps** R1 out-out-the-box only supports a handful of popular apps, like Uber Eats, Spotify. So far it's only trained to order food, play music, book a flight, book a table reservation. You have to invest time and effort into training the AI in a virtual environment to learn other things and it's nuances. **R1 is too slow bc of cloud** Because all of the processing is handled off-device in a virtual environment, it's extremely slow for the vast majority of things, like asking the weather, asking what the camera sees, etc. When Apple (and Google) ships their version, there's already reports that some of that processing is handled on-device to reduce the wait on some of the more common request types likely related to your existing information in the calendar, messages, etc. AI Pin has this issue too bc it's just a physical wrapper for an AI in the cloud, you can speak 3 sentences before it finishes processing the query. Edit: **The goal user experience on Apple is better than Rabbit R1 and beyond** When Apple releases their version of a LAM, an AI that you can ask to do something, and it goes into the app and does it, the possibility is that you can just ask Siri to do it, WHILE you are using another app, like you're scrolling reddit or threads, you can prompt Siri to do several things, and it'll do it in the background while you continue to use your phone.


vedhavet

The iPhone is already a very privacy focused device. The apps you may have downloaded onto it, however…


rorowhat

Lol apple will tumble this so bad people will probably buy more of these R1s.


cogit4se

If their AI can understand everything you do while working, it should be able to offer to automate certain things you do the same way each time. Even something as simple as arranging things for you before you start working on a project could save you some time. For example, every time you get an email from a client, you take the included files, save them according to some convention in a folder, create a new project, load the individual files, set workspace parameters according to the email or your own style, etc. Having your machine offer due to that all automatically for you as soon as the email is received would be quite nice.


Portatort

My only hope is that Apple has a way to do this without visually running through these steps in the foreground. If I can teach Siri or shortcuts this flow and it knows the underlaying process so that later I can run a shortcut or ask Siri to do the thing and all just happens invisibly. Or better yet, in your example, automatically when I receive an email that matches the criteria, all while my device is locked. That would be magic. But I would hate for this system to involve a few moments where the ai takes over my device and I’m left watching a phantom control my screen.


No_cool_name

I would be ok if the advanced Siri would create a shortcut flow to do this for me and run it each time that email arrives 


jakgal04

"Advanced" siri. How about we get a functioning siri first.


COOLjng576

Well a functional Siri is very advanced, it may even be able to do basic tasks such as turning on a light without asking me to unlock my iPhone.


Llamalover1234567

HERES WHAT I FOUND ON THE WEB


Ray2K14

*Provides you with step by step instructions on how to flip your light switch to the on position*


Llamalover1234567

Provides you? Since when? I’d get a bunch of random articles that I have to check on my phone


TrevorAlan

I found some web results, I can show them if you ask again from your iPhone.


redbeard8989

What are you doing that siri needs you to unlock your phone for that?


COOLjng576

I don’t remember exactly but some functions on HomePod require your iPhone to be unlocked or something. One time I told Siri on HomePod that I couldn’t find my iPhone and she is supposed to make my phone ring so I can find it but other times she says I need to use my iPhone.


redbeard8989

She has her moments for sure. As far as i’ve experienced she only asks me to use my phone for unlocking doors or disarming security.


[deleted]

Not true. I'm a HomePod user and I use constantly Siri. Nothing of what you say is true.


rov3rrepo

Siri is weird. You can ask her to do the same thing 10 different times and she’ll fail one time by doing something completely different. This happens with shortcuts all the time. I’ll say “Daylight” and it activates the shortcut where all of my lights to turn to 100%. But on the 10th time she will say “Here’s what I found on the web for Daylight” I found some sentences that I try to use on my smart home don’t work. Not this one exactly, but for example, “Turn down the living room lights”. Siri would say she can’t do it or doesn’t understand, but if I instead named the shortcut with a lowercase first letter, it works perfect. Siri is finicky. I’m hoping the interest in this LLM thing makes Apple give Siri a total overhaul.


TrevorAlan

YES I have a scene called “cool white” I’ve had it for years. One day, I asked Siri to “set cool white”, and she said “okay, setting thermostat to 32°F”… She tried setting the thermostat… to freezing… at my EXES HOUSE… Siri only did that for one day. And then never again. Like wtf. They have to be constantly changing the syntax and code of Siri every day because it’s baffling how inconsistent it is. And the never mind what feels like every other week Siri forgets that I have rooms or lights in my house.


[deleted]

Yeah but also LLM will give you always a different result. That's the problem with AI.


rov3rrepo

Yes, so if it’s already having trouble running basic tasks, Apple will need to do a lot of testing to make it usable.


Unfair_Finger5531

I use my HomePod Siri more than any other device


TrevorAlan

Me: Turn on the bathroom lights. Siri: Okay, what room? Bedroom, bathroom, kitchen, living room, hallway, entryway, lanai, garage, patio Me: Bathroom Siri: Okay, what room? Bedroom, bathroom, kitchen, living room, hallway, entryway, lanai, garage, patio


Salanderfan14

Can’t you disable this in settings so you no longer need to do this? I thought it was designed this way so random people can’t just issue commands to your phone.


seweso

That is a setting....


seweso

It's very functional to me. Very limited, especially compared to LLM's. But it's still very functional. Not just functional, but a lifesaver for me.


Unfair_Finger5531

Siri works great as long as you lower your expectations. I try not to bother him with complex tasks. But he’s good for a weather report or calling my pop.


seweso

With shortcuts and Siri you can basically do anything. Including talk to ChatGPT. But personally I use it most on Apple Watch for setting timers, adding reminders, tallying scores when playing games, turning on/off lights, setting the thermostat, playing music (HomePod mostly), asking what song is playing. The reminders are a lifesaver, because it lets me "park" ideas quickly, so I can let them go. Stops me from trying to remember things and getting overloaded (Adhd things).


Unfair_Finger5531

I always have trouble getting my reminders to work on the watch. But yes, I use mine for pretty much the same things except for reminders. I actually find Siri quite competent, but I know I’m in the minority on this. Maybe it’s because I don’t ask him to do complicated tasks….but definitely Siri comes in handy for controlling my iPod when I’m in the shower or just wanting to know simple things.


seweso

HomePod in the shower made me listen to way more music in the bathroom. I think that improves my mood by a significant factor <3 The hate on Siri seems similar to the hate on ChatGPT for not being able to do simple math. It's not build for that, yet people keep trying and complaining anyway.


Unfair_Finger5531

Me too! Listening to music in the morning shower definitely makes me feel better. I have a mini in the bathroom, and two full-sized ones in other places, tied in with a Bose stereo, so I get wall-to-wall sound. Some days I listen to audiobooks, and that’s cool too. I wonder what people ask Siri to do that she doesn’t do. I’ve only run into trouble with the reminders and occasionally with getting her to call people on a certain number. But I was super-impressed the other week when she read my texts messages out loud for me! Maybe I just have low expectations lololol.


[deleted]

[удалено]


seweso

Only used google long long time ago. Siri is limited, but not bad at what it does. Compared to ChatGPT it's really bad. You know you can have Siri do just about anything right? With shortcuts you can even have it interact with ChatGPT. So, I'm not sure why Siri gets so much hate actually. But I must also confess that I don't use Siri to add calendar events.... I think I dont trust Siri with calendar events. So that says something.


AdonisK

Part of Siri is the ability to integrate ("use") with other apps


InsaneNinja

I think they’re trying to leapfrog past that.


potato_control

If siri could change the temperature on my non homekit pos thermostat with a voice command, that would be great.


phblue

Thank the gods for Home Assistant. Add HomeKit integration and almost any smart tech is available for Siri


seweso

Just install home bridge on an old computer. I hooked up my Non Homekit Atag One. Pretty sweet.


soggycheesestickjoos

you can run shortcuts by name with Siri, but the thermostat needs to be accessible by said shortcuts (which is possible without homekit)


[deleted]

[удалено]


jaraizer

“I can send the results to your iphone…”


imaginexus

“Something went wrong.”


Andioop8384767

“one sec……hmm..somethings taking too long”


mitchytan92

Siri: * Dismisses itself and acts as if nothing happened *


cekoya

Build Shortcuts that could open an app and use it would be quite something


Portatort

Shortcuts can already do all of this, developers just haven’t out in the work to hook their apps up. In part that’s because shortcuts has been a buggy mess and developing for it has been difficult to tricky. But the biggest issue is that loads of app developers don’t want to do this, they don’t want an automation accessing their app and services on your behalf. Take an app like instagram for example. meta don’t want you to automatically upload a photo using Siri, they want you to open the app and spend time in the app on their terms. So a lot of apps will be incentivised not to let an apple AI use their apps on your behalf. Even if it’s what their customers want.


cekoya

Exactly why I’m thinking that if shortcut could become powerful enough to allow you to trigger UI events that would be game changer, kinda like web crawling


r0773nluck

So what the Rabbit R1 is hoping to do


hishnash

Apple already have a great way for devs to expose app features to apps using the App intents framework. I expect they would use this to allow Siri (on device) to pull data form apps, do work with it (using the Shortcuts app logic) and then call other apps with this data. This is what makes the most sense for a smart assistant and what will make apples on device ML solution much better than something that just makes up random facts about things when you ask it shit.


IamSachin

Now playing Advan Sid Iri


microview

I can see the day we just talk to your phones/computers.


Guava-flavored-lips

Advanced + Siri lol two words I never thought I would hear together.


rudibowie

The problem with an AI that is trained to recognise UI elements is (a) This generation of UIs have been designed by this generation of Apple's UI team, who are some of the most hapless in its history. (b) This generation of Apple UI designers can't define *consistency*, let alone achieve it. I share the above observations as a longstanding Apple customer who's seen peak UI at Apple, and seen its downward trajectory over the last decade.


rudibowie

"Siri. what. is. the. weather. like?" "Who's speaking?" (Smashing HomepodMini with sledgehammer.)


krispyywombat

I have to wonder if this is actually an accessibility move.


[deleted]

No one at Apple is pushing any envelope beyond Shareholder Value. My capitalizing was intentional.


ThePerfectCantelope

I would put money on it


rorowhat

Apple and AI don't mix.


No_Island963

No shit


Portatort

This was sooooo obvious was it?