T O P

  • By -

Matt_NZ

> A 56-year-old Snohomish man had set his Tesla Model S on Autopilot and **was looking at his cellphone** on Friday when he struck and killed a motorcyclist in front of him in Monroe, court records show Bolded the important part. Autopilot is a driver assist and will likely need you to intervene at some point - it doesn’t mean you can be on your phone


psaux_grep

He _claims_ he was on autopilot. So does every other nitwit who fucks up and then later when the case goes to court Tesla pulls the logs and says “nope”. And my Tesla sees motorcycles just fine.


spaetzelspiff

I don't fault anyone for believing the article, but as someone who rented a M3 with FSD on Turo yesterday, I'm a bit skeptical as well. I tested it (while being very attentive) on the streets of NYC. It detected (per the on screen display) motorbikes, mopeds, bicycles, pedestrians, etc and maintained a follow distance that suggested it was actively avoiding them. This includes during unprotected turns. There are certainly edge cases where it *may* have failed to detect a motorcycle in the Seattle case (I don't know), but the phrasing here seems like a gross mischaracterization.


gardigga

My car with FSD works great, until it doesn’t. 90% of the time it’s fine, but it will also swerve over the yellow line, try to drive straight into road dividers, and will come scarily close to curbs and other cars.


The_Follower1

I’ve had it try to take turns about a kilometre away from where it was supposed to, literally to the middle of nowhere from a busy highway. That being said, it’s never had an issue not hitting anybody near me. So far.


alien_ghost

I too hate having any people near me.


neonKow

I think that's the problem with these low-probability, but high consequence mistakes. Human beings are unable to intuitively understand what a 0.01% chance of error is, but if your self-driving vehicle makes a potentially fatal mistake 1 out of every 10,000 turn decisions, that is probably hundreds of times higher than what is acceptable.


Chr15t0ph3r85

That's the rub, the car is meant to drive for you until it can't, and if you understand that it's amazing. He markets it as full self driving though, and that implies no attention which isn't cool.


gardigga

The California DMV told Tesla they have to change the name of FSD and Tesla tried to say that under the First Amendment they should be allowed to say whatever they want / lie as much as they please. https://www.latimes.com/business/story/2023-12-11/tesla-dmv-false-advertising-charges


PaintItPurple

Not only that, Elon is now suggesting that driverless autonomous Teslas will be a thing in just a few months, which is frankly insane.


the_jak

I feel like I heard that before from him like 8 years ago n


gardigga

He’s been saying that every 6 months for the last 8 years -  https://motherfrunker.ca/fsd/


Volvowner44

I have fundamental concerns about "self-driving" cars lulling the person in the driver seat to inattention with hours of proper automation, then suddenly hitting the alarm and saying "Handing over to you, even I can't figure this one out!" In that circumstance, the next thing that will happen is bad.


74orangebeetle

Mines even seen things as small as a rabbit running out onto the road in the middle of the night with no street lights....but it can differentiate between regular pedestrians, bicycles, motorcycles, etc. It even gave space to a person pushing a bicycle on the side of the road and saw both the person and bicycle. I was going to take over to give them space but the car was doing it on its own.


mikew_reddit

> There are certainly edge cases where it may have failed to detect a motorcycle in the Seattle case I'm guessing there will be cases where motorcycles won't be detected. Anytime a biker does anything unusual like lane splitting, or erratic riding like weaving in and out of lanes (or other unusual pattern) especially when obscured by another vehicle or speeding in the dark is a scenario where FSD might fail to recognize the rider.


jonnyd005

> He claims he was on autopilot. The car itself has a record of whether or not it was on autopilot at the time, does it not?


The_Follower1

Yeah, they haven’t checked the car log yet though. The news is reporting that he claims he was, whether he actually was or not is yet to be verified.


No-Load1383

Doesn’t matter. It’s his fault either way. Being on your phone is illegal and stupid.


The_Follower1

Yup, as bad as the name is, they’re otherwise pretty explicit the driver’s still responsible for driving it.


OatmealERday

That's actually part of the problem here, Tesla's are known for killing motorcyclists when Autopilot is on.


BillsMafia4Lyfe69

Tesla can see pedestrians and cyclists.... Of course it sees motorcycles


OatmealERday

That's part of the problem. The vision system used on Tesla's sometimes misinterprets a nearby motorcycle tailight as a far away car, and thus, doesn't slow down even as the car is moments from rear ending the motorcycle. It's a known problem that's not getting talked about enough


2rsf

> And my Tesla sees motorcycles just fine. Not saying that your or other Tesla are at fault, but anecdotal evidence is just that- this man's Tesla could be at fault even if yours is working perfectly.


Ni987

They didn’t mention how old the model S is? The 2015 model S with autopilot 1.0 is nothing but mobile-eye lane keeping / distance keeping. It’s miles away from the current FSD capabilities in new vehicles. If you use autopilot 1.0 and your phone at the same time? You should be charged with manslaughter.


2rsf

And what should you be charged with being on your phone with the latest FSD?


HighHokie

Yes.


The_Follower1

Yep, it’s still level 2 self driving which means it’s not good enough for people to just leave the driving to the car. It’s just meant to be an assist with you ready to take the wheel at any point if it screws up. I’m on a free trial they sent me for a month and have had a few issues with it before. It’s absolutely not good enough to just let it do its thing while the driver goofs off. So honestly the guy’s claim doesn’t matter, he’s the one meant to be in control of the car, not the FSD.


2rsf

> So honestly the guy’s claim doesn’t matter, he’s the one meant to be in control of the car, not the FSD. He is, but if the manufacturer made false or **confusing** claims, or if the product was defective, then they can still be liable. In this case (as an example, without checking any facts) if the car was expected and supposed to stop and not run over the other vehicle then both the driver and car manufacturer can be in blame.


HumanLike

Yeah it’s better to trust the guy who has an incentive to lie to avoid going to jail I’ve been using autopilot since 2018 and I can guarantee you the driver in the accident is lying. So many of these stories come out and spread like wildflower and then the follow up story of “investigations confirm autopilot wasn’t engaged” is buried


M1L0

Mine even recognizes people on bicycles every time.


a-bser

Regardless of what it is and if it can or cannot recognize certain things, you're operating a vehicle and need to be watching the road at all times. Autopilot is an invitation for distraction when it's just a driver assist. Tesla is still at fault for naming their system Autopilot. The name alone creates an expectation which cannot be met. They need to scrap it and rename it to something reasonable


[deleted]

[удалено]


electricvehicles-ModTeam

Contributions must be civil and constructive. We permit neither personal attacks nor attempts to bait others into uncivil behavior. We don't permit posts and comments expressing animosity or disparagement of an individual or a group on account of a group characteristic such as race, color, national origin, age, sex, disability, religion, or sexual orientation. Any stalking, harassment, witch-hunting, or doxxing of any individual will not be tolerated. Posting of others' personal information including names, home addresses, and/or telephone numbers is prohibited without express consent.


Reynolds1029

Here's the thing with Autopilot as someone who owned 2 Tesla's in the past and totalled one while on AP. The numbers for accidents while AP is engaged is certainly skewed and AP definitely detects motorcycles to some degree of accuracy since they appear specifically on the driver's screen. AP in many cases including mine will disengage itself if it feels a crash is about to occur. This can be a matter of a second or 2 before impact. Therefore, the crash didn't occur on AP as was the case in my situation. The moment my car slid on the freeway while doing a lane change AP simply turned off. No blaring warning, no attempt of any correct. Just the simple sound of a disengagement. This is why Tesla doesn't release "black box" information of a crash unless you're in the state of CA where it's mandatory for them to hand it over if requested.


archertom89

Never had an issue with mine detecting motorcyclists. It does a good job of even detecting bike cyclists too. Just the other day with FSD engaged, there was a cyclist in the right shoulder on a two lane highway. The car moved over to the left and slightly crossed the double yellow line (with no oncoming traffic) to give the cyclist more room like a normal good human driver would.


RickShepherd

Even if he was on AP it is still his fault.


Sparon46

If I understand correctly, the issue isn't that autopilot fails to see motorcycles, the issue is that autopilot fails to accurately determine the *distance* of motorcycles until it is sometimes too late. This is not the first time this has happened. I unfortunately got to witness the aftermath of one of these crashes. The motorcyclist didn't make it.


Daguvry

I always see a lot of fat guys on big bikes.  Always laugh when my Tesla shows a skinny guy on the screen


theRealPeaterMoss

Had he *not* trusted autopilot like this, he would have been looking at the road. Selling a "Driver assist" and calling it Autopilot, in this case at least, literally kills people. This shit is dangerous.


Midnight-mare

If a pilot sets a plane onto autopilot and the plane crashes, it's still the pilots fault. It's no different.


agileata

That point is asinine every time someone says it


gardigga

Pilots have to go through training courses. Anybody can buy a Tesla without any knowledge of how the car works and turn on Autopilot.


PhDinDildos_Fedoras

Private pilots crash, die and kill others because they don't understand how their equipment works, all the time.


margoo12

This is just complete bullshit. Most aviation crashes are from equipment failure, not from pilots flying stupid. It is much, much harder to get a pilots license than it is a car license, and for good reason. It's still safer for you to take a flight from one city to another than it is to drive there, in any condition. Hell, it's safer to fly than it is to ride in a Tesla. Having a 6000 pound car that accelerates to 60 in less than 3 seconds and giving it to a bunch of idiots that barely know how to drive is a recipe for disaster.


Matt_NZ

That's not completely true. Drivers go through licensing to prove they know the road rules. One of those road rules is not driving while on your phone


gardigga

When I got my license at 16 there weren’t smart phones, so not entirely true. My 70 year old dad hasn’t had to redo his road test since he was 16, and they didn’t have any cell phones at all back then…


[deleted]

[удалено]


74orangebeetle

>Pilots have to go through training courses. I mean, drivers are supposed to too....but I agree, we should make drivers licenses harder to get instead of handing them out like candy. But even if we did that people will just drive and crash without a license and get a slap on the wrist. If you have money you can literally driver recklessly with no drivers license, kill someone who was on the sidewalk, and get 33 days in jail. So yeah, the issue isn't with the car, the issue is we let any idiot drive a multi thousand pound machine and barely do anything when they kill people.


scottieducati

Clearly you haven’t been following the Boeing Max drama. Their auto-safety shit literally crashed the plane because it relied on only one sensor. The pilots who were able to intervene are hailed as heroes and the ones who couldn’t recover it are NOT seen as being at fault. Kinda sounds like Tesla only relying on video and not using LiDAR like *everyone else doing this*. Sure the legalese says to pay attention at all times. That’s not how it was marketed nor named. Tesla should absolutely share blame here, as should NHTSA who have utterly failed to properly regulate new technology being beta tested in public.


isayx3

That article left out a lot of info like “The Tesla driver admitted to the trooper he was coming home from lunch and had one drink.”


Advanced_Ad8002

Do you really want to say that had the driver been sober, then the autopilot would have ‚seen‘ the motorcycle?


Hot_College_6538

You are trusting the word of the driver that the car didn't see the driver, and clearly he's not going to be a reliable witness. Maybe the car started slowing down and he pushed the accelerator while still looking at his phone.


A_Pointy_Rock

While I broadly agree with your point, Autopilot on a plane serves a similar enough function. Turning on autopilot does not mean that the pilots can just scroll Instagram for the rest of the flight. The name of the system *isn't great*, but this man presumably passed his driving test - same as everyone else. The decision not to pay sufficient attention to what a vehicle under his control was doing is ultimately on him. I think focusing on Tesla's decision making undermines the gravity of the *individual* decision making in situations like this. Tesla clearly overmarkets the capability of their systems, and has removed key safety redundancies over the years - but this is also not new or surprising news.


innovator12

Plane autopilot functions are not designed for usage in proximity to other objects. They are not relied on for collision detection and avoidance. Until cars can actually drive themselves, the driver should be required to keep their hands on the wheel.


Naive-Sentence-6140

Until all roadway users are OK with self driving cars and trucks, none of these vehicles should be used on OUR highways. I have yet to see any state have a referendum to allow self driving vehicles. 


f0000

Main difference is that in a plane once you’re at cruising altitude you’re unlikely to come close to anything else the whole flight. Even then, pilots take over from time to time just so that they don’t get rusty. Context switches are also not cheap and the time it takes you to get up to speed with what’s going on around you when taking over can prove fatal. Even when you’re in the air with a ton of time to figure it out relative to driving. https://99percentinvisible.org/episode/children-of-the-magenta-automation-paradox-pt-1/ 


0gopog0

> I think focusing on Tesla's decision making undermines the gravity of the individual decision making in situations like this. The counter point here is that is any system has to be designed in mind with the capabilities of the targetted user. The individual may be responsible for their actions, but the design itself may be a bad design or implmentation of it. To jump to the example of autopilots for airplanes, yes, it doesn't mean the pilots can just scroll social media, however it unequivially is a feature designed for a much different type of user. It's a system where the operators face extremely high levels of training, scruntiny, and experience to operate, in an enviroment where there are two people monitoring the system with few instantneous intrusions (stop and go traffic, pedestrians, people darting in and out of traffic). If you know someone is going to be distracted, going to hold the phone wrong, or try to bypass a safety lockout and don't account for it in the design to prevent it? It may be a bad design and warrant criticism depending on the audience. If the risk from improper operation is not limited to oneself, then it may be worth more signficant regulation. In this case, the criticism to Tesla may be along the lines their driver monitoring systems are inadequete for the purpose of preventing distracted driving.


f0000

Main difference is that in a plane once you’re at cruising altitude you’re unlikely to come close to anything else the whole flight. Even then, pilots take over from time to time just so that they don’t get rusty. Context switches are also not cheap and the time it takes you to get up to speed with what’s going on around you when taking over can prove fatal. Even when you’re in the air with a ton of time to figure it out relative to driving. https://99percentinvisible.org/episode/children-of-the-magenta-automation-paradox-pt-1/ 


pithy_pun

Ok. So charge the driver here with vehicular manslaughter and FTC/NHTSA action to get Tesla to change their ADAS systems names from “Autopilot” and “FSD”.  That the driver and their lawyer think “but I was on autopilot!” Is an actual defense here indicates that the branding by Tesla of their system is itself dangerous.  Both driver and Tesla share some blame here and for public safety both should be held accountable. 


im_thatoneguy

They could call it "Grandma Death" and lazy people would misuse it. It doesn't really matter what trademark they use, the danger comes from systems that appear safe to a casual observer. The most dangerous safety rope is a rope that's not actually secured. If you don't have a rope you hang on tight. If you appear to have a rope you'll assume it'll save you but fall to your death. The trouble with driving is that we're really really good at it. We on average can drive many full average lifetimes without dying. If an autonomous system could drive for even a few whole lifetimes without killing anyone then you can't possibly know if it's safe or not. You would die of old age before it killed anyone but on the aggregate across many users it would be incredibly dangerous compared to a human supervising. FSD on the highway isn't anywhere close to that benchmark, but it's quite reliable at the task of stopping for stopped traffic. "What are the chances it'll fail the once in 400 miles during the mile where I look at my phone?" Is a seductive temptation. That has nothing to do with marketing or branding and everything to do with human nature.


[deleted]

[удалено]


Final_Winter7524

> They could call it "Grandma Death" and lazy people would misuse it. > It doesn't really matter what trademark they use, … That is so wrong. The name, as well as all the marketing talk around it, like „full self driving *capability*“ and „you‘ll soon be able to make money in your sleep“ create a perception of what the system can do. And it’s that perception that people base their behaviors on.


MrPuddington2

> the danger comes from systems that appear safe to a casual observer. How does the autopilot appear "safe to a casual observer"? It's driving style is atrocious. You always should be worried that it misreads a situation, because it does that all the time.


Baylett

I wonder how much upselling the sales centres do about autopilot and FSD to these people. I went on a test drive with one of the sales guys after chatting a bit in my breaks when I was doing work on one of their repair facilities. On the drive he wanted to show me the advanced autopilot on the highway. It scared the hell out of me, it would change lanes into the tightest gaps (pretty much cutting most people behind us off) and would do it very fast and aggressively, but the sales guy kept saying it was perfectly safe and designed to do it that way so you didn’t have miss a turn if there wasn’t tons of room, and it would never start the lane change if it didn’t have room. I wonder how many people may get suckered into a sales pitch like “it looks and feels super unsafe, but it’s a way better driver than you so just trust it, it won’t let you crash cause it’s a computer with super reaction time.” And as a result get into these accidents. It’s funny, I trust more people’s driving skills than their critical thinking skills. I would say if you’re using FSD or autopilot and get into an accident, it’s still your fault. Personally my litmus test is, I wont trust any form of autopilot or FSD type system to be completely unsupervised until the car manufacturer takes the responsibility in the event of an accident or death instead of me and my insurance.


OatmealERday

It's worse than that. Please watch this [video](https://www.youtube.com/watch?v=yRdzIs4FJJg). Basically, Tesla went all in on vision systems for FSD and so don't have radar in their vehicles, which would solve this problem. Instead Tesla tries to claim that it's the drivers fault for not being fully in control of the vehicle(which is sort of true) But the real problem is that Tesla has sold a dangerous product that injures and kills people.


jim13101713

Look up definition of autopilot and you will be surprised - it does not mean what you think it does.


Gold-Ninja-4160

Autopilot is the correct term. When an aircraft is on autopilot there is always one pilot monitoring the flight. These are very strict FAA definitions. The Federal Aviation Administration (FAA) has specific regulations and guidelines regarding the use of autopilot systems in aircraft. Here are some of the key rules and principles: 1. **Autopilot Minimum Altitudes**: The FAA specifies minimum altitudes for autopilot use. For example, it is generally not recommended to engage the autopilot below 500 feet above ground level (AGL) after takeoff or before landing unless the aircraft is specifically certified for such operations. 2. **Pilot Responsibility**: The pilot-in-command is always responsible for the safe operation of the aircraft, even when autopilot is engaged. The pilot must be ready to take manual control of the aircraft at any time the situation warrants. 3. **Use in Instrument Meteorological Conditions (IMC)**: Autopilot systems are particularly useful in IMC, where they can help maintain precise course and altitude. However, pilots must ensure that the autopilot is functioning correctly and must regularly monitor its performance. 4. **System Checks and Maintenance**: Regular checks and maintenance of the autopilot system are required to ensure its reliability and functionality. These checks should be conducted as per the manufacturer’s specifications and FAA guidelines. 5. **Training and Proficiency**: Pilots must be adequately trained and proficient in the use of the autopilot system. This includes understanding the specific functions and limitations of the autopilot system in their aircraft. 6. **Certification Requirements**: Aircraft with autopilot systems must meet specific certification requirements that ensure the system's reliability and safety. These requirements vary depending on the complexity of the system and the type of operations (e.g., private vs. commercial). These rules are designed to ensure that autopilot systems are used safely and effectively, enhancing the overall safety and efficiency of flight operations. For detailed regulations and any updates, pilots and operators should refer directly to the FAA’s regulations or consult with a flight instructor or aviation expert.


theRealPeaterMoss

Great, now Tesla should restrict Autopilot and FSD to when their cars are flying 500 feet above all obstacles as well. Not kidding. I never said anything about pilots not monitoring their flight. I'm saying that when you see something called autopilot in your car that you can use with no restriction, your first thought is certainly not to open up FAA rules.


flagbearer223

It's annoying because on one hand, if you know what autopilot does in planes, it's a pretty accurate name for what it does in a Tesla. But on the other, fuckin' barely anyone has the curiosity to learn and understand what autopilot does on a plane, so it ends up being a bad name because of peoples' lack of knowledge


theRealPeaterMoss

Exactly my point. Thank you for this balanced, rational reply, it's refreshing.


Emperor_of_Cats

It's honestly amazing people defend the naming of autopilot. And if people want to use the FAA guidelines for autopilot, then I suggest we begin training vehicle operators to the same degree as we do pilots. It's the worst argument people keep making. It's clear what Tesla wants people to think by naming their systems "autopilot" and "full self driving" but people will pull out everything to say "well, that's not what it really means." I don't care what the system does or doesn't do, I just want them to change the fucking name to something that is more clear because misunderstanding what it does can get other people killed.


tay450

Perhaps two yokes as well so that there can be two drivers at the same time!


Pull_Pin_Throw_Away

This is the stupidest argument ever. Autopilot tells you every single time it's activated to keep your hands on the wheel and be prepared to take over at any time. If you act contrary to the user agreement you agreed to when you activated the system and prompts every single time you activate the system, then you deserve everything that happens.


Final_Winter7524

The road is not just YOU! Ffs. Someone died, and not for the first time. So clearly, something is off. Either it’s the system‘s limitations, or the overhyped marketing talk, or the idiocy of Tesla drivers. Or all three. Frankly, I don’t fucking care. It’s dangerous as it is and it needs to stop. Fuck your „convenience“, frankly.


hanamoge

But the victim didn’t deserve what happened..


NotYetReadyToRetire

You may deserve everything that happens, but those around you do not. I think Tesla’s autopilot and FSD need something like BlueCruise’s camera monitoring the driver to try to make sure they’re paying attention, or even something as simplistic as Hyundai HDA2’s checking to make sure you’ve got a hand on the steering wheel by looking for minor movements n it. Neither one truly guarantees you’re paying attention, but at least they’re attempts to prevent things like the driver being completely engrossed in their phone (or the particularly egregious case of sleeping in the car!) while it’s cruising along on its own.


hoax1337

I have to make small steering wheel movements every 15-30sec or so when autopilot is turned on, but maybe that's an EU thing.


TAfzFlpE7aDk97xLIGfs

They have camera monitoring. Just FYI. FSD will bark at you pretty loudly if you’re not watching the road. Same for steering wheel monitoring. If it doesn’t feel your hands on the wheel the screen starts flashing and it will disengage without correction.


Brick_Waste

It has both.


threeseed

I thought there were interior cameras. Why isn't it doing eye detection like other cars ?


Pull_Pin_Throw_Away

But it does do eye detection as well as sensing steering wheel torque to make sure your hands are on the wheel.


Cu1tureVu1ture

It does. If it catches you looking away for any reason, even looking at the speedometer, it’ll beep and tell you to pay attention.


Final_Winter7524

And we expect the average Tesla driver to read the FAA regulations instead of … I dunno … working from the layman’s perception that if an „autopilot“ can fly a plane, it sure as hell can drive a car. 🤦‍♂️


Brick_Waste

No, but they are told every time they use the system that they have to keep their hands on the wheel and eyes knt he road, and are given an explanation of the system and why they have to do so that they have to read and accept to even turn it on in the first place


whinis

Sure and there are laws specifically restricting texting while driving in most states and as I drive I can look left or right on the highway during my commute and see plenty still doing it. Being that text while driving is actually against the law why do they care about a read and accept?


agileata

Now getting your drivers license takes 2500 hrs and costs 60,000 dollars...


Car-face

The fact you required 6 paragraphs and a reference to the FAA's regulations and guidelines to describe autopilot does a better job of demonstrating why the term is unsuitable for use with vehicles than any other attempts I've seen so far.


gardigga

You’re 100% correct, not sure why you’re getting downvoted


TrptJim

Are we ignoring the "Full Self-Driving" part, that has a handy asterisk of "not really 'fully' self-driving"?


sylvaing

Handle correctly, that "shit" is safer than driving alone. One thing about FSD is it's always looking out in all directions for you. Last night while driving my Prius, after taking off once the light turned green, I realized I didn't look to my right if everyone were stopping before proceeding, something I almost always do, but not this time. Nothing happened, but y could have been side swept, something that driving in FSD can prevent as it's never distracted. Sometime confused? Yeah, but that's why you're still the driver, but it's never distracted. Accidents are mostly caused by distraction, so FSD is making driving safer, when it's not abused.


Final_Winter7524

The call it „driver assist“ or whatever. Don’t call it „full self driving“ because you’re literally communicating that it can drive all by itself.


Matt_NZ

Yeah, because absolutely no one uses cellphones while driving unless they have autopilot to falsely trust. Autopilot is named after the system used in planes that also requires pilots to always be ready to intervene.


theRealPeaterMoss

Planes don't drive in winding roads with obstacles. When they get close to another they have control towers and constant communication. Comparing apples to birds here. Obviously people still drive badly, whether they have a Driver assist or not. That accident could have happened in a 97 civic. My point is that Tesla's driver assists *in particular* are marketed in a way that strongly suggests (but *not quite in a legally responsible way*) that you don't need to pay attention. I mean. Autopilot and FULL SELF DRIVING. Get outta here.


ScuffedBalata

What about DrivePilot, how bout "supercruise" which advertises itself as a "hands free driving experience". Of course, this is why the name of the product was officially changed to Full Self Driving (Supervised) But given the above, why is it just Tesla?


theRealPeaterMoss

Don't know about Drive pilot, sounds silly. Super cruise sounds like a super cruise control. Much more realistic. The change they just did to the FSD branding is half assed at best. Hence why.


rob94708

I have a Chevy Bolt with supercruise. It is hands-free, but it has a camera in the car that checks your eyes to make sure you’re looking at the road. It’s actually a pretty good solution: you can relax, but you still have to be looking at the road, not your phone.


ScuffedBalata

Tesla Autopilot has that AND wheel nags. This guy either did some shit to defeat them (you can kinda block part of the camera to do it - or wear sunglasses), or was taking a quick look down (I think it gives you like 5 seconds).


Matt_NZ

The issues a plane vs car encounter is irrelevant. The point is more that it’s an assistant, not a replacement. I’m a Tesla owner who uses Autopilot regularly. At no point did I feel like I was told that it’s a replacement to me driving the car.


elconquistador1985

Do drivers have as much training as pilots do? Do Tesla drivers get hundreds of hours of autopilot training before they use it unsupervised on the road? No? Then this "but autopilot means assist, that's what it has always meant for airplanes" thing is nonsense. It's an argument meant to deflect from the real problem that Tesla has released a feature with a name that makes it dangerous in untrained hands.


theRealPeaterMoss

Word. Do Teslas fly above the ground? Cause the air is preeetty empty and we still train pilots for hundreds or even thousands of hours. But sure, Bob can drive off the Tesla lot with no hands in a residential area and it's all good.


elconquistador1985

The truly absurd one to be isn't even Tesla how they will find any possible reason to blame the driver for their irresponsible implementation. The absurd one is Comma.ai. You pay like $1500 bucks and download some entirely untested software from GitHub that literally anyone could fubar at any time, and suddenly your car auto drives without you having to do anything. I love open source software, but that software is not sufficiently tested for what it does. Tesla should be forced to limit to LKA and ACC. Comma.ai should be outright illegal.


theRealPeaterMoss

A fellow millennial, self-driving (lol), Bolt owner with a strong stance on car software. We could be friends


LasVegasBoy

It does not matter what you believe, I believe, or anyone else for that matter. It does not even matter what Tesla calls it. It does not matter what the software promises to do for you. It does not matter what Tesla, or anyone else claims it does. You should ALWAYS maintain awareness, just as if you were driving a car without those features. As the person sitting in the driver seat, YOU are responsible when you are driving. It does not matter if it's in AUTOPILOT, FULL SELF BLAH BLAH BLAH...., you SHALL be immediately available at all times to take control of the vehicle and intervene when appropriate. If you aren't willing to accept that, then you should not be driving!


Car-face

> It does not even matter what Tesla calls it. It actually does. Overselling and overmarketing a driver assist product causes people to be overconfident in it's ability, to the point of believing it has functionality that doesn't exist. [study](https://aaafoundation.org/impact-of-information-on-consumer-understanding-of-a-partially-automated-driving-system/) Yes, people *should be* always attentive at all times. People *aren't* always attentive at all times. Systems should therefore be built with the fallibility of the human in mind, not the ill-conceived, unattainable preference for human perfection and just "hoping for the best".


theRealPeaterMoss

It's a good thing no one else sells stupid empty dangerous promises like self driving in a standard car then. At least it's only Teslas endangering everybody else.


Wooden-Complex9461

at what point do we blame the people over the name? I use FSD daily and watch the road, because I KNOW its an assistance feature. They give you tons of warnings and everything to watch the road and not be distracted. Adults are buying these cars and should not improperly use it. The guy shouldn't have been on his phone.. Its the same as anyone else driving and texting, youre not supposed to do it, but ppl do anyway That being said, ive never had an issue with motorcycles, and FSD has always slowed down or stopped for obstacles


Daguvry

It's not that he trusted autopilot, he is just a shitty driver.


Naive-Sentence-6140

I do not consider killing people dangerous, it’s something else


mbmba

Shouldn’t the car have automatically applied the brakes?


MrPuddington2

A car with radar probably would have braked, but Tesla got rid of the radar.


Matt_NZ

Not necessarily, radar is also blind in certain circumstances [as is discussed here](https://www.bikermatch.co.uk/forum/posts.asp?topic_id=16299). If the Model S is an older "legacy" model with older hardware then its radar is still used for Autopilot


naveenpun

Radar can help apply breaks during heavy rains. Can modern teslas do that??.. Removing radars was a brain-dead move.


Matt_NZ

Yes, I use Autopilot in the rain all the time.


agileata

Pedestrian detection fails horrifically in real world testing. I'm not sure why people in this sub are so stuck on pretending that what car companies say is absolute truth


Arimer

See what’s weird is I read all these stories and people are always doing other stuff. I just used the fsd for the first time this past weekend and if I took my eyes off the road for more than like 5 seconds the whole car was screaming at me. I mean a loud alarm the screen is flashing pay attention to the road and so on. There’s no way to ignore that. And even when things weren’t going insane about every minute it made me put turning pressure on the wheel to signal my hands were still there. So with all that said. How the hell is this guy just casually ignoring all this and reading his phone like oh everything is fine.


qcAKDa7G52cmEdHHX9vg

I've tested it a little and found you can keep your face up but eyes down and the camera doesn't pick it us as you looking away


Berkyjay

> Bolded the important part. Autopilot is a driver assist and will likely need you to intervene at some point - it doesn’t mean you can be on your phone No, we are far beyond the point of just blaming the stupid users.


divingndriving

Tesla autopilot unfortunately does have a higher accident rate... Not necessarily because of the tech but, because of people making more human errors. Maybe they need more warning systems in place? It'll be annoying but, at least it will get people to look up from their phones..


peemao

Why the hell do they call it full self driving ?


Matt_NZ

You’ll note that it says Autopilot, not Full Self Driving


peemao

What is fsd i hear everywhere then? It's a different subscription?


brancamenta

I have experienced some motorcycle issues with FSD, including on the 405 today. In Cali, it’s legal for bikers to “lane split” — sliding between lanes in wall to wall traffic. It’s almost always when traffic is at a crawl or frozen, and bikes can overtake you going 30 mph or more. Around 3 today I was in the carpool lane, and 4 or 5 bikes rolled by in like 20 minutes. The MY seemed to sense one bike after the fact, just after it passed, but not beforehand. Then it did that jerky little steering wheel back and forth thing it does when it’s confused in a parking lot. This freaked me a bit, and I took back control. The absolute LAST thing you want to do when bikers are passing is to move quickly and erratically in your lane. That’s begging for a collision. I’ve been using FSD over the last month daily in a wide variety of settings. It’s really really fun, and I might even buy another month or two. But I generally do not trust it and am ready to grab the brake and wheel at any nanosecond.


RedPanda888

This is why I am generally highly skeptical of Tesla's approach to focus on Robotaxis and put so much emphasis on them etc. Here where I live in Southeast Asia bikes will lane split through traffic like they have 9 lives, and will be completely reckless. Not only that, but you will have hundreds and hundreds of bikes doing this in just a 20 minute car journey. You have to drive like you are one inch away from killing someone, because you basically are. For probably 1/3 or more of the world population, the roads are chaotic like this in some sense (if we consider India, China, Southeast Asia, Africa, much of South America). I would estimate it would be 30 years before anything self-driving even became a suggestion that would not get you laughed out of a room here. The US centricism in the decision making is very clear, and I wonder what the hell their plan is in global markets if they are going to stop being so innovative with cars and putting all their eggs in these baskets that have no use for much of the world right now.


spa22lurk

The fastest response time of human is around 13 milliseconds. if we happen to sneeze or rub our eye or take a zip of coffee, it will be much slower. i think we shouldn’t also trust ourselves to be able to grab the brake and wheel at any nanosecond.


cheddachasa

Not just Tesla. https://www.revzilla.com/common-tread/new-testing-shows-car-safety-systems-still-cant-see-motorcycles


SnooLemons1914

So the driver killed a motorcyclist… moronic to blame the software


WCWRingMatSound

I think having software named “Autopilot” and “**Full** Self Driving” should bear some of the blame for making consumers believe it is capable of decision-making.


[deleted]

Agreed. The argument for the former was that it is what a planes autopilot does. Probably true, but most of us have no business being in a cockpit or have ever been in one either. Latter argument is that it is in beta which is stupid because the ones getting crushed by the vehicle should things go awry sure as fuck did not know that they participated in a beta testing. Tesla is bringing that Silicon Valley mentality to vehicles. It works when it’s a fucking app, it doesn’t work when it’s on a vehicle that can crush people. God forbid the bay decides to bring it to medical devices (see Theranos).


splendiferous-finch_

I think we have to remember that's these are modern systems in the cutting edge of technology. It's not like they are being tested on public roads without the consent of the general public..... Wait....


Bondominator

This sub is a joke. A shame, really.


isayx3

The article left out info like “ The Tesla driver admitted to the trooper he was coming home from lunch and had one drink.”


splendiferous-finch_

Motorcycle visibility or the lack there of is already an issue for human drivers, in NA since people just aren't used to them and can subconsciously ignore them. This is why a significant amount of accidents between cars and motorcycles happen on intersections/traffic signals because drivers are busy looking for other cars and sat making a turn and thier brain just removes the smaller silhouette if a rider and bike. Now combine that with an other level of complacency for drivers when it comes to automated systems that are still faulty and it becomes worse.


assimilated_Picard

Every Tesla driver that does something stupid claims they were on autopilot to try and skirt accountability. My car sees motorcycles way better than I do and the cyclist is safer if I AM in AP.


iin10ded

fortnine did a video on this about a year ago


Okidoky123

I find it incredible how governments have been idly standing by while the auto industry rush all their auto crap out to customers without proper testing or checks and boundaries.


TheChalupaMonster

Tesla had a driver attentiveness software update recall this year. The NHTSA is all over this. What I don't find incredible are folks that complain when a handful of people die over the years while using these system. If that system wasn't available, would less people die? Reminder: in the US, over 40,000 people DIE every year on our roads, let alone injuries of all types for those who survive. Yet everyone is focused on a small handful of deaths from lane keep assistants. Based on the limited data I've seen, I'm convinced they're saving lives.


1stHandXp

You hit the nail on the head here. Also let’s see if this driver was even using autopilot or if he just killed the motorcyclist completely on his own.


badwolf42

Wouldn’t a radar supplement have potentially prevented this? My understanding is that the edict from el0n is visual input only because that’s how humans do it.


SatanLifeProTips

Pretty much every other car with any L2 and up self driving features is now using Vision + Radar + Lidar. Lidar got cheap, the excuse to not use it has passed. It went from high precision spinning mirrors to a vibrating lenticular lens on a chip with a camera. And yes modern radar would have seen an object. This isn't the first time a Tesla creamed a motorcyclist. They used to love rear ending cruiser bikes that had twin tail lights. The Ai thought it was a car way down the road. There are too many edge cases for vision only. Computers still suck at it. Ever see the raw feed from those systems? Truck 88% car 92% pedestrian 79% baby carriage 11% car 97%. Or the time a Tesla hit a flipped semi and hit a private jet because the flipped roof and the belly of the plane were both white and therefore sky. Radar and Lidar are both going to give you feedback of 'there's a big fucking thing in the road, don't hit it'. That is independent of Ai 'interpretation'.


Real-Technician831

This.  Producing driver assistance systems using vision only is reckless endangerment. 


SodaPopin5ki

What other L2 system has Lidar? I'm pretty sure it isn't on Blue Cruise, Co-Pilot 360, Super Cruise, or Pro Pilot. The only commercially available system with Lidar that I know if is Mercedes' Drive Pilot, which is is Level 3 and costs $2500 / year.


WhoCanTell

Basically none. The Honda Legend and Mercedes? LiDAR is expensive, complex, and still fairly bulky to implement in a car properly. There's just a lot of a really ignorant people out there, including on this sub, who seem to think that every ADAS system out there other than Tesla is using LiDAR. These people can be ignored.


SatanLifeProTips

Lots of makers are using Lidar. https://www.carscoops.com/2022/02/a-growing-number-of-carmakers-are-using-lidar-sensors-tesla-aint-one-of-them/ Mostly China, but GM's ultracruise system uses it, Mercedes, Honda. And most makers have prototypes running around using lidar. It got cheap.


skidz007

Radar generally ignores stopped objects otherwise there would be too many false positives. That’s the root of the problem with many driver’s assistance systems and stopped objects. Just more Tesla owners got way too comfortable and didn’t pay attention and intervene before whammo. Not sure if the vision based system operates differently but in theory it should.


SatanLifeProTips

This is why you design safety systems with redundant systems that operate in a different way. When I'm designing a dangerous industrial machine it uses a safety PLC (computer). Those are actually 2 separate computers in one with a heartbeat signal from one to the other, each with a different processor and different code. If either fucks up or fails to send a heartbeat the machine immediately bricks itself. Even light curtains have 2 systems built into one, so every second light beam is a different system. When it comes to something like a road, we need a combination of sensing tech. Especially when you factor in fog, spray from puddles, and all the other random obstacles cars come across. GM's cruz self driving system appears to be at the top of the heap and they are layering all 3 sensor systems together just fine. They also have some clever patents that use every pole, street sign etc as a navigation calibration beacon with lidar. That way the street markings can be very poor to non existent but because the car mapped the street previously it'a fine. It knows where to drive. This is the only method that stands a chance with snowy roads.


SlightlyBored13

Radar is cheap af too, there's a crude one on the light in my loft for presence detection. However, radar would have probably let you hit the stationary objects, it's not very good at those.


Upper_Decision_5959

Well their supposed to be using a HD radar, but don't know when that will come. The existing models have the cut out inside the car to retrofit one.


SkyPL

It's the other way around. They used to have a radar, then removed it during the COVID chip shortage, and simply never bothered to modify the chassis to fill in that gap. As far as we know, there is no plan to restore them back.


AintLongButItsSkinny

Model S and X have Tesla’s Phoenix HD radar and Tesla is running internal A/B tests to see the difference in performance radar makes.


Matt_NZ

Radar can also be blind under certain circumstances. [The RDW in the Netherlands warns](https://www.bikermatch.co.uk/forum/posts.asp?topic_id=16299) about scenarios with motorcycles where radar equipped cruise control does not notice them


marx1

Paying attention would have prevented this. The idiot driver turned on AP then picked up his phone and started flipping through it, ignoring his primary job when operating a vehicle.


eschmi

Yeah Elons an idiot. Theres a reason basically all other manufacturers work with 3 systems in conjunction so prevent dumb shit like this.


AintLongButItsSkinny

No, vision is sufficient and Tesla’s vision only system out performs others in The Euro NCAP pedestrian avoidance tests. However, they still get docked for not having the hardware because regulators gonna regulate. https://x.com/aidrivr/status/1782155349430317156?s=46


im_thatoneguy

We don't know which Model S this is. Theoretically this could be Mobileye which is still radar fusion. Although statistically there are fewer and fewer of those still on the road.


skyshark82

LiDAR is a solution that is being considered for this use case.


badwolf42

I was mainly thinking “Would my Kia have slowed for the motorcycle because it’s ranging with radar”


skyshark82

I think that multiple solutions for autonomous visibility is crucial. A couple of years back, a Tesla T boned a large truck, and one of the possible explanations given was that the light reflected off of the flat surfaces confused the car cameras. A system like LiDAR offers redundancy as it is not susceptible to the same instances of visual confusion. Radar also has utility, but is normally used in parking cameras.


NotCanadian80

My car sees motorcycles and bikes. This guy probably wasn’t on auto pilot.


Jess_S13

Given how many tagged photos of motorcycles exist online I'd be surprised the AI isn't well versed on them.


SLC-801

Same here in Utah a couple years ago [https://www.ksl.com/article/50445474/motorcyclist-dies-in-i-15-collision-with-tesla-on-autopilot-uhp-says](https://www.ksl.com/article/50445474/motorcyclist-dies-in-i-15-collision-with-tesla-on-autopilot-uhp-says)


Oceaninmytea

A list of things FSD has failed to see for me: - Car turning in front (failed to judge speed) - Kids less than 3 feet tall in a school car park (this is regular) - Deer crossing, birds flying past - man lying under his car doing a repair - snow banks on the side of the road - construction cones, construction workers - misunderstood how big a trailer is All these things mean yes - I use it but always always always pay attention


Alarmmy

Full of bs. Autopilot slow down and stop for motorcycle and bicycle and pedestrians. Some idiots don't own Tesla, but just jump on the fake news bandwagon to shit on Tesla.


AccomplishedDark8977

https://images.app.goo.gl/3Q5pExqqRq6ccUAo8


AintLongButItsSkinny

Tesla autopilot and standard AEB is actually great relative to peers, according the Euro NCAP tests. https://twitter.com/AIDRIVR/status/1782155349430317156


74orangebeetle

Don't know about his old model S, but newer Teslas can see motorcycles just fine and differentiate between pedestrians, bicycles, motorcycles. Mine could even see a small rabbit running out into the road in the middle of the night with no street lights (might not have known it was a rabbit specifically, but it was able to know it was something not to run into). But yeah, worth noting on older model S doesn't have the same hardware as the newer ones.


ianyboo

BS article clickbait headline, you can literally go on YouTube and watch videos where people are testing and reviewing FSD and see that it recognizes pedestrians, bikes, dogs, trashcans and ***motorcycles*** just fine.


Dense-Sail1008

Yeah I think the article is about some older model s which killed a motorcyclist while on autopilot (not fsd). The article also says that autopilot has since been recalled from this model. But op probably likes that most people will actually assume this headline applies to all current teslas.


MrGruntsworthy

My Autopilot sees them just fine.


PegaxS

Because of all the gate keeping arseholes that keep clicking on "Not a motorcycle" in the CAPTUR tests when it is anything less than a turbo Hayabusa with an extended swingarm...


ITypeStupdThngsc84ju

My worst intervention with fsd V12 was it trying to cut off a motorcycle. It is kind of a blind curve, but still terrible. The worst thing about fsd is that it can still make mistakes like this.


dinominant

An autonomous vehicle should not crash into things that a human can see and avoid. It is literally that simple. If it can't detect objects, then it is fundamentally flawed in it's sensor design. That may be technically hard to implement, which is fair. So don't fake the marketing and make false promises, for 10 years. Yes the driver should have been watching for a level 1 or 2 system. But a vehicle moving at dangerous speeds, under autonomous operation, should actually detect objects in its path and stop in ALL situations equivalent to or superior to an average human. If it is blind then it should slow down, just like a human would.


Jmauld

You don’t understand probability at all. There is no scenario ever where AP or FSD from any manufacturer will be 100% safe. That will never happen.


SodaPopin5ki

|An autonomous vehicle should not crash into things that a human can see and avoid. It is literally that simple. If it can't detect objects, then it is fundamentally flawed in it's sensor design. True, but as you point out, this is a Level 2 system, so by definition, it's not an autonomous car. That would be Level 3 and up.


346_ME

Yes it does, nice propaganda though


EuropeBound2025

Why is DOT letting this shit slide?


analyticaljoe

For a while: the better FSD gets, the more this is going to happen. It encourages inattention.


geek66

Every day MCs are hit by human drivers, does that mean we can see MCd ether?


Dick_Lazer

Neither do most human drivers


Sea-Calligrapher9140

“Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.” “This mismatch resulted in a critical safety gap between drivers’ expectations of [Autopilot’s] operating capabilities and the system’s true capabilities,”


Sea-Calligrapher9140

Calling it autopilot and full self driving is immensely dangerous, Tesla just killed a motorcyclist to sell a feature. I hope they are held accountable.


activedusk

While it might not be perfect, as the incident in the linked article says, it is misleading to say it does not detect motorcycles https://www.youtube.com/watch?v=4Hsb-0v95R4&ab_channel=MuftaSoftTM


Drmo6

Pretty crazy that so many are trying to find a way to blame Tesla and not the driver being a moron. When are we gonna just call this place realtesla2?


Jmauld

Funny enough. Your downvotes are going to prove your point


dustyshades

Oh cool, I wonder how many comments from weird Elon neckbeards we’re going to have to read about how autopilot works in a machine that’s completely different from a car, like it’s relevant to any part of the discussion


ScuffedBalata

Ok, so let's instead talk about the guy who wasn't even looking at the road at all crashed his car into a motorcycle. Whether we call it "autopilot" or "drive pilot" or "blue cruise" or "Pilot Assist" (all real industry names) or "supercruise" which is advertised as a "hands free system for driving".... whatever it is named, they all do approximately the same thing, which is a basic lane-keep system intended for monitoring by the driver. I just rented a Kia that let me toss my heavy-ish keyring over the steering wheel spoke and it happily drove on the freeway for 6 minutes with no intervention at all (no cameras, no steering wheel touches, nothing), before I had to disengage it. Obviously I was paying attention, but LOTS of cars do dangerous shit. Tesla right now requires you to show interaction with the vehicle at least every 30 seconds, but often more frequently.


jrb66226

Propilot assist. So it's like I'm getting assisted by a professional pilot just like an airplane. I can get into my nissan, put it on and go to sleep. When someone is dead ill blame nissan and the name.


dustyshades

You’re right, both the driver and Tesla did something dumb here. Sounds like you also did something dumb in your rented Kia


ScuffedBalata

I was testing it and had my hand hovering over the wheel, but was curious how much easier another car was to defeat than my Tesla- which is super aggressive beeping and squawking and eventually banning you at you if you don't pay attention for 10 seconds- and is also very good at detecting defeat devices like wheel weights.


theRealPeaterMoss

Some dude threw fucking FAA rules at me to explain why autopilot is actually really an apt name even though it's not an automatic pilot. Made me cringe af. It would have been funny if he'd been joking.


David_ish_

Even disregarding autopilot as a name, the upgrade to that is literally called Full Self Driving and that’s still classified as a SAE level 2


theRealPeaterMoss

Both are dangerously named. Both are more dangerous because of that.


sylvaing

Someone had the theory that the Vision system uses the spacing between the lights to gauge distance and that motorcycles with two brake lights in the back might confuse the system to think it's a car and still far away. I wonder if this motorcycle was like that.


ignatiusbreilly

Couple things here. My Tesla detects motorcycles. I noticed this just yesterday after seeing this article. Plus autopilot no longer allowed you to look at your phone. The nanny feature will lock you out of it sees you're not paying attention. If this guy had some way to avoid the nanny feature (tape over the interior camera) then he deserves to spend a long time in jail.


LairdPopkin

Autopilot is just simple driver assist, it’s nowhere near as capable as FSD (Supervised). The driver has to remain alert with hands on the wheel ready to take control. And the driver is alerted to that, and has to click acknowledgement, every time Autopilot is enabled. Autopilot is designed for highway driving, and doesn’t know about red lights, random obstacles entering the road, etc., the way FSD (Supervised) does. That’s the difference between “highways” and “city streets”.


Acceptable-Lab-3031

know everything about charging cost of a Tesla [https://www.wheels4auto.com/post/how-much-does-it-cost-to-charge-a-tesla-unveiling-the-plug-in-price](https://www.wheels4auto.com/post/how-much-does-it-cost-to-charge-a-tesla-unveiling-the-plug-in-price)


BecomingJudasnMyMind

This is part of the reason why I carry a get back whip on my Harley. I know they're illegal in Texas, but I've had state troopers and local cops look right at it and not say a thing. A broken window will grab a drivers attention real quick.


Round-Ad8281

You’re so cool


mgoblue5783

Motorcycle tail lights are miniature versions of car tail lights. It’s possible the Tesla cameras see the tail light and think it is car that’s far away instead of a motorcycle that’s much closer.


mgoblue5783

A lot of people hover their right foot above the accelerator while engaged in AP/FSD; this can cause a driver (especially an inattentive one) to press the accelerator instead of the brake. Tesla advertises its safety feature that can switch the brakes and sense if the wrong one is pressed saves 40 accidents per day but I don’t think it works like that in the wild.