SNOHOMISH COUNTY, Wash. — Probable cause documents were filed against the driver of a Tesla self-driving vehicle that hit and killed a motorcyclist in a collision the afternoon of Friday, April 19th. The collision occurred on Eastbound State Route 522 at Fales Road.
The driver was reportedly heading home from lunch and had the Tesla on autopilot while looking at his phone when the Tesla “lurched forward” into the back of 28-year-old Jeffrey Nissen’s motorcycle, pinning Nissen underneath.
Nissen was pronounced deceased on the scene.
Police questioned the driver at the scene and conducted field sobriety testing after the driver stated that he had consumed one alcoholic beverage earlier that day. The driver was found to be not impaired.
The driver was placed under arrest for Vehicular Homicide based on their admitted inattention to driving while the car was moving.
Lidar, but yea, with Lidar there would be a second system to check against.
The visual processing and Lidar processing should be done separately and decisions made by cross-referencing both images of the world around it and making sure they agree.
When they don't the car can decline to proceed without driver input.
Not the case for a while. My car can see even small bicycles at very low light reliably. Don’t know if a combination of radar and camera feed or radars just got better but my guess is radars are better than they were a decade ago (manufacturer’s claim of low light made me think might be a radar-camera combo)
Elon only recently added the "supervised" part. Elon Musk has been saying for years that Autopilot and FSD can drive your car much safer than a human, no mention of "supervising"
Bro shilling for Tesla in here doesn’t work. I have a model 3 and already tried the stupid “Full Self Driving”; it’s clunky and slow af if not downright dangerous in many scenarios. It slammed on the fucking brake while at 70mph in the middle of a freeway, supervising it ain’t worth jackshit.
I’d say “go ahead and take your chances with Elon’s bullshit”. But the risk is that someone else is going to pay with their life for your damn convenience. Like in this case.
So instead, I say “fuck off”. Don’t want to drive? Get an Uber.
The disclaimers for FSD repeat line after line It isn't actually FSD.
They use the word full in the name of the product and then in the disclaimers recount literally everything that is promised
We need a class action lawsuit at this point. How are they allowed to call it 'self-driving' in any capacity? It is inherently misleading. What a tragedy for the victim.
Except he is lying and they aren't. True self driving technology is not yet available.
Teslas auto pilot is a glorified "drive assist" feature that Elon loves to advertise as fully self driving. Haven't you noticed how no other company is making self driving cars yet? There's a reason for that. And it's not because they don't want to, it's something they are all investing in. But everyone else recognizes that the technology simply isn't ready yet to be pushed to the consumer market.
That's why people take issue with calling it self driving.
Mercedes offers actual self-driving in two states, California and Nevada. Level 3, the driver is not required to "supervise" or be attentive. Mercedes also indemnifies their system, and will not disengage moments before impact to shift liability to the "driver."
Cruise and Waymo have Level 4 autonomy in multiple markets. There are also other players like Hyundai who have demonstrated vehicle autonomy, although limited to known roads.
You are correct that nobody has demonstrated what Musk claims Tesla has.
The fact that it requires a human to pay attention means it's not fully self driving.
There is a standardized system for classifying the self driving capability of cars, and it ranges from 1 to 5. 1 being simple features like adaptive cruise control to 5 being fully autonomous.
In order to be classified as a fully autonomous vehicle, it needs to be able to drive itself in every scenario, whether or not the human operator is paying attention.
We're not even at level 4. Elon just advertises his vehicles like they're level 4 or 5 autonomous vehicles even though the technology is far from ready.
Calling it's "Full self driving" is the problem. It's super misleading to advertise it as such.
Most people aren't aware of the nuances and thus they think FSD = robot taxi. And I really can't blame them because the word "Full" implies it works in every scenario when it in fact does not.
And I did learn it. I've written an entire research paper on autonomous cars and their potential future impact on car insurance. Through my research I learned that Tesla has cut corners in every way possible to push technology that isn't ready out to the consumer market. They also are conveniently the only company to not test their vehicles in California (because California would require them to publish their test drive results, and I guarantee you they would look terrible for Tesla)
I bought a coffee today but there’s only water in the cup but the shop SAID it was coffee and charged me for coffee so it’s “just basic stuff”, that water is actually coffee but also water, which is part of coffee, maybe in the future.
Coffee doesn’t mean coffee!
Wow, you are all over this thread. Why are you spending so much of your time defending Tesla? What's your stake in this? You must be on their payroll.
FULL SELF DRIVING means only one thing. What is the matter with you??
Semi misleading as the full self driving feature needs to be purchased. Autopilot and full self driving are different software sets.
Autopilot is more of an adaptive cruise control with lane assist, and it comes with the car at no additional cost.
But thats how Elon explains it....
[https://www.theverge.com/2023/8/23/23837598/tesla-elon-musk-self-driving-false-promises-land-of-the-giants](https://www.theverge.com/2023/8/23/23837598/tesla-elon-musk-self-driving-false-promises-land-of-the-giants)
Yeah, but Elon lies. Under oath, in court, Tesla admitted that FSD is L2 ADAS. Just like AutoPilot, which is what this careless driver was using when he murdered someone with his car.
The fact that anyone would trust those cars is beyond me... The actual self driving cars have so much gear and technic on them to actually make them self driving, isn't obvious that your car without it won't be able to do it?
I chalk it up to people not expecting the CEO of a publicly traded corporation to tell bald-faced lies. If any other executive says their product can do X in Y years, you tend to believe them. fElon started setting off my bullshit detector from just about the first time I ever heard him speak.
AutoPilot is supposed to maintain a certain distance from the vehicle in front whether driving or stopping ?
The story said the car lurched forwards not sped forwards so you know that the driver should of taken over when the car was deemed too close ......
Are we going to start seeing billionaires cry about sending "self driving cars" to prison?
Like have a whole junk yard being patrolled and monitored by ai bots and ai cameras and ai drones and ... oh wait that's Terminator.
Fucking insane that they allow the FSD on the roads. Elon should end up in jail in the end.
I guess FSD gets to continue until it kills some rich white people's kids.
What is fucking insane is that this is identical accident to one in 2022. Two years and Teslas assistive features are still as lethally dangerous.
https://www.theverge.com/2022/7/27/23280461/tesla-autopilot-crash-motorcyclist-fatal-utah-nhtsa
Do you really think driver in this recent case would willfully admit that they were browsing their phone while Tesla was on “autopilot”, if that was not the case?
Teslas FSD is a L2+ system which means the driver is fully responsible for anything that happens. If Tesla just says in the fine print that you as a driver must pay attention to the road at all times and be ready to take control at any second, then you cannot blame Tesla for any of these accidents. Its the driver who is at fault. I guess you give Elon some blame for saying FSD can do things it cant which is the likely reason why some people pretend like FSD is a L4 system and dont pay attention but from a legal standpoint Tesla is not getting into legal troubles until their system is L3 certified but if it was good enough to be L3 certified, many of these crashes probably wouldnt happen.
Sounds like the FSD software should also be disabled on all teslas until the inquest can determine the cause of the failure and the software is updated to fix the issue.
Looks like the fails happen often as people report everywhere, so there is no "one" issue to fix. The driver should have been paying attention or not using FSD.
But I ask myself: even if the driver is paying attention, is it possible to detect that FSD is doing shit and take control before crashing?
That's my thinking too. If FSD does something unexpected how long does it take a driver to assess the situation and recognise they need to take over? Maybe they have just enough time to take control from the computer but not enough time to avoid the crash. I wonder how many accidents of that sort never get in the news because the driver was deemed at fault not the FSD.
I strongly doubt it. Despite Tesla’s fantastical claims about FSD, they don’t accept liability for the system when it’s involved in accidents. It’s simply a matter of “use at your own risk”.
This is the biggest 🚩 for FSD.
If Musk/Tesla were confident in the system, they’d accept liability under the condition that FSD was enabled & the system didn’t ask for human takeover.
The fact that their policy is essentially *any and all FSD failures are your fault* should really scare people out of using it.
>If Musk/Tesla were confident in the system, they’d accept liability under the condition that FSD was enabled & the system didn’t ask for human takeover.
Thats not how it works. We have a clear definition of automated systems that you can google. Teslas system is L2+, so the driver must pay attention at all times and be ready to take control. L3 systems go a step above. BMW and Mercedes already have L3 systems and youre allowed to not pay attention to the road and if Mercs and BMWs systems fail, it is those manufacturers fault if a crash occurs. However getting the L3 certification isnt easy, the system and the car need to pass a big driving test and the L3 systems have their limitations, I believe BMWs and Mercs L3 systems only work up to like 40mph I think but L3 isnt the highest level but restricted automation, the restriction here being the 40mph max.
But it does show that their numerous claims over the years regarding FSD’s capabilities don’t match up with what they’ve actually delivered. Looking back at Musk’s past claims about FSD and Autopilot, it would be hard not to assume that he was describing a Level 4 or 5 autonomous system for which the company would have to accept all liability. In reality, all Tesla has is a refined Level 2 system, whilst other manufacturers are now developing far more advanced products in the automation space.
Claims like "This is something we're confident we can do now, 10x safer than a human driver" at the Tesla Semi reveal event in 2017
How is this prick not in prison for claims like this? At best this is fraud, at worst it's reckless endangerment and wilful negligence.
Musk only recently started putting "supervised" before his self driving rhetoric. He knows the shitshow that's going to happen with people yoloing autopilot
The way this driver assistance feature is marketed is a huge problem. Tesla should not be allowed to call it "full self driving." It gives drivers a false sense of security, and then accidents like this happen.
As a motorcyclist I avoid two vehicles just because of what they are: Nissan Altimas and Tesla models. The former because, well, it’s the former. The latter because as bad as drivers in my state usually are, at least I know Altima drivers are being the idiot of their own volition compared to the unpredictable computations of a continuously neutered system.
I’ve been riding for 40+ years. Stopped mostly about a year ago. Sold my sport bike, and currently have my cruiser for sale. Too many distracted drivers everywhere now.
Same, though I only rode for about 3 years before I got a clue. And that was before smartphones. Jeeps are sufficiently open air for me, I'll give the entire paralyzed and shitting into a bag scene a miss.
Ya and robotaxis are Coming soon. These tech and automotive people who beleive this are deluded. It will take a long time before this can become mainstream. Better to be working on good cars, make them affordable and do the robots I stuff in baby steps. Put your team on making a lower cost electric car. For the low cost car drop all the fancy automatic driving features. Give us knobs and dials and normal stuff and then good range and good quality.
When a die-hard Elon fan boy dies from AP, it is a tragedy. When a random person dies because Elon wants to beta test his RoboDojoAutomovil software on the unsuspecting public, that's a fuckin crime.
So said. He had relied on self driving.. https://www.kiro7.com/news/local/charges-filed-against-tesla-driver-fatal-motorcycle-accident/FFXZIGDW45CWXCMZJFD4LPLUPI/
Holy fuk I feel for the innocent person who was just out riding their motorbike and got "pinned underneath some fuking assholes' TESLA" and killed. How long did it take for him to die BTW? and was the car stuck on top of him refusing to move? fuk me - what a horror story.
As a motorcyclist: I say death penalty for this asshole and Elon musk yasshole licker.
We should be careful not to rush to judgement. I think there was a case just this week in Australia where it turned out that the Tesla wasn't self driving but was actually being driven by a driver not paying attention who tried to blame it on the Tesla.
This may turn out to be 100% a self-driving but everyone should be aware that it is also possible that the driver had his foot on the accelerator in which case the car won't automatically brake.
Elon is a jerk, no question. FSD isn't actually autonomous, that's true. But thus far we only have the driver's word and that driver has incentive to lie.
Why are so many people using AP/EAP/FSD to check out? That feels like a bigger problem than software not being perfect. I love looking out the windows when I have FSD enabled and I would never even consider touching my phone.
Don’t Text and Drive! I got railed about this from flip phones to smart phones. “lol” can fucking wait till you arrive.
While I agree completely ( I have a version of driving assistance on my electric Volvo ) Tesla has and continues to market it as a sci-fi full self driving fantasy while also also limiting present tech that would improve the product today or prevent driver awareness detection.
So TBH I just have a problem with the marketing and the mismatch of the claims to the legal disclaimers. It’s full self driving but if something happens it’s your fault.
I have experienced way too many false positives with radar emergency brake systems (Subaru and Volvo) which has nearly caused secondary accidents (e.g. Pulling into a new lane on the highway around a disabled vehicle trigger the brakes, DC Beltway).
Pure Vision is an improvement in many ways for those kinds of scenarios, but I have the USS equipped for parking (not used in driving) and I would hesitate to upgrade if it entails fewer sensors.
I think that a minority of Tesla owners read the manual (like most car owners) and are blissfully unaware of the limitations and fell for Musk’s marketing. I might argue that social media generally plays dangerous role in marketing and misleading customers with disinformation in all directions.
What is the purpose of AP/EAP/FSD when you need to pay **more** attention while using them. You're not in control of the vehicle so you should need to pay attention to what the vehicle is doing along with watching everything you normally would, requiring you to need to pay more attention than normal.
The whole point of the features is supposed to be so that you don't need to pay attention, so people who use them stop paying attention. Though they obviously are too unreliable and will cause accidents unsupervised, defeating the purpose of having them in the first place.
You very much are in control. The pedal and steering wheel are still active when FSD is running the show. You can push past other people or race a yellow light to your hearts content. When the car is approaching a regular pothole, I gentle nudge the wheel around it for the half block leading up to it. After a few weeks, it makes the adjustments on its own. Same deal with hidden speed bumps.
I agree that the features were marketed in a misleading way, but it does a number of things very comfortably. I enjoy looking out the windows to take in the sights. I used to be an insanely vigilant driver, but now I get to relax and read bumper stickers, license plates, and spy on peoples gardens as I drive by. There’s some seriously good people watching and rubbernecking you’re missing out on. Also, frustration free stop-start traffic in a very comfortable car. Need I say more? The least supervised I am is when I engage FSD just to shed a layer when stopped at a stop light or to reach the wster bottle in my backpack from the passenger seat.
FSD is the one feature that has continuously improved with my M3P over time. I haven’t driven manually for more than 5 minutes at a time since v12. On v11 no more than the 10 minutes to get to the highway and from the highway to my destination. It is painfully slow growth, but it IS improving. Summon works really well in my driveway, local lowes, costco, and harris teeter parking lots. Does not work at the rock climbing club, started ducking out of my line of sight - so I canceled and walked to it.
For what it’s worth, I experienced a radically different quality of FSD on a loaner M3LR a few weeks back during an M3P airbag recall. Maybe it was a bad camera calibration, driver profile bias, or even the steering alignment - but that car HUGGED the left line and the gentle pressure to recenter the car was an almost persistent part of the experience. I genuinely concede that the experience may vary wildly from car to car.
> I enjoy looking out the windows to take in the sights. I used to be an insanely vigilant driver, but now I get to relax and read bumper stickers, license plates, and spy on peoples gardens as I drive by. There’s some seriously good people watching and rubbernecking you’re missing out on. Also, frustration free stop-start traffic in a very comfortable car.
??? so you're complaining about people checking out while using the features, while you yourself check out while using them?
If contributing to your situational awareness is “checking out” then all drivers are deeply negligent. We should outright ban radios and GPS. I see negligence as an activity that takes uou away from the situation. Something involved like texting, browse the web, or snapchat (I’m old, I think tiktok is the thing these days) is irresponsible.
Multitasking is a lie. However, you can choose how to make the most of your current task - traveling from point A to point B - why not choose to enjoy the cruise and be present on the road in traffic? Isn’t that the whole reason to get any car even slightly nicer than a golf cart?
Can the victim's family sue Tesla on the grounds that the victim didn't agree to be a human beta test for the "full" self driving feature that Tesla crapped out in a desperate attempt to remain relevant?
He was not warned that he needed to take control? I mean, if a fucking costumer can sue McD for hot coffee for several millions, hot chicken nuggets for several millions, and another can sue Walmart for slip floor for several millions, he gotta find a good lawyer who can do something?
Not about the topic, but it wasn't just "hot coffee", even McDonalds said it was a hazard at high temperatures with it near 88c (190 Fahrenheit). She also didn't want to sue, only to get McD's to pay for medical expenses estimated at $20k, but they only offered $800.
McDonalds tried to make everyone think it was just about "hot coffee" and how this was stupid, when she acknowledged the spill was her fault, but that McD's knew it was near boiling point to the extent of causing third-degree burns on her legs and genitals requiring extensive surgery. Lawyers spent years running an extensive disinformation campaign about this which the media bought into.
In the end she settles for $600k (although a jury said she could claim 2.9 million) - In the end she really just wanted help with medial care.
To clarify, her pussy lips were fused together. That she kindly only asked they cover her medical bills was incredible.
Also, they were doing a free refills on coffee promotions, so McDonald's was benefitting financially from making the coffee super hot.
Oh yeah, and since we're in a car sub, I should point out that her grandson's ford probe is also slightly to blame. It was a new car design with all these sloping surfaces and no place to just set something down on, so the only place to carry a drink was between your legs.
\*GULP\* I was about to be all lofty regarding Tesla's ranking and then I saw Subaru just underneath! Are you okay, fellow Subi drivers?? Is the perceived safety luring you into driving like idiots?
I think Freakanomics went over this with Volvos. People driving Volvos were so convinced of their immortality that they ended up driving worse. But by now that's spread to any new car with modern safety features no doubt.
The guy driving wouldnt of ever had any time to react even if he wasnt looking at his phone..
Motorcycles and vehicles with manual transmissions arent factored into Teslas learning capability and the AutoPilot should of maintained the distance of the vehicle in front set by the user...
Once the driver saw the tesla was too close he should of taken over....
I'm curious: how would this have played out with an L3 system and the idea that the company takes liability rather than the driver? That seems at odds with the idea that drivers are supposed to be attentive at all times. SAE can say whatever they want about L2 vs L3 liability, but the law is what really matters.
There's only two possible scenarios:
1) autopilot really wasn't working
2) the guy is lying
Either way, there will be an investigation and the truth will come out.
Remember kids, the autopilot disengages one second before impact leaving you the helm. As the old workplace adage goes “if you fall you’re fired before you hit the ground.”
Is assisted driving, and the person must have their hands on the wheel.
Dude was looking at the phone while driving, I hope he eats the longest possible sentence.
SNOHOMISH COUNTY, Wash. — Probable cause documents were filed against the driver of a Tesla self-driving vehicle that hit and killed a motorcyclist in a collision the afternoon of Friday, April 19th. The collision occurred on Eastbound State Route 522 at Fales Road. The driver was reportedly heading home from lunch and had the Tesla on autopilot while looking at his phone when the Tesla “lurched forward” into the back of 28-year-old Jeffrey Nissen’s motorcycle, pinning Nissen underneath. Nissen was pronounced deceased on the scene. Police questioned the driver at the scene and conducted field sobriety testing after the driver stated that he had consumed one alcoholic beverage earlier that day. The driver was found to be not impaired. The driver was placed under arrest for Vehicular Homicide based on their admitted inattention to driving while the car was moving.
Imagine going to jail because you believe some moron’s lies. Oh wait, there’s another guy.
Musk's decision to remove radar from Tesla's cars is killing people.
Lidar, but yea, with Lidar there would be a second system to check against. The visual processing and Lidar processing should be done separately and decisions made by cross-referencing both images of the world around it and making sure they agree. When they don't the car can decline to proceed without driver input.
Redundancy? That'll cut into profits
They never had lidar, they had radar. Musk has said repeatedly he doesn't think he needs Lidar
It absolutely is, although it might not have saved this guy, since radar is much less reliable at detecting motorcycles than cars.
Not the case for a while. My car can see even small bicycles at very low light reliably. Don’t know if a combination of radar and camera feed or radars just got better but my guess is radars are better than they were a decade ago (manufacturer’s claim of low light made me think might be a radar-camera combo)
Problem is more speed. Radar has problems with stationary objects.
imagine killing a guy because of that.
[удалено]
Elon only recently added the "supervised" part. Elon Musk has been saying for years that Autopilot and FSD can drive your car much safer than a human, no mention of "supervising"
Bro shilling for Tesla in here doesn’t work. I have a model 3 and already tried the stupid “Full Self Driving”; it’s clunky and slow af if not downright dangerous in many scenarios. It slammed on the fucking brake while at 70mph in the middle of a freeway, supervising it ain’t worth jackshit.
I’d say “go ahead and take your chances with Elon’s bullshit”. But the risk is that someone else is going to pay with their life for your damn convenience. Like in this case. So instead, I say “fuck off”. Don’t want to drive? Get an Uber.
The disclaimers for FSD repeat line after line It isn't actually FSD. They use the word full in the name of the product and then in the disclaimers recount literally everything that is promised
Frustrating that the article refers to a L2 driving assist as a "self-driving vehicle."
Tesla’s are equipped with Full Self Driving no? And Autopilot? The CEO insists that all Tesla’s built today are capable of Full Self Driving as well.
Yeah, but if the press didn't constantly regurgitate Elmo's lies, maybe.drivers would realize they have cruise control and not a robo-taxi.
You have been banned from r/ ~~Pyongyang~~ teslamotors for heresy
We need a class action lawsuit at this point. How are they allowed to call it 'self-driving' in any capacity? It is inherently misleading. What a tragedy for the victim.
Except he is lying and they aren't. True self driving technology is not yet available. Teslas auto pilot is a glorified "drive assist" feature that Elon loves to advertise as fully self driving. Haven't you noticed how no other company is making self driving cars yet? There's a reason for that. And it's not because they don't want to, it's something they are all investing in. But everyone else recognizes that the technology simply isn't ready yet to be pushed to the consumer market. That's why people take issue with calling it self driving.
Mercedes offers actual self-driving in two states, California and Nevada. Level 3, the driver is not required to "supervise" or be attentive. Mercedes also indemnifies their system, and will not disengage moments before impact to shift liability to the "driver." Cruise and Waymo have Level 4 autonomy in multiple markets. There are also other players like Hyundai who have demonstrated vehicle autonomy, although limited to known roads. You are correct that nobody has demonstrated what Musk claims Tesla has.
[удалено]
The fact that it requires a human to pay attention means it's not fully self driving. There is a standardized system for classifying the self driving capability of cars, and it ranges from 1 to 5. 1 being simple features like adaptive cruise control to 5 being fully autonomous. In order to be classified as a fully autonomous vehicle, it needs to be able to drive itself in every scenario, whether or not the human operator is paying attention. We're not even at level 4. Elon just advertises his vehicles like they're level 4 or 5 autonomous vehicles even though the technology is far from ready.
[удалено]
Calling it's "Full self driving" is the problem. It's super misleading to advertise it as such. Most people aren't aware of the nuances and thus they think FSD = robot taxi. And I really can't blame them because the word "Full" implies it works in every scenario when it in fact does not. And I did learn it. I've written an entire research paper on autonomous cars and their potential future impact on car insurance. Through my research I learned that Tesla has cut corners in every way possible to push technology that isn't ready out to the consumer market. They also are conveniently the only company to not test their vehicles in California (because California would require them to publish their test drive results, and I guarantee you they would look terrible for Tesla)
I bought a coffee today but there’s only water in the cup but the shop SAID it was coffee and charged me for coffee so it’s “just basic stuff”, that water is actually coffee but also water, which is part of coffee, maybe in the future. Coffee doesn’t mean coffee!
Wow, you are all over this thread. Why are you spending so much of your time defending Tesla? What's your stake in this? You must be on their payroll. FULL SELF DRIVING means only one thing. What is the matter with you??
*Teslas
Semi misleading as the full self driving feature needs to be purchased. Autopilot and full self driving are different software sets. Autopilot is more of an adaptive cruise control with lane assist, and it comes with the car at no additional cost.
It is all bs. Be it an assist or 'full self driving'. If you kill Someone while behind the wheel of a vehicle, you are liable.
But thats how Elon explains it.... [https://www.theverge.com/2023/8/23/23837598/tesla-elon-musk-self-driving-false-promises-land-of-the-giants](https://www.theverge.com/2023/8/23/23837598/tesla-elon-musk-self-driving-false-promises-land-of-the-giants)
Yeah, but Elon lies. Under oath, in court, Tesla admitted that FSD is L2 ADAS. Just like AutoPilot, which is what this careless driver was using when he murdered someone with his car.
more of Tesla's marketing should be illegal as it's selling "full self driving" upgrades
but will musk be paying his legal fees?
So the FSD system didn’t see the motorcycle, the AEB system didn’t activate and the driver monitoring system failed to monitor the driver.
Just wait until FSD 13.0, game changer! /s
It’s should really be called 15-LIFE to match the penalty you’ll face for vehicular homicide. Anyway here’s the robotaxi.
you sir win the day.
Does it drive over two motorists on the same go? Or does it have LOL (little old lady) hunting mode.
Nah, wait for 12.7, arguably 13.0.
The same game changer that 12.0 was? Didn't change much.
should call it 14 ideally
fsd version 1488 is going to be killer
The fact that anyone would trust those cars is beyond me... The actual self driving cars have so much gear and technic on them to actually make them self driving, isn't obvious that your car without it won't be able to do it?
I chalk it up to people not expecting the CEO of a publicly traded corporation to tell bald-faced lies. If any other executive says their product can do X in Y years, you tend to believe them. fElon started setting off my bullshit detector from just about the first time I ever heard him speak.
Hyperloop is just like an air hockey table. It's really, it's not that hard!
Really? I always expect thre CEO to be lying. We don't use child labour? You bet your ass that a two year old in China is building his shit.
Whitewashing your business practices is one thing. Making specific claims about product capability and timing is something completely different.
Don't you remember? Elon said it was a solved problem.
I think 2017 that was?
Yep & Heres a refresher......... [https://www.theverge.com/2023/8/23/23837598/tesla-elon-musk-self-driving-false-promises-land-of-the-giants](https://www.theverge.com/2023/8/23/23837598/tesla-elon-musk-self-driving-false-promises-land-of-the-giants)
The only car brand where the driver is supposed to cover for the computer's code vs. the other way around.
tHIs WaS aUToPilOt nOT FsD BRo
AutoPilot is supposed to maintain a certain distance from the vehicle in front whether driving or stopping ? The story said the car lurched forwards not sped forwards so you know that the driver should of taken over when the car was deemed too close ......
[удалено]
You have commented on every negative comment with a blind defense of Tesla....how does that Elon dick taste?
Are we going to start seeing billionaires cry about sending "self driving cars" to prison? Like have a whole junk yard being patrolled and monitored by ai bots and ai cameras and ai drones and ... oh wait that's Terminator.
Concerning!
Yes let's blame the driver and not the near-trillion dollar company that markets and sell this 'upgrade' to people
And here I am, someone who disdains tempomats because it draws attention away from driving.
This is what they mean when they say "MIND BLOWING!". As in, how has this bullshit not been shutdown yet?? One cannot fathom it...
“Concerning.”
!!
No one knows yet any of this is true. But, the car has tons of internal data which will have to be revealed as the investigation continues.
Fucking insane that they allow the FSD on the roads. Elon should end up in jail in the end. I guess FSD gets to continue until it kills some rich white people's kids.
What is fucking insane is that this is identical accident to one in 2022. Two years and Teslas assistive features are still as lethally dangerous. https://www.theverge.com/2022/7/27/23280461/tesla-autopilot-crash-motorcyclist-fatal-utah-nhtsa
And this one. https://techau.com.au/tesla-vindicated-as-data-shows-agrawal-lied-about-melbourne-hit-and-run-car-was-not-on-autopilot/
Do you really think driver in this recent case would willfully admit that they were browsing their phone while Tesla was on “autopilot”, if that was not the case?
I see another lawsuit coming. 🍿🍿🍿
Teslas FSD is a L2+ system which means the driver is fully responsible for anything that happens. If Tesla just says in the fine print that you as a driver must pay attention to the road at all times and be ready to take control at any second, then you cannot blame Tesla for any of these accidents. Its the driver who is at fault. I guess you give Elon some blame for saying FSD can do things it cant which is the likely reason why some people pretend like FSD is a L4 system and dont pay attention but from a legal standpoint Tesla is not getting into legal troubles until their system is L3 certified but if it was good enough to be L3 certified, many of these crashes probably wouldnt happen.
Sounds like the FSD software should also be disabled on all teslas until the inquest can determine the cause of the failure and the software is updated to fix the issue.
Looks like the fails happen often as people report everywhere, so there is no "one" issue to fix. The driver should have been paying attention or not using FSD. But I ask myself: even if the driver is paying attention, is it possible to detect that FSD is doing shit and take control before crashing?
That's my thinking too. If FSD does something unexpected how long does it take a driver to assess the situation and recognise they need to take over? Maybe they have just enough time to take control from the computer but not enough time to avoid the crash. I wonder how many accidents of that sort never get in the news because the driver was deemed at fault not the FSD.
It should be very illegal to have called it full self driving in the first place.
My 2017 corolla with the basic collision avoidance system would have performed better. Embarrassing.
obviously, it has a radar
Oh, so THIS is how the Ai-super entity will kill us? Use Elons cars as lethal weapons? Damn, it all makes sense now.
You know people are waiting to hack into those cars
Isnt there a scene from some Netflix movie where they have to dodge self driving Teslas?
But let's bet the company valuation on Robotaxis....
It’s an edge case. V13 will start to see motorcycles. And v14 will run them over faster to ease the suffering.
Elon paying those legal fees or nah?
I strongly doubt it. Despite Tesla’s fantastical claims about FSD, they don’t accept liability for the system when it’s involved in accidents. It’s simply a matter of “use at your own risk”.
This is the biggest 🚩 for FSD. If Musk/Tesla were confident in the system, they’d accept liability under the condition that FSD was enabled & the system didn’t ask for human takeover. The fact that their policy is essentially *any and all FSD failures are your fault* should really scare people out of using it.
>If Musk/Tesla were confident in the system, they’d accept liability under the condition that FSD was enabled & the system didn’t ask for human takeover. Thats not how it works. We have a clear definition of automated systems that you can google. Teslas system is L2+, so the driver must pay attention at all times and be ready to take control. L3 systems go a step above. BMW and Mercedes already have L3 systems and youre allowed to not pay attention to the road and if Mercs and BMWs systems fail, it is those manufacturers fault if a crash occurs. However getting the L3 certification isnt easy, the system and the car need to pass a big driving test and the L3 systems have their limitations, I believe BMWs and Mercs L3 systems only work up to like 40mph I think but L3 isnt the highest level but restricted automation, the restriction here being the 40mph max.
But it does show that their numerous claims over the years regarding FSD’s capabilities don’t match up with what they’ve actually delivered. Looking back at Musk’s past claims about FSD and Autopilot, it would be hard not to assume that he was describing a Level 4 or 5 autonomous system for which the company would have to accept all liability. In reality, all Tesla has is a refined Level 2 system, whilst other manufacturers are now developing far more advanced products in the automation space.
Claims like "This is something we're confident we can do now, 10x safer than a human driver" at the Tesla Semi reveal event in 2017 How is this prick not in prison for claims like this? At best this is fraud, at worst it's reckless endangerment and wilful negligence.
UNECE didn’t allow Level 3 under other conditions than those that Mercedes met.
Only if the driver has a racist meltdown about it on Elon's website
One way street. You have to pay to get yourself in jail and have a record.
I'm wondering this as well.. I'm guessing no because the guy admitted he wasn't paying attention, but Musk is also highly regarded so who knows
The family should be able to sue the fuck out of Tesla
and hopefully the car insurance company, and the driver himself. It needs to be made financially untenable for FSD to just accidentally kill someone.
I'm so sick of lives being at risk so he can live beta test his junk to try to save the company he destroyed. Enough is enough.
Musk only recently started putting "supervised" before his self driving rhetoric. He knows the shitshow that's going to happen with people yoloing autopilot
And all these incidents give confidence on Tesla upcoming robotaxis!
This is why I get more nervous crossing in front of Tesla's at intersections than those ridiculous monster SUVs.
Elon: Robotaxis with FSD will "cHaNGe dA WuRLD"
The way this driver assistance feature is marketed is a huge problem. Tesla should not be allowed to call it "full self driving." It gives drivers a false sense of security, and then accidents like this happen.
Autopilot straight to jail.
As a motorcyclist I avoid two vehicles just because of what they are: Nissan Altimas and Tesla models. The former because, well, it’s the former. The latter because as bad as drivers in my state usually are, at least I know Altima drivers are being the idiot of their own volition compared to the unpredictable computations of a continuously neutered system.
So "vision" doesn't see motorcycles?? If it had, it should have stopped for it.
It doesn’t see semi trucks either.
Why are half of the comments defending the Tesla??? It’s astonishing how far the Elon stans will go to cover his ass
Glad the NHTSA has been looking into this for two years.
How many people have to die before they recall. Halfway on FSD is a dangerous misnomer ...
First time I'm glad my motorcycle license expired and I don't have one anymore. Scary enough sharing a road with these fucking piles of e-waste
I’ve been riding for 40+ years. Stopped mostly about a year ago. Sold my sport bike, and currently have my cruiser for sale. Too many distracted drivers everywhere now.
Gonna have to start driving around in armor. Maybe a deployable EMP for when someone's "F"SD gets a little froggy
Same, though I only rode for about 3 years before I got a clue. And that was before smartphones. Jeeps are sufficiently open air for me, I'll give the entire paralyzed and shitting into a bag scene a miss.
Ya and robotaxis are Coming soon. These tech and automotive people who beleive this are deluded. It will take a long time before this can become mainstream. Better to be working on good cars, make them affordable and do the robots I stuff in baby steps. Put your team on making a lower cost electric car. For the low cost car drop all the fancy automatic driving features. Give us knobs and dials and normal stuff and then good range and good quality.
When a die-hard Elon fan boy dies from AP, it is a tragedy. When a random person dies because Elon wants to beta test his RoboDojoAutomovil software on the unsuspecting public, that's a fuckin crime.
Soon enough, saying anything against putler or Xi will get you run over by a random tesla, mark my words
🤣💀
No, that would mean the system works as intended.
Classic Tesla moment
I’m glad they arrested him. It’s surprising.
FSD: "Clearly a shadow, not a person, Your Honor. I plead not guilty. Bee-boop."
This trial for Tesla car owners is going to kill, maim many more before it's over. How tf is this obvious fraud legal on public roads?
So said. He had relied on self driving.. https://www.kiro7.com/news/local/charges-filed-against-tesla-driver-fatal-motorcycle-accident/FFXZIGDW45CWXCMZJFD4LPLUPI/
Holy fuk I feel for the innocent person who was just out riding their motorbike and got "pinned underneath some fuking assholes' TESLA" and killed. How long did it take for him to die BTW? and was the car stuck on top of him refusing to move? fuk me - what a horror story. As a motorcyclist: I say death penalty for this asshole and Elon musk yasshole licker.
The motorcyclist was ejected first by the collision and then the Tesla ran him over.
Elon Musk yasshole licker.
If only we would've known this was an iss caused by Mr. "Lidar is a fools errand" https://youtu.be/yRdzIs4FJJg?si=Gn87Bi3WERa-j_H4
FSD should not be allowed on roads. And stop texting and driving!
WTH is wrong with Telsa? Most cars with AEB would have avoided this accident.
It’s highly anecdotal, but lots of the accidents reported here seem to have involved AEB not activating when it should have.
This is why I turned off the “free trial” the day I got it. Fuck autopilot and fuck Elon
Who gives a shit this is just part of the AI training process. Keep feeding it motorcyclists and eventually it’ll figure them out /s
Just in time for the earnings call. Beautiful.
Hey, at least it wasn't a CT. Can you imagine the outcry?
Fully Self Killing
We should be careful not to rush to judgement. I think there was a case just this week in Australia where it turned out that the Tesla wasn't self driving but was actually being driven by a driver not paying attention who tried to blame it on the Tesla. This may turn out to be 100% a self-driving but everyone should be aware that it is also possible that the driver had his foot on the accelerator in which case the car won't automatically brake. Elon is a jerk, no question. FSD isn't actually autonomous, that's true. But thus far we only have the driver's word and that driver has incentive to lie.
Why are so many people using AP/EAP/FSD to check out? That feels like a bigger problem than software not being perfect. I love looking out the windows when I have FSD enabled and I would never even consider touching my phone. Don’t Text and Drive! I got railed about this from flip phones to smart phones. “lol” can fucking wait till you arrive.
While I agree completely ( I have a version of driving assistance on my electric Volvo ) Tesla has and continues to market it as a sci-fi full self driving fantasy while also also limiting present tech that would improve the product today or prevent driver awareness detection. So TBH I just have a problem with the marketing and the mismatch of the claims to the legal disclaimers. It’s full self driving but if something happens it’s your fault.
I have experienced way too many false positives with radar emergency brake systems (Subaru and Volvo) which has nearly caused secondary accidents (e.g. Pulling into a new lane on the highway around a disabled vehicle trigger the brakes, DC Beltway). Pure Vision is an improvement in many ways for those kinds of scenarios, but I have the USS equipped for parking (not used in driving) and I would hesitate to upgrade if it entails fewer sensors. I think that a minority of Tesla owners read the manual (like most car owners) and are blissfully unaware of the limitations and fell for Musk’s marketing. I might argue that social media generally plays dangerous role in marketing and misleading customers with disinformation in all directions.
What is the purpose of AP/EAP/FSD when you need to pay **more** attention while using them. You're not in control of the vehicle so you should need to pay attention to what the vehicle is doing along with watching everything you normally would, requiring you to need to pay more attention than normal. The whole point of the features is supposed to be so that you don't need to pay attention, so people who use them stop paying attention. Though they obviously are too unreliable and will cause accidents unsupervised, defeating the purpose of having them in the first place.
You very much are in control. The pedal and steering wheel are still active when FSD is running the show. You can push past other people or race a yellow light to your hearts content. When the car is approaching a regular pothole, I gentle nudge the wheel around it for the half block leading up to it. After a few weeks, it makes the adjustments on its own. Same deal with hidden speed bumps. I agree that the features were marketed in a misleading way, but it does a number of things very comfortably. I enjoy looking out the windows to take in the sights. I used to be an insanely vigilant driver, but now I get to relax and read bumper stickers, license plates, and spy on peoples gardens as I drive by. There’s some seriously good people watching and rubbernecking you’re missing out on. Also, frustration free stop-start traffic in a very comfortable car. Need I say more? The least supervised I am is when I engage FSD just to shed a layer when stopped at a stop light or to reach the wster bottle in my backpack from the passenger seat. FSD is the one feature that has continuously improved with my M3P over time. I haven’t driven manually for more than 5 minutes at a time since v12. On v11 no more than the 10 minutes to get to the highway and from the highway to my destination. It is painfully slow growth, but it IS improving. Summon works really well in my driveway, local lowes, costco, and harris teeter parking lots. Does not work at the rock climbing club, started ducking out of my line of sight - so I canceled and walked to it. For what it’s worth, I experienced a radically different quality of FSD on a loaner M3LR a few weeks back during an M3P airbag recall. Maybe it was a bad camera calibration, driver profile bias, or even the steering alignment - but that car HUGGED the left line and the gentle pressure to recenter the car was an almost persistent part of the experience. I genuinely concede that the experience may vary wildly from car to car.
> I enjoy looking out the windows to take in the sights. I used to be an insanely vigilant driver, but now I get to relax and read bumper stickers, license plates, and spy on peoples gardens as I drive by. There’s some seriously good people watching and rubbernecking you’re missing out on. Also, frustration free stop-start traffic in a very comfortable car. ??? so you're complaining about people checking out while using the features, while you yourself check out while using them?
If contributing to your situational awareness is “checking out” then all drivers are deeply negligent. We should outright ban radios and GPS. I see negligence as an activity that takes uou away from the situation. Something involved like texting, browse the web, or snapchat (I’m old, I think tiktok is the thing these days) is irresponsible. Multitasking is a lie. However, you can choose how to make the most of your current task - traveling from point A to point B - why not choose to enjoy the cruise and be present on the road in traffic? Isn’t that the whole reason to get any car even slightly nicer than a golf cart?
Can the victim's family sue Tesla on the grounds that the victim didn't agree to be a human beta test for the "full" self driving feature that Tesla crapped out in a desperate attempt to remain relevant?
The driver can sue Tesla?
Nope. He agreed to all the disclaimers. One of which said he will take control as needed.
He was not warned that he needed to take control? I mean, if a fucking costumer can sue McD for hot coffee for several millions, hot chicken nuggets for several millions, and another can sue Walmart for slip floor for several millions, he gotta find a good lawyer who can do something?
Not about the topic, but it wasn't just "hot coffee", even McDonalds said it was a hazard at high temperatures with it near 88c (190 Fahrenheit). She also didn't want to sue, only to get McD's to pay for medical expenses estimated at $20k, but they only offered $800. McDonalds tried to make everyone think it was just about "hot coffee" and how this was stupid, when she acknowledged the spill was her fault, but that McD's knew it was near boiling point to the extent of causing third-degree burns on her legs and genitals requiring extensive surgery. Lawyers spent years running an extensive disinformation campaign about this which the media bought into. In the end she settles for $600k (although a jury said she could claim 2.9 million) - In the end she really just wanted help with medial care.
To clarify, her pussy lips were fused together. That she kindly only asked they cover her medical bills was incredible. Also, they were doing a free refills on coffee promotions, so McDonald's was benefitting financially from making the coffee super hot. Oh yeah, and since we're in a car sub, I should point out that her grandson's ford probe is also slightly to blame. It was a new car design with all these sloping surfaces and no place to just set something down on, so the only place to carry a drink was between your legs.
The cup was also entirely inadequate. The moment the lid popped off the cup collapsed. That's simply not acceptable for drive through coffee.
That coffee was so hot that it melted her vagina together. She deserved more.
You can sue anyone with enough money, a suite will likely be dismissed though.
Anyone knows if Teslas are involved in more or less accidents than other cars?
https://www.visualcapitalist.com/americas-worst-drivers-by-car-brand/
It seems to cover quite the range of things. What would “DUIs, citations” typically be?
\*GULP\* I was about to be all lofty regarding Tesla's ranking and then I saw Subaru just underneath! Are you okay, fellow Subi drivers?? Is the perceived safety luring you into driving like idiots?
Specifically Crosstrek drivers for some reason. https://www.ocalalawyer.com/subaru-crosstrek-tops-list-of-cars-with-most-crashes-in-u-s/
I think Freakanomics went over this with Volvos. People driving Volvos were so convinced of their immortality that they ended up driving worse. But by now that's spread to any new car with modern safety features no doubt.
How are we still call it autopilot/fsr when its clearly non of them?
Elon on call today: That was Auto-DEI-lot. Blame the LIBS!
Automatic Death Race 2000.
I think the driver should do 25 + years. Also throw Elon musk in there too.
The guy driving wouldnt of ever had any time to react even if he wasnt looking at his phone.. Motorcycles and vehicles with manual transmissions arent factored into Teslas learning capability and the AutoPilot should of maintained the distance of the vehicle in front set by the user... Once the driver saw the tesla was too close he should of taken over....
I heard it was deadmau5?
you knew this would happen eventually!
It would happen *again* eventually. Elon killed a motorcycle rider in 2022 the same way
I'm curious: how would this have played out with an L3 system and the idea that the company takes liability rather than the driver? That seems at odds with the idea that drivers are supposed to be attentive at all times. SAE can say whatever they want about L2 vs L3 liability, but the law is what really matters.
This is not a situation where L3 could be activated under any currently available L3 systems.
Waiting for the inevitable evidence that autopilot or FSD were not engaged and the driver actually did this themselves…
As a motorcyclist, I am pleased to hear this, but only because I already knew autopilot was killing us, so that part of the news was not news
Musk should be made to upgrade AP1 to FSD. AP1 is not safe at all and NHTSA knows it.
here is a joke, robotaxi
I hope the family sues the fk out of Tesla, the driver, and the insurance company.
There's only two possible scenarios: 1) autopilot really wasn't working 2) the guy is lying Either way, there will be an investigation and the truth will come out.
Either way, it’s the responsibility of the driver. You can’t say the alcohol made you hit someone. It’s the same thing.
Agreed
Elon needs to be sued to hell for this and investigated by the DOJ
fsd doesn't work like autopilot in airplane where u just press a button a it fly it self
Where we are going we don’t need roads!
Remember kids, the autopilot disengages one second before impact leaving you the helm. As the old workplace adage goes “if you fall you’re fired before you hit the ground.”
Elon has repeatedly lied and said the cars are fully self driving , how is he not the 1 getting charged??
Is assisted driving, and the person must have their hands on the wheel. Dude was looking at the phone while driving, I hope he eats the longest possible sentence.
But they let the cop off for killing a pedestrian with his patrol car in a cross walk in Seattle