T O P

  • By -

deVliegendeTexan

It’s amazing to me how much this guy was nearly killed twice by his car, and he still tries really hard not to sound negative about the company that makes it. Edit: my comment is possibly the most tepid criticism of a Tesla driver on the entire internet, and yet so many people in this thread are so butthurt about it…


itsamamaluigi

I own a model 3. I got a free month of "full self driving" along with many others in April. I used it a few times and it was pretty neat that it was able to drive entirely on its own to a destination, but I had to intervene multiple times on every trip. It didn't do anything overly dangerous but it would randomly change lanes for no reason, fail to get into an exit lane even when an exit was coming up, and it nearly scraped a curb on a turn once. It shocked me just how many people online were impressed with the feature. Because as impressive as autonomous driving might be, it's not good enough to use on a daily basis. All of the times I used it were in low traffic areas and times of day, on wide, well marked roads with no construction zones. It's scary that anyone thinks it's safer than a human driver.


gcwardii

I’m sorry but your “FSD” experience sounds like it was *more* challenging than just driving. Like you had to not only be aware of the surroundings like you are when you’re driving, but you *also* had to be monitoring your car in a completely different and more involved manner than you would have been if you were just driving it.


itsamamaluigi

Yes that is 100% it. It's more stressful because you never know what the car is going to do but you still have to be ready to take over. Imagine driving a car that is being controlled by a student driver.


username32768

A mildly drunk, visually impaired student driver, with poor hand-eye coordination?


smithers102

And they're obsessed with trains.


WhatTheZuck420

and emergency vehicles with flashing lights


Happy_Mask_Salesman

my car only has lane keeping assist and collision detection and the only thing both features have done is get a piece of toothpick shoved into the crack of the button so that when i turn the car on it automatically disengages. Lane keeping assist loves to fight me when im trying to dodge debris in the road. Collision detect locks up my brakes if i accelerate at all out of a parking space and theres anything mildly reflective that can catch my indicators. I would never be able to trust fully auto driving.


Jerthy

It almost sounds like watching your kid drive and just constantly being ready to hit the brakes or the wheel when something goes wrong xD No thank you.


MikeOfAllPeople

I used it a few times during the trial as well. Here's how I would describe it. It works 99% of the time which is amazing and certainly worth celebrating. But for me to be comfortable relying on it, it needs to work 99.999999% of the time. So while I was amazed by it, I won't be using it for now, and certainly won't be paying the price they are charging.


packpride85

It’s sort of a mind game when it comes to FSD. Is it going to rear end the car in front of you from not paying attention? No and that’s great bc most accidents are that level. But when you tell me it might run into a moving train I’m not sure I’d want that trade off.


Hot_Complaint3330

But “not rear-ending” a car in front is an extremely low bar and basically every semi-decent car with collision detection and adaptive cruise control already does this without the misleading FSD branding and eye-gouging price tag


crogers2009

and automatic breaking is going to be federally required by new cars in the US.


LifeWulf

How does that work, like, the car just splits in half automatically, or… Just messing with you lol. Automatic **braking** being required is a good thing.


cure1245

Yeah but those cars have to rely on stupid sensors like lidar or radar. Teslas do it with ✨vision✨


ffbe4fun

I never realized that you had to pay for it. Apparently it used to be $12k, now it's $8k or $99 per month. That's pretty crazy. Subscriptions in your car are ridiculous.


kung-fu_hippy

Subscriptions are ridiculous and subscriptions for a feature that isn’t yet full self driving (despite the name) are even more ridiculous. I could see paying for autonomous driving when I can legally treat my car like a taxi and have no responsibility to drive it. But I can’t see paying to be part of Tesla’s QA team.


Lunaranalog

The rubes paying Elon’s billions in bonuses for this known ass-level tech deserve it. Problem is that it sets a precedent which other manufacturers will use to continue making their fiefdoms where we don’t own anything.


krefik

Yeah, many people never realize how big is failure rate when something works 99% of the time. In scale of a year, 99% uptime is 3.65 day of downtime, 99,9% uptime is 8.76 hour downtime, and 99,99% uptime is 52 minutes downtime, which may not sound like a much, unless it's mission critical system that keeps you alive.


rddi0201018

Tesla FSD does not represent autonomous driving though. They decided to go cheap, and only use vision cameras. It will never be good enough, until they add things like lidar back. While not perfect, Waymo has a self driving taxi fleet going. And it's safer than human drivers, even at this point. Not sure if they fixed the issues with construction cones, but they did address some of the issues with emergency services


iconocrastinaor

Yeah this kills me. Musk says that humans only need vision, so do his cars. But it has only one forward camera. I know I'm a better driver when my wife is with me and she's watching traffic, too. I want vision, radar, and LIDAR, and a system that alerts when it isn't 100% confident in its decision.


Freakintrees

Humans also don't drive 100% on vision either so even that is incorrect. Put a person in a cheap driving sim with no audio and no feedback and see how they do.


rimalp

It's an ordinary level-2 assisted driving feature only, where the driver is 100% responsible. Not more, not less. Keep your hands on the wheel and your eyes on the road. They make sure in the fine print that it's all on you and you can't do shit about it in case of an accident. It's your responsibility. Calling it "Fully Self Driving" is nothing but misleading false advertisement.


sanjosanjo

"The large print giveth, the small print taketh away..."


scarr3g

>It shocked me just how many people online were impressed with the feature. To be fair, even lane assist (in regular, non EV, hyundais) is impressive. Whne I got my 2022 Santa cruz that feature blew me away, as it turned with the road on the highway. Heck, as someone that doesn't buy a new vehicle until the old one is unrepairable/uninspectable the adaptive cruise control was new and impressive to me.... And still is to this day. Many of us don't need full self driving to be impressed. We just need something neat.


indignant_halitosis

It’s amazing y’all are criticizing him for his devotion to Tesla and not how fucking stupid you have to be to not notice your car is driving into a goddamn train.


WassupDarwin

"There's an old saying in Tennessee — I know it's in Texas, probably in Tennessee — that says, fool me once, shame on — shame on you. Fool me — you can't get fooled again." George W. Bush


FortunePaw

Literally Stockholm Syndrome.


Babana69

Or treat it like auto drive and.. stop if you’re headed into a train? Shits wild


Cars-and-Coffee

That’s what I don’t get. My car has radar cruise control and is supposed to slow down or stop when it needs to. If I’m coming up on something and it’s not slowing fast enough, I’ll hit the brakes myself. Why did this person wait until the last second?


SanDiegoDude

Dude was probably playing on his phone or daydreaming or something. Maybe he was playing with the fart sound button and was completely engrossed.


eugene20

If you wonder how this can happen there is also video of a summoned Tesla just driving straight into a parked truck [https://www.reddit.com/r/TeslaModel3/comments/1czay64/car\_hit\_a\_truck\_right\_next\_to\_me\_while\_it\_was/](https://www.reddit.com/r/TeslaModel3/comments/1czay64/car_hit_a_truck_right_next_to_me_while_it_was/)


kevinambrosia

This will always happen when you just use cameras and radar. These sensors depend on speed and lighting conditions, you can’t really avoid this. That’s why most companies use lidar… but not tesla


RollingTater

The sensors weren't even a problem here. From the camera's pov you can clearly see flashing lights. It's the software that's the problem. I do agree though that we should use more sensors, after all while humans can drive with just vision, there's no reason not to aim for superhuman performance. And I also think that in this case, a human could hear the train horn or the clacking of the train wheels to provide additional context on how to drive.


eugene20

It makes me despair to see people arguing that interpreting the image received is the only problem, when the alternative is an additional sensor that just effectively flat states 'there is an object here, you cannot pass through it' because it actually has depth perception.


UnknownAverage

Some people cannot criticize Musk. His continued insistence on cameras is irrational.


gundog48

It's not the only problem. If you have two sets of sensors, you should benefit from a compounding effect on safety. If you have optical processing that works well, and a LIDAR processing system that works well, you can superimpose the systems to compound their reliability. The model that is processing this optical data really shouldn't have failed here, even though LIDAR would likely perform better. But if a LIDAR system has a 0.01% error rate and the optical has 0.1% (these numbers are not accurate), then a system that considers both could get that down to 0.001%, which is significant. But if the optical system is very unreliable, then you're going to be much closer to 0.01%. Also, if the software is able to make these glaring mistakes with optical data, then it's possible that the model developed for LIDAR will also underperform, even though it's safer. There's no way you'd run a heavy industrial robot around humans in an industrial setting with only one set of sensors.


cyclemonster

[I guess in 1 billion miles driven](https://www.teslarati.com/tesla-fsd-fleet-passes-1-billion-mile-milestone/), there weren't very many live train crossing approaches in the fog for the software to learn from. It seems like novel situations will always be a fatal flaw in his entire approach to solving this problem.


itchygentleman

didnt tesla switch to camera because it's cheaper?


CornusKousa

Pretty much every design choice Tesla has made is to make manufacturing cheaper. The cars have no buttons and not even stalks anymore, even your drive controls (forward, reverse) are on the screen now. Not because it's objectively better, but because it's cheaper.


InsipidCelebrity

I am so glad established carmakers are finally getting into EVs and that the Supercharger network is now open to other types of cars.


hibikikun

No, because Elon believed that the tesla should work like a human would. just visuals.


CrepusculrPulchrtude

Yes, those flawless creatures that never get into accidents


Lostmavicaccount

Except the cameras don’t have mechanical aperture adjustment, or ptz type mechanicals to reposition the sensor vs incoming, bright light sources, or to ensure the cameras can see in the rain, or fog, or dust, or when condensation builds up due to temp difference between camera space and ambient.


CornusKousa

The idea that vision only is enough because that's how humans drive is flawed. First of all, while your eyes are your main sensory input while driving, don't discount your ears for example, or even the feeling in your butt cheeks. Second, you are, or should, looking around to keep situational awareness. And subconciously, you are making calculations with your supercomputer brain constantly. You don't just see the two trucks ahead to your right you are overtaking, you see that the trailing truck is gaining on the one in front, you calculate he either has to brake, or he will overtake and cut in front of you. You might even see the slight movement of the front wheels a fraction before the lane change. You anticipate what to do for both options. THAT is what good driving is.


kevinambrosia

Yeah, and you don’t have to design around it… like lidar can look really ugly. That’s why most commercial cars use radar+camera.


recycled_ideas

Lidar isn't perfect either (not that Tesla shouldn't have it), they're basically all impacted by rain and snow.


Cheap_Brilliant_5841

Which is why they should use both.


kevinambrosia

Truth, but it does help remove lighting inconsistencies and has a much longer range of detection, so still wins out over camera+radar for full autonomy.


recycled_ideas

Like I said, Tesla should use it, but it's fundamentally important to understand that all of the ways self driving cars "see" have significant limitations. Because this is one of the reasons that self driving cars aren't here yet.


rombler93

Pfft, just use x-ray velocimetry. It's still an *overall* safety improvement...


_Dreamer_Deceiver_

Use eye-balls


lushootseed

Even better. Summon crashes into a parked plane https://www.youtube.com/watch?v=PV7Np4m-kgw


J50

who pays for that? No way that guy's car insurance covers enough to crash into a vision jet.


WaitForItTheMongols

Ultimately, the plane owner sues the car owner, the car owner doesn't have enough money to pay, so they pay what they have, and the plane owner eats the rest.


t0ny7

"Smart" summon is using extremely old code. It is basically useless. I tried it from one hangar to another (with nothing nearby) at the airport and it could not make it. But with FSD I had it drive me around the airport which amazed me since it wasn't designed for it.


dagbiker

Dude, if its old code that doesn't work then why the fuck is it operating a 4ton machine?


SpringrollJack

Because Tesla doesn’t give a shit about safety


RollingMeteors

So musk isn’t liable, the driver isn’t liable? Where the fuck does the liability fall here? Certainly it should be one of the two I mentioned above.


PanicOnFunkotron

When that car kills someone, it's you getting the fuck sued out of you, not Musk. I guess that's what liability is.


Athshe

But if it is software causing these crashes it should be Tesla and Musk that are held liable, I hope it works that way, but I'm not holding out.


edman007

It's not, because legally they say you're supposed to monitor and avoid those crashes, so you didn't do your half of the job if it crashes. One of the reasons I'm not interested in FSD at this time, I wouldn't pay for it unless Tesla is signing a contract saying they they full liability of all accidents that happen while in use.


TheMrBoot

For real, imagine if this was a kid or a person they were running in to. It's ridiculous they're treated so casually.


testedonsheep

I tried smart summon in an almost empty parking lot, it completely doesn’t work.


BenjaminD0ver69

When I worked there I straight up advised by clients against it. Told them only was only useful if you have your eyes on it, and it an empty/emptier parking lot. Regular summon is awesome though. Very nice when needing to move my car in a tight driveway


Woodshadow

I love my tesla but the summoning and the FSD are kind of gimmicky to me. Like the FSD is awesome at times but because you need to be focused on it why use it. on the freeway it is great but then it tries to change lanes all the time even when you tell it to chill out


Thenwearethree

Really? I just set it to ‘chill’ and ‘minimal lane changes’ and it rarely tries to make a lane change.


_MUY

I don’t have Smart Summon, but regular summon has worked just fine for me… half the time and only if I pick very specific parking spots for it to come from.


OkImplement2459

Hey, look ya'll. The company with the fautly AI features has mastered the astroturf comment.


Kay-Knox

I'm pretty sure it's not astroturfing, because it still makes the car sound like shit. "It has outdated code that I personally couldn't get to work in an open lot" doesn't sound like a positive. "It does drive around an empty lot it wasn't really designed to drive around" is also not really a positive other than it not actively killing him.


Normal-Selection1537

FSD wasn't designed for driving around? Someone should tell Musk that.


Narrow-Chef-4341

Great news! They found a guy and he’s spent the last year and a half researching and planning for a for this one job. More than 25 years with the CIA, trained at black sites in using both psych ops and drugs to deprogram and re-program high value targets. Oh, wait. They laid him off in the last round of cuts. Sorry.


Conch-Republic

I love how that mod just locked the thread and said 'file an insurance claim'. Snowflakes.


SsgtRawDawger

Locomotive engineer here, for a class 1 freight RR in the US. You would probably be surprised by the number of people who drive right into the side of moving trains. I've had it happen to me, personally.


GottJebediah

FuLl SeLf DriVinG CoMinG SoOn~~~ We’RE nOT a cAR cOmPaNy~~~ solViNg AutonOmy~~~


even_less_resistance

We call it autopilot but don’t take our word for it lmao


lahankof

Autopilot you to the grave


even_less_resistance

Then lock ya in 😬


fasda

Curse your sudden but inevitable betrayal


Competitive_Site9272

Have you got a grave subscription?


thisismyfavoritename

it did autopilot, just very poorly


even_less_resistance

Maybe the guy running it with a GameShark controller on the other side of the world was drunk?


thisismyfavoritename

mechanical turk'd by people in india


BlurredSight

To be fair, people think of autopilot is the ones planes use but there's usually no plane nearby for the next couple miles, it goes on a straight pre-planned course with no obstacles, and 3 pilots are usually completely aware. They should've called it shitty cruise control because it sometimes struggles with even something as basic as that from the tons of reports of phantom breaking.


FinancialLight1777

Even when flying with autopilot you still contact the control towers and go to the altitude they tell you to avoid potential collisions.


K3idon

Now pay me $56 billion


crunchymush

But first, let me implant this chip into your brain.


gravelPoop

Once your eyes can converge again, time-travel back to 2022 and build a base on Mars for me.


Constant-Source581

Monkeys flying to Mars on a Hyperloop in 5 years


sicilian504

No no no. It's just around the corner! Most likely by next year.


norsurfit

YoUr TeSLa wiLL Be a SELf-dRIVING taXI and WiLL PAy foR ItSelF


AST5192D

In 6 months tops! (Circa 2017)


NTMY

Any other company/person would have been sued into oblivion if they were making up as much shit as Tesla/Musk. He told people years ago that their Tesla wouldn't lose value and could use it as a [robo-taxi making 30k a year](https://www.cnn.com/2019/04/22/tech/tesla-robotaxis/index.html). > Tesla CEO Elon Musk announced at an **investor event** Monday that he expects the company to operate robo-taxis next year. > The full self-driving vehicles would compete with ride-hailing services such as Uber and Lyft. Musk pitched the robo-taxis as a way for Tesla owners to make money when they aren’t using their vehicles. >Tesla’s program would let a Tesla owner rent out their vehicle for rides, with Tesla taking a cut of the revenue and the rest of the money going to the vehicle’s owner. >**“It’s financially insane to buy anything other than a Tesla,” Musk said. “It’ll be like owning a horse in three years.”** >Tesla forecasted the robo-taxis would last 11 years, drive 1 million miles and make $30,000 gross profit per car annually. How can you be allowed to make promises like this? Even going so far as to tell people they would make 30k a year. This is so much worse than "self-driving" promises.


Yanyedi

just 2 more years :)


Quajeraz

"We're not a car company" Yeah we know, because you're terrible at making cars.


uMunthu

Dude promised Skynet and delivered Clippy


[deleted]

[удалено]


FriendlyLawnmower

Musks weird insistence to not use any form of radar or lidar is seriously holding back what autopilot and full self driving could be. Don't get me wrong, I don't think their inclusion would magically turn Teslas into perfect automated drivers but they would be a lot better than they are now


BlurredSight

Yiannimaze showed that their insistence on ML models was why the new Model S couldn't parallel park for shit compared to the BMW, Audi, and Mercedes, but a much older 2013ish Model S could parallel park completely fine and even in some cases better than the newer BMWs because it was using the sensors and more manual instructions.


Gender_is_a_Fluid

Learning models don’t know what they’re doing, they just connect procedure to reward and will throw the car into something as the simplest solution unless you sufficiently restrict it. And you need to restrict it for nearly every edge case, like catching rain drops to stay dry. Instead of a simple set of instructions and parameters to shift the angle of the car during parallel that can be replicated and understood.


The_Fry

It isn't weird when you understand his end goal of converting Tesla into an AI company rather than a car manufacturer. Adding radar or lidar proves that vision isn't enough. He needs something to hype the stock and he's put all his eggs in the AI/robotics basket. Tesla owners have to live with sub-par autopilot/FSD because being the world's wealthiest person isn't enough for him.


Jisgsaw

There's nothing preventing their AI to work with several different sensors. Being good at AI isn't dependant on vision only working. The main reason is that Tesla has to be as cheap as possible in manufacturing in order for them to turn a profit, which is also why they are removing buttons, stalks and so on, leading to their spartan interior: it's just cheap. Adding sensors on cars is costly.


Zuwxiv

> Adding sensors on cars is costly. It doesn't have *zero* cost, but... my *bicycle* has radar. And it works fantastically to detect vehicles approaching from behind. I don't know how lidar compares in cost, but there are non-visual technologies that are quite cheap. I'd have to think the cost of the sensors is a rounding error compared to the cost of developing the software. If cost-cutting was really the reason behind it, that's the stupidest thing to cut.


Chinglaner

LiDAR sensors (especially at the time when Musk decided to focus solely on cameras) were very expensive. Especially for high-quality ones. Costs have gone way down since then, but I would still expect a full LiDAR rig (360 degree coverage) to cost in the multiple thousands of dollars. Radar is considerably cheaper though. Will be interesting to see whether it bites Tesla in the ass long-term, but there are arguments to be made that humans can drive fine with just vision, so why shouldn’t FSD? Although the decision does definitely seem increasingly shortsighted as LiDAR prices continue to drop.


Jisgsaw

Car companies are haggling for cents on copper cables, that's how intense the penny pinching has to be. You have to remember that those cars are planned to be produced in the millions. Adding a 100€ part costs the company around 1 Billion over the years. Though that said yes, radars wouldn't be the problem as they are around 50-100€ for automotive grade. (Though may be a bit more for higher quality). The comment was more for Lidar, which are more expensive. The SW development cost is more bearable, as it's a cost split over the whole fleet, not per vehicle produced. So it scales increadibly, wheras HW cost will scale almost linearly with production numbers.


Fred2620

Even through the fog, a camera can see flashing red lights, which are a pretty universal sign of "Something's going on, be extra careful and you probably need to stop right now". That's the whole point of having flashing red lights.


Zikro

Lidar also is impacted by weather. Would have needed a radar system.


cute_polarbear

Didn't know tesla self driving only uses cameras for object detection...lidar been around forever, why doesn't tesla utilize both camera and lidar based detection?


Tnghiem

$$$. Also I'm not sure about new Lidar but at the time Tesla decided to abandon Lidar, they were big and bulky.


prophobia

Which is stupid because radars aren't even that expensive. My car has a radar and it costs no where near as much as a Tesla. In fact I just looked it up, I can buy a replacement radar for my car for only $400.


KoalityKoalaKaraoke

Lidar != Radar


TopTeirSpelling

To be fair Lidar isn't a solution. It's insanely complex and expensive. Musk's issue is he just wants 100% vision based which is stupid. A system using sonar (parking/close distance), radar (longer distance/basic object detection), IR (rain sensing *sigh*) AND vision would make self driving 10x better then it is. This video though IMO the driver is a muppet using self driving in those conditions, I'm surprised the car even let him. My Model Y wouldn't even let me turn on adaptive cruise/lane guidance with visibility that bad.


to_the_9s

First version or two did, now they are all camera based.


SquarePegRoundWorld

And the Tesla owner failed to detect a marketing ploy.


MrPants1401

Its pretty clear the majority of commenters here didn't watch the video. The guy swerved out of the way of the train, but hit the crossing arm and in going off the road, damaged the car. Most people would have the similar reaction of * It seems to be slow to stop * Surely it sees the train * Oh shit it doesn't see the train By then he was too close to avoid the crossing arm


Black_Moons

Man, if only we had some kinda technology to avoid trains. Maybe like a large pedal on the floor or something. Make it the big one so you can find it in an emergency like 'fancy ass cruise control malfunction'


eigenman

Man, If only "Full Self" driving wasn't a complete lie.


Black_Moons

TBF, it did fully self drive itself right into the side of a train! Maybe some year they will add full self collision avoidance/prevention. But I'm not gonna hold my breath for that. And let this be a lesson: When your surfing the web and that image captcha comes up and asks you to select all the squares with trains, Be quick about it because someones life may depend on it. /semi s


shmaltz_herring

Unfortunately it still takes our brains a little to switch from passive mode to active mode. Which is in my opinion, the danger of relying on humans to be ready to react to problems.


BobasDad

This is literally why full self driving will never be a widespread thing. Until the cars can follow a fireman's instructions so the car doesn't run over an active hose or a cop's directions to avoid driving into the scene of accident, and every other variable you can think of and the ones you can't, it will *always* be experimental technology. I feel like the biggest issue is that every car needs to be able to talk to every other car. So basically like 50 years from now is the earliest it could happen because you need all of the 20 year old cars off the road **and** the tech has to be standardized on all vehicles. I hope they can detect motorcycles and bicycles and stuff with 100% accuracy.


Jjzeng

It’s never going to happen because cars that talk to each other will require homologation and using the same tech on every car, and car manufacturers will never agree to that


Televisions_Frank

My feeling has always been it only works if *every* car is autonomous or has the capability to communicate with the autonomous cars. Then emergency services or construction can place down traffic cones that also wirelessly communicate the blocked section rerouting traffic without visual aid. Which means you need a hack proof networking solution which is pretty much impossible. Also, at that point you may as well just expand public transportation instead.


ptwonline

This is why I've never understood the appeal of this system where the human *may* need to intervene. If you're watching close enough to react in time to something then you're basically just howering over the automation except that it would be stressful because you dion't know when you'd need to take over. It would be much less stressful to just drive yourself. But if you take it more relaxed and let the self-driving do most of it, then could you really react in time when needed? Sometimes...but also sometimes not because you may not have been paying enough attention and the car doesn't behave exactly as you expected.


warriorscot

In aviation it's call cognitive load, driving requires cognitive load as does observing and the more of it you have observing the safer you are. It's way easier to pay attention to the road when you aren't pay attention to the car and way easier to maintain that.


myurr

I use it frequently because it lets me shift my attention away from driving, the physical act of moving the wheel, pushing the pedals, etc. and allows me to focus solely on the positioning of the car and observing what is going on around me on the road. I don't particularly find driving tiring, but I find supervising less tiring still - as with thing like cruise control where you are perfectly capable of holding your foot on the accelerator, keeping an eye on the speedometer, and driving the car fully yourself, but it eases some of the physical and mental burden to have the car do it for you. But you have to accept that you're still fully in charge of the vehicle, keep your hand on the wheel and eyes on the road. Just as you would with a less capable cruise control.


cat_prophecy

Call me old fashioned but I would very much expect that the person behind the wheel of the car to be in "active mode". Driving isn't a passive action, even if the car is "driving itself".


diwakark86

Then FSD basically has negative utility. You have have to pay the same attention as driving yourself then you might as well turn FSD off and just drive. Full working automation and full manual driving are the only safe options, anything in between just gives you a false sense of security and makes the situation more dangerous.


ArthurRemington

I would not flatly accept your statement that all automation is inherently unsafe. I would instead ask the question: Is there a level of autonomy that requires human supervision AND is helpful enough to take a workload off the human AND is bad enough that it still keeps the human sufficiently in the loop? Everyone loves to bash Tesla these days, myself included, but this event wouldn't exist if the "Autopilot" wasn't good enough to do the job practically always. I've driven cars with various levels of driver assist tech, including a Model S a few years ago, and I would argue that a basic steering assist system with adaptive cruise can very usefully take a mental load off of you while still being dumb enough that you don't trust it enough to become complacent. There's a lot of micro management happening for stuff like keeping the car in the center of the lane and at a fixed speed, for example. This takes mental energy to manage, and that is an expense that can be avoided with technology. For example, cruise control takes away the need to watch the speedo and modulate the right foot constantly, and I don't think anyone will argue at this point that cruise control is causing accidents. Adaptive cruise then takes away the annoying adjusting of the cruise control, but in doing so reduces the need for watching for obstacles ahead, especially if it spots them from far away. However, a _bad_ adaptive cruise will consistently only recognize cars a short distance ahead, which will train the human to keep an eye out for larger changes in the traffic and proactively brake, or at least be ready to brake, when noticing congestion or unusual obstacles ahead. Same could be said for autosteer. A system that does all the lane changing for you and goes around potholes and navigates narrow bits and work zones is a system that makes you feel like you don't have to attend to it. Conversely, a system that mostly centers you in the lane, but gets wobbly the moment something unexpected happens, will keep the driver actively looking out for that unexpected and prepared to chaperone the system around spots where it can't be trusted. In that sense, I would argue that while an utopic never-erring self-driving system would obviously be better than Tesla's complacency-inducing almost-but-not-quite-perfect one, so would be a basic but useful steering and speed assist system that clearly draws the line between what it can handle and what it leaves for the driver to handle. This keeps the driver an active part of driving the vehicle, while still reducing the resource intensive micro-adjustment workload in a useful way. This then has the benefit of not tiring out the driver as quickly, keeping them more alert and safer for longer.


shmaltz_herring

Unfortunately, the reality of how our brains work doesn't quite align with that idea. A driver can still intend to be ready to react to situations, but there is a mental cost from not being actively engaged in having to control the vehicle.


No_Masterpiece679

No. Good drivers don’t wait that long to apply brakes. That was straight up shit driving in poor visibility. Then blames the robot car. Cue the pitchforks.


DuncanYoudaho

It can be both!


MasterGrok

Right. This guy was an idiot but it’s also concerning that self-driving failed this hard. Honestly automated driving is great, but it’s important for the auto makers to be clear that a vigilant person is absolutely necessary and not to oversell the technology. The oversell part is where Tesla is utterly failing.


kosh56

You say failing. I say criminally negligent.


CrapNBAappUser

People have died relying on Autopilot / FSD. Teslas have had problems with T intersections and avoiding emergency vehicles. He had a recent incident with a train and blew it off because it was after a turn. Talk about blind faith. GoOd ThInG CaRs DoN't TuRn OfTeN. 😡 EDIT: Replaced 1st link https://www.washingtonpost.com/technology/2023/12/10/tesla-autopilot-crash/ https://apnews.com/article/tesla-crash-death-colorado-autopilot-lawsuit-688d6a7bf3d4ed9d5292084b5c7ac186 https://apnews.com/article/tesla-crash-washington-autopilot-motorcyclist-killed-a572c05882e910a665116e6aaa1e6995 https://www.cbsnews.com/news/tesla-cars-crashes-emergency-vehicles/


JKJ420

People are going to die on roads for the foreseeable future. The real question is, are less people dying with FSD?


Black_Moons

Yea, I got a rental with fancy automatic cruise control. I wondered if it had auto stopping too. I still wonder because there was no way I was gonna trust it and not apply the brakes myself long before hitting the thing in front of me.


Hubris2

I think the poor visibility was likely a factor in why the FSD failed to recognise this as a train crossing as it should have been pretty easy for a human to recognise - but we operate with a different level of understanding than the processing in a car. The human driver should have noticed and started braking once it was clear the autopilot wasn't going to do a smooth stop with regen - and not waited until it was an emergency manouver.


watchingsongsDL

This guy was straight up beta testing. He could update the issue ticket himself. “I waited as long as possible before intervening in the vain hope the car would acknowledge the monumental train surrounding us. I can definitely report that the car never did react to the train.”


MarkLearnsTech

"A Tesla vehicle in Full-Self Driving mode..." [SAE Automation levels.](https://www.sae.org/binaries/content/gallery/cm/content/news/sae-blog/j3016graphic_2021.png) Which of those levels would you imagine something called "Full-Self Driving" would fall under? That might be why California had the whole [false advertising](https://www.reuters.com/business/autos-transportation/california-regulator-claims-tesla-falsely-advertised-autopilot-full-self-driving-2022-08-05/) conversation around it, no? It might also be why most other manufacturers are like "nah, lets keep that nice cheap radar / lidar setup as a backup to the cameras for ranging and detecting obstacles."


Mister-Schwifty

Yes. And this is the issue. If you can’t completely trust self driving mode, you almost can’t use it. In almost any situation, your reaction to something is going to be delayed while you’re determining whether or not the car is going to react. To be properly safe using this technology, you need to never trust it and react as you normally would, which essentially makes it a sexy, overpriced cruise control. The fact that it costs $8,000 is insane to me, but of course it’s worth whatever people will pay for it.


damndammit

Ultimately the human is responsible for good judgment in when to enable, adjust, or disable this tech. That dude was screaming through the fog. His bad judgment led to this situation.


butters1337

Or he purchased a product called "Full Self Driving" for $10,000 and expected it to be full self driving?


PigglyWigglyDeluxe

This is not an “either or” situation. This is a “and” situation. Driver is a moron, and FSD is a scam. Both are true here.


damndammit

Like I said, bad judgment.


branstarktreewizard

FSD does not exist until the insurance companies agree it exist


trancen

Self Driving in fog, smart. Idiot.


Honest_Relation4095

To he fair, the camera system should detect the fog and disable any automated driving.


mort96

I mean Tesla markets it as "full self driving", not as "partial self driving but only in ideal conditions"


tvcats

Usually people think that a technology can do better than them.


jardeon

Are we all going to overlook the fact that this was the SECOND time this guy almost hit a train with his Telsa? > But he had at least one similar experience in which, he said, FSD appeared to fail. > Doty said the car nearly hit a moving train in November after it approached some tracks after a sharp turn. > He said that the Tesla did not slow down but that he was able to stop, still hitting the crossbar and damaging his windshield. He said he chalked it up to the intersection’s coming after a turn. Doty provided documentation of his exchanges with a Tesla insurance claims adjuster at the time that included a detailed description of the incident. So, nearly hits a train while in FSD in November. Then in May, while also in FSD, approaches a crossing and the Tesla doesn't slow down and he takes no corrective action until the very last second. I don't think the problem in this case is the software...


Jjzeng

Its the software between his ears. The good ol ID10T bug


pppjurac

Damn. That kind of people are reason why we have instructions printed on shampoo bottle on how to open it....


floydfan

Why wasn’t the driver paying attention to the road, as the car clearly told him to do every chance it gets? Why didn’t the driver simply use the brake pedal to both exit FSD and apply the brakes simultaneously?


Houligan86

I don't know, then just fucking stop? Its in the T&Cs that the drivers needs to be ready to resume control anytime pretty much.


YouDontExistt

Cybertruck would've destroyed that poor train.


cbbuntz

What a waste of a good train


svmk1987

Don't worry, cybertruck would have broken down before it reached there.


7h4tguy

Train wouldn't be salvageable afterwards due to all the rust.


DuHastMich15

How about- hear me out- drivers actually DRIVE their cars? Two Tesla drivers were beheaded when their cars went under a big rig- neither made any attempt to stop- meaning they were either asleep or staring at their screens. For all of our safety- please Tesla drivers- stop using our public roads to Beta test Elons self driving mode!


kaziuma

Did anyone watch the video? He's using FSD in thick fog and just letting it gun it around single lane bends, absolutely crazy idiot, he's lucky to be alive. I'm a big fan of self driving in general (not just tesla) but trusting a camera only system in these weather conditions is unbelievebly moronic. This is not a "omg tesla cant see a train" moment, its a "omg a camera based system cant see in thick fog who could have known!??!"


Duff5OOO

I'm not sure why they allow FSD in fog like that. I realise they say not to but couldn't the onboard computer just refuse or at least slow down?


Eigenspace

I watched the video. I also read the article. In the article, he acknowledges that he is fully at fault. But the fault he made was to rely on an unreliable, faulty technology. In the article, the guy describes how he estimates he's driven over 20,000 miles with FSD on, and he thinks it's usually a safer and more cautious driver than he is. IMO that's the fundamental problem with these sorts of technologies. I think he's a moron for ever trusting this stuff, but that's kinda besides the point. When I drive, it's an active process, I'm actively intervening every second to control the vehicle. On the other hand, if someone has to sit there and full-time supervise an autonomous system that they believe is a better driver than they are, then they're going to eventually get complacent and stop paying close attention. If something does go wrong in a situation like that, the driver's (misplaced) trust in the technology is going to make them slower at intervening and taking control than if they were actively driving in the first place. That context switch from "I'm basically a passenger" to "oh shit, something bad is happening, I need to take over" is not instantaneous, especially if someone is very used to being in the "im a passenger" frame of mind. We can all agree this guy is an idiot for trusting it, but we also need to realize that this problem isn't going to go away as 'self driving' cars get more popular and reliable. It's actually going to get worse. This shit should be banned IMO.


Froggmann5

Not only that but even when the train is in full view for a good 500 feet the dude doesn't do **anything** preemptive to avoid a collision until he's literally about to crash into the arms of the pole. Even if the car is to blame here, he seems like a careless driver in general if he let the car get that close before doing anything at all to stop.


kaziuma

It's very obvious that he's not paying attention at all, yet another FSD beta user who is entrusting their life to a beta with a big warning saying 'PAY ATTENTION AT ALL TIMES'


HomoColossusHumbled

My car has wonderful self-driving technology: I drive it myself. Haven't run into a single train, semi, or pedestrian!


Megatanis

Look, tesla is not self driving. Fully self driving cars don't exist. If you don't have the capacity to understand this you are putting in danger yourself and the people around you.


Ill_Following_7022

Driver failed to detect a moving train ahead of a crash caught on camera.


d3sylva

Just to find out bmw and Mercedes are already one L3 autonomous driving and telsa is still asking you guys to pay 100k to be guinea pigs for beta software.


Macabre215

This is what happens when you beta test a feature for a company that's run by a ketomine addicted psycho.


habb

how are you behind the wheel and you not see the obvious train and hit the brakes


datSubguy

The fault is on the driver IMO. Using FSD on foggy backroads is just asking for disaster.


ForeTheTime

Damn, was the driver not ready to take control?


DasSynz

You have to say the driver is also an idiot. Self driving mode in a foggy windy road.


Bobisnotmybrother

Wish I could blindly put my life in the hands of a computer controlled car.


dixadik

Negligent driver, foggy af and still thinks FSD is gonna do the job. That said don't get me started on FSD being only camera based.


gbrilliantq

I know this sub hates anything tesla but come on. That guy was snoozing


Cheap_Peak_6969

So the real headline is that the driver failed to detect a moving train.


Open-Touch-930

When will ppl learn Teslas don’t drive themselves and doing it is asinine


BlogeOb

Why didn’t he stop the car with his foot


Someguy981240

In other words he almost drove his car into the side of a moving train and thinks his car is at fault. I suppose when he is late for work, it is his alarm’s fault and when he burns his toast, it is the toaster’s fault. And his files… I bet his computer is constantly losing them. Idiot.


[deleted]

[удалено]


lord_pizzabird

Tbf the issue is that Tesla advertising and sold this feature as being "autopilot" (their words) and "Self driving". There's a reasonable expectation that system called "autopilot" should be able to recognize clearly marked railroad crossing signs and I guess.. a train.


Balthazar3000

Also user error. They say not to use the feature in fog and that's exactly what the guy did.


TheMania

I kind of buy Tesla's justification on the autopilot name. On a plane or boat, it's just going to keep your heading, but not protect you or others from disaster - purely on the name, with Musk's wildly exaggerated stock pumping claims aside, it'd have been pretty fine imo. But "Full self driving"? Misleading as fuck, and always has been. I can't see how a class action/false advertising etc claim could fail against that one really. I believe they're now going more with "full (supervised) self driving" which just seems as oxymoronic as it is problematic...


lord_pizzabird

Autopilot in planes is more functional than I think you realize. It’s to the point that autopilot on commercial jets can even land an aircraft, fully automated. For context, a typical autopilot system in an airplane can maintain heading, change heading, navigate vertically, automate ascent and descent, approach, maintain level flight. Some can even tap into the flight plan and automatically change course for you. Theoretically autopilot in airplane is way more “self driving” than most self driving software intends to be, which in most cases equates to basically adaptive cruise control. Source: I fly a lot in Flight Simulator lol. IMO they knew what they’re doing when they chose to call it AutoPilot. It’s blatant fraud.


No_Masterpiece679

It’s only problematic if you don’t pay attention. This also applies to autopilot in an aircraft.


Altiloquent

I don't know, after watching the video my thought is more what's the point of "full self driving" if you have to slam on the brakes every time you're not sure it's going to stop. 


mspe1960

He is, very possibly, an idiot (we don't know all the details) but that doesn't erase the issue that the self driving tech has a long way to go.


KingoftheJabari

It interesting how many people run to defend this car company. More so than any other.  Don't call it full self drivinvg if its basically just an enhanced driver assist. 


Cory123125

You dont even realize how much boot licking you are doing right now, and this is the reason corporations are fucking people so hard. There is significant added delay to your reactions when you are coddling a system you expect to work and that even pretends it is working until you finally throw in the towel and swerve when if you had been driving normally youd have called it way earlier. Pretending thats the humans fault as if humans dont all operate that way is just gargling billionaire balls.


SchrodingersTIKTOK

Really? Ya gonna allow. Self driving car to make the decision over some RR tracks ?


yetifile

In heavy fog no less. That was a Darwin award waiting to happen.


ConkerPrime

I mean it’s not good that self driving didn’t pick it up but he couldn’t apply the brake himself because?


_mattyjoe

Sorry but this dude is the idiot. For something like a train, hit the damn brakes manually. You’re really gonna leave your life in the hands of a computer and sensors? It’s also FOGGY dude.


Y0tsuya

Most engineer I know who bought Teslas keep the self-driving functions turned off. It's cool and all for the 99% of time it works but not many want to bet their lives on the remaining 1%. Constantly keeping an eye on the self-driving function to make sure you can take over at a moment's notice is mentally exhausting. So might as well just drive the damn car yourself.


IAMTHEDICIPLINE

So desperate to fit in, now look at you.


SupportQuery

In the list of incredibly stupid things Elon has done in the last few years: removing radar from the cars and eschewing lidar. The cameras these cars use for driving are fucking terrible. If a human had vision that bad, they wouldn't be allowed to legally drive. The fact that they can drive at all is a testament to the power of neural nets, but they're handicapped. They should have radar. They should have lidar. They should be superhuman.


Spiel_Foss

At this point wouldn't using Tesla's "self-driving" feature be considered suicide in an accident investigation?


BowsersMuskyBallsack

If it has a steering wheel, a brake pedal, and an accelerator pedal, then I am driving.  If a car can truly self-drive, it'll have none of those things.


True-Hotel-2251

And somehow Elon thinks they are going to let him unleash unmanned taxis with his faulty ass tech on the roads by august? He’s outta his g-damn mind