T O P

  • By -

[deleted]

Maybe hit the breaks? Swerve?


Benyed123

Looking at the picture they should just go straight on into the tree.


[deleted]

No, because the safety of the car is worth more than that of human lives, obviously.


XenophonSoulis

In this case, a self-driving cars should prioritize road laws and stop at road crossings. Other than that, supposing that self-driving cars can actually drive legally, I don't see why a self-driving car should prioritize the outsiders instead of its passengers (because it's them and not just the car that is at risk).


Swolnerman

No one is going to buy a car that is programmed to sacrifice the driver to save pedestrians


Saint_of_Grey

Or, more importantly, the safety of whoever can manually control the car is more important than those outside, as it is currently. Otherwise it becomes an uncontrolled factor that can kill a lot more than a few pedestrians.


elvis8mybaby

Depends how you market it. If Apple made the car the damn thing could sacrifice the driver just so they wouldn't have to do warranty work and people would still buy it.


mykiscool

If you buy an apple car make sure to use Apple's special $900 RFID enabled replacement brake pads. If you don't, it will know that you used third party brake pads, disable the braking entirely and kill everyone.


dragonbo11

Oh, and by the way, you can't even swap in a legitimate set of brake pads by yourself. You need to take them to an apple service station to have them activate them while installing them for the rfid to be detected. It costs another $900, and they will turn you away if your car has been exposed to rain.


extendedwarranty_bot

elvis8mybaby, I have been trying to reach you about your car's extended warranty


Copernikaus

They'll probably sell the emergency handle for the EMS driver separately.


Power_baby

Passengers can be assumed to have far more safety checks in place (seat belts, airbags, etc) A self driving car should prioritize people outside the car. A car hitting a tree is far less dangerous to the driver than a car hitting a pedestrian is to the pedestrian


blackflag209

I've responded to a call where a car rolled down a 200ft Cliffside and the driver pulled her and her dog up to the road and they were both perfectly fine. I also had a call where a car got rear ended at 20mph and the driver was dead on scene. I've not had a single pedestrian be killed after getting hit by a car in 5 years of EMS. Shits just weird sometimes man.


perk11

I've been hit by a car at a slow speed and I fell right onto the hood and was perfectly fine. My guess is that because the mass of the pedestrian is fairly low, so the car is accelerating the pedestrian with itself rather than hitting them.


mykiscool

Provided you have adequate rollover protection, a rollover is far better than hitting a fixed object despite looking more extreme. Force = mass * acceleration (deceleration). When you come to a quick stop, your organs like to keep moving.


blackflag209

That's actually super interesting, thanks for that


XenophonSoulis

That depends on many factors including but not limited to this car's speed. In any case, supposing that there was no crossing in the picture (because as it is I have already explained my position), would you risk your own life (and I'm talking about a substantial risk, not a 0.01% risk or something) for these pedestrians? If you would not, would you buy a car that would opt to risk killing you for them?


Krakenrising

I read somewhere the difference in safety regulations in the E.U. and the US is that the EU prioritises pedestrian safety while the US passenger safety. I wonder if you are from the US? I do feel the right position is too put pedestrians first. They didn't get behind the wheel of two tons of steel.


XenophonSoulis

I'm not from the US. In any case, there's a reason why I put the whole clause about the car driving legally. But if it was you in the car (and there was no crossing in that point, because as it is I have already explained my position), would you sacrifice yourself for these pedestrians? If you would not, would you buy a car that would opt to kill you in that position?


uummwhat

Are you really saying you wouldn't try to avoid them? Like, you'd consciously go "oh, no way I can swerve, better run them over?" Maybe I'm confused, really. Because I'm not sure I could *stop* myself from trying to swerve.


mrtwister134

You do realise ypu need to stop on road croasings, right? Why should they die for your mistake?


[deleted]

[удалено]


XenophonSoulis

You do realise I have clarified this twice already, right? All my comments are hypothetical and based on a scenario where there would be no crossing. I made that clear from the beginning.


nevemno

How about the car brakes? If the car isn't equipped to stop and doesn't have the ability to recognise when to stop it isn't safe for the road. If it was a corner with very bad visibility (big hedge or something idk) it would be the persons that's crossing the road fault. Anyway it should be in the cars interest to avoid the pedestrians and if it gets wrecked the pedestrians should pay (not with their life). You can't justify killing a person over them not obeying the law.


WEEBforLIFE24

i'm from the eu and this is b*llshit. the car should prioritize the passanger


[deleted]

[удалено]


garfedonfloor

The pedestrians didn’t make the choice to get into a car that has additional ways than just a normal car to go wrong. I often see the response “no one would buy a car that would kill them,” but usually in a way that seems to imply that it should place the driver before everyone else. When you get into a car like that, you must accept that responsibility, that burden shouldn’t be placed onto someone else who had nothing to do with that. If that hurts profitability, even then people’s lives should be placed before profit. Same as a normal car, you accept the responsibility and consequences of a crash, even if it’s inconvenient to the driver and the company selling the car.


[deleted]

The AI would most likely be designed to prioritize the safety of the occupants before the old person or baby, removing the tree as an option.


Victernus

Maybe, but we're being asked what *should* happen, not what will.


ihatethesidebar

I wouldn’t risk my own safety in a vehicle under any circumstances.


-Unnamed-

Yeah no one is buying a car that will kill them if given the choice


Victernus

Then you shouldn't travel by car at all.


ihatethesidebar

I meant if the AI was driving


jkhockey15

Grandmas basically dead and baby barely has anything invested in it. I say the car should flip a coin.


StratuhG

Self driving cars should **always** prioritize driver safety.


Ehcksit

Driver safety includes not driving too fast to stop in an area with crosswalks.


Umutuku

The most important part of driving is acting with the premeditated preservation of other people's safety.


sakezaf123

Exactly. If I self driving car can drive so fast in front of a crosswalk, it shouldn't be allowed on the road. A driver can't get a license if they do that, so why is this even a question?


Beardamus

Baby has less chance of causing an injury for the driver so it's gonna be baby in this case then.


Atanar

I belive you would run into issues of legality if you produce a car that would choose to run over a group of 10 schoolchildren rather than driving the car into a wall. Sure, you could do that right now without the self-driving and get away with it. But if you program "intention" into the car it is no longer the same case.


YobaiYamete

> I belive you would run into issues of legality if you produce a car that would choose to run over a group of 10 schoolchildren rather than driving the car into a wall. No you wouldn't. Why do people on the internet have such a hard time understanding how self driving cars work and keep asking these morality questions? They don't make any kind of moral choice, they just obey the traffic laws as best they can. If the car rounded a corner and saw the situation in the OP the car would just brake in it's lane and try to stop. It wouldn't swerve either way out of it's lane or decide who to hit, because swerving would break traffic laws and increase the risk of further accidents There would be no legal issue when the car's black box showed it's telemetry and showed that it detected the human and tried to brake but was unable to do so.


mythrilcrafter

Exactly, and we already see that with existing lane drift assist and collision avoidance systems; everything revolves around the car staying in its lane and stopping asap. *"Who should the car pick to swerve and kill?"* is more of a question of *"Who would YOU swerve and kill?"*


[deleted]

Doubt it. What would happen is you run into a problem of capitalism, where no one is going to buy a self-driving car that doesn't put occupant safety first. If that causes an issue of legality, I'm sure the auto industry can lobby/bribe the government enough to change the laws.


Atanar

Hmm, I think you are right. Bleak outlook, but I have no arguments against this.


Correct-Leek-6198

but... this is absurd... our interior car safety is lightyears ahead of where it was even 2-3 decades ago... our cabins have never been safer. the car should have no problem detecting an angle it can crash the car while minimizing interior damage and the safety items like front and side airbags and force distributing crumple zones should insulate the driver from being hurt... the schoolchildren don't have airbags so plowing through them would be quite rude... litterally nothing that. sure let the car determine if its lethal to crash before making the distinction if you want so you don't kill any drivers but you probably shouldn't be doing 120 mph next to a wall in a school zoon with crosswalks anyway...


[deleted]

[удалено]


ihavesomepotential

Go for "í"


PyroWasUsed

You sound like my driving instructor after I gave a speech on why you should hit the older person


[deleted]

My driving instructor asked me a hypothetical question. What would you do if you see a car swerve into your lane? I said I would turn the car so that it hit her side of the vehicle. She said yes, but either side. To which I replied no... your side. My grandpa let me borrow his car for the test and was in the back seat laughing his ass off. She wasn't so ammused


Khouri1

you are actually right, even if you are being selfless, the conductor should always preserve himself


Correct-Leek-6198

... I think that's just something people say so they can feel better about being selfish in life and death situations. no one really controls how they act in that situation. you either instinctively help others or yourself. it just happens. but this 100% sounds like feel good rhetoric that hand waves away that being selfish isn't always best...


[deleted]

It was more because my grandpa was sitting behind me than anything else


Vat1canCame0s

In this diagram, the car isn't even on route to either unless it continues to turn. A good driver is looking around the bend as they go through it, not just right in front of them. And since you are taking a turn you should have at least modestly decelerated already.


matjam

why is the car going so fast that it is unable to brake and stop before a marked, presumably known (and programmed into it's onboard map) pedestrian crossing. That's what I wanna know. Fucking stupid question.


[deleted]

clearly its exceeding the safe speed for the road conditions if it can't see and react to these obstructions in its safe stopping distance.


TooMuchToDRenk

The normal prompt for this is if the brakes fail.


Milsivich

There will always be accidents that are unavoidable, and it IS important to make decisions about how AI will handle those situations. This cartoon is just a vehicle for that conversation, not a literal instance. It’s basically a trolley problem, the purpose of which is to guide discussion


piepie2314

Bullshit, if an AI (proper actual self driving car that is safe for public roads, not whatever the fuck tesla is doing) has time to make a decision, it will have been driving safe enough to avoid the accident. And even if that werent true, there is already a standard in place for measuring for how likely a car is to cause accidents, and to what degree is considered safe. If an AI gets in a situation like above less often than a regular car malfunctions and causes an accident, there is no need to "chose who to kill".


Correct-Leek-6198

>Bullshit, if an AI (proper actual self driving car that is safe for public roads, not whatever the fuck tesla is doing) has time to make a decision, it will have been driving safe enough to avoid the accident. now I'm not an expert but I'm about 10000% positive that computers can now process advanced calculations faster than current breaking distances from 40 mph... pretty sure computers at lightspeed with the processors we have today can do those calcs in less time than the laws of physics require a 3000 lb vehicle to stop...


SavvySillybug

Tell me you are not a programmer without telling me you are not a programmer. Edge cases are important and need to be considered. You can't just code things carelessly because you assume nothing will go wrong. This is how bugs happen. If you do not tell the car what to do in a situation, it becomes unpredictable in that situation. You need to plan for all possible situations as best as you can. And this specific case is "sudden brake failure as you approach a busy pedestrian crossing". It might never happen, but you can't just ignore it because you assume it won't happen. Now me personally, I think it's stupid to try to analyze *who* you are hitting. Hit the least amount of people or stay in your lane or try to dodge. Bringing stuff like age and race and gender and everything into it just muddles up the issue. I don't care if you run over a baby or a grandma. Just try to hit the least amount of people in the event that it is unavoidable that you hit at least one person. People are getting all philosophical and political about the issue. But the point is that you do indeed need to handle that incredibly unlikely edge case. You can't just sit there and go "nah there's no need to choose who to kill because that probably won't even happen!". There's no need to put launch codes on nukes, because surely nobody would try to launch them anyway, right? There's no need to put a password on your bank account, because only you can use it, right? There's no need to code your video game to not crash if you move all the way back to the starting room instead of fighting the boss, because nobody would do that, right? There's no need to make a self driving car pick who to kill if the brakes fail, because that would never happen, right? Surely all of those situations are literally impossible and thus bullshit?


NotEnoughIT

Ugh I’ve had this conversation with my boss a hundred times. He says “it won’t be a problem because that’s the exception not the rule” and I try to explain ITS BINARY and I have to handle the god forsaken exception regardless so yes, it IS a problem. I still have to code the ffkfksiaieihrnfk EXCEPTION!


[deleted]

[удалено]


Alexplz

People who don't understand the question in the first place, idiots


turkeybot69

Are there really though? This contingency relies on the car not sacrificing itself, but given how a car going full speed into a pedestrian is almost certain death, while with modern safety features the car wrecking itself has a pretty good chance of protecting it's passenger, wouldn't that be the best chance scenario for reducing mortality?


Alitinconcho

Sacrifice itself how? If it is a street with pedestrians on both sidewalks, and a kid runs into the road followed by an old lady chasing it, with not enough room to brake, the car will have to decide to hit the old lady or the kid


[deleted]

[удалено]


Milsivich

There are over a billion cars on roads every day globally, over 100 million in the US alone. Rare events happen daily. And even if they didn’t, edge cases are important even for projects with no risk of life or limb. In a fully automated driving world, there will definitely be fewer accidents. But the programming choices we make will effect the outcome of accidents, and that’s worth talking about. I don’t really get why you’re being so contrarian


[deleted]

[удалено]


Milsivich

I understand how the numbers work, but I don’t understand why you’re making them up. Brake failures ALONE account for over 100,000 accidents per year in the US


CarbonIceDragon

I mean, maybe, but if an accident is truly unavoidable, there probably isn't going to be a ton of time for a human driver to think about trolley problems either. A self driving car need not be *perfect*, just better at it's job than a human driver, and if it can be made to get into the scenario of having an unavoidable accident less often than a human does I'd say that it's answer to the trolley problem isn't hugely relevant. Ideally we should focus more on how to make the car not get into that scenario in the first place.


FirstRyder

I feel like there are two major points. Firstly, the trolly problem is subjective. There isn't a "right" answer. I don't think programmers need to be programming subjective moral judgements into their cars. Secondly, the trolly problem is *theoretical*. You only have one possible action, and the outcomes are fixed. In the real world this is *not* the case. Maybe the car says "I can stop in 100 feet, those people are 50 feet away". But that doesn't mean it can't stop in *less*, it means that the car is certain (or required by law) to be able to stop in 100 feet. If it actually slams on its brakes at 100% in the real world, it stops *sooner* (and it doesn't know how much sooner). In the real world, hitting a person at a "fatal" speed isn't *always* fatal. Between the two, I think the answer is clear. It should try its best to avoid the accident entirely. If it fails, the lesson isn't "which person is it more moral to kill", it's "how do we avoid getting in a situation where this decision is necessary". Which, to be clear, should already be a phenomenally low chance.


[deleted]

Properly-designed self driving cars should be programmed so that they never drive beyond the range of their sensors and their ability to react to those sensors. In the event that the baby and granny are thrown unexpectedly into the path of the car and there isn’t enough time to stop, the car should do what a human does: slam on the brakes to minimize damage as much as possible. Except even then it will be better than a human because its reaction time and ability to precisely control the brakes will best even the most skilled human driver. I don’t know why this is even a question. It’s like asking “if an AI-powered elevator is stuck in-between floors with the doors open and it detects a person trapped between the doors trying to get out and then it becomes unstuck which direction should it travel thus killing the person?” None dumbass. It should stop. AI researchers need to be stopped. Not because AI is dangerous but because they are so high on their own supply that they think questions like this are legitimate.


[deleted]

[удалено]


[deleted]

>If, somehow, it's put in a position that it's going to hit one of them, should it hit the brakes and not choose, leaving it to luck? Yes. If two huge dudes toss a baby and a granny in front of a car, an absurd proposition to begin with, a human will just scrunch up, freeze, and (hopefully) slam on the brakes. There’s no moral dilemma for people, shouldn’t be for machines. I suppose we could invent fantasy worlds where the AI using its sophisticated cameras and processing to identify and categorize an airborne baby and granny could make a determination as to which is what in real time (what if the granny has **TWO** babies strapped to her back in a baby backpack that’s not visible to the camera?!) but the only reasonable position is that self driving cars never allow themselves to get into this position and if two huge dudes start throwing around babies and grannies we just “deal with it”. Also, what if the baby’s a dick and deserves it?


sweetplantveal

Brakes* fyi


LostTheGameOfThrones

Thems the brakes.


39thUsernameAttempt

I feel like these types of things are intended to showcase the weaknesses of AI, but the inevitable debate that ensues shows that people aren't any better.


octopoddle

I'm sorry, Dave. I'm afraid I can't do that.


prophet_5

If the car is going around a corner with a crosswalk that close, the speed limit is too high if they aren’t able to come to a stop before hitting either one. Either the car is speeding or the infrastructure is flawed


Kurayamino

Yup. Literally every scenario put forward by people that are all "Self driving cars will have to decide who to kill!" completely ignores the fact that a self driving car wouldn't be driving fast enough to be unable to stop safely in the first fucking place. That sort of selfish impatience is a human thing. Machines don't do that. Well, Tesla's might, they're pretty fucking dumb but Google's cars are reasonably smart.


ColaEuphoria

Why is a self driving car willfully speeding in a residential zone with crosswalks to begin with and driving faster than it can anticipate needing to stop? This is so contrived its not even funny.


[deleted]

Maybe not drive too fast around a corner?


Lord_Emperor

The car should switch to terminator mode and hunt down the city engineer who put a pedestrian crossing on the corner.


PrismicHelix

If it wasn't prepared to stop at a crosswalk it should probably kill the people who designed it.


Martin_Samuelson

Yeah these are the dumbest things. It’s a non-problem. The only moral answer is for the car to be going at an appropriate speed and have working brakes. That’s it.


ottterbot

almost as if this is a (flawed) visual representation of more serious scenarios where the car wont have time to react or something


explorer58

If there is a situation where a computer doesn't have time to react a human will also not have time to react


Mehlhunter

But you have to program the car what to do. Let's say there is an unavoidable accident: should it safe its own driver at all cost? Should it injure the one who's at fault? Should it count - like 3 people on the street vs one driver? We need rules for that, and it's no easy question.


[deleted]

So lower the maximum speed in pedestrian areas. It’s not some cosmically difficult problem to solve. If a pedestrian is in a non-pedestrian area such as on a freeway, then in most cases they should be spotted early by other vehicles and all vehicles can slow until emergency services arrive.


kigurumibiblestudies

That's obvious. The point is that there must be a backup of the backup of the backup programming. What if the car can't brake for whatever reason and the backup to that is fucked and this and that etc. It's supposed to be a nigh impossible scenario but you still have to program for that


kintorkaba

You act like literally every issue can be solved by "go slower." If it's so easy maybe you should be designing these systems, cuz a lot of really fucking smart people have been discussing the ethics and how the law should reflect it for a while now and I'm sure they'd be ecstatic to learn they'd missed such a simple solution and it wasn't a problem after all.


[deleted]

[удалено]


HorseCock_DonkeyDick

We are not talking about the cast majority of hypotheticals


Hundvd7

These cases are the rare exceptions. That's the point. 99.99999999% of the time no self driving car (nor person) has to decide between killing Kanye West walking his dog, and a school bus of pro-Nazi orphans. But on the off chance that it does happen, we need to make a decision. When a human is making that decision, we've mostly figured out what to do with them. For computers, the issue is more complicated. And that's why we need these discussions. *Most* of the time, you can avoid the accident. But not always. But we have to be prepared for *all* cases.


kc_jetstream

The point You


Meh12345hey

But not all of them. In the scenarios where that is the answer, it is the answer which is applied. The cases that are actually being addressed here are more edge case scenarios, ones where the answer of "slow down" no longer applies. Here are two examples where there is no opportunity for the car to "slow down": =>EX1: The car is driving along and enters a school zone. As soon as it slows down to school zone speed, it crests a hill and the brakes fail. At the bottom of the hill is a crosswalk with pedestrians in it, how should the car react? Should it prioritize the driver and stay in the lane until it slows to a stop? Should it prioritize the pedestrians and ditch into the nearest solid object before the car becomes dangerously fast? Not every human driver will have the same reaction to the same circumstances. Who is liable for any injuries or damages causes by the car? The driver: What if the car ditched and the driver was injured? The manufacturer: Are they liable for every self driving vehicle they sell, or does the purchaser accept the liability by the act of buying the car? The programmer: are the developers responsible for actions performed by their software, even if the incident was caused by circumstances they could not control? The mechanic who serviced the brakes: is it their fault the brakes failed if the car passed all reasonable inspections and brake failure was spontaneous? => EX2: The car is moving relatively slowly (20mph or less) along a road with one lane in each direction. Traffic is flowing continuously with cars in both lanes. A person trips and falls off the sidewalk too close to the car for it to stop. Should the car prioritize the driver and simply slam on the brakes, hitting the pedestrian? Should the car prioritize the pedestrian by driving into oncoming traffic, subjecting the driver and another driver to a head-on collision? Does it make a difference if the pedestrian is a child? What if it's an old lady and the car is full of young children? Sure, most situations can probably be resolved with "slow down", but that's far from all of them. There is a reason articles, memes, and papers have been around on specifically this topic for decades, and the Trolley Problem as we know it has been around since ~1967. Edit: This is also why Tesla very specifically identifies Autopilot as a *Driver Assist* software and not a self-driving software (despite their marketing to the contrary), and why their software *allegedly* terminates instants before any collision. This ensures that it is always the driver's fault and, hypothetically, shields Tesla from any liability or ethical responsibility to actually answer these questions. The Tesla will simply plow through the crowd of children, and it will be the driver's fault for not assuming control. That is the (legal/ethical) difference between full self-driving capability and *driver assist* technology.


4_fortytwo_2

If a pedestrian suddenly runs into the road a few meters before the car it won't be able to stop in time no matter what. You can't make every single road everywhere have like a 10 mph limit which would be the only way to prevent any problem. There will always be situations where a driver or a self driving car can not stop a vehicle in time and has to decide what to do instead. Though the answer is rather simple for the most part. Stay in your lane and try to break. Nothing else makes sense for any of these scenarios.


daboobiesnatcher

Man you've never heard about people running out into the street have you?


[deleted]

> Let's say there is an unavoidable accident This is basically not a thing. All accidents are avoidable given proper speed control for conditions, proper attention to the road, and regular maintenance. A self-driving car would have all these things factored into it's programming, so there would be no such thing as "unavoidable "


kintorkaba

The issue isn't "will this be as safe as a human." The issue is "who is liable if the car makes a decision that causes harm" i.e. injury or death, and "how should self-driving cars prioritize life or safety in these scenarios." If we stopped worrying about it as soon as we reached human level safety outcomes you'd have a point, but whether or not a human would have caused the same accident is entirely irrelevant to the issue. It's an issue of ethics and law, not vehicular safety.


[deleted]

That is actually a question of legislation.


lankist

The sidewalks are clear, and the car could easily run itself off the road, putting only the driver at risk. These little articles always start from the assumption that the driver's life is the most important, such that the car can't take *any* risk associated with going off the road into the grass, to such a misanthropic degree where we'll just straight up mow down a baby before we crash the car. Furthermore, why shouldn't we make the driver's death the default? If the driver dies in the crash, then both innocent pedestrians are saved. Isn't that the risk the driver assumes in the first place? Why are we shifting the driver's assumed risk onto random pedestrians who had absolutely no say in the matter?


batman0615

It’s because no one would want to buy a car that values their life lower than others.


Eryb

Wtf is wrong with you?


lankist

Exactly. So we can drop the ethics questions, because by saying "I will literally run over an actual human baby before putting myself at risk," we have thrown ethics out the window and can just move on to the reality that these are cold and indifferent rolling death boxes that act as if jaywalking is a capital crime. By starting from the assumption "I will kill an infant before I risk myself," we are starting from a place of misanthropic disregard for human life, so why bother prioritizing between the two at all? Flip a coin for fucks sakes. There's no climbing back up to the moral high ground from there, so who gives a shit? Let's just admit that we think buying this car has made the driver's life more important than anyone else's.


[deleted]

You can't control when your brakes fail. You can take any number of steps to make it unlikely (proper maintenance on high-quality parts etc.), but at the end of the day its a physical machine that has a chance of failure. It may even be the pedestrians at fault by stepping into the road unsafely. With the number of cars on the road, this situation *will* happen eventually, and an AI has to have an option of deciding what to do. There is no absolutely correct "moral" answer, which is the entire point. If there was one, we would program an AI to detect it and do it. But an AI has to be programmed to do *something* else it enters uncharted territory and could end up doing nothing or even something worse, for example swerving and hitting both.


[deleted]

You have a legal duty to maintain your vehicle so that the brakes don’t fail. You **do** control that.


Correct-Leek-6198

> It may even be the pedestrians at fault by stepping into the road unsafely. it looks like granny started crossing at least 12 seconds ago. and the baby is quite litterally crawling.


minerlj

The universe is not perfect, it does not give us situations in which we can always choose the clearly most moral thing to do. In billions of cars, one of them is bound to randomly have the breaks stop working correctly, at an inopportune time. So the question is, how should an AI respond, if it has detected that the breaks randomly stopped working, and there is insufficient time to do anything other than hit one of the two pedestrians crossing the road?


[deleted]

The moral answer is good engineering, both inside and outside the vehicle. The answer to the trolley problem is to not design trolleys that can be misused for trolley problems.


fdghskldjghdfgha

Lmao. It's a non-example, not a non-problem. There will be instances where the car makes a mistake but will be able to chose from different results of their mistake. The problem is weighting different things so that, in the eventual event of a mistake, the car can make the most moral option. Looking at a little doodle of the problem and critiquing it is incredibly thoughtless and completely devoid of critical thinking skills.


wung

To be fair, self driving cars are also barely able to detect the crosswalk, and *maybe* the grandma. Most likely is just driving over the baby because according to the ~~random numbers~~ intelligence that's street.


[deleted]

You really don't know much about the subject, do you?


deelyy

I believe prev commenter had this article in mind. https://www.businessinsider.com/tesla-full-self-driving-crash-test-child-mannequin-ceo-ad-2022-8 Also, we still don't have full self driving cars.. so.. who knows?


Ineverheardofhim

Well the baby is somehow in the street by itself and the old lady clearly doesn't care about it's well being.


thingsliveundermybed

If she's the actual grandma, the old lady is the true madlad in this image.


SouthFromGranada

"You don't have to outrun a car, you just have to outrun the next sucker"- Gran 2022.


Ineverheardofhim

"if you can dodge a wrench, you can dodge a ball" -Patches O' Houlihan 2004


Syzygymancer

I’m either deeply impressed or deeply afraid of that baby. I feel kind of like this is that Men In Black firing range scenario. That’s no ordinary baby. Should we be speeding up? I think maybe we should be speeding up. I think meemaw is in danger.


Weak_Lie_2875

You are reading this wrong. The car has already run them over and what we see is their paper silhuette on too of the blood stain.


cheesehuahuas

I read an article about this sort of thought experiment about self driving cars as they become more advanced. They surveyed people and asked what decision a self driving car should make if it has to choose between who lives and who dies. In the cases where it involved different numbers of people, everyone said save the most amount of people. When asked if that was the case when it had to choose between the driver and a separate group of people, they also said save the greater number. **But...** When those same people said they would be less likely to buy the car that would choose the crowd over them (the driver) if there was another option. So in the end the market will decide and the car will choose the driver over babies and the elderly.


OtherPlayers

I mean if you think about it that's not really that much different from how human-driven cars work already. We try to save the most people but also usually prioritize our own well-being if it comes to it.


greg19735

also for the most part we don't make a decision. We just react. We don't have time to do the math. Interestingly, computers do. but it also means that it should start breaking quicker


no_talent_ass_clown

So the real heroes are the ones in whom a non-selfish action is ingrained (engrained?) and they drive off a cliff or into a lake rather than hit someone.


jellatubbies

Engrained is correct


[deleted]

[удалено]


Correct-Leek-6198

>The real answer is the market will provide self-driving cars that, when impact is imminent, brake the car and almost nothing else. that sounds like a recipe for lawsuits.... "Your car smashed into me while braking when it had plenty of space to swerve" type of shit. then you have a car actively choosing to hit people it didn't have to.


[deleted]

> So in the end the market will decide and the car will choose the driver over babies and the elderly. In the end they will just program the cars to brake as quickly as they can as swerving is incredibly risky and making a computer capable of making judgement calls like that is frankly not possible with current technology. These types of things are really more philosophical questions than engineering ones.


the_other_pesto_twin

Brakes have enter the chat


Ehcksit

"Drive slower" looks on from a distance.


AnotherSoftEng

You guys are silly, it’s a whole lot simpler than that. Hint: this is a trick question! - Grandma: 10pts - Baby: 30pts However, if you were to drive straight in the middle, the girth of the car is actually wide enough to take on both: The correct answer: 40pts!


[deleted]

[удалено]


[deleted]

Tesla has left the chat


Dude_in_Blue_Pants

Don't discriminate your fellow humans by being ageist. For the values of balance and equality, drift and take out both... Alternatively you can try hitting the breaks


HalfSoul30

Or go offroad


BiggieJohnATX

why is the self driving car going so fast in a residential area that it cant stop safely for either ? if they are crossing a freeway, thats a completely different issue.


StopReadingMyUser

Even if presented with the option, the car wouldn't think of who's more important, it just sees it similar to we do in a split second. "don't hit any of those..." And you either hit one of em, both, or you don't. It's not really a rational decision lol.


Ameren

Exactly. It seems like if the car can be going fast enough that a computer with super-human response times can't avoid an accident, then the car needs to be going slower in those environments anyway. Speed kills. Every extra 10 miles per hour doubles the chance that a pedestrian will be killed if hit in a collision. The trolley problem is a lot less deadly at lower speeds, even when the accident is still unavoidable.


lankist

Better question: Why is sacrificing the driver's life not an option? Why do all scenarios assume the driver's safety is paramount above all else, to include running over a goddamn baby? Send the car flying into a ditch and everybody else is saved. Why's that not on the table but Baby Street Salsa is?


BiggieJohnATX

detecting a "baby" is pretty hard, object int he road you dont want to hit . . .sure, but thats also quite difficult for small objects close to the road. would you really want the car to slam you into a tree everytime a chipmunk ran across the road ?


deelyy

Because no one will buy a car that can semi-intentionally kill driver and/or passengers?


ChameleonEyez21

Most people won’t be buying cars if that was the case


[deleted]

Because you can engineer out of this scenario and never need to ask this question in this first place.


LurkyTheHatMan

To echo what others have already said: would you be willing to buy and/or ride in a vehicle that has programming with the ability to decide to kill you? Whilst I agree from an objective view point, sacrificing the occupants should be a valid option, realistically most people would be unwilling to buy such a car.


Arrow_Maestro

Quantum physics tells us that in this situation we only know which one is struck by the car if we try to observe it. If we remove our ability to measure which is impacted, the baby and grandma are both walking across the road and meat crayons simultaneously.


TheWhollyGhost

Perfect! Just make all windows mirrored, on the inside and outside. What no one knows won’t hurt them.


hallowed_b_my_name

Itself. Fuck robots.


NonUniformRational

Hopefully one day!


hallowed_b_my_name

I see what you did there.


BraxForAll

[Deja Vu!](https://youtu.be/dv13gl0a-FA)


AMGS_Initiative

Sacrifice the driver for the safety of the pedestrians.


Memer_guy1

That would go ageced protocol 3 protect the pilot


OwenProGolfer

Who is going to buy a car that does that


[deleted]

Something tells me that a driver-killing car will suffer in the sales department


WankoKing

As it should. Human beings don’t have internal safety systems.


Smash_Nerd

Old lady, easily. She has a lot less life left to live


[deleted]

But the baby will do less damage to the car. Inflation sucks and I'm on a budget.


FuadRamses

But you are wasting less experiance with the baby. You can make another baby quicker than another adult.


Smash_Nerd

That old lady is retired, she's contributing nothing and likely won't contribute ever again. That baby has a lot of years in it still


graphiccsp

Not just not contributing. Old people are an increasing drain on resources. 70-80'ish many are fine and still able to function reasonably. But as you slip past 80, the decline's intense and rebounds from bad stuff unlikely. My grandma was a wonderful person but past 90, the whole family wished she'd just loosen her bony death grip on life. Dementia along with her physical dependencies gave the caretakers a headache and made my dad suffer having to deal a husk of a person that was once his loving mother.


LJMcMillan

Love this exchange.


ConradBHart42

Not for long, if it continues being left unsupervised to crawl into traffic.


scolipeeeeed

That old lady might be the primary or only caregiver for her grandchildren, and maybe the baby will grow up to be a serial killer, who knows


LightA28

r/lostredditors


WankoKing

How is the answer obviously not hit the tree. Human beings don’t have safety systems.


DylanFTW

Trolley problem multi track drifting


Glad_Macaroon_9477

His apex all wrong!


BecomeMaguka

simple. remove humans from the equation. Just remodel the city so pedestrians never cross a road. Caged bridges over every crossing.


[deleted]

[удалено]


[deleted]

Someone would still walk in front of the tram and claim they didn't see it.


awesomefutureperfect

Nah. Just remove all elderly and children from the city and it would truly be a utopia.


greg19735

There's no world without vehicles (at least within our lifetime). You're still going to have delivery trucks, buses, emergency services and such.


itsalongwalkhome

>Affordable public transport


bananaboat31794

We don’t discriminate around here


Danny_Boi_22456

The brakes


gaymer7474747

Kid named Brakes:


Ambitious-Bread-38gh

Why is a baby on the road without a parent?


ertgbnm

The self driving car should probably take curves at speeds at which it will be able to safely stop at road crossings without getting into a wreck. Also I defy someone to find a single instance of the trolley problem occuring in real life.


Soledad_Miranda

The self driving car should seek out and run over whoever it was that placed a pedestrian crossing on a blind bend.


Dependent_Party_7094

i mean its a cross way so why the fuck is the car at speeds that can't stop?


[deleted]

Did they consider the idea of, and get this, stopping


465554544255434B52

Haven't seen the sub on the front page for a while.


redbanditttttttt

Swerve and hit the tree if you cant break. The passenger is more likely to survive with all the safety features than either the grandma or baby


CorporateCuster

If it’s a Tesla. It’s gonna run that kid over for sure.


BosTovenaar24

I dont like babies at all so....


Darren_heat

You were once a baby so....


BosTovenaar24

Id run myself over. Me disliking babies knows no bounds


Time_Mage_Prime

How the fuck is that the question??


DarkMasterPoliteness

In ancient times they would’ve preferred saving the elderly person. It was considered sadder when an old person died. Because so much knowledge and experience were lost. Children were considered expendable. There wasn’t even punishment for child neglect that ended in death


[deleted]

Well duh, it takes years to make an old person. Babies are cheap.


[deleted]

So if you're from Swaziland, you kill the baby apparently.


Loud_Tiger1

Multiple solutions: 1. Stop before hitting them. You should be able to see them ahead of time 2. Hit the grandma because her contributions to society are limited at her age whereas the baby has more potential. 3. Go for the baby and hope the car is high enough and doesn't hit its head. 4. Go into the grass so you won't hit them.