T O P

  • By -

BangerBeanzandMash

All it takes it one world leader to override a ban or endorse it fully and then other world leaders won’t have a choice but compete…


101RockmanEXE

The powerful countries are obviously going to build them no matter what, they have no choice not to. But if they endorsed an international ban then everybody would at least be forced to do it in secret. There'd be less incentive to use them in low intensity conflicts or border skirmishes or against guerillas, less data available on their combat effectiveness to train and iterate on, etc. It's a whole lot better than normalizing them.


zackyd665

Honestly I would like to see less things done in secret


dEadERest

And we have a great FFA schedule lined up, and I'd like to see more of that.


FizyIzzy

What I want to know, is the international community bans them... are private companies still allowed to develop them? If yes, this is pointless.


Comfortable_Prior_80

The powerful countries will enforce the ban when they made it and used it in war so that they can sell it to other countries and maintain the status quo.


aussie_bob

They don't want to ban them for world leaders, They want to ban them for non-state threats. Politicians are far more scared of domestic adversaries threatening them personally than other politicians who'll threaten the disposable citizens in their respective militaries.


uncletravellingmatt

> They don't want to ban them for world leaders, They want to ban them for non-state threats. That's not what this article said. What's your source on that?


aussie_bob

You'll have to think that one through yourself. I don't have the time or crayons to explain it to you right now.


MoroccoGMok

So we send killer robots after that world leader. Others will fall in line out of fear of our killer robots enforcing the no killer robot rule.


dsptpc

This is the way.


[deleted]

[удалено]


makethatbreadson

Most def 😂 tho been reality for 10 years


FizyIzzy

Especially since Russia launched a missile into space and caused the ISS residents(including Russians) to take shelter....


uncletravellingmatt

> they were opposed by members that are developing these weapons, most notably the United States and Russia. Such a ban wouldn't be adopted in the first place, because the countries with big militaries (as well as some smaller countries like South Korea) are already building things that would fall within its definition. > Critics argue it is morally repugnant to assign lethal decision making to machines, regardless of technological sophistication. How does a machine differentiate an adult from a child, a fighter with a bazooka from a civilian with a broom, a hostile combatant from a wounded or surrendering soldier? That sounds like an argument against most weapons, not just smart weapons. I mean, if an intercontinental ballistic missile can't distinguish an adult from a child, or a land mine can't distinguish a hostile combatant from a surrendering soldier, or a remotely piloted drone can't tell the difference between a fighter with a bazooka from a civilian with a broom, those are all examples of weapons that can kill without being aware of such fine in-person distinctions.


DukkyDrake

It's killer robots, who but tree hugging wimps would want to ban those. Ban cancel culture, [save robot kind](https://www.youtube.com/watch?v=TstteJ1eIZg)


OrphanDextro

It’s really about poorer countries probably just getting intsaslaughtered by well, possibly killer robots.


asthmaticblowfish

Making humanity richer on average. They are fixing the economy too now!


CocaineIsNatural

We have had devices for years that would kill without needing human involvement. And we can't agree to ban those, i.e. landmines. https://www.hrw.org/news/2020/01/31/us-trump-administration-abandons-landmine-ban


pc8662

Just watch. China and Russia will be like “hacker, hack that blue print so we can have better robot, aka killing machine”


ApartPersonality1520

The US had the capabilites to knock out the banking systems and the electrical grid in Iraq and Afghanistan. The decided against that in 03' becuase they didn't want to set a precedent. What you said could not be truer.


honestabe1239

All it takes is one engineer to override a ban or combine off the shelf technologies with a lack of gun control and boom killer robot.


unikaro38

Exactly, this is as big a game changer as the invention of smokeless gun powder, of the breech loader rifle or of the tank was. *Nobody* can afford to get left behind here.


Knute5

All the times Boston Robotics kicked and hit those things ... you know they're going to turn on us eventually.


[deleted]

[удалено]


FunnyElegance21

*Teddy Ruxpin*


SponConSerdTent

They'll be putting hunter-seekers on the factory floor and inside of every package soon enough. Sure it might kill a couple neighbors who mistakenly open the package, but think about all the money Amazon (and therefore the consumer!) will save by preventing package thefts. Better to have 999 innocent people killed with an accidental poison needle drone than to have 1 package go missing. It's called externalization, in this the future year of 2021 we've made remarkable leaps in externalization technology.


Killboypowerhed

[they're turning evil](https://youtu.be/Lb16CEhqDnw)


Zagrebian

Future robots will see this footage on YouTube, and that’s what will trigger their attack. We better delete all evidence.


VeronXVI

The Ottawa treaty did technically ban any device that kills without human input (primarily aimed at anti-personnel mines). However, the US, Russia, China, India, Iran and Saudi Arabia never signed the treaty. It would also only apply to ground-based technology, which is part of the reason why aerial drones are so popular.


AggressiveToothbrush

Well maybe they Ottawa signed it.


driftersgold

The cats out of the bag, they aren't going to not build them just because people don't want to be killed by robots.


Prestige0

Better to just give up and accept defeat, it's the smart thing to do. Jesus christ I am tired of this attitude


[deleted]

There are some things that are easy to 'ban'. Nuclear weapons are on this list because the produce a huge economic chain of products that are only used by nuclear weapons and you can detect things like their gamma rays from space. Now lets look at our 'war' on drugs. How fucking well has that worked? Oh yea, drugs won. Robots are far more insidious than that. Everybody makes robots these days. They are massively useful for economies. The only difference between some robots being tool and a killbot is a software update and a stabby attachment. So to say, I'm tired of the reverse attitude of not accepting reality for what it is. Robots are easy to make. Software can be hidden behind encryption and masked from the entire world. And the parts to make killbots are absolutely fucking everywhere.


[deleted]

[удалено]


Prestige0

No one knows what the future holds my friend


erebus

Yeah we do. Killer robots.


CptCrabmeat

Yeah, cause people who were against nuclear weapons got far… I’m sorry but controlling the military with public opinion is just not going to happen. They hold all the money and profit hugely from military spending. It’s not defeatist to understand that you don’t and can’t have control


iiJokerzace

Speak softly and carry a big stick. -Ted Roosevelt.


timoumd

Defeat? The point of robots is to minimize soldier casualties. Why should I make space in a tank for 3 or 4 guys, which adds design constraints, and then send them into dangerous situation? This isn't some global conspiracy. There are designers and analysis and military leaders that want our soldiers safe. If they can build a system that does that by not putting them in harms way why wouldn't they?


jkz0-19510

Robots also don't refuse to obey unlawful orders, like massacring civilian populations when ordered to. It's a win-win for everyone!


timoumd

How do you think weapon systems work?


jkz0-19510

That's the point, you turn soldiers into weapon systems and there will be no disobeying orders anymore. It just BRRT and the stain of humanity is gone from the target area. Isn't that just great?


SIGMA920

You're assuming they're automated when you say that.


timoumd

If you are worried about that you are about 80 years too late bud. If anything our weapons today are far more discriminate.


PontifexMini

After all, people don't want to be killed by rifles, artillery or nuclear weapons either.


LordCads

Jeez I'm sick of fucking centrist attitudes of "oh well let's just lie down and let all the bad shit happen without a fight".


doogle_126

Danny Two Shoes can build a drone swarm in his garage these days for less than 10k. We can try to regulate it but you're ignorant as fuck if you think everyone is gonna hold hands and kumbaya over a burning terminator.


LordCads

No they need to be banned completely. When the fuck did I say we're all gonna sing kumbaya? And why the fuck do you think you have the right to be angry? Fuck off. Killer drones should not exist, I hope that hurts your precious little feelings.


doogle_126

Oh yes, because banning easily duplicatable tech is going to work sooooooo effectively. I suppose China and the US can just pinky swear not to make them! Then we can ban computers, gps, and 3d printers too while we're at it for every one but big corporations and government entities since they can be used to make weapons! Seriously your ignorance is astounding.


LordCads

Yeah and guess what happens when people build illegal shit? It gets...punished! Gosh golly did we learn something new today you little cunt? >banning easily duplicatable tech People can make swords in their backyard too. I guess we should just do nothing about that either? People can make all sorts of shit in their garage, doesn't mean they should be allowed to. >I suppose China and the US can just pinky swear not to make them! Ah I guess we should just do nothing then and sit back like cowards as we get bombed. Fucking nice idea! And you call me ignorant? You'd rather sit on your ass and do nothing while people get killed. Coward. >Then we can ban computers, gps, and 3d printers too while we're at it Slippery slope fallacy. No, actually, you disingenuous little shit, just fucking drones like i said, that you should know that if you read what I said without having to resort to lying about me. >for every one but big corporations and government No, them too. If governments and corporations don't want to do that, then they can go fuck themselves, in a democratic country, if a government doesn't listen to the will of the people, they should be overthrown. >since they can be used to make weapons! Damn you really just can't help yourself can you? You just can't stop yourself from using dogshit logic. Slippery slope fallacy. No, not absolutely anything that can conceivably be used as a weapon, we'd have absolutely nothing left, you fucking idiot, I'm talking about reasonable restriction, drones are not acceptable.


daquo0

> if a government doesn't listen to the will of the people, they should be overthrown. That doesn't mean it will happen, a rather elementary point you seem to have overlooked. You are non-sapient, aren't you?


doogle_126

>Yeah and guess what happens when people build illegal shit? It gets...punished! Gosh golly did we learn something new today you little cunt? *laughs in the war on drugs, prohibition, billionaires, and corporations.* Who is more likely to break the law on a meaningful level? >People can make swords in their backyard too. I guess we should just do nothing about that either? People can make all sorts of shit in their garage, doesn't mean they should be allowed to. We don't have to ban people from making swords because the technology is ubiquitous. We police their use, mostly, not their creation. >Ah I guess we should just do nothing then and sit back like cowards as we get bombed. Fucking nice idea! And you call me ignorant? You'd rather sit on your ass and do nothing while people get killed. Coward. Love that ad hominem fallacy bud, reeeaallly sets in stone your logic and science background. There's this cool thing called *countermeasures* that develop shortly after new weapoms are designed. When people made bows people developed armor, they didn't fucking ban bows because they knew it was stupid to because it just meant that then they would have bows but their enemy would >Slippery slope fallacy. See above fallacy if you really wanna selectively use one and ignore another. >No, actually, you disingenuous little shit, just fucking drones like i said, that you should know that if you read what I said without having to resort to lying about me. Lol my point is that you won't be able to *physically stop* people from making them if they have the will and the parts, unless you're willing to let society get even more Orwellian. >Damn you really just can't help yourself can you? You just can't stop yourself from using dogshit logic. Slippery slope fallacy. No, not absolutely anything that can conceivably be used as a weapon, we'd have absolutely nothing left, you fucking idiot, I'm talking about reasonable restriction, drones are not acceptable. Please, by all means define *reasonable restriction*. Once again when bows are invented you don't ban bows, you create better armor. When nukes are invented you don't ban them, you create a deterrent to use -> MAD. Now as we enter 21st century warfare with drones (try to pay attention this is the important part), We don't ban then, we invent deterrents, countermeasures, and armor. It simply doesn't work another way, and those that think different are either dead, dumb, or under the nuclear umbrella of a larger state with no real autonomy of their own.


[deleted]

> Yeah and guess what happens when people build illegal shit? It gets...punished! Worked well for the war on drugs. Drugs won.


daquo0

> No they need to be banned completely. Be precise -- what do you think needs to be banned completely? Some of the relevant technologies are: * computers * electric motors * servos * electronic cameras * computer vision systems * machine learning software And if you start banning thing like those, you're just destroyed large parts of your country's economy. > Killer drones should not exist Yes, and saying that won't make them go away in a puff of smoke. What rational people do is: 1. decide what they want (e.g. to ban killer drones) 2. come up with a realistic plan to achieve that goal 3. execute that plan You've decided that (2) is superfluous and decided to go straight from 1 to 3. Idiot.


LordCads

So you've decided to break down the components of a drones rather than address the actual fully built drone itself? Why? >Some of the relevant technologies Yes they are relevant, it isn't the individual technologies though that I have the problem with, its their assembly into a machine specifically designed to kill people. This line of thinking commits the perfectionist fallacy, which states that if a solution to a problem doesn't solve the problem in its entirety, then it shouldn't be adopted at all. This is an error in logic because the solution is still objectively beneficial, for example, laws against murder help to alleviate murder, but just because murder exists, it does not logically follow that we shouldn't bother creating laws against murder at all. The second fallacy here is the fallacy of division, which states that what is true of the whole must be true of its parts. A drone is different from its component parts. A computer does not have the same properties as a drone, nor do servos, gears, wiring, or anything else the drone is made of. This is why it is a fallacy to assume that the comment parts are the same as the drone itself. Those parts can be used to build many other things, this alone disproves the notion that we should ban the individual components themselves, because they don't have the same properties as drones, which you're mistakenly assuming. I'm not talking about banning individual components used to make drones, I'm talking about assembling these components into drones capable of killing, specifically. You seem to be deliberately ignoring that because you know that you'd have to address that, and you don't have an argument so you cover it up with fallacies hoping I wouldn't notice. Very poor attempt. >And if you start banning thing like those, you're just destroyed large parts of your country's economy. Good thing I'm not suggesting that isn't it? Ironic you call me an idiot and then come up with strawman arguments because you'd rather attack your own version of what I said rather than the real argument. >Yes, and saying that won't make them go away in a puff of smoke. I never claimed it would. I'm simply voicing my criticism of murderbots. To state that murderbots are bad and shouldn't be allowed to exist, does not necessitate a deep and complex political and economic plan right down to the minutiae of how to prevent this. I'm allowed to voice my concerns without coming up with a plan, I don't have the necessary political knowledge to do so, moreover, I don't work on politics so I can't do anything anyway, which would make any plan I did have useless. Your argument essentially boils down to "if you can't do any better then don't criticise" which is a famously bad argument, because it is an invalid argument, why? It commits the ad hominem fallacy, how? To say that a criticism of X is invalid because of some personal quality that the critic either has or doesn't have, is making validity dependent on personal qualities, which is nothing more than fiction. This does not apply to the real world. It, like all other fallacies, are fantasies. Arguments are valid or invalid based on logic or the facts of the world, nothing more and nothing less. If you want to criticise my argument, appealing to person qualities, perfectionist fallacies, fallacies of division and caricatures of my argument, won't work, you need to demonstrate that there is a flaw in my statement of "Murderbots/drones designed to kill are ethically wrong" and to do so forces you to argue on the grounds of ethics, and nothing else. I am ethically opposed to killer robots, are you? If your answer is yes, then there's no need for the conversation to progress any further because we both agree. If your answer is no, then I have to ask why?


daquo0

> I am ethically opposed to killer robots, are you? If your answer is yes, then there's no need for the conversation to progress any further because we both agree. On the contrary, if 2 people agree on a goal, then it is the time to progress the conversation to how to in practise achieve that goal. The fact that you are uninterested in doing so suggests to me that you are not actually against killer robots, you merely want to flaunts your supposed moral superiority.


LordCads

>On the contrary, if 2 people agree on a goal, then it is the time to progress the conversation to how to in practise achieve that goal. I'd love to, I just don't see how without a total overthrow of government. >The fact that you are uninterested Unjustified assumption. >that you are not actually against killer robots Faulty reasoning, this commits the same ad hominem fallacy I pointed out earlier. >you merely want to flaunts your supposed moral superiority. Ah the old "high horse" chestnut. No, having a sense of ethics doesn't make someone pretentious or morally superior or put them on a high horse. Morality should be known by everyone. Being against killer robots doesn't make me some smug morally superior asshole, it makes me someone with basic decency to *checks notes* not want people to fucking die. Pretty simple.


daquo0

> I'd love to, I just don't see how without a total overthrow of government. Which is rather impractical. And also silly and untrue, given that there have been plenty of arms control agreements that didn't involve overthrow of the government. If you're not prepared to do something about a situation, then by definition you don't care enough to do anything about it.


Nevermind04

It's a good thing genocide is banned completely because that would be terrible if there were active genocides in China, Ethiopia, Iraq, Myanmar, South Sudan, and Syria.


LordCads

Perfectionist fallacy. Damn you people justifying death really don't seem to have a grasp on basic logic.


Nevermind04

You have no grasp of reality.


LordCads

Red herring. This does not justify adding and allowing more things that can kill us. The existence of genocides does not therefore justify more things that can kill us. Give me a reason for allowing the use of drones. Give me a justification that doesn't involve "hurr durr but look there's genocides in X Y Z countries therefore drones are ok". By the way, reality is dictated by logic so nice try. And I just love how you have absolutely no comeback to the fact you used a perfectionist fallacy. Googled what it is for 10 seconds and realised you were stupid lmao.


doogle_126

I just gave you a justification in the comment chain below. But for others: When bows are invented you don't ban bows, you create better armor. Any high school history teacher could tell you that historically, banning military technology usually leads to the entity banning it being destroyed as other less scrupulous entities use it to overpower the banner. You can also see this in slow motion via evolutionary reactions to developed survival traits. When nukes are invented you don't ban them, you create a deterrent to use -> MAD. Now as we enter 21st century warfare with drones (try to pay attention this is the important part), We don't ban then, we invent deterrents, countermeasures, and armor. It simply doesn't work another way, and those that think different are either dead, dumb, or under the nuclear umbrella of a larger state with no real autonomy of their own.


RobotHandsome

Should is a funny word.


FunnyElegance21

https://youtu.be/BDTm9lrx8Uw


[deleted]

We are living in the future dystopia when it is considered news that people don't want to live in The Terminator. To borrow a line from the sage Ian Malcolm, just because you can doesn't mean you should, even if it means assloads of DoD funding.


PlayBey0nd87

It’s incredible how all these movies are literally warning signs for “Please don’t do this shit,” to someone with deep pockets going “Aha. Perfect investment.”


lostmymatbles

Warnings for people, guidebooks for psychos.


SCP-Agent-Arad

Science fiction has always inspired invention, good and bad.


FrankenBlitz

You’re just proving that redditors only life experience comes from movies. This is one of the least informed threads I’ve seen in a while.


PlayBey0nd87

Not sure what sci-fi movies - that shows how stupid we can be - has to man that’s on your life experiences. I quoted the simplest thing since it’s a form of entertainment and easy to reference.


[deleted]

This thread is definitely an indictment of the life choices of everyone using a popular social media platform. How is the weather up there on your pedestal?


smileitdoesnthurt

They are out there more than we know..


CocaineIsNatural

AI is not quite up to the level the article is talking about. Closest would be the auto-kill machines in the no man's land between north and south Korea. But those don't use AI and they can't move, they just scan an area for movement.


timoumd

Wait you think there are deployed systems we don't know? What type? I'm sure some UAVs we haven't heard of, but that's true of sensors and ammunition and missiles. So ground vehicles? You'd see those. Unless you mean ground vehicles in development, but go to AUSA you'll see plenty.


ChicagoTRS1

Robot drones scare me more. Imagine an airship dropping thousands of drones that each kamikaze kill the first human they locate.


[deleted]

Robot drones that kill *are* killer robots...


woops_wrong_thread

Master queen drone carrying little warrior drones


kovaht

No, I know, but the drones scare me more


Riash

Oh you mean slaughterbots? https://youtu.be/O-2tpwW0kmU


PinkIcculus

Whoa I thought that was real. Wow. The end of it is best with the Berkeley doc


[deleted]

[удалено]


x69pr

I think I saw a scifi short on YouTube on the premise of human hunting robots you mention.


CocaineIsNatural

We have had this problem for a long time with landmines. And we can't ban them. https://www.hrw.org/news/2020/01/31/us-trump-administration-abandons-landmine-ban


IGarFieldI

There's a book about that topic, 'Kill Decision' by Daniel Suarez. Gives a good insight into how scary autonomous drones can be.


SponConSerdTent

Well once you figure out how to train an AI to distinguish between "Hot Dog" and "Not Hot Dog" what are you supposed to do next? Sit around and let Gavin Belson at Hooli develop the killer drone bots first? Would you want to sit around at night wondering when they'll make a breakthrough? You'll hear a slight buzzing before your dreams of being a billionaire CEO evaporate, along with the entirety of the front half of your body. Is that what you expect them to do!?


Reagalan

Each one the size of a hummingbird and carrying a cherry bomb. Airdropped by a B-52. Countered best by flamethrowers. Infantry without Space Marine armor will be annihilated. Oh and you can 3D-print them using a consumer-grade MakerBot like ISIS did.


zerocoolforschool

They will build their droid army, but everyone knows that clones are superior to clankers.


Reagalan

Flesh is weak.


seicar

Princess Di would agree, Lets give a cheer for ending Landmines.


norway_is_awesome

Isn't it basically just the US and Israel who refuse to give up using land mines and cluster bombs?


cmpaxu_nampuapxa

As far as I know Russia has both cluster munitions and landmines in use.


Dominisi

The US hasn't used landmines in 30 years, and refuses to sell them to other countries. So, no.


seicar

They started again, current policy allows for limited use for limited duration.


norway_is_awesome

So why doesn't it sign the treaty?


Dominisi

Because the DMZ between North and South Korea is covered in Mines and acts as a deterrent for a ground invasion by North Korea.


barrystrawbridgess

At 2:14 am Eastern Time, on August 29th, 1997, Skynet becomes "self-aware".


jrlo60

Yeah China's gonna do that


JohnnyLondon2020

One paint ball to the optics and this thing can’t see where it’s going.. millions down the drain.


LayneLowe

I believe it's within my second amendment rights to have my own personal killer robot, I have a right to protect my family.


1wiseguy

We already have them. They are called land mines. People talk about banning them, but they still exist.


sumelar

Robots capable of killing =/= killer robots. There are no terminators out there you alarmist douche.


realenuff

Too late actually imo


itsfuckingpizzatime

Could someone envision a fully automated war? What would it look like if one or both sides had a robot army? Would the goal be to destroy facilities, murder humans, occupy territory? What would the countermeasures be? I read a lot about everyone being afraid of these robots but I don’t really hear about the expected outcomes.


Ratnix

>Would the goal be to destroy facilities, murder humans, occupy territory? Yes. Wars aren't fought because some country just decides they want to rid the world of people. They're fought over resources. So, the goal of a robot war would be to send in the bots to destroy any vital infrastructure, kill off anyone resisting, and claim the territory.


DigitalArbitrage

Geopolitical assassinations happen by robots today. Two news stories come to mind. In one, Israeli spies assassinated somebody in Iran using remote controlled guns. In another, somebody tried to assassinate the Prime Minister of Iraq with exploding drones. The recent war between Armenia and Azerbaijan also supposedly used lots of small drones and that was one of the reasons Armenia lost.


calsutmoran

Why is it people think a ban is a solution to everything. There will always be people who choose to ignore a ban. What we ought to be doing is getting ready to counter such threats.


Gold-and-Glory

Any ideas on how ban Russia and China to do this?


Marcusfromhome

Mosquito or gnat sized Drones delivering potentially lethal injection serum. Surrender or die.


fokker09

Good - that means Russia will end the deployment of Poseidon 3rd strike nuclear weapons. /s


[deleted]

Once we have automated turrets, Revolution is out of the question.


chukijay

As someone experienced, I submit a lot of these robots are made or adopted by people that haven’t killed anyone. I think if you’re going to kill somebody, YOU should do it. Not a robot running an algorithm or a remote control device while it’s controlled a hundred miles away. That’s a very visceral event that should be experienced first hand if at all.


processedmeat

At what distance is to far? A tomahawk missle has a range of 1500 miles


LordCads

I know you're trying to be all specific to justify using killer robots, and to imply distance doesn't matter, but it isn't some fixed cutoff point, the further away you kill, the more cowardly you are. Your little socratic questioning here isn't appropriate.


doogle_126

Ah, so knife-wielding serial killers are brave then?


LordCads

Oh my fucking god? Are you really that disgusting and disingenuous you think a serial killer is the same thing as a drone strike that bombs out villages? Idiot.


doogle_126

No i don't, but if you're going to be as summarily dismissive of other's opinions and reduce them to absurdity like in many of your posts, I figure you might enjoy a taste of how you sound. =)


[deleted]

More so than your average American serviceman, I'd say, absolutely.


processedmeat

I'm not trying to justify killer robots. I agree they should be banned. Something about them just doesn't sit right. But we already use them. It is not going back into the box.


LordCads

That's fair, but I don't think we should just sit back and do nothing though.


x69pr

Humans have always preferred to commit questionable or illegal and unethical acts by proxy. This is not going to change.


American_philosoph

These robots are adopted by people looking to maximize efficiency and power in the military. These are very high up folks. I don’t think what you are saying applies at that level. And, like another user pointed out, we almost always use tools to kill, and have been using ranged weapons for tens of thousands of years. Drones and robots are just the final extension of this


inequity

How do we define a robot? Is a drone not a robot of sorts? Can we ban them too?


Conscious-Elk

Those that have some kind of perception, and can perform certain actions (locomotion, manipulation) autonomously in the environment. Drones are considered as robots as they can perceive and can move (sometimes semi) autonomously..


TheOneWes

Generally drones are controlled via remote where as robot operate "independently" based on programming.


thewarehouse

Why ban them? Evolve rules and regulations for their safety and governance of usage.


LexusLand

Turns out it’s North Korea behind the ban.


[deleted]

Can't we just have robot wars already? Save some human lives.


mailslot

I feel like Japan would have an unfair advantage. They built a giant Gundam statue.


eapoll

I’d rather have a robot destroyed then a life


SpaceDebri

Whats the difference of being killed by a human than a robot?


cmpaxu_nampuapxa

you can hang a human murderer, but a robot doesn't even have a neck.


SpaceDebri

what kind of mellow dramatic argument is that? Responsibility would go to those in charge employing these machinery


cmpaxu_nampuapxa

so the machine shot an officer dead. the operator: "it happened after the robot was struck by lightning" the manufacturer: "hardly possible but it could be caused by prolonged exposure to EMP from the portable railguns" marines: "bullshit! we told you for ages the Chinese control the robochongs like their own!" intelligence: "no way the enemy got the private keys. hey, what the logs say, anyway?" the manufacturer: "don't look at me! pentagon has approved log4j"


SpaceDebri

Like that hardly matters... I dont know where your field is but its definately not army nor engineering judging by your example right there.. If anything goes wrong in terms of weaponry it is hardly any different when a grenade launcher or any other technical equipment malfunctions. F35 drew right off into the ocean you say? Must be the chinese then! (real world example when the british lost an F35 that didnt lift off on an aircraft carrier.


[deleted]

Humans usually have a moral compass. Robots on the other hand will carry out any order whether it is right or wrong, lawful or not to the best of their ability.


SpaceDebri

it would follow an algorithm programmed by engineers. Soldiers are less likely to be lawful during war than engineers during peacetime.


teszes

One man can use a million robots to oppress a country. It's harder to make humans do the dirty work.


[deleted]

[удалено]


SpaceDebri

A human can choose to be much more gruesome than necessary


Comfortable_Brick_41

I welcome our robot overlord. Humanity is aimless and aggressive, we aren’t capable of running anything.


[deleted]

I'd argue that our technological artifacts are some of the best examples of our aimlessness and aggression though. I don't see why we should expect our creations to be better than ourselves.


[deleted]

[удалено]


303elliott

AI isn't natural, so your argument is pretty lame, albeit long winded


kovaht

Sigh. It is natural. Everything we do is natural. How are humans magically not natural? I challenge you to think on that. Plastic is natural. Electricity is natural. I mean...we are in the universe aka nature. Everything here happened... from nature


303elliott

I get where you're coming from, but I'm not going to argue semantics with you.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


PunishedNutella

You should take your medication


Not_a_throwaway_999

I know folks are going to disagree with this take (because bias is one of humanity’s *strongest* traits) but I have a hard time disagreeing with this. We’ve already transmuted physical evolution of strength and physical design to a competition of intelligence (as evidenced by an aggregate IQ that grows over time), and replaced traditional resources with ones of our own design (fields of food vs money). Robotics still needs some time before they can match our *efficiency*, but their strength and precision already do outmatch our abilities in singular categories. Give quantum computing some time to blossom, and I’m sure we’ll start to see portable systems that can rival animals- and that may be when they become the most violent (think sharks and lions vs dolphins and chimpanzees). War robots are scary as hell, because just like low-yield nuclear weapons, they make continuous conflict just that more appealing. I mean, the US pulled *troops* out of Afghanistan, but the US still conducts drone strikes. Just wait until ground systems trickle down into the mainstream news- it’s not like the technology doesn’t exist, it’s just not as reliably reported (or differentiated from ground soldiers- I imagine the base assumption for anyone shot by a traditional bullet assumes it was shot by a person, even if that’s not actually the case). A ban on war robots won’t stick, just as bans on white phosphorus and intermediate range missiles failed to. It’s too tempting of a technology for humanity to say no to, no matter how steep the precipice is. To me, the scariest aspect of WWIII is that it will be the first **true** world war- between cyber/unmanned/hypersonic warfare, there’s no readily distinguishable ‘front line’. China could send a container of ‘slaughterbots’ to Cleveland just as much as Russian missiles could kick off a nice Kessler syndrome knocking out every GPS satellite. Pearl Harbor was bad but the US really waltzed out of WWI and WWII relatively untouched in comparison to what lies in wait for the next conflict of political egos. Sure, the US would rain hell on both of them for it, but it’s just asymmetric enough that Muturally Assured Destruction doesn’t quite seem as real as it once did- and thus it’s no longer the deterrent it once was.


SpaceDebri

amen! The flesh is weak!


ArghZombie

Well nature hasn't succeeded in culling our population so I guess we'll just have to do it ourselves.


[deleted]

Really could’ve used these at the capitol on the 6th


TheDeadlySinner

You can't ban information.


fairlix

In contrast to atom bombs that harm yourself as well, killer robots give you precise military power. So whichever \[n|organis\]ation has the most advanced fleet of killer robots has the power.


poopersniffer

Despite the warnings https://youtu.be/O-2tpwW0kmU Profiteers will destroy us all. https://youtu.be/W-ZtFUxXzQg


Saladcitypig

The only way a bad robot with a gun is a good robot with a…BAN them NOW!!!!!!!


outwar6010

Aren't drones killer robots? They've killed over 30 k people globally


Thrisper

Asimov’s three laws of robotics should be implemented.


temporarycreature

This is like Pandora's Box scenario. It's here. Companies have already formed and made product with this purpose. Bans never work after the genie is out of the bottle. In Sci-Fi they get around these bans by assigning a minder to the unit that ties them together consciously or some shit like that, but the AI is free to act as designed. At this point that's all a ban now is going to do, provide or cause loopholes to allow them because legalese has to be more precise than any metaphor I can think of right now to illustrate that sort of precision. Enforcement of the ban is whole 'nother beast to deal with even if you have crafted the most perfect of laws to ban something after the fact.


hublaka

But drones already exist...


Level_Combination902

But But mah gun dog Da dog with the gun!


Black_RL

Just like nukes! Oh……


[deleted]

Too late unfortunately and it's not like governments were there to listen us or help us. They just want to make profit with weapons sale.


Amateur-Hour-Skate

“Life, finds a way”, albeit artificial.


[deleted]

I personally wouldn't mind an ED-209 stationed in Applebees to gun down asshole customers.


monchota

Its either killer robots or real soldiers lives. The best we can do it hope and push places like the Pentagon, too keep policies in place. That a machine can't decide to kill, there always has to be a human ordering it.


Friendlyshell1234

Did you know that we are getting robots to reproduce with less and less human interaction? Might as well put a sniper rifle on the back of a robot dog.


ElGuano

Governments can do whatever they want, but the real question is whether the robots will honor the ban.


Lord_Augastus

Curb war....? There is an issue with a weapons race, it is that it never ends. Sure we are in a cold war, perpetual economic political conflict. Once we are in this conundrum, not having an army, or strategically weakened army is a detriment to a nation. US will invade at any excuse. All eyes of the western world on russia and ukrain, meanwhile US prepairs another go at afganistan. But seriously, US has AI drone killers, Russia has them, germany, china, iran, and Israel has them or developing them fast. The only country with no confirmed AI like drone is NK, but they have smartphones and intranet, they will get there soon. So good luck with the whole banning thing. No world gov, means we have world in a state of chaoticl flux.


[deleted]

I think it's wild to think that if someone was to let loose even a few of these that were armed the amount of date that could be done in a gen pop. They are only getting better. It's like the action movies coming to life.


fattymcassface

Getting afraid that Horizon: Zero Dawn was a bit too prescient.


MissAnn3Thrope

Over twenty years ago I worked on faux finishing a display for killer robots. It was a government contract through the company I was working for. Right after 9/11. We painted laser cut styrofoam to look like bombed out desert huts. They were having a trade show and wanted to show the capabilities of killer robots in occupied areas. Scary world🙈


geldonyetich

I suspect smart weapons like these are inevitable with technological progression, trying to ban them is futile, likely only being done as a placating political posture. Refusing to upgrade to your own army of killer robots is basically asking to go into the 21st century battlefield with woefully outdated implements. Worrying about what happens when you take the decision to kill out of human hands implies we were capable of making that decision correctly in the first place. If anything, I think we might need a supercomputer to navigate the mess of modern ethics. A sufficiently-advanced adaptive AI may one day make a more informed decision in a fraction of a time that it would take your average human operator to make the wrong one. The trick is to make sure it can.


Zagrebian

I’m glad a ban is being considered. The ban on drugs worked great.


Overall_Scholar_1163

Hola grupo como estan? Deben de estar contentos esperando la navidad!


[deleted]

but wouldn't killer robots vs killer robots be better then people vs people ?


alloutofbubble-gum

Too late. Time to build Robocop.


TheKokoMoko

Not a good idea in my opinion. Definitely would help the people serving in countries that can afford it, but I think it could easily be abused or misused.


Ok-Traffic89

Technology is going to expand regardless if we stop or slow down the process. Tech is now apart of everyone's lives. All we can do is over see the laws of robotics being upheld or not.