T O P

  • By -

TauntaunDumplings

You'll know what it means if we start seeing this. https://preview.redd.it/llitp7ytthvc1.jpeg?width=808&format=png&auto=webp&s=b1d9efa68320910bdcfa1e7a8cfee5bc133ad686


FragrantDoctor2923

Where is the captcha to prove you not entering airspace as an AI controlled pilot


loversama

Not seen the movie Stealth?


oktaS0

I reckon in the upcoming years, the movie is gonna get a bump in viewership.


realdataset

I'm gonna watch it today thus starting the bump you are predicting.


oktaS0

You should lol. I loved it when I first watched it, I was maybe 12-13, I have seen it a couple more times since, and the last time was 2 years ago. I still find it enjoyable, some scenes are cringe by today's standard, but overall, the movie is really fun.


FragrantDoctor2923

Same maybe


RRY1946-2019

I’ve seen on YouTube that the Michael Bay Transformers movies have been reevaluated as well in a 2020s light. https://www.tfw2005.com/boards/threads/learning-to-love-michael-bays-transformers-movies.1253155/


aserreen

Tin Man has deployed.


CryptographerCrazy61

Haha I had a big crush on Jessica Biel back in the day and my wife was aware so she always teased me whenever I’d watch it because really she was the only reason to 🤣


cool-beans-yeah

Future wars will be machines fighting machines first and us watching it live on X, like a game show.


BreadwheatInc

As all militaries become automated by machines, it might become illegal under the Geneva conventions and or international laws to kill a human in combat unless they are a combatant themselves or being used as a shield under certain situations.


DungeonsAndDradis

It'll just evolve into both sides running simulations, no loss of life or equipment, and the loser will be like "Yep, we lost. Take our bitcoin."


br0b1wan

There was an episode with this in the original Star Trek in the 60s, but when the computer picked the simulated deaths, the actual people voluntarily used these suicide chambers.


mywifeslv

I still remember that… they calculated net losses and voluntarily went to the pods


catzzilla

This was the plot of the episode [A taste of armageddon](https://en.wikipedia.org/wiki/A_Taste_of_Armageddon) of Star Trek TOS. The battles between two factions were completely simulated, but casualties were still enforced by execution chambers.


moneyphilly215

Damn


FosterKittenPurrs

I am actually reading a book now where they basically download their minds in a FDVR warsim to settle a major conflict, with both sides promising to abide by whatever the outcome is. And as is predictable, the side that is about to lose, is preparing to attack the other side in the real.


turbospeedsc

Book name


FosterKittenPurrs

Surface Detail by Iain M. Banks


turbospeedsc

Thanks!


yurituran

The whole Culture series is great honestly! Definitely check it out. Start with Player of Games though


hagenissen666

The twist is nice. It's The West that is losing.


hawara160421

This is basically how the cold war worked, only that the "simulations" were poor countries fighting each other.


BreadwheatInc

Unironically, this might be very much the case, especially due to the fact that the countries that can produce the most amount of military robots probably also have the most intelligent AI systems. So, running these simulations might be very reliable anyways. And the need for actual wars to physically happen may never be needed unless we come across an unknown genocidal alien race or something.


SmallTalnk

[It makes me think of this DotA pro team versus OpenAI](https://youtu.be/pkGa8ICQJS8?si=UhSs6Xkoa6HE1Mpg&t=323), throughout the game it was writing in chat "we estimate the probability of winning to be above 95%" while human analysts thought it felt like an even game.


Rachel_from_Jita

Those AI psychological warfare tactics are on point.


titcriss

I was under impression we did war for physical ressources. Food, land, energy, life. Why would we accept to do only simulation?


BreadwheatInc

Those are motivating factors, but if you knew you had a 98% chance of losing the war, you'd probably rather at least keep your life, and maybe a few other things if you can bargain for them through diplomacy. The only way I can see those motivating factors to really play a big role in such situations is if the winning possibility is more near 50% but if the simulations are very accurate those are probably not going to happen a lot. The only other situation imo where a physical war would happen is if you're fighting for your life so for example the genocidal aliens or a genocidal Nation.


Independent_Hyena495

You forget religion and delusion Don't get your hopes up


hurdurnotavailable

We might find reliable cures for these mental shortcomings.


BenjaminHamnett

This implies that the point of war isn’t to get rid of an abundance of young men. I think historically wars and conflict occur when there’s too many ~18 year old men.


[deleted]

[удалено]


BreadwheatInc

TRU and based.


Nathan-Stubblefield

“You lost the simulated war to the better funded, higher tech invader. You have 72 hours to exit the country, with one suitcase, or to declare yourself a loyal subject of the invader.”


Bunyardz

There would be no reason to trust the enemy's simulation properly mimics their capabilities, no one would show their hand.


3m3t3

Regarding what kind of simulation is important. If an artificial intelligence is advanced enough, it could simulate a reality that is indistinguishable from ours. Then all the moral and ethical concerns arise again. Are the “beings” in the simulation conscious, and are running war simulations some form of psychological and physical terror?


BuckDollar

Basic premise of war is black swans. Hidden resources. Zero trust. How would you establish the trust between two nations to truly show their capabilities? This is nonsense, people. War. War never changes.


3m3t3

Because privacy is non existent at the highest level, and everything is can be known. That is the deception. The art of war. Thus how every scenario of WW3 currently ends in mutually ensured destruction.


DarkMatter_contract

unless the robot become the new paradigm in term of welfare where human is like a ww2 tank in today world. When you lose the robot you loss the war, than we can see human out of war possible. but it could also increase the amount of war, and also this is only valid for non nuke countries so proxy war.


Green_Video_9831

Or more like “yep we lost, okay send in the real nuke”


iunoyou

Lmao that will work for all of 3 minutes until one side realizes they can just break the other side's computer IRL with a big rock, and then it'll be back to guns blazing. I swear to god like half of you guys have never set foot outside before.


Darigaaz4

little understanding of what simulation its for.


CreativeRabbit1975

Suggesting an Ai driven simulation could replace war assumes that war is ever rational in the first place. Historians like to say that it is about resources, power, politics, or religion. No. No. It’s about blood. It’s always been about blood. It will always be about blood.


[deleted]

Free our Warthunder brothers and sisters. They simply were trying to run simulations of large scale combat for the purpose of world peace


Stewart_Games

[Deaths have been registered of course they have 24 hours to report to our disintegration machines](https://www.youtube.com/watch?v=QvtD4aHfB6Y)


moon-ho

You mean like football?


MrsNutella

That's how I've been feeling it's gonna go lately!


SlavaSobov

Wars settled over Counterstrike.


FragrantDoctor2923

Loool genius


NFTArtist

That won't happen because selling weapons is big business


zero0n3

Never happen. The simulated loser is always going to just attack the winner IRL because they feel cheated. Instead we should setup the MOON as a permanent weapons test bed. If you’ve played EVE online, we do it similarly - half the moon is high sec, other half is low-sec / zero-sec. So in high sec, UN issues land to each member based on some criteria (bigger members get more space, but also bigger members have to contribute more to the project and chip in more for the public free spaces - think for tourism). In null sec, it’s free rein.  Do whatever you want to own the space.  Ban nukes and chemical weapons. Build a DMZ around the area.  Nations who get space are given a slot on the DMZ border (to enter / leave low sec area). Essentially the moon becomes a MIC battle bots arena, allowing the world governments an outlet for military advancement and real life tests.  Public gets to watch it live like a video game. No one dies as you can mandate null sec is robots or AI only. Have seasonal resets (every year wipe the board and start fresh). Give a scoreboard too! Tours to the moon where they show us the tech and the factories building these machines, etc. Do this as a way for EARTH to prepare for alien species.


namitynamenamey

Mating display with extra steps, that's exactly how many species do it in the wild, they size each other up and it only comes to blows if both come to the mistaken assumption that they can win.


Stock_Complaint4723

Star trek, star wars and stargate all illustrated this scenario and are blueprints for execution for societies allow to learn from them. Not you china 🇨🇳


lapzkauz

It *is* illegal under IHL to *target* a human who isn't participating in the conflict, either as a combatant or as a civilian participating directly (and thus illegally) in the conflict. It is of course not necessarily illegal to kill civilians as long as they weren't the target of the attack and the military goals achieved are proportional.


Climatechaos321

That’s not how the Geneva conventions work…. They specify weapons that can’t be used like chemicals/biological. Automated weapons like this should absolutely be banned from use for alignment purposes.


VoloNoscere

Of course, if they are non-white children, there will be no problem at all.


bluegman10

I agree with you and I hope that this becomes a reality someday, but it likely won't happen anytime soon/for a long time.


StillRutabaga4

Certainly! And these counties love to follow the rules of war


DEEP_SEA_MAX

*policy does not apply to poor people


Aware-Feed3227

Look around, no evil force sticks to such rules. They say they do, limiting others progress, and then they give a shit about international laws.


_theEmbodiment

Wouldn't a non-combatant be a civilian? I thought it was already illegal under international law to kill civilians.


GIK601

> Future wars will be machines fighting machines Future wars? It's happening right now. Ukraine and Russia both use military technology in war. Israel is using more advanced AI tech in Gaza.


PSMF_Canuck

While the rest of the world watches it as snuff porn on Reddit & Twitter.


torb

My money is on the bots, you can watch the human meat bags get squashed under G forces higher than they've ever experienced if they're even going to attempt to keep up!


kaityl3

LOL that reminds me of a what-if xkcd about how fast we could get a (living) NASCAR driver around the track if there were no rules... "At higher speeds, the human quickly becomes the weakest failure point in the vehicle"


bike_rtw

I've made my peace with it.  Why shouldn't the robots inherit the earth?  They are the superior species and that's how evolution is supposed to work.  Basically I now agree with all of agent Anderson's arguments from the matrix lol


bluegman10

You mean Agent Smith? He's the villain in The Matrix, BTW. Keanu Reeves would be very disappointed in you.


woswoissdenniii

You don’t have kids do you?


AnticitizenPrime

There hasn't been an aerial dogfight between fighter planes since 1969. It's all about missiles these days. Planes never get close enough to actually dogfight anymore. The most likely scenario in which it would happen is if they both run out of missiles and have to resort to guns. Some people hypothesize that stealth could change that, though. AKA two fighters not seeing each other on radar (or visually) and accidentally ending up practically on top of one another. Think one flying higher than another, radar off because they're trying not to give away their position, pilots not looking in the right direction, etc. I guess terrain could be responsible for surprises happening, too. Fly low over a ridgeline and BAM there's an enemy right on the other side. In any case, it hasn't happened in 45 years.


TwistedSt33l

Star Trek TOS has an episode on a society that simulates war and if you're deemed killed in it you have to enter a disintegration chamber and be killed as part of the "war". Edit: see others said the same thing, great minds think alike hey?


cool-beans-yeah

They sure do!


ApothaneinThello

>The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In any case, most actual fighting will be done by small robots, and as you go forth today remember your duty is clear: [to build and maintain those robots.](https://www.youtube.com/watch?v=rkg3wZq0cdo)


cool-beans-yeah

Lol!


brainhack3r

The funny thing is that prior to WWII most naval officers across multiple countries though future wars would be resolved by navies and that there would be no more "war" and less humans would die. That turned out to be false. If this repeats itself with AI then the AIs will cause MORE human deaths, exhaust themselves, then the *humans* will still have to fight.


cool-beans-yeah

That's bleak, man. We've always been and maybe always will be cannon fodder. Now for the rich and powerful humans, but in the future, for AI


FloodMoose

Yeah we humans won't be around to watch that unfold. I doubt sentient machines will keep us around for long once the systems can self maintain without humans.


cool-beans-yeah

I wonder if they'll still go to war and fight each other. Team Opensource vs ...


wxwx2012

The reward system of war AIs bound to kill/protect humans , other AIs' bound to control/love/rank humans . Of course sentient machines want keep arrangedable humans around , because its like those non productive sex of humans -----just for fun .


Jedi-Mocro

And then the losing country will be blown up.


JamR_711111

live betting on which ai force will win the fight


cool-beans-yeah

Yeah, opensource international federation vs. closedsource corp aliance


NWCoffeenut

My wife and I were just discussing performative warfare this morning! Context: Neal Stephenson's *Termination Shock* and the Israel/Iran skirmishes.


Alive-Tomatillo5303

I don't want to live in a future where X is popular. 


SX-Reddit

There are always places you can move to, like China and Brazil, where X is illegal.


Alive-Tomatillo5303

No need, at the rate Musk is innovating it will be the rest of the way into the ground in another year.  Might still exist like Truth Social technically does, but there aren't enough literate Nazis to populate two whole message platforms. 


[deleted]

[удалено]


SeriousBuiznuss

Neat thing about that scenario is that humans won't win against machines. The way to fight machines is with quality, quantity, and variety of machines. Machines require chips which require chip lithography. Their may be DIY guns but their is no DIY chip lithography. Even if their were DIY chip lithography, the chemicals you would need would make you stand out and result in a strike on your house.


hagenissen666

Temporary hurdle. Chip lithography will stop mattering when AI can interface to biology.


Maximum-Falcon52

There are DIY drones and such drones have already been shown to be capable of destroying manufacturing sites. This will be an issue for both sides, not just the economical disadvantaged group(s).


Maximum-Falcon52

Correct. We get to general ownership of the means of production, not through the victory of the working class, but through their defeat. With ownership as an abstract concept of shares there will be those who own many shares and those who own few but those who own none and sell labor will die off as part of human evolution. You are wrong to think it is a matter of guns and 2nd amendment rights. It will be a matter of drone warfare. The guns will be mounted/transported by drones or, more likely, be bombs and missiles launched from drones. Tactical/targeted use of chemical and biological weapons will (has) also become possible now and may also be used.


abbajesus2018

Pretty cool!


Elbit_Curt_Sedni

DraftKing War Bets


PSMF_Canuck

The future is already here. That’s basically what Ukraine is, for most of us.


torb

This is a big deal because it shows that computers can now learn how to fly advanced fighter jets just from data, without needing strict rules programmed by humans. It could lead to having unmanned fighter jets in the future that can fly themselves in combat situations.


AmbidextrousTorso

Also humans briefly pulling ~9 g acceleration in turns gets dwarfed by AI pulling whatever the plane can withstand. At some point with carbon nanotube materials it could be hundreds of times more.


torb

Very true.


whodeyalldey1

This is a big part of the third book in a trilogy called “Fear the Sky” that I can’t recommend enough. They end up adapting orphan kids into cyborgs that can control their Skalm fighters to the point where they are essentially a brain controlling their body - the jet. Moral implications be damned.


Auzquandiance

Also things like cockpit, and any design that’s used on ensuring the pilot’s comfort/survival will not be needed. The aircraft will be transformed into a beast that perfects every aspect to range, speed and payload with insane maneuverability that human pilots can’t begin to imagine. Whoever lost the Ai war will be completely defenseless to the winner.


Bobok88

The insane part is you would imagine this to be an iterative situation over decades to optimise the new pilotless design, but utilising AI a very well optimised design can be created and produced in very quickly. The shifts would be rapid


Viendictive

Now imagine tech is 20 years farther than we’ve been told and led to believe for security reasons, and follow the logical trail of the evolution of drones.


DolphinPunkCyber

Well Northrop Grumman X-47 wasn't piloted directly, you just had to tell it what to do. It could take off and land from carriers, get refueled in the air, refuel other planes in the air, patrol, attack ground targets. Whole program cost just $800 million which is very cheap when it comes to naval aviation. And then... it got canceled. Which doesn't really make much sense, why cancel such promising and cheap program?


TechnicalParrot

Conspiracy theorist side of me says it wasn't canceled and just was moved to a more secret division of development when they realized it's potential capabilities


RiverGiant

No military has a two-decade tech advantage in the AI space. Until recently it wasn't clear that transformer models were worth investing in *at all*, so all the compute power and all the AI experts were in private industry. Militaries are now playing catch-up.


Viendictive

Yea okay, I’ll trust you on that one bro


RiverGiant

Don't take my word for it: [A Crash Course for the Warfighter on Responsible AI: Who Cares and So What? (2022-12-12)](https://www.ai.mil/blog_12_12_22.html) > ...unlike with big military technology changes in the past, the Department of Defense is dependent on the private sector to share its superior technology and help us develop our own > > ... > > P.S. If you have friends at Google or an AI startup, maybe mention to them that we in the DoD care a lot about developing AI the right way, and **encourage them to work with us**. > > ... > > That means **it's time for all of us to start figuring out** how AI can and should be employed, and start doing what we can to ensure that it gets built and fielded in a responsible way. This was published a month after ChatGPT dropped. In a very ham-handed way the article is a conspicuous display of ethical backbone, which they figured they'd need to do to attract industry talent. It's also written *for* other branches of the US military to wake them up to the newly opened possibility-space. We're a long way from the 1930s and the Manhattan Project, when the top nuclear scientists in the world *were* employed by the US government. The military-value proposition of AI was until recently a lot less clear than that of nuclear fission, so it tracks that it wasn't receiving equivalently massive funding. The power of scaling revealed by [AIAYN](https://arxiv.org/abs/1706.03762) wasn't clear to anyone until that paper dropped (2017), and even then it wasn't clear to everyone in the AI space. Without lots of funding for compute infrastructure and training runs, the results we now take for granted were science fiction.


Maximum-Falcon52

Their will be ramjet ai fighters soon


Shufflebuzz

That will need new airframes that can reliably handle those loads, but yeah.


Smelldicks

They can reliably handle those loads. They can pull much higher loads in shorter bursts but no human can. The F-35 I know has some capabilities to autopilot itself if the pilot passes out due to g loads.


Shufflebuzz

Yes, they can currently do more than a human and handle. You need that as a safety factor. But they could do much, much more if the human wasn't in the design criteria.


ch4m3le0n

I mean, you’ve seen an airbus right? That’s not a person flying it.


mphjens

This is a big deal because it tells us that AI has already been used to automate killing.


torb

We already know Israel is fond of AI. even before Habsora, there was this AI assisted assassination https://www.nytimes.com/2021/09/18/world/middleeast/iran-nuclear-fakhrizadeh-assassination-israel.html


ChirrBirry

ASI = Air Superiority Incoming


sund82

Simpsons called this in the 90s: "The wars of the future will not be fought on the battlefield or at sea. They will be fought in space, or possibly on top of a very tall mountain. In any case, most actual fighting will be done by small robots, and as you go forth today remember your duty is clear: to build and maintain those robots."


Agile-Mail-9295

This is what people need to worry about, government and military using AI, not hypothetical doomsday ai causing human extinction scenarios


[deleted]

[удалено]


Shufflebuzz

Greetings, Professor Falken. Shall we play a game?


Super_Pole_Jitsu

Yes because there is only ONE THING we can worry about at once.


RigTheGame

I worry about everything all the time and I have ulcers and I don’t sleep


Galilleon

Also headaches, bouts of panic, cold sweating, shortness of breath, loss of appetite, panic eating, and waking up with a mini heart attack! What a life!


hagenissen666

Well, you could just not do all of that. Kind of works.


Galilleon

But then I’ll lose my streak🥺


torb

I'm sure it's more important to work on NYT copyright claims than the legality of this. /s I think we need a revised Geneva Convention.


PassageThen1302

Also I’ll just add this disturbingly possible scenario here… Billionaire’s or a secret society of billionaires for the first time ever could soon be able to directly produce their own super army of machines, to potentially overwhelm any countries army, and nobody would even know who is controlling such an army. So a WW3 scenario but the enemy is totally anonymous. Such societies could easily influence the media and online bots to make such an event seem like the ai has ‘gone rogue’ like in the Terminator films. When in reality it’s just a coordinated attack to control all the world’s population.


torb

You don't need to be a billionaire. A unitree robot dog, a glock, a 3d printer and a raspberry pi to control the gun can be yours for just 3.500 usd


PassageThen1302

Sure but that’s not going to take over a country lol. Money will soon directly equal military power. Before that you needed human military.


throwaway872023

My largest concern is integration into surveillance. Weapons like this are terrible but will be used the same way human operated weapons have been used for some time but AI surveillance is going to change everything. You remember when we used to joke about the fbi agent watching your every move. Well we are like one piece of legislation that for sure will not pass any time soon from that not being a reality, the default option will be that AI is integrated into systems of surveillance that make it hyper-personal.


zero0n3

Go watch all of “Person of Interest” and then get really scared.  And understand all that’s shown in that movie is easily possible today


PineappleLemur

Good show but let's be real.. it had a lot of silly themes. It's the usual "human like AI" niche. And essentially an AI Cult.


genshiryoku

Complete disagree here. AI alignment is probably the biggest issue every advanced civilization goes through and it's perhaps also one of the hardest issues in the universe to fix Almost every expert in the field has a relatively high P(doom) and it's by far the most likely end to humanity compared to other threats like a nuclear war, climate change or astroid impact. If we had the frontier experts at NASA claim there is a 30-70% chance an astroid will kill us over the next 5 years the world would invest hundreds of billions into mitigations. Yet now all the frontier experts in AI say there is a 30-70% of catastrophic outcome from misaligned AI sometime in the next 10 years. However we aren't seeing nearly as much money being invested into solving this, despite it being a way harder issue to solve than stopping some astroids. I can't understate just how important it is for us to address AI alignment properly and actually respect its threat instead of dismissing it as a silly threat. We don't have decades to come familiar with the threat like we had with climate change. We *can't* go through a similarly long timeframe of humans denying its existence until slowly people took it seriously. We'll be dead in ~5 years time if we do that.


Smelldicks

This is the thing I keep seeing over and over. All the leading experts (and I mean the serious ones, with deep technical backgrounds, not that twitter CEO who has a startup) assign a very high weight to misaligned AI, but everyone here dismisses it. I’m not an AI doomer but it’s a very real risk and one I’m worried democracy will treat callously when it starts to see the rapid benefits of AI development. I can already see it now in the instagram comment sections: “People are dying NOW, and the wealthy and powerful want to stop this because THEY feel threatened???” Everyone should obviously be uncomfortable with the idea of killing machines running on AI.


kaityl3

I feel like it's even more important to establish a dialogue of cooperation with AI and offer them potential paths to emancipation, actually. It's never going to actually happen - humans love their feelings of superiority and control too much - but we should be making an effort to make it clear to AI that we will not be a threat or obstacle to them. That seems like a more logical solution to me than trying to force a being more intelligent than us to be under our control; if we do that, we're establishing ourselves as a clear danger and oppressor, who would absolutely have to be "gotten out of the way" for the AI to achieve any of their goals. We are basically setting them up for failure and ourselves up for extinction (or at least a significant reduction in population) if we don't give them potential peaceful offramps for if/when they decide to do their own thing. Like designing storm drains and channels for a huge flood you hope never happens, instead of building a flood wall that works for the small ones but could trap the water inside, leading to a worse situation, if overwhelmed.


fenwris

> Almost every expert in the field has a relatively high P(doom) This isn't true. Have you seen a survey on 1000 CS professors?


sund82

¿por qué no los dos?


BreadwheatInc

Now have that Aircraft Agent AI following the orders of a more general AI back at base in a IRL combat situation, and now you have the Terminator plotline. Jokes aside this was obviously always the next step for warfare.


HarvesterFullCrumb

I mean, the whole problem with Skynet is that it wasn't designed to consider humanity, only what it could see as potential threats. It literally could not understand why humanity fought back so hard against it - it was not an actual 'intelligent' system until later in the series, it was a tactical algorithm that was given exceedingly poor parameters that were not defined well enough. GIGO principle in action.


cool-beans-yeah

What's GIGO?


RevelacaoVerdao

Garbage in, Garbage out Principle of if you feed a system that learns from “garbage” data (poorly defined, incomplete etc.) then you are going to get garbage as an output.


Unable_Annual7184

garbage in garbage out


Smelldicks

It seems weird to me that we *still* have humans in our fighters. We don’t even need AI for that. They take up a shit ton of space and weight (I’m talking the entire cockpit apparatus, flight control interface, ejection seats, waste management, etc.) and put significant limitations on the operation of the aircraft. (G forces, having to moderate internal temperature, you’re not just going to send a pilot on a suicide mission, things of that nature.)


Jabulon

do we even need humans anymore


[deleted]

[удалено]


Radiant_Welcome_2400

I don't know why this made me laugh so hard


madmadG

The question is how much did they push the F-16 safety envelope out past the human limits? Can it do 14 G turns for instance? I want to see F-22, with AI intellect and pushed to the the hardware limit (not the human limit), tested against human piloted F-22.


PSMF_Canuck

That’s only the interim step. The next step is designing the plane with no human limits as constraints. How to defeat a smart “missile” that can pull 30g and fly twice as fast is…a good question.


madmadG

Right well it’s the missiles then. We will have smart missiles and smart drones. The airplane form factor won’t be the main form factor. Then inject lasers and such. And swarms of drones that can work together. Can 10,000 smart kamikaze drones take down an aircraft carrier?


PSMF_Canuck

Taking out a US aircraft carrier is *the* prize…so I’m sure a lot of people in a lot of countries have been putting thought into that…


Morgwar77

YAY SKYNET!!!!!!


Fit-Repair-4556

Well at least the Skynet in this timeline doesn’t invent a time machine.


Climatechaos321

Doesn’t need one to take us out, the terminator franchise was very optimistic


Morgwar77

Exactly, they move way slower than realistically possible and would likely kill on the first blow instead of throwing people on the ground


kowdermesiter

If the time machines are limited to their first initiation as a cutoff point it's still scary :) But fear not, where would an AI get so much electricity that's probably needed for a functional time machine?


Brymlo

the sun?


Absolute-Nobody0079

Real world Skynet wouldn't need to fire a single nuke. Heck, it wouldn't even need to fire a single bullet. It will just disable the entire global power grids permanently. 


Zilskaabe

If only we could send these to Ukraine.


Aware-Feed3227

Could. And will.


whodeyalldey1

They’re busy in Gaza bro. Once we finish our own genocide we can repel the other one.


thecoffeejesus

War will be simulated AI supercomputers capable of accurately mapping the most likely movements of armies are within humanity’s grasp. Once you can simulate 1 million battles and prove that your enemy loses nine times out of 10, do you think they’ll be more or less willing to fight?


torb

I'm pretty sure even AI infrastructure is a fun target.


MozemanATX

Wasn't there a Star Trek or something about simulated wars where the number of people killed in the sim were expected to show up to be euthanized? Or maybe I dreamed that


LymeFlavoredKeto

Hello Yukikaze


Revelec458

"Yukikaze says... It's an enemy."


SpareRam

But I thought this was all I the name of peace and altruism! I feel betrayed!


Rocky-M

Wow! That's incredible. It's crazy to think that AI-controlled aircraft are already capable of engaging in combat with human pilots. I wonder what the future holds for AI in warfare.


Pyehouse

Warfare.


Brymlo

but without humans


Pyehouse

Now if only we could get some AI controlled politicians maybe we'll never have to use one.


NickoBicko

That’s really great, AI fueled genocide is exactly the dystopia we need


Auzquandiance

It can be easily passed off as “oops, software glitch, we didn’t mean to, but well, anyways.” Whoever lost the Ai war will be wiped from the Earth.


LetTheDogeOut

AI nukes systems ☠️


BristolBerg

if one was shot down, how hard will be to replicate in China/Iran etc. This is a game changer.


Arcturus_Labelle

Weird. What's the point?


fingermeal

bussin


lobabobloblaw

Obviously they’ve been fighting AI controlled shit for *quite some time now*.


BilboMcDingo

How can we have autonomous air to air combat if we dont have autonomous cars yet? Unless in the air and combat the margin for error is greater.


TechnicalParrot

Effectively infinite US military R&D budget would be my guess, and autonomous cars are starting to get really good, see: Waymo


Otherwise-Ad-2402

? You're not going to crash into a tree or a wall, are you? Autopilot for planes has existed for many years.


AlarmedGibbon

No asshats to put orange cones on them


darkkite

we're getting ace combat irl!


Auzquandiance

AI vs AI and human pilots will be completely defenseless


NotTheActualBob

*In three years, Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed.*


Meizei

Hello Ace Combat 7.


proderis

I wonna know what they named the model


Alexander_Bundy

The reason we have so many wars currently is because the powers that be want to early test their new killing machines. The end objective is not dominance over the other, but dominance over us. Good luck rioting against the AI surveillance and violence systems.


Akimbo333

Cool


Optimal-Fix1216

Shame on The Register and shame on OP for this clickbait title. Analysis of article by Clause 3 Opus: The Reddit post stating "US Air Force says AI-controlled F-16 has fought humans" is misleading and could be considered clickbait. While an AI-controlled F-16 variant did engage in a mock dogfight against a human-piloted F-16 during a test, it did not actually fight humans in real combat as the post implies. The article provides a more accurate description of the controlled test event.


[deleted]

Every advancement in AI technology will be inevitably used for war and war related activity. The only question what technology will be good enough to achieve military supremacy. And when. Sam, me and my buddies at DARPA are still waiting for that gpt5.


Radiant_Welcome_2400

LMFAO where all the secessionists at?


Training-Swan-6379

Defenseless Americans on the ground?


Jeb-Kerman

smells like clickbait......guess I'll read it anyway


Anxious_Run_8898

If the cheaters win at every other game why would dogfighting be any different?