Ukraine is set to field its first officially codified domestically built grenade launcher armed ground robot, after the Ministry of Defense approved the Droid NW 40 robotic combat system for service with the country’s Defense Forces, developer DevDroid said on December 23.
Our first Slaughterbot is born! Isn’t she adorable?
When you’re blitzing up the tech tree and the AI keeps building T1 infantry and cavalry.
wow…
Imagine being a Russian on a horse charging a fortification because that seems to be WTF is going on now, and this fucking robot of death comes out and kills you with a fucking grenade launcher.
the robot wars have begun
Be on the lookout for Russia’s response based on historical robot champions.

Begun the robot war, has
Damn, we have to advance our cloning technology quick, that just doesn’t roll off the toung.
more of this, please. UA doesn’t have enough soldiers to fight? modern problems require modern solutions.

The launcher has a maximum effective range of up to 1.5 kilometers and can fire either single shots or bursts, with an onboard ammunition load of 48 rounds.
Nice, a rapidfire grenade launcher. What will the kids think of next?
Title
Yeah this is going to end well.
Invaders must die

-
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
-
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
-
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
- A robot may not damage shareholder value or, through inaction, allow shareholder value to come to harm.
Must. Make. More. Paperclips.
I don’t think it will be programmed that way.
The writer of the laws himself made his career by writing an entire book series with stories about pointing out where the laws break down.
In this case I think they’re broken down immediately because they won’t be used.
-
Spend months in a meeting room with everything from ethicists and philosopher to machine learning specialists and developers to figure out each nuance of the rules
-
or program it to kill everything until the batteries run down and send it out within the hour
The answer in a war is always the second option.
Screamers. Check it out.
-
He did also do some stories about ‘fixing’ the laws but in every case that resulted in either limiting the robots capabilities in a restricting way or requiring them to have more power over people in ways that could be harmful to humanites self autonomy in the longrun.
From the first time I read this as a kid, I recognized that in order for a robot to adhere to these rules, they would have to be programmed with these rules. A bad-faith robot builder could simply not include that programming.
A bad-faith robot builder like literally every one that accepts US government contracts.
Asimov wasn’t any more of a legislator than Dante was a theologian
I believe this is only for fully autonomous robots. These are human controlled, more like an rc car with an upgrade.
But there are aerial drones currently in use that have AI targeting.
Actually autonomous? Or guided by computer vision? Technically CV is AI but not what concerns most people.
Ukraine has autonomous drones as autonomous as “there must be a tank around these parts, go destroy it!” and it executes.
Now they even recognize the type of tanks, their armor, so that they target its known weak point, and if it can’t, it targets the turret to make it at least unusable.
They probably have something similar as a submarine drone they used to sink the Russian submarine (it was most likely too far and underwater for remote control, and the drones had to pick the right target, or had a precise map and not lose positioning).
So they’re probably close to make these slaughterbots also autonomous to some degree.
I never imagined humanoid shapes to be the perfect Terminators except for infiltration anyway…
Right off the factory floor they are programmed to increase shareholder value. Some of them become accountants.
Which this article is not about.
There are loops. Killing a few Russian invaders will prevent many Ukrainian killings.
There’s at least one story I can remember about that.
I don’t think it’s a loophole. Surgeons hurt people in order to prevent a greater pain. ruZZia is just a cancer.
It’s seen as an unexpected loophole in the books. Similar to how a surgeon won’t kill one healthy person to save two with their organs.
At least on I,Robot
Edit: also, on I,Robot they harmed very specific people with very calculated results. Not like going to war, even on the defensive side.

Where’s this from?
The Foundation adaptation on AppleTV. Where, famously, spoilers
Tap for spoiler
Demerzel, this robot in the gif, kills lots of people by exploring the loophole that let robots commit genocide in supposed compliance with the 3 laws.
Holy shit. Random, I know but you just convinced me to check this out.
the show is solid
If you start on season 1, don’t let that dumpster fire dissuade you from the rest of the show. Once you get past that, the show is solid
They also explore this on I, Robot
They are Isaac Asimov’s rules for robots. Google it, it’s a whole thing.
-
Now release sir killalot
Slaughterbot
Command and conquers drone anti-tanks, which conveneiently serves as anti-infantry because you crush them.
How about the Abrams getting PERCH systems to launch switchblade drones? That one is straight out of C&C Generals.

funny thing russians had the v3 launcer in the game, then the “v4” in generals they hada drone version of this.
From the thumbnail I thought they used an image of Johnny 5.
MUCH DISASSEMBLE!!
Hell ya, fuck them up!
I’m sure there will be no negative repercussions whatsoever.
Just so nobody assumes the worst, why don’t you explain your reasoning here?
Thank you. I’m pro Ukraine, and terrified of robot armies. It’s great that they are being made to fight Russia, but is that the only use this technology will ever be put to? I mean, we are already looking down the barrel of weaponized drone warfare. Basically, war sucks and fuck Putin.
The primary use of unmanned ground vehicles are for logistics especially logistics with explodey things it is very dangerous to ask a human to carry through a battlefield. If a UGV carrying a bunch of mortar rounds is hit by an FPV drone that sucks for the mortar crew that doesn’t get ammunition but it is unimaginably better than a human being driving a truck full of mortar rounds being hit by one.
The next most useful use of unmanned ground vehicles are for casualty evacuation under fire and under conditions too risky for humans to try.
The third most useful is as a weapon.
So yes this is scary but honestly a human with a gun is still far more terrifying. The broader future of these big derpy rolling ground robots are disaster response and lifesaving applications, they are useful for war too but under very specific human coordinated contexts.
You just know that Trump is salivating to deploy something like this against ANTIFA. It’ll be allowed to go into fully autonomous mode as it is modified to shoot “non-lethal” (yes I know thats not the right term but thats what they will say) rounds as it deems fit by their specially trained AI to basically shoot anything moving that isn’t ICE.
I guess it’s because combat trained killer robots that are already better than an army (granted, russian army, so not a high bar, but still) doesn’t have amazing implications for thr future of humanity















