

And I don’t see that changing.” Which is all well and good, but what happens if the enemy chooses to do things differently? States that choose to go down this route will have such an advantage in warfare that, as a proponent of the technology put it, “it will be the equivalent of horses going up against tanks, people with swords going up against the machine gun. Could the decision to weaponise them be too far away? A former US deputy secretary of state said that currently “a machine can do the killing but only at the behest of a human operator.

So these autonomous robots have the artificial intelligence necessary to take on the enemy. The same manufacturer is working on “swarm deployment”, where multiple drones move collectively and make their own decisions while following a target.

The company is run by a former Navy Seal who knows from personal experience that sending soldiers into those buildings blind to face close-quarters combat or IEDs is “the most dangerous thing that anyone can do in a combat zone, bar none”.

A drone has been developed that can be piloted by AI to map and search buildings in war zones, looking for enemy combatants. So far, so benign.īut the programme was mostly concerned with AI as a weapon of war. You have probably seen those ‘uncanny valley’ robot dogs, which were shown here being trained via AI simulation to walk over different terrain (piles of bricks, a surface covered in oil) in what the developers said could prepare them for working in disaster response scenarios. That is: they don’t look like Arnold Schwarzenegger in Terminator. “The robot revolution has arrived, it’s just that it doesn’t look like what anybody had imagined,” explained a professor. Perhaps she’s revised her opinion after watching Unknown: Killer Robots, in which people working in the sector told us that they are coming, and soon. No-one sensible thinks that is happening,” said Martha Lane-Fox in this newspaper last week.
