The U.S. military kindly asks you to trust its death robots
In the opening days of the 2003 Iraq War, automated Patriot Missile batteries downed a British Tornado fighter, killing both pilots. A few days later, another Patriot shot down an American F/A-18 over Southern Iraq, killing that pilot. The people manning the automated missile launchers trusted that the system would work as advertised. Why didn’t it?
Benjamin Lambeth wrote in his exhaustive Iraq memoir The Unseen War that “many allied pilots believed that the Patriot posed a greater threat to them than did any [surface-to-air missile] in Iraq’s inventory.”
“The Patriots scared the hell out of us,” one F-16 pilot remarked. In one case an Air Force pilot actually fired a HARM anti-radar missile at a Patriot battery, destroying its radar dish. No one in the Patriot crew was hurt, and the airman said, “they’re now down one radar. That’s one radar they can’t target us with any more.”
When asked if the error was human or mechanical, Lt. Gen. Ronald Kadish, then-director of the Pentagon’s Missile Defense Agency said “I think it may be both.”
A software malfunction in 2008 caused US. Army robots to aim at friendly targets. No one was killed because a human was at the trigger. Those robots were still in Iraq with troops as of 2009.
An analysis of the U.S. Navy’s own data on the development of automated weapons says “with hundreds of programmers working on millions of lines of code for a single war robot, no one has a clear understanding of what’s going on, at a small scale, across the entire code base.”
An Air Force unit called the Human Trust and Interaction Branch — that interaction being between humans an automated equipment — based out of Wright-Patterson Air Force Base, Ohio is looking to study this interaction to develop new types of recon and intelligence tools via a propose research contract called “Trust in Autonomy for Human Machine Teaming.”
On FedBizOpps (the official government contracting website with a domain name as trustworthy as any payday lender) the Air Force listed a contract for research in understanding “the most significant components driving trust and performance within human-robotic interaction. We need research on how to harness the socio-emotional elements of interpersonal team/trust dynamics and inject them into human-robot teams.”
The $7.5 million contract listing continues with this: “These human-machine teaming dynamics can involve research studying the interplay of individual differences, machine characteristics, robot design, robot interaction patterns, human-machine interaction, and contextual facets of human-machine teaming.”
In plain language, the research is focused on how humans use machines, when they trust the machines and when they don’t. In the cases of when Patriot missiles shot down friendly pilots, an automated system notified the human crews via a popup that warned the machine would fire if no one stopped it. When no one did, the Patriot intercepted the allied aircraft, just as programmed.
The Air Force contract is another example of the military “not knowing what’s going on.” As the Air Force explores our trust issues with robots, the Navy is warning us that “early generations of such [automated] systems and robots will be making mistakes that may cost human lives.”
Humans do come to trust their machines. Previous studies found that humans will bond with machines. U.S. Army Explosives Ordnance Disposal (EOD) technicians have been found to form emotional attachments to their bomb-disposal robots. They are awarded homemade medals and given names, ranks, and sometimes funerals. This level of trust could be misplaced as the bots are armed and the stakes of malfunctioning become higher.
Other current automated units in the U.S. military arsenal include Air Force Predator and Reaper drones, the Navy’s Phalanx Close-In Weapon System, the Army’s tracked, unmanned ground vehicle called TALON (or SWORDS) and the Marines’ unmanned ground vehicle called the Gladiator.
Recent news about the increased ability of machines to deceive their human masters and the warnings from the scientific and computing community about the overdevelopment of artificial intelligence (AI) as weapons don’t seem to be a concern, even though the Army is developing an automated RoboSniperCopter and is trying to remove humans from the battlefield altogether. This 2013 Gizmodo piece listed the robots debuted for the Army at a Fort Benning “Robotics Rodeo,” featuring an entry from the company who brought you the Roomba:
Major commercial and scientific computer experts believe AI weaponry would be the third revolution of warfare, after gunpowder and nuclear arms. An open letter from this community expressed the concern that unchecked arms races of automated death robots would result in drones becoming the new AK-47 (presumably meaning cheap, deadly and ubiquitous).
The UN is urging the world’s nations to ban the development of automated weapons, citing the “legal and ethical questions” the use of such a weapon would raise.The aforementioned U.S. Navy report recommended building a “warrior’s code” into the weapons to keep them from killing us all once Skynet becomes self-aware.
This is the latest version of the M9 service pistol
The M9A3 offers a bigger magazine, a user-friendly grip, and a host of improvements based on lessons learned from over three decades of service.
This is what the DoD has planned for a zombie apocalypse
It does touch on many of the pop culture elements of zombie lore, but it breaks things down to become applicable to most situations that would similar to an actual outbreak.
Some dirtbags messed with an Iwo Jima memorial — and Marines caught 'em on film
Officials say an Iwo Jima memorial in Fall River was doused with the contents of a fire extinguisher last weekend. Police are investigating
Vets are going to get a new ID card, and they'll be ready for use next month
The new identification card will provide employers looking to hire veterans with an easier way to verify an employee's military service.
This is the story behind the rise and fall of the Islamic State group
The Islamic State group, responsible for some of the worst atrocities perpetrated against civilians in recent history, appears on the verge of collapse.
Now the Iraqi army is going after the Kurdish forces who helped beat ISIS
Iraqi federal and Kurdish forces exchanged fire on Oct. 20, capping a dramatic week that saw the Kurds hand over territory across Northern Iraq.
This Kurdish female militia refuses to stop its hunt for ISIS terrorists
A Kurdish female militia, after helping free the city of Raqqa, said it will continue the fight to liberate women from the extremists’ brutal rule.
The US just sent nearly 1M bombs and missiles to Guam — here's why
Hint: There's this guy a few thousand miles away who's threatening to lob a nuke in their direction.
This is what the 400 US troops in Somalia are actually up to
The US has quadrupled its military presence in Somalia after Al-Shabab killed nearly 300 civilians in two truck bombings. Half of them are special ops troops.