November 26, 2022

Looking for a Thanksgiving table talk that isn’t politics or professional sports? Okay, let’s talk about killer robots. It’s a concept that long ago jumped from the pages of science fiction to reality, depending on what loose definition you use for “robot.” Military drones abandoned Asimov’s First Law of Robotics: “A robot may not injure a human being or, by inaction, allow a human being to be harmed” decades ago.

The topic has come to a boil again recently due to the growing possibility of killer robots in domestic law enforcement. One of the best-known robot makers at the time, Boston Dynamics, raised some public policy red flags when it showed footage of its Spot robot deployed as part of Massachusetts State Police training exercises on our stage on 2019.

The robots were not armed and were instead part of an exercise designed to determine how they could help keep officers safe during a hostage or terrorist situation. But the prospect of deploying robots in scenarios where people’s lives are at immediate risk was enough to prompt an inquiry from the ACLU, which told TechCrunch:

We urgently need more transparency from government agencies, which should be honest with the public about their plans to test and deploy new technologies. We also need state regulations to protect civil liberties, civil rights, and racial justice in the age of artificial intelligence.

Meanwhile, last year, the NYPD cut a deal with Boston Dynamics following strong public backlash, after footage emerged of Spot being deployed in response to a home invasion in the Bronx.

For its part, Boston Dynamics has been very vocal in its opposition to the weaponization of its robots. Last month, it signed an open letter, along with other leading firms Agility, ANYbotics, Clearpath Robotics and Open Robotics, condemning the action. It is noted:

We believe that adding weapons to robots that operate remotely or autonomously, are widely available to the public, and are able to navigate to previously inaccessible places where people live and work, raises new risks of harm and serious ethical issues. Weaponized applications of these newly capable robots will also undermine public trust in the technology in ways that damage the enormous benefits they will bring to society.

The letter was believed to be, in part, a response to Ghost Robotics’ work with the US military. When images of one of its own robotic dogs showed up on Twitter with an autonomous rifle, the Philadelphia firm told TechCrunch it took an agnostic stance on how its military partners use the systems:

We don’t do the payloads. Will we promote and announce any of these weapon systems? Probably not. It is difficult to answer. Since we are selling to the military, we don’t know what they do with them. We will not dictate to our government customers how they use bots.

Let’s draw the line where they are sold. We only sell to US and allied governments. We don’t even sell our bots to enterprise customers in adversarial markets. We get a lot of inquiries about our robots in Russia and China. We do not ship there, not even for our business customers.

Boston Dynamics and Ghost Robotics are currently involved in a lawsuit involving several patents.

This week, local police reporting site Mission Local showed renewed concern about killer robots, this time in San Francisco. The site notes that a proposed policy being reviewed by the city’s Board of Supervisors next week includes language about killer robots. The “Law Enforcement Equipment Policy” begins with an inventory of robots currently held by the San Francisco Police Department.

There are 17 in total, 12 of which are in operation. They are designed largely for bomb detection and disposal, meaning none of them are specifically designed to kill.

“Robots listed in this section will not be used outside of training and simulations, criminal arrests, critical incidents, exigent circumstances, execution of a warrant, or during suspicious device evaluations,” the policy states. It then adds, more worryingly, “Robots will only be used as a deadly force option when the risk of loss of life to members of the public or officials is imminent and outweighs any other force option available to the SFPD”.

Indeed, depending on the language, robots can be used to kill to potentially save the lives of officers or the public. It seems innocuous enough in this context, perhaps. At the very least, it appears to fall within the legal definition of “justified” deadly force. But new concerns are emerging over what appears to be a profound policy shift.

For starters, using a bomb disposal robot to kill a suspect isn’t unprecedented. In July 2016, Dallas police officers did this for what was believed to be the first time in US history. “We saw no other option than to use our bomb robot and place a device on its extension to detonate where the suspect was,” Police Chief David Brown said at the time.

Second, it’s easy to see how a new precedent could be used in a CYA scenario, if a bot is intentionally or accidentally used that way. Third, and perhaps most alarmingly, one could imagine the language being applied to the acquisition of a future robotic system not designed exclusively for the detection and disposal of explosives.

Mission Local adds that SF Board of Supervisors Rules Committee Chair Aaron Peskin tried to insert the more Asimov-friendly line: “Robots shall not be used as a use of force against any person.” The SFPD apparently tasted Peskin’s change and updated it to their current language.

The renewed conversation about killer robots in California is happening, in part, because of Assembly Bill 481. Approved by Governor Gavin Newsom in September of last year, the law is designed to make police action more transparent. This includes an inventory of military equipment used by law enforcement.

The 17 robots included in the San Francisco document are part of a longer list that also includes the Lenco BearCat armored vehicle, flash-bangs and 15 submachine guns.

Last month, Oakland police said they would not seek approval for remote armed robots. The department said in a statement:

The Oakland Police Department (OPD) is not adding remote armed vehicles to the department. OPD participated in ad hoc committee discussions with the Oakland Police Commission and community members to explore all possible uses for the vehicle. However, following further discussions with the head and executive team, the department decided it no longer wished to explore this particular option.

The statement followed public backlash.

The toothpaste is already out of the tube before Asimov’s first law. The killer robots are here. As for the second law: “A robot must obey orders given to it by human beings,” that is still within our grasp. It is up to society to determine how its robots behave.

Leave a Reply

Your email address will not be published. Required fields are marked *