Yeah, I said it.
Oh, I’m sorry, did you think that Robocop was a piece of satire? No. As this New York Post piece points out, it got some things right. And now it’s time to add automated use of force in policing to that list.
This is when we’re at. We are in the time when the Dallas police force uses bomb drones to eliminate snipers firing on police officers and protesters. This may indeed be unprecedented, but that doesn’t make it unexpected, or impossible to predict. This moment in time, like all others, takes place at the intersection of several trends.
Think of trends as billiard balls on a pool table. Like all objects in motion, they proceed until they’re interrupted by another trend of equal or greater mass. The force driving those trends is more elemental. Sometimes the driver putting the trend in motion really is a hand — the invisible hand of capitalism. Sometimes the thing directing the trend’s course is the slant of the pool table — systemic inequality. But no matter the trend or the driver, they hit each other in ways that we can model and plan for. The game is never over. The game is constantly changing. Each time the game changes, you have to plan your next move.
And that’s what strategic foresight is. It is the process of discussing the game, modelling possible outcomes, and deciding on the next move that will lead you to your goal. Like all types of strategy, it means having a good picture of all the elements in play, a good understanding of how they interact, and all the angles involved. For this reason, some skeptics like to describe foresighters and futurists as hustlers.
Because “hustler” is another word for “person who won the game in a way you didn’t anticipate.” Are some futurists charlatans? Yes. There are charlatans in every field. And as in every field, the way to avoid charlatans is to find the people who tell you things you don’t want to hear.
Which is why I’m about to tell you that the moment to re-train American police officers has already passed.
Let’s look at some trends bouncing off each other in this moment:
- Militarization of police forces
- Increasing automation
- Mass shootings
- Racialized violence
- Widening inequality on a global scale
Now, when you look at those trends, a police force under fire reverse-engineering a bomb drone once used in a military capacity to deliver a bomb to a man who once served in that same military may seem like contrapasso worthy of Dante. But it’s not exactly unimaginable. And now that it’s happened once, it’ll happen again. In fact, the process is going to get even more refined and, dare I say it, procedural.
Take a look at Silicon Valley venture capitalist Shervin Peshevar’s idea for app-based policing:
The mobile app would start a FaceTime-like call or, for someone who does not have a smartphone, a phone call between the police officer and the citizen. His theory is that exchanging information such as driver’s license and registration over the mobile app, rather than face to face, could keep tensions from escalating during a traffic stop.
Now, there are a lot of things wrong with this idea. Smarter people than I am have picked it apart. But there are plenty of successful bad ideas out there. Mr. Peshevar is also an investor in Uber, and Uber is a terrible idea. It’s terrible for drivers (who get paid pennies), it’s terrible for passengers (who pay more on rainy nights for the possibility of being sexually assaulted by an uninsured contractor), and it’s terrible for the environment (because it perpetuates the existence of fossil-driven cars at the expense of public transit). But you know what Uber is? It’s convenient.
And you know what drones are? They’re convenient. That’s why we use them abroad. Because they’re more convenient than sending human beings abroad. Because they’re more convenient than using HUMINT. It’s more convenient than teaching operatives to speak the four different languages and their associated dialects that might be spoken in a given region. It’s more convenient than sending SEAL teams with years of extremely expensive training to assassinate small fry before they become big fish. So what just happened in Dallas?
Well, it’s like Uber, but for killing. And it’s been happening in Afghanistan and elsewhere for fifteen years. And the same robots used there are sold to police departments in the US for a cut rate. Because how else do you justify the massive amounts of spending and procurement associated with these programs? By telling Congress that the machines will find a good home somewhere, after the war. A farm. Upstate. Where they can run and play and shoot the shit out of active shooters. They that sow the wind, shall reap the whirlwind.
This is not to say that I want a future where police officers don’t know how to speak to human beings. In fact, that’s the opposite of what I want. But I’ve also gone on the record as saying that automation can eliminate a great deal of fear — that dealing with a machine is often easier and less fraught than dealing with a person. (For example, I love dealing with the kiosk at Canadian customs. It’s fast, it’s easy, and I spend less time with people holding weaponry.) And I have been naive in my trust of those machines. Sure, getting a computer to be racist means programming in the racism. But that hasn’t stopped people, so far. Why? Because existing structures of power work to perpetuate themselves, and that means perpetuating the existing culture, including bias.
Is it possible to change a policing culture? Yes. In fact, Dallas was on its way to doing exactly this. Individual police departments can (and sometimes should) do a complete overhaul of staff and policy to change their cultures. There are police officers out there who want this, who want to be safe and want to do a good job and want to protect people, all people, but who are hamstrung by existing policies and procedures. Guidelines and rules alone can’t make that change. Designers have a saying: Culture eats strategy for breakfast. If you want to change a culture, you have to go beyond re-training. You have to disrupt. You have to take away jobs, and give them to robots. You have to say, “We are afraid you’re going to turn into a godless killing machine, so we’ve hired a godless killing machine to replace you, because at least it doesn’t expect a pension.”
Because somehow that disruption is easier than the cultural one. Somehow buying robots is easier than disarming police officers, as they have in Ireland and Britain — countries with long histories of terrorism and violence.
So if you’re interested in the future of policing, or if you want police officers to have jobs in the future, you have some options. You can train police officers to stop shooting unarmed people, even when they happen to be black (or homeless, or mentally ill, or trans, or sex workers). You can hire police officers to patrol their own neighbourhoods, to enact true community policing where officers are also friends and neighbours who understand the needs and tensions of their beat. You can train them how to be human beings for the benefit of other human beings. Or you can train them to be robots, for the benefit of other robots.
Which is cheaper? Which is more convenient? Which do you think the voters in your district will pass a levy for?
What time is it, where you are?