“Innovation is the key to modern policing, and we’re proud to be leveraging technology in a way that keeps our community safer,” then-Police Chief Kevin Vogel said.
After the May 25 killing of George Floyd in Minneapolis sparked massive worldwide protests, as well as an ongoing debate over whether money spent on police departments should be redirected to other city, state or local facilities and services to better serve the community, Santa Cruz made history again. In late June, it became the first city in the nation to ban predictive policing.
That wasn’t the only piece of law enforcement tech to hit the mothballs. This summer, Amazon and Microsoft announced moratoriums on its facial recognition services for law enforcement, while IBM announced it was getting out of the facial recognition business completely.
On the one hand, calls for police reform are causing companies and institutions to reconsider a high-tech infrastructure that civil liberties groups and activists say perpetuate racial injustice and police brutality. Black Americans are disproportionately part of use-of-force incidents, studies have found. Police shoot and kill Black people at twice the rate of whites.
On the other hand, lawmakers are looking at how data and tech can improve accountability and identify police officers with a pattern of misconduct.
Adam Schwartz, a senior staff attorney with the Electronic Frontier Foundation in San Francisco, welcomes recent announcements by Amazon, IBM and Microsoft to end or suspend controversial facial recognition services. The police should also end their reliance on surveillance technologies such as predictive policing, the attorney adds.
“It’s reflective of the national conversation, where we are looking afresh at these technologies through the lens of the Black Lives Matter moment we’re in,” Schwartz says.
Predictive policing uses machine learning algorithms based on past crime data to attempt to forecast which areas should be more heavily policed and identify offenders and victims.
Critics warn, however, that predictive policing software echoes racial disparities and biases in police records and crime data. In June, hundreds of mathematicians wrote an open letter calling on their peers to stop developing algorithms and models for predictive policing programs.
“Given the structural racism and brutality in U.S. policing, we do not believe that mathematicians should be collaborating with police departments in this manner. It is simply too easy to create a ‘scientific’ veneer for racism,” the letter states.
The police department in Santa Cruz used a predictive policing model developed by the company PredPol, which is headquartered in the same city. In June, the city passed an ordinance banning predictive policing and facial recognition software.
Deputy Chief Bernie Escalante says the police department did not have enough information on how its predictive policing software worked to keep using it.
“It felt as if it was directing us a little bit down the road of, if not racially profiling, a biased-led type of policing,” Escalante says.
Police Chief Anthony Holloway of St. Petersburg, Florida, is a proponent of predictive policing, provided it is not used for racial profiling. Holloway, chair of the ABA Criminal Justice Section’s Law Enforcement Committee, says the software enables his department to improve public safety and better allocate resources.
“I think there’s a lot of misinformation out there about what law enforcement is doing with predictive policing, what law enforcement is doing with facial recognition,” Holloway says. “There should be some guidelines so those departments that are misusing this great technology don’t misuse it.”
Schwartz says there is scant evidence that predictive policing reduces crime. Instead, the attorney says it has led to “aggressive policing in minority communities” and could violate due process and privacy protections.
Besides predictive policing, law enforcement has adopted other forms of surveillance, including facial recognition. Researchers have shown that facial recognition software misidentifies people of color more frequently than white people. In 2018, Massachusetts Institute of Technology and Stanford University researchers found that three commercially available facial recognition programs misidentified darker-skinned women 34.5% of the time. Lighter-skinned men were misidentified 0.8% of the time.
Schwartz says facial recognition could also have a chilling effect on speech, if people believe they are being identified and surveilled.
“With face surveillance and predictive policing, we have arrived at the point that these techniques are so dangerous that we think the police should never use them,” Schwartz says.
In February 2018, the Stop LAPD Spying Coalition filed a lawsuit against the Los Angeles Police Department demanding information on its predictive policing program. The LAPD eventually released the records to the group in December 2019. Earlier this year, the police department suspended the program, citing budget constraints because of the coronavirus pandemic.
Hamid Khan, one of the principals of the coalition, says police follow a familiar pattern. Officials tout the technology’s public safety benefits and claim they foster trust in the community. But over time, he says, these tools are exposed as another form of surveillance.
“Any tool can be changed into a tool of harm,” Khan says.
Activist Malkia Devich-Cyril is concerned that the commitment to suspend facial recognition could be short-lived.
“The companies that are deciding not to sell these controversial products as a powerful protest movement gains traction may be motivated more by careful calculation of financial and public relations risk than by concern for Black lives,” Devich-Cyril wrote in a July Atlantic article.
Jody Westby, CEO of technology services firm Global Cyber Risk and chair of ABA’s Section of Science & Technology Privacy and Computer Crime Committee, warns that police departments should not lean too heavily on technology.
“It’s a very slippery slope when you start giving police technology to help them solve crimes—especially before a crime has been committed—instead of getting up and doing the legwork,” Westby says.
Dustin DeRollo is a spokesman for the Los Angeles Police Protective League, the union representing the LAPD. He says if ending predictive policing helps departments get back to a place where officers are directly engaging with the public, then it’s a positive development.
“No algorithm is ever going to replace a human being,” DeRollo says. “There’s an instinctual part of police work that can go by the wayside if you’re overreliant on the technology.”
DeRollo says facial recognition merits further consideration as a tool for investigating crimes where real-time data, including child abduction cases or terrorist attacks, is important.
“We completely understand that the technology needs to be further developed to be as accurate as possible,” DeRollo says. “By the same token, we’d like to have every tool we can to help solve crimes and keep people safe.”
Accountability through data
Through social media platforms, smartphone cameras and apps, tech companies have given people the tools they need to document police misconduct.
“What we would say to the government is,’Get out of the way,’” Schwartz says. “While there is a First Amendment right to record police, all too often police officers are grabbing cameras or retaliating with arrest against people.”
Lawmakers are seeking greater accountability and transparency through a national registry.
The FBI national use-of-force database has been in development for years, but data collection is inconsistent. But as of May, just 40% of police departments had sent information to the FBI, the agency told Time magazine.
In June, Democrats in U.S. House of Representatives passed a criminal justice reform bill that would create a national database to log instances of police misconduct, use-of-force, disciplinary records, complaints and dismissals. Senate Republicans have also proposed a national registry of use-of-force incidents.
At this year’s annual meeting, the ABA responded to calls for police reform with Resolution 116A. It pushes legislators to create a central database of excessive and deadly force complaints in addition to disciplinary records that would prevent problem officers from moving from one jurisdiction to another.
Body and dashboard cameras could also be tools for accountability, though Schwartz cautions that they can also be used as a tool for surveillance. To ensure transparency, he says when police officers come into contact with a civilian, they must turn on their cameras and keep them on.
“Otherwise, you just have this gamesmanship where the cops turn on the camera when they’re good and turn off the camera when they’re bad,” Schwartz says.
The public should also have timely access to footage of violent incidents, Schwartz says. Police departments should not delay their release, the attorney adds, as Chicago officials did in the case of teenager Laquan McDonald, who was murdered by Chicago Police Officer Jason Van Dyke in October 2014, but video of the shooting was not released until November 2015.
Harvey Rishikof is a law professor at Temple University in Philadelphia and co-chair of the ABA National Task Force on Cybersecurity and the Law. He says body cameras come with “a whole new understanding of what privacy is and is going to be.” He adds, “When the officer is interacting with the public, the public loses its privacy, too.”
Looking in the mirror
Some view efforts to end the surveillance technologies as just a small piece of the controversial effort to defund, or in some cases, abolish the police. Hannah Sassaman is a policy director at the Movement Alliance Project, a nonprofit advocacy group that organizes programs with a focus on media, technology, and racial and economic justice. She says instead of investing in police departments, there should be a greater investment in communities.
“It’s not about whether or not police use tech; it’s whether or not we can make the footprint of police smaller year after year,” Sassaman says. “You don’t need an app for that.”
But some who are advocating for police reform are wary of defunding the police. At the ABA Annual Meeting earlier this month, Republican Sen. Tim Scott of South Carolina told then-ABA President Judy Perry Martinez that police officers should be supported in the field, not defunded.
“The concept of defunding the police is the scariest thought I’ve ever heard as it relates to communities of color and the vulnerable communities,” Scott says.
Escalante says recent outrage over racial injustice drove the decision to ban predictive policing, and his police department welcomes reforms.
“Systemically, I think we all can take the time to look at ourselves in the mirror and see how we can get better as individuals or institutions,” Escalante says.