How the LAPD and Palantir Use Data to Justify Racist Policing

In a new book, a sociologist who spent months embedded with the LAPD details how data-driven policing techwashes bias.

The post How the LAPD and Palantir Use Data to Justify Racist Policing appeared first on The Intercept.

The killing of George Floyd last May sparked renewed scrutiny of data-driven policing. As protests raged around the world, 1,400 researchers signed an open letter calling on their colleagues to stop collaborating with police on algorithms, and cities like Santa Cruz, New Orleans, and Oakland variously banned predictive policing, facial recognition, and voice recognition. But elsewhere, police chiefs worked to deepen partnerships with tech companies, claiming that the answer to systemic bias and racism was simply more data.

In her new book, “Predict and Surveil: Data, Discretion, and the Future of Policing,” sociologist Sarah Brayne slays that assumption with granular detail. An assistant professor at the University of Texas at Austin, Brayne did months of fieldwork at the Los Angeles Police Department and other law enforcement agencies in the area, tagging along as cops used software from Palantir, PredPol, and other companies. She learned that software vendors routinely show up at the department to peddle their wares, like pharmaceutical representatives visiting doctors’ offices. She noted how police used an automated license plate reader mounted outside an emergency room to build out networks of victims’ associates. A sergeant explained that family or friends would often drop off an injured person and then speed away. With the automated license plate reader, he said, police could use plate numbers to determine who else was connected to the victim, even if there was no other evidence linking them to a crime.

Riding along in department helicopters, Brayne saw how data was used to justify extreme measures. In order to get a reduction in crime in one division, police concluded that they had to fly helicopters overhead 51 times per week. They often increased that to 80 to 90 flyovers for good measure, meaning that many residents’ days were regularly interrupted by the noise of buzzing choppers. The cops seemed to register the intrusion. They dubbed the copters “ghetto birds.”

Predict-and-Surveil-jacket

Image: Courtesy Oxford University Press

For years, scholars and activists have critiqued the algorithms used in data-driven policing, arguing that they merely techwash bias by making sloppy investigative work seem objective. Leading the charge in Los Angeles is the Stop LAPD Spying Coalition. Through public records requests, the group’s activists have obtained documents on police use of data analytics, and in 2018, they successfully pushed the city’s Office of the Inspector General to audit the department’s use of technology. “Surveillance is basically the tip of the policing knife,” said Hamid Khan, a co-leader of the coalition. “When you look at policing and the history of policing, from our vantage point, it’s not about public safety when it comes to nonwhite folks. It’s about the content to cause harm.” Big data, he added, simply gives police more ways to do that.

Brayne’s contribution is showing exactly how data is distorted in the hands of police. “Most sociological research on criminal justice has focused on those who are being policed,” she told The Intercept. “I very deliberately wanted to flip the lens to focus on those doing the surveilling — on the police themselves.”

Andrew Guthrie Ferguson, a law professor at American University and author of “The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement,” said that Brayne’s work is an unflinching look at what happens when people in power use emerging technologies. “Her book is a completely original inside look at the development of big data surveillance at the height of the first generation of its adoption,” he said. “Sarah has been given access to the reality of big data policing in a way that no one else has — and probably, because of her success, no one else ever will.”

Los Angeles, CA - October 24: LAPD Captain Elizabeth Morales speaks during an interview about using predictive policing zone maps with the Los Angeles Police Department in the LAPD Foothill Division on Monday, October 24, 2016 in the Pacoima neighborhood of Los Angeles, CA. (Photo by Patrick T. Fallon for The Washington Post via Getty Images)

LAPD Captain Elizabeth Morales shows a printed map of predicted crime hot spots in the Foothill Division of Los Angeles, Calif., on Monday, October 24, 2016.

Photo: Patrick T. Fallon for The Washington Post via Getty Images

Operation LASER

Brayne first embedded with the department in 2013 as a 27-year-old graduate student. It was a critical moment for policing tech, and the LAPD and other departments were ramping up their use of technology. The LAPD had signed on with Palantir Gotham, which merges data from crime and arrest reports, automated license plate readers, rap sheets, and other sources. The department also had a contract with PredPol, which generates “boxes,” or hot spots, where property crimes like burglaries and auto theft are predicted to occur. A third program, Operation LASER, which stands for Los Angeles’ Strategic Extraction and Restoration program, used a points-based system, called a LASER score, to evaluate the risk that individuals posed.

In Brayne’s first interview, a captain boasted that the LAPD was at the forefront of technology adoption, detailing how software that had been developed for counterterrorism work was helping the department ward off future crimes. Afterward, Brayne asked to go on a ride-along with an officer. That trip revealed a more complicated picture. Instead of pulling up PredPol’s software on the laptop mounted to his dashboard, the cop worked with a printout of PredPol’s hot spots. He explained that the department’s in-car laptops had trouble loading even standard internet browsers. So much for technological wizardry.

Brayne and the officer took a break to eat In-N-Out burgers under a highway overpass, where they watched his colleagues bust up a grow operation and drag marijuana plants onto a flatbed truck. Then when they drove to a new location, Brayne noticed that the officer typed in his address manually, rather than let the car’s automatic vehicle tracker register his location. The union opposed officers being tracked, he explained. While predictive policing systems had caught average citizens in an opaque dragnet, police grew squeamish when the technology was turned back on them.

The notion that better technology can fix policing dates to at least the early 20th century. Back then, police departments were closely tied to city government political machines. Arguing that a data-driven approach would make policing more professional, reformers introduced tactics drawn from military operations, including signal boxes, telephone kiosks, and pin mapping. The militarization of police accelerated during the 1960s and has continued to the present day, to the point where even departments in placid American suburbs now have armored vehicles, night vision viewers, and bayonets.

“That’s a very visible manifestation of the militarization of policing,” said Brayne. “But something that’s more invisible is this creep of surveillance software into the daily operations of policing.”

Starting this month, one of the nation's major military contractors is outfitting Los Angeles County Sheriff Dept.'s patrol cars with sophisticated computer systems and hi?tech gadgetry that the company perfected for the battlefield. The installation is taking place at the Sybil Brand Institute. All the vehicles will be outfitted with Panasonic Toughbook laptop computers.  (Photo by Michael Robinson Chavez/Los Angeles Times via Getty Images)

A Los Angeles County Sheriff’s Department’s patrol car, outfitted with a computer system developed for the battlefield by a military contractor, is seen in Los Angeles on Nov. 16, 2011.

Photo: Michael Robinson Chavez/Los Angeles Times via Getty Images

After 9/11, the Department of Homeland Security gave state and local police $35 billion in grants, a portion of which was spent on developing data infrastructure. The infusion of cash created needs where there had been none. Police chiefs realized that the data they collected for Homeland Security could also be used for regular policing. In Los Angeles, some officials came to believe that predictive analytics and big data might also solve the department’s many problems. The LAPD had been racked by a stream of high-profile scandals, most infamously the 1991 beating of Rodney King. In 2001, the Department of Justice imposed a consent decree, a court order mandating that the department conduct regular audits and capture more data on officers and crimes. (A spokesperson for the LAPD declined to comment for this article.)

A key figure behind the shift toward predictive policing was William Bratton, who brought the data management system COMPSTAT to New York City before becoming the chief of the LAPD in 2002. Bratton oversaw the LAPD’s effort to merge various streams of data.

One result was Operation LASER, which was funded with nearly a million dollars from the federal Bureau of Justice Assistance. When they came into contact with someone who seemed suspicious, officers filled out a card with the person’s name, address, physical characteristics, vehicle information, gang affiliation, and criminal history. Each new point of contact with police earned a person one additional point. “There are a lot of chickenshit violations you can stop someone for,” a sergeant explained to Brayne during a ride-along. “Yesterday, this individual might have got stopped because he jaywalked. Today, he might have got stopped because he didn’t use his turn signal or whatever the case might be. So that’s two points.” The sergeant went on to argue that even such minor violations could help police predict the next crime because, taken together, they show “who is out on the streets.”

Such a system means, of course, that individuals in overpoliced neighborhoods can easily get caught up in a vicious cycle where they are, as Brayne writes, “more likely to be stopped, thus increasing their point value, justifying their increased surveillance, and making it more likely that they will be stopped again in the future.” But some of the administrators whom Brayne encountered were apparently less interested in accuracy than they were in amassing more records. The goal, one captain told her, was simply to get people “in the system”: to capture larger and larger amounts of data on seemingly harmless individuals in the hope that the data would help solve a crime later on. Once an officer had a person in the system, they could set an alert to automatically track changes in that person’s profile.

Brayne watched Los Angeles police fill out cards for Operation LASER and noted that they often listed the people who were with a person when they were stopped. She calls this a “secondary surveillance network.” You don’t actually need to have contact with police to be caught up in the system; you just need to have had contact with someone who did. Similarly, Brayne learned that images captured by automated license plate readers sometimes showed the faces of people who were stopped with a person of interest. Those people too became data. When Brayne raised the issue of possible legal constraints around the rampant collection and sharing of information, an employee in the Los Angeles County Chief Information Office declared bluntly, “Consent is anachronistic.”

In some cases, even being the subject of a query can boost someone’s suspiciousness. Brayne watched one detective search a national database for a juvenile. He remarked that he could see how many times other users had queried the same name. He claimed that the feature had helped police catch criminals because it told them other officers had suspected the same individual. When Brayne asked why that was useful, he replied, “Just because you haven’t been arrested doesn’t mean you haven’t been caught.”

Decades of research and years of cellphone videos have shown, of course, that police regularly target people who haven’t committed crimes. On ride-alongs, Brayne watched cops run the plates of law-abiding drivers stopped at traffic lights, just in case the numbers turned up a record. But the anecdote underscores how thoroughly big data reinforces existing police biases. One captain claimed, “It’s just math.”

A banner displays Palantir Technologies Inc. signage during the company's initial public offering (IPO) in front of the New York Stock Exchange (NYSE) in New York, U.S., on Wednesday, Sept. 30, 2020. Shares of Palantir Technologies, a data mining company co-founded by technology billionaire Peter Thiel, opened trading today on the New York Stock Exchange at $10 after the company sold shares to investors in a direct offering. Photographer: Michael Nagle/Bloomberg via Getty Images

A banner displays Palantir signage during the company’s initial public offering in front of the New York Stock Exchange on Sept. 30, 2020, in New York.

Photo: Michael Nagle/Bloomberg via Getty Images

The World According to Palantir

In Brayne’s fieldwork, Palantir’s technology came in for special praise. “They catch bad guys during every training class,” one sergeant told her. A captain rhapsodized about the platform’s constant addition of new data sets, marveling at the addition of foreclosure information: “I’m so happy with how big Palantir got.” Another said of Palantir, “We’ve dumped hundreds of thousands [of dollars] into that. They are so responsive and flexible about what we want. They’re great. They’re going to take over the world. I promise you, they’re going to take over the world.”

But up close, the software was only as good as the people maintaining and using it. To make sense of Palantir Gotham’s data, police often need input from engineers, some of whom are provided by Palantir. At one point in her research, Brayne watched a Palantir engineer search 140 million records for a hypothetical man of average build driving a black four-door sedan. The engineer narrowed those results to 2 million records, then to 160,000, and finally to 13 people, before checking which of those people had arrests on their records. At various points in the search, he made assumptions that could easily throw off the result — that the car was likely made between 2002 and 2005, that the man was heavy-set. Brayne asked what happened if the system served up a false positive. “I don’t know,” the engineer replied.

There were dissenters. Some of the people tasked with implementing data-driven policing at the LAPD complained to Brayne that the software didn’t work as advertised. One person working in information technology said, “Our command staff is easily distracted by the latest and greatest shiny object.” Data integration was uneven. Some divisions had certain software, while others didn’t. Another employee told Brayne that the detective case management system was “sort of like a pimple,” adding, “they just, like, stuck it on top.”

When management was out of the room, police were honest about their doubts. “Looks bitching, but it’s worthless,” one sergeant told Brayne of the LAPD’s data analysis infrastructure, which is housed at the department’s Real-Time Analysis and Critical Response Division. One group of officers bought their captain a Ouija board to mock his faith in algorithms. A captain told Brayne that person-based predictive policing was “a civil liberties nightmare” and that he would never adopt it. (His division adopted it after he left.)

Activists who have fought for years against department use of technology told The Intercept that Brayne’s work is useful — but only to a point. “It was very helpful to uncover and learn about the extensive amounts of stuff that LAPD was doing,” said Jamie Garcia, an organizer with Stop LAPD Spying. “But then what?” In general, she added, she is tired of academics treating surveillance like a problem to observe and evaluate. “The information goes back to the ivory tower, and the ivory tower has this conversation with themselves about what they think about it instead of that information being brought directly to the community.”

Brayne said that she hopes her work can be useful to a broad variety of people. “Transparency is the first step towards accountability,” she said. “It is impossible to hold an individual or organization accountable if you don’t know what they’re doing.”

Forrest Stuart, a sociologist at Stanford who studies policing, said that her book is essential at a moment when anti-racism protests have prompted some law enforcement officials to further embrace technology as a means for reducing bias. “Some of the most popular proposed solutions to the over-policing and over-incarceration of black and brown communities have involved new technologies,” he wrote in an email. “There is a sense that if we could just design a good enough computer program, we could deploy police more fairly and reduce, or perhaps even eliminate, unwarranted disparities in criminal justice. Brayne’s book makes us take real pause and recognize the faults in this techno-optimist dream.”

The LAPD ended its contract with PredPol last April, citing financial constraints brought on by the pandemic. Operation LASER ended in 2019. But a range of other companies, including heavyweights like Amazon and Microsoft, have moved into the space nationally, and a guidebook published by the LAPD last year makes clear that big data will continue to prominently figure in policing in Los Angeles. Palantir, meanwhile, has lately expanded its access to data by moving into coronavirus tracking and vaccine safety analysis. Last year, the company brokered lucrative contracts with the National Institutes of Health and the Food and Drug Administration. It went public in September. Since then, its stock prices have more than tripled.

The post How the LAPD and Palantir Use Data to Justify Racist Policing appeared first on The Intercept.


Print Share Comment Cite Upload Translate
APA
Mara Hvistendahl | Peace (2024-04-25T02:20:10+00:00) » How the LAPD and Palantir Use Data to Justify Racist Policing. Retrieved from https://www.pea.cx/2021/01/30/how-the-lapd-and-palantir-use-data-to-justify-racist-policing/.
MLA
" » How the LAPD and Palantir Use Data to Justify Racist Policing." Mara Hvistendahl | Peace - Saturday January 30, 2021, https://www.pea.cx/2021/01/30/how-the-lapd-and-palantir-use-data-to-justify-racist-policing/
HARVARD
Mara Hvistendahl | Peace Saturday January 30, 2021 » How the LAPD and Palantir Use Data to Justify Racist Policing., viewed 2024-04-25T02:20:10+00:00,<https://www.pea.cx/2021/01/30/how-the-lapd-and-palantir-use-data-to-justify-racist-policing/>
VANCOUVER
Mara Hvistendahl | Peace - » How the LAPD and Palantir Use Data to Justify Racist Policing. [Internet]. [Accessed 2024-04-25T02:20:10+00:00]. Available from: https://www.pea.cx/2021/01/30/how-the-lapd-and-palantir-use-data-to-justify-racist-policing/
CHICAGO
" » How the LAPD and Palantir Use Data to Justify Racist Policing." Mara Hvistendahl | Peace - Accessed 2024-04-25T02:20:10+00:00. https://www.pea.cx/2021/01/30/how-the-lapd-and-palantir-use-data-to-justify-racist-policing/
IEEE
" » How the LAPD and Palantir Use Data to Justify Racist Policing." Mara Hvistendahl | Peace [Online]. Available: https://www.pea.cx/2021/01/30/how-the-lapd-and-palantir-use-data-to-justify-racist-policing/. [Accessed: 2024-04-25T02:20:10+00:00]
rf:citation
» How the LAPD and Palantir Use Data to Justify Racist Policing | Mara Hvistendahl | Peace | https://www.pea.cx/2021/01/30/how-the-lapd-and-palantir-use-data-to-justify-racist-policing/ | 2024-04-25T02:20:10+00:00
https://github.com/addpipe/simple-recorderjs-demo