![]() It was a critical moment for policing tech, and the LAPD and other departments were ramping up their use of technology. Fallon for The Washington Post via Getty Images Operation LASERīrayne first embedded with the department in 2013 as a 27-year-old graduate student. LAPD Captain Elizabeth Morales shows a printed map of predicted crime hot spots in the Foothill Division of Los Angeles, Calif., on Monday, October 24, 2016. “Sarah has been given access to the reality of big data policing in a way that no one else has - and probably, because of her success, no one else ever will.” ![]() “Her book is a completely original inside look at the development of big data surveillance at the height of the first generation of its adoption,” he said. “I very deliberately wanted to flip the lens to focus on those doing the surveilling - on the police themselves.”Īndrew Guthrie Ferguson, a law professor at American University and author of “The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement,” said that Brayne’s work is an unflinching look at what happens when people in power use emerging technologies. “Most sociological research on criminal justice has focused on those who are being policed,” she told The Intercept. It’s about the content to cause harm.” Big data, he added, simply gives police more ways to do that.īrayne’s contribution is showing exactly how data is distorted in the hands of police. “When you look at policing and the history of policing, from our vantage point, it’s not about public safety when it comes to nonwhite folks. “Surveillance is basically the tip of the policing knife,” said Hamid Khan, a co-leader of the coalition. Leading the charge in Los Angeles is the Stop LAPD Spying Coalition. Through public records requests, the group’s activists have obtained documents on police use of data analytics, and in 2018, they successfully pushed the city’s Office of the Inspector General to audit the department’s use of technology. Image: Courtesy Oxford University PressFor years, scholars and activists have critiqued the algorithms used in data-driven policing, arguing that they merely techwash bias by making sloppy investigative work seem objective. With the automated license plate reader, he said, police could use plate numbers to determine who else was connected to the victim, even if there was no other evidence linking them to a crime. A sergeant explained that family or friends would often drop off an injured person and then speed away. She noted how police used an automated license plate reader mounted outside an emergency room to build out networks of victims’ associates. She learned that software vendors routinely show up at the department to peddle their wares, like pharmaceutical representatives visiting doctors’ offices. An assistant professor at the University of Texas at Austin, Brayne did months of fieldwork at the Los Angeles Police Department and other law enforcement agencies in the area, tagging along as cops used software from Palantir, PredPol, and other companies. ![]() In her new book, “Predict and Surveil: Data, Discretion, and the Future of Policing,” sociologist Sarah Brayne slays that assumption with granular detail. ![]() But elsewhere, police chiefs worked to deepen partnerships with tech companies, claiming that the answer to systemic bias and racism was simply more data. As protests raged around the world, 1,400 researchers signed an open letter calling on their colleagues to stop collaborating with police on algorithms, and cities like Santa Cruz, New Orleans, and Oakland variously banned predictive policing, facial recognition, and voice recognition. The killing of George Floyd last May sparked renewed scrutiny of data-driven policing. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |