Zephyrnet Logo

AI Weekly: Restricting surveillance technologies is a strong first step, but it’s not enough

Date:

In a letter to U.S. House leaders on Wednesday, over 100 racial justice and civil liberties groups called on Congress to end federal funding for surveillance technologies police use to spy on activists and demonstrators. In New York, the City Council voted in favor of the Public Oversight of Surveillance Technology (POST) Act, a bill that requires the New York City Police Department (NYPD) to disclose their use of surveillance. In Detroit, residents and activists urged the City Council to reject a contract that would extend local law enforcement’s use of facial recognition.

Ahead of such legislation, some vendors voluntarily pledged to end or pause their relationships with police departments. Amazon and Microsoft said they would suspend — at least temporarily — sales of facial recognition solutions to law enforcement, citing a lack of regulation. IBM announced it would exit the “general purpose” facial recognition business, and Foursquare, a location technology platform used by a number of mobile apps and services, decided not to provide analytics on data from recent protests.

While this feels like a step in the right direction, a healthy dose of skepticism about these companies’ intentions is warranted. Tech giants are quick to ascribe their actions (or reactions) to noble causes, but motivations of a fiscal or political nature often play a part. Deborah Raji, a technology fellow at the AI Now Institute, noted that IBM had quietly removed face detection capabilities from its APIs last fall, and the Associated Press reported the company’s decision “is unlikely to affect its bottom line.” And Amazon has faced mounting pressure from regulators, consumers, shareholders, and employees over the continued sale of its Rekognition cloud service to police.

Even companies that appear to have fewer horses in the race may not be acting out of the goodness of their hearts. Microsoft — whose president, Brad Smith, recently described the company’s stand on the use of facial recognition as “principled” — had tried to sell its face detection technology to the U.S. Drug Enforcement Administration as far back as 2017, according to emails obtained by the American Civil Liberties Union. And a report by the Intercept and the National Institute’s nonprofit Investigative Fund shows IBM collaborated with the New York City Police Department (NYPD) to develop a system that allowed officials to search for people by skin color, hair color, gender, age, and various facial features.

VB Transform 2020 Online – July 15-17. Join leading AI executives: Register for the free livestream.

OneZero’s Jathan Sadowski, a postdoctoral research fellow in smart cities at the University of Sydney, borrowed a phrase from Macomb Community College professor Chris Gilliard in an op-ed this week: “Black power washing.” While moratoriums on the sale of surveillance technologies have more heft than, say, tweets against systemic racism, they’re ultimately calculated decisions. When “doing good” is in a company’s best interest, such acts necessarily fall short of ethical standards.

Sadowski advocates holding companies to account by dismantling the police surveillance infrastructure, from cellphone-tracking Stingray devices and real-time analytics to predictive algorithms and data collection systems. He asserts they’re deeply flawed on a technological level, for one thing. Taking facial recognition as an example, a National Institute of Standards and Technology (NIST) study last December found systems misidentify Black people more often than White people, following landmark work on facial recognition technologies by Raji, Joy Buolamwini, Timnit Gebru, and Helen Raynham that came to the same conclusion.

The POST Act — and the 13 similar laws adopted by cities around the country — are a positive step. So are the regulations on law enforcement use of drones in 44 states; the facial recognition bans in San Francisco, Oakland, Somerville, Brookline, and San Diego; and the reform bill proposed by Democrats from both houses of Congress that includes restrictions on the use of police body cameras. But all of these measures stop short of preventing companies like Palantir, Ring, Clearview AI — and to a lesser extent Microsoft, Google, and Amazon — from pursuing contracts that infringe upon the rights of ordinary people.

“The existing order of power is untenable,” Sadowski writes. “The companies that have profited from [surveillance] technologies and the governments that have used them against the public have no moral authority to tell us what to keep and what to trash. Dismantling the machinery of policing is a necessary method of confronting the forms of power that are channeled through this infrastructure.”

Increasingly, that appears to be the best option.

For AI coverage, send news tips to Khari Johnson and Kyle Wiggers — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI Channel.

Thanks for reading,

Kyle Wiggers

AI Staff Writer

Source: http://feedproxy.google.com/~r/venturebeat/SZYF/~3/ptZ5xsK8mNo/

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?