Breaking

Sunday, June 14, 2020

Microsoft won’t sell facial recognition to police until Congress passes new privacy law

Microsoft has followed competitors Amazon and IBM in restricting how it provides facial recognition technology to third parties, in particular to law enforcement agencies. The company says that it does not currently provide the technology to police, but it’s now saying it will not do so until there are federal laws governing how it can be deployed safely and without infringing on human rights or civil liberties.

IBM said earlier this week it will outright cease all sales, development, and research of the controversial tech. Amazon on Wednesday said it would stop providing it to police for one year to give Congress time to put in place “stronger regulations to govern the ethical use of facial recognition technology.”

Microsoft president @BradSmi says the company does not sell facial recognition software to police depts. in the U.S. today and will not sell the tools to police until there is a national law in place “grounded in human rights.” #postlive pic.twitter.com/lwxBLjrtZL

— Washington Post Live (@postlive) June 11, 2020

Microsoft president Brad Smith most closely echoed Amazon’s stance on Thursday in outlining the company’s new approach to facial recognition: not ruling out that it will one day sell the tech to police but calling for regulation first.

“As a result of the principles that we’ve put in place, we do not sell facial recognition technology to police departments in the United States today,” Smith told The Washington Post. “But I do think this is a moment in time that really calls on us to listen more, to learn more, and most importantly, to do more. Given that, we’ve decided that we will not sell facial recognition to police departments in the United States until we have a national law in place ground in human rights that will govern this technology.”

Smith said Microsoft would also “put in place some additional review factors so that we’re looking into other potential uses of the technology that goes even beyond what we already have.” That seems to indicate Microsoft may still provide facial recognition to human rights groups to combat trafficking and other abuses, as Amazon said it would continue doing with its Rekognition platform.

“We will not sell facial recognition to police departments until we have a national law in place ground in human rights.”

Amid ongoing protests around the US and the world against racism and police brutality and a national conversation about racial injustice, the tech industry is reckoning with its own role in providing law enforcement agencies with unregulated, potentially racially biased technology.

Research has shown facial recognition systems, due to being trained using data sets composed of mostly white males, have significant trouble identifying darker-skinned people and even determining the gender of such individuals. Artificial intelligence researchers, activists, and lawmakers have for years sounded the alarm about selling the technology to police, warning not just against racial bias, but also human rights and privacy violations inherent in a technology that could contribute to the rise of surveillance states.

While Microsoft has previously sold police departments access to such technology, the company has taken a more principled approach since. Last year, Microsoft denied California law enforcement access to its facial recognition tech out of concern for human rights violations. It also announced it would no longer invest in third-party firms developing the tech back in March, following accusations that an Israeli startup Microsoft invested in provided the technology to the Israel government for spying on Palestinians. (Microsoft later declared that its internal investigation found that the company, AnyVision, “has not previously and does not currently power a mass surveillance program in the West Bank,” but it divested from the company nonetheless.)

Microsoft has been a vocal supporter of federal regulation that would dictate how such systems can be used and what protections will be in place to protect privacy and guard against discrimination. Smith himself has been publicly expressing concerns over the dangers of unregulated facial recognition since at least 2018. But the company was also caught last year providing a facial recognition dataset of more than 10 million faces, including images of many people who were not aware of and did not consent to their participation in the dataset. The company pulled the dataset offline only after a Financial Times investigation.

According to the American Civil Liberties Union (ACLU), Microsoft, as recently as this year, supported legislation in California that would allow police departments and private companies to purchase and use such systems. That’s following laws in San Francisco, Oakland, and other Californian cities that banned use of the technology by police and governments last year. The bill, AB 2261, failed last week, in a victory for the ACLU and coalition of 65 organizations that came together to combat it.

Matt Cagle, the ACLU’s technology and civil liberties attorney with its Northern California branch, released this statement on Thursday regarding Microsoft’s decision:

When even the makers of face recognition refuse to sell this surveillance technology because it is so dangerous, lawmakers can no longer deny the threats to our rights and liberties. Congress and legislatures nationwide must swiftly stop law enforcement use of face recognition, and companies like Microsoft should work with the civil rights community — not against it — to make that happen. This includes halting its current efforts to advance legislation that would legitimize and expand the police use of facial recognition in multiple states nationwide.

It should not have taken the police killings of George Floyd, Breonna Taylor, and far too many other Black people, hundred of thousands of people taking to the streets, brutal law enforcement attacks against protesters and journalists, and the deployment of military-grade surveillance equipment on protests led by Black activists for these companies to wake up to the everyday realities of police surveillance for Black and Brown communities. We welcome these companies finally taking action — as little and as late as it may be. We also urge these companies to work to forever shut the door on America’s sordid chapter of over-policing of Black and Brown communities, including the surveillance technologies that disproportionately harm them.

No company backed bill should be taken seriously unless the communities most impacted say it is the right solution.

No comments:

Post a Comment