Microsoft argues facial-recognition tech could violate your rights

Microsoft president Brad Smith says facial-recognition technology needs to be regulated.

In This Article:

On Thursday, the American Civil Liberties Union provided a good reason for us to think carefully about the evolution of facial-recognition technology. In a study, the group used Amazon’s (AMZN) Rekognition service to compare portraits of members of Congress to 25,000 arrest mugshots. The result: 28 members were mistakenly matched with 28 suspects.

The ACLU isn’t the only group raising the alarm about the technology. Earlier this month, Microsoft (MSFT) president Brad Smith posted an unusual plea on the company’s blog asking that the development of facial-recognition systems not be left up to tech companies.

Saying that the tech “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression,” Smith called for “a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.”

But we may not get new laws anytime soon. The nuances are complex, while Congress remains as reluctant as ever to regulate privacy. We may find ourselves stuck struggling to agree on norms well after the technology has redefined everything from policing to marketing.

My face is my passport

The underlying problem is simple: the proliferation of connected cameras, databases of facial images and software linking the two has made this technique not just cheap but increasingly unavoidable.

Or as Nicol Turner-Lee, a fellow at the Brookings Institution in Washington, put it: “We live in an economy of images.”

Of course, this can be good. Encrypted, on-device facial-recognition systems such as Apple’s (AAPL) Face ID and Microsoft’s Windows Hello let you sign into a phone or laptop easily without putting your facial characteristics in a cloud-hosted database.

Or facial recognition may come as a choice subject to your own calculus of convenience versus privacy. If you value streamlining air travel enough, you can join such experiments in biometric identification as a test of facial-recognition boarding at LAX.

Or it can be beyond your control. Maybe your state shares its driver’s-license database with other government agencies—a 2016 study by the Georgetown Law Center’s Center on Privacy and Technology found that at least 26 states had opened those databases to police searches, covering the faces of more than 117 million American adults. Or the passport-control checkpoint at an international border—where you already have minimal rights—may demand you look into an automated camera.

What rules should we have?

The problem is you can’t assume to know when a camera and its software identify you, that their recognition algorithms are accurate or that the underlying databases are always secure.