Apple’s new state-of-the-art phone — the iPhone X—has received accolades for replacing the familiar home button with facial recognition technology, known as FRT. Users can unlock the phone simply by looking at it. The phone then uses a computer algorithm to compare the owner’s face with stored data, such as eye position and skin texture, and unlocks it if there is a “match.” Cool, right?
Well, not entirely.
The iPhone X moves us closer to a world in which, by simply walking outside or picking up a personal iPhone, we will be monitored but have no idea of the extent.
{mosads}FRT has been around for a while, so its privacy implications are not new. What is notable about Apple’s iPhone X is that it will catapult FRT into mainstream societal norms of everyday life.
Think about it. Tens of millions of loyal Apple users will soon be handing over detailed scans of their faces to the tech giant, to do with as it pleases. Just like the disposable camera gave way to the cell phone, the police lineup photo and fingerprint could swiftly become key technologies of the past.
We have all seen police investigating crimes on TV. They show up to a murder scene, collect hair and blood samples and maybe DNA, and then run the information against their fingerprint databases. Then they talk to lots of people who might have been there when the crime occurred or saw something suspicious beforehand. The cops flash their badges, identify themselves and ask potential witnesses about something that happened in the past. In the next scene, one of the witnesses tips off the perpetrator that the police are on to him.
FRT is different.
Identifying someone with FRT can be done secretly, behind a computer. The “witnesses” are not live human beings but images from social media, closed circuit TV cameras from ATM machines and satellites — as well as other publicly-available big data which, when subjected to a computer algorithm, can produce a virtual dossier of the person’s whereabouts, acquaintances, activities and beliefs.
With FRT, the “fingerprint” used to match someone with a crime scene is the high tech selfie stored in an iPhone X. FRT could even be applied to large groups of innocent people to match facial images to loads of photos and other bits of cyber-data from days, weeks and months before — despite zero evidence that anyone committed a crime.
Remember Tom Cruise’s 2002 futuristic film, “Minority Report?” As FRT increases in sophistication, the police could eventually have the technological capacity to identify wrongdoers before they commit crimes. Recall that Cruise’s character was innocent, accused of a future murder. According to a 2016 report by Georgetown Law’s Center on Privacy & Technology, one in two American adults’ faces are already in a law enforcement face recognition network. And the systems are rarely audited for abuse.
Of course, Apple is a private company. While the government is constrained by the Constitution, Apple can “just say no” to the government snoops, and refuse to hand over its customers’ FRT data. But on the flipside, Apple can also perform whatever search of users’ FRT data that it wants, or include whatever fine print required to get users to “consent” to its ability to selling customers’ face scans to the highest bidders (including the government).
In other words, Apple can do stuff with your personal information that would mean big constitutional trouble if done by the government. But the Constitution’s constraints on government is hardly a reason not to worry about FRT’s implications for individual freedoms.
First, we know from the FBI-Apple encryption lawsuit involving a shooter in a December 2015 attack in San Bernardino, California, that the day will likely come when the government will try to get Apple to turn over FRT data from a suspect’s iPhone. The Constitution would not necessarily stop Apple from cooperating with the government in that circumstance. Although the Fourth Amendment protects against unreasonable searches and seizures, a face is public while a person moves about in public.
Under Supreme Court law developed prior to the technology revolution, exposing one’s face arguably operates as a waiver of Fourth Amendment protections. If the guy on the street can see your face as you walk by, so can the government.
Second, as others have noted, the government might not even need Apple’s help in order to get inside an iPhone via FRT. As things stand, nobody knows for sure whether the police will be able to open an iPhone X — and access the troves of personal data it contains — just by flashing it before the owner’s face during an arrest, or by holding up a lineup photo to the iPhone’s screen once the person is in custody.
Third, even if the iPhone X’s contents remain secure, the simple fact that users’ face scans will be stored for perpetuity has profound constitutional implications. Imagine a spirited political protest on Washington, D.C.’s mall where all participants — as a matter of course — are scanned, matched, tracked and traced by the opposition political party, which happens to be in power.
Anonymity — with its First Amendment implications for freedom of speech and association — would soon be a thing of the past. Studies have shown that people of color, in particular, are more likely to be erroneously identified by FRT in its current iteration. Yet when it comes to new technologies like FRT, constitutional law resides in the dark ages, with scant time to catch up before individual privacy becomes extinct along with the mastodons.
Unless Congress steps in to legislate privacy protections that respond to the rapid-fire advancement of technologies like the iPhone X, Apple users’ ability to move about anonymously and without the threat of criminal liability could hinge on private sector profit margins and (let’s hope) the integrity of corporate officers. For now at least, manufacturers like Apple may be our most vital defense to the possible emergence of a police state. Let’s just hope corporate America decides to “regulate” this area for us in ways that our democratic system would condone.
Kimberly Wehle is a professor of Law at the University of Baltimore School of Law, former assistant United States attorney and associate independent counsel in the Whitewater Investigation and author of the forthcoming book, “The Outsourced Constitution: How Public Power in Private Hands Erodes Democracy.”