I Opted Out of Facial Recognition at the Airport—It Wasn’t Easy
Credit to Author: Allie Funk| Date: Tue, 02 Jul 2019 13:00:00 +0000
The announcement came as we began to board. Last month, I was at Detroit’s Metro Airport for a connecting flight to Southeast Asia. I listened as a Delta Air Lines staff member informed passengers that the boarding process would use facial recognition instead of passport scanners.
Allie Funk is a research analyst for Freedom on the Net, Freedom House's annual country-by-country assessment of internet freedom. She focuses on developments in the US and Asia.
As a privacy-conscious person, I was uncomfortable boarding this way. I also knew I could opt out. Presumably, most of my fellow fliers did not: I didn't hear a single announcement alerting passengers how to avoid the face scanners.
To figure out how to do so, I had to leave the boarding line, speak with a Delta representative at their information desk, get back in line, then request a passport scan when it was my turn to board. Federal agencies and airlines claim that facial recognition is an opt-out system, but my recent experience suggests they are incentivizing travelers to have their faces scanned—and disincentivizing them to sidestep the tech—by not clearly communicating alternative options. Last year, a Delta customer service representative reported that only 2 percent of customers opt out of facial-recognition. It's easy to see why.
As I watched traveler after traveler stand in front of a facial scanner before boarding our flight, I had an eerie vision of a new privacy-invasive status quo. With our faces becoming yet another form of data to be collected, stored, and used, it seems we’re sleepwalking toward a hyper-surveilled environment, mollified by assurances that the process is undertaken in the name of security and convenience. I began to wonder: Will we only wake up once we no longer have the choice to opt out?
Until we have evidence that facial recognition is accurate and reliable—as opposed to simply convenient—travelers should avoid the technology where they can.
The facial recognition plan in US airports is built around the Customs and Border Protection Biometric Exit Program, which utilizes face-scanning technology to verify a traveler’s identity. CBP partners with airlines—including Delta, JetBlue, American Airlines, and others—to photograph each traveler while boarding. That image gets compared to one stored in a cloud-based photo-matching service populated with photos from visas, passports, or related immigration applications. The Biometric Exit Program is used in at least 17 airports, and a recently-released Department of Homeland Security report states that CBP anticipates having the ability to scan the faces of 97 percent of commercial air passengers departing the United States by 2023.
This rapid deployment of facial recognition in airports follows a 2017 executive order in which President Trump expedited former President Obama’s efforts to use biometric technology. The Transportation Security Administration has since unveiled its own plan to improve partnership with CBP and to introduce the technology throughout the airport. The opportunity for this kind of biometric collection infrastructure to feed into a broader system of mass surveillance is staggering, as is its ability to erode privacy.
Proponents of these programs often argue that facial recognition in airports promotes security while providing convenience. But abandoning privacy should not be a prerequisite for achieving security. And in the case of technology like facial recognition, the “solution” can quickly become a deep and troubling problem of its own.
For starters, facial recognition technology appears incapable of treating all passengers equally at this stage. Research shows that it is particularly unreliable for gender and racial minorities: one study, for example, found a 99 percent accuracy rate for white men, while the error rate for women who have darker skin reached up to 35 percent. This suggests that, for women and people of color, facial recognition could actually cause an increase in the likelihood to be unfairly targeted for additional screening measures.
Americans should be concerned about whether images of their faces collected by this program will be used by companies and shared across different government agencies. Other data collected for immigration purposes—like social media details—can be shared with federal, state, and local agencies. If one government agency has a database with facial scans, it would be simple to share the data with others. This technology is already seeping into everyday life, and the increased regularity with which Americans encounter facial recognition as a matter of course while traveling will reinforce this familiarity; in this context, it is easy to imagine content from a government-operated facial recognition database being utilized in other settings aside from airports—say, for example, monitoring peaceful protests.
There are also serious concerns about CBP’s storage of this data. A database with millions of facial scans is extremely sensitive, and breaches seem inevitable. Indeed, CBP officials recently revealed that thousands of photos of people’s faces and license plates were compromised after a cyberattack on a federal subcontractor. Once this sort of data is made insecure, there is no hope of getting it back. One cannot simply alter their face like they can their phone number or email address.
Importantly, there have been some efforts to address facial recognition in airports. The government’s Privacy and Civil Liberties Oversight Board recently announced an aviation-security project to assess privacy and civil liberties implications with biometric technologies. Members of Congress have also shared similar concerns.
Nevertheless, the Biometric Exit Program needs to be stopped until it prioritizes travelers’ privacy and resolves its technical and legal shortcomings. At the state and local level, public opposition has driven cities and states to consider—and, in some cases, enact—restrictions on the use of facial recognition technology. The same healthy skepticism should be directed toward the technology’s deployment at our airports.
Congress needs to supplement pressure from travelers with strong data protection laws that provide greater transparency and oversight. This should include strict limits on how long companies and government agencies can retain such intimate data. Private companies should not be allowed to utilize data collected for business purposes, and federal agencies should not be able to freely share this data with other parts of government. Policymakers should also ensure that biometric programs undergo thorough and transparent civil rights assessments prior to implementation.
Until measures like these are met, travelers should be critical when submitting to facial recognition technology in airports. Ask yourself: Is saving a few minutes worth handing over your most sensitive biometric information?
WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here. Submit an op-ed at opinion@wired.com.