Facial recognition technology is quickly spreading in the travel industry, but some observers worry that it is getting ahead of concerns about accuracy and individual privacy.
Major airlines have teamed up with U.S. Customs and Border Protection to deploy facial recognition technology at airports, and now lodging giant Marriott International is testing widespread use of facial scans for guest check-ins.
Meanwhile, questions remain about the accuracy of facial match-ups with stored data, and now the head of a major technology company is warning that use of facial recognition could be moving too fast. Microsoft president Brad Smith issued that warning in a blog post this week, noting that the spread of facial scans in public places could “potentially be misused and abused by private companies and public authorities alike.”
While the technology can have lots of positive uses, other potential applications are “sobering,” Smith wrote. “Imagine a government tracking everywhere you walked over the past month without your permission or knowledge. Imagine a database of everyone who attended a political rally that constitutes the very essence of free speech. Imagine the stores of a shopping mall using facial recognition to share information with each other about each shelf that you browse and product you buy, without asking you first.”
Smith also raised the question of accuracy in matching facial scans with existing photo databases, citing recent studies that have found biases in the technology. “The technologies worked more accurately for white men than for white women and were more accurate in identifying persons with lighter complexions than people of color,” he said.
A study by the Georgetown Law Center on Privacy and Technology said the Department of Homeland Security’s own data shows that its facial recognition systems – the ones being deployed at U.S. airports to validate passenger identities – “erroneously reject as many as one in 25 travelers using valid credentials.” CBP claims an accuracy rate of about 99 percent for its airport scans, but that’s still one mistaken identity out of every 100 passengers.
The Georgetown Law report also said that while Congress has authorized DHS and CBP to gather biometric data from foreigners at the border, it “has never clearly authorized the border collection of biometrics from American citizens using face recognition technology. Without explicit authorization, DHS should not be scanning the faces of Americans as they depart on international flights—but DHS is doing it anyway.”
Airlines like the technology at boarding gates because it helps them load planes faster. Orlando International recently said it will be the first U.S. airport to conduct facial scans of all arriving and departing international passengers, and CBP is now testing facial scans at a dozen other U.S. airports. The most recent is Seattle-Tacoma, where Lufthansa is testing the tech with CBP. Other airports with CBP facial scans in place include Miami, Atlanta, New York JFK, San Diego, Houston (Intercontinental and Hobby), Washington Dulles, Las Vegas, Chicago O’Hare, and Preclearance locations in Aruba, Abu Dhabi, and Ireland (Shannon and Dublin). And a few weeks ago, CBP said it is beginning to test the incorporation of facial recognition into its Global Entry trusted traveler program.
Meanwhile, Marriott International announced last week that it is starting pilot tests in China of facial recognition for guests checking into the Hangzhou Marriott Hotel Qianjiang and the Sanya Marriott Hotel Dadonghai Bay, “with the goal of global rollout across Marriott International’s properties in the future.” The company said the technology, developed with its Chinese joint venture partner Alibaba, should reduce typical check-in time from three minutes or more to less than a minute.
Arriving guests “simply need to scan their IDs, take a photo and input contact details on a self-help machine. The intelligent device will then dispense room key cards after identities and booking information are verified,” Marriott said.
Microsoft’s Smith said the potential for abuse of facial recognition is even greater when individuals don’t realize their personal data is being collected, stored and shared as they move through public spaces.
Article by the San Francisco Choronicle