Risk of infection during the pandemic motivated brick-and-mortar businesses to explore alternatives to traditional point-of-sale (PoS) terminals, like QR codes pointing to online checkouts. Most rely on familiar, existing platforms and technologies, but a new entrant called PopID is pushing a potentially problematic solution that leverages facial recognition.
Earlier this month, The Los Angeles Times profiled the startup, which was founded by the chairman of CaliGroup, CaliBurger’s parent company. PopID recently launched a face-based payments network with 25 restaurants and retailers in Pasadena; it also offers a product called PopEntry that verifies people’s identities at places of business and universities. CEO John Miller claims that over 1,000 PopEntry units have been sold and that several thousand more are planned for installation before the end of 2020 in Colorado, Texas, Arizona, and Indiana.
To enroll in PopID, customers visit PopID.com and snap a picture. Next time they’re at a business or school with PopID, they can use their face for verification by standing in front of a camera. “PopID … aims to be the universal gateway for verifying an individual’s identity based on face for applications such as loyalty, payment, and entry,” reads an excerpt on PopID’s website. “Your face now becomes your singular, ultra-secure ‘digital token’ across all PopID transactions and devices … Your face replaces keys, fobs, key cards, etc. to allow you easy entry to secure areas.”
Miller often states that PopID complies with Illinois’ Biometric Information Privacy Act, which obligates businesses to protect and store biometric data at least to the same degree as other sensitive information. But questions around PopID’s privacy, security, and algorithmic practices abound, and it’s an example of a larger problem.
Event
AI Unleashed
An exclusive invite-only evening of insights and networking, designed for senior enterprise executives overseeing data stacks and strategies.
Bias
It’s well established that facial recognition systems are susceptible to bias.
PopID uses “core algorithms” from Japanese electronics vendor NEC combined with “technologies that PopID has built internally,” Miller told VentureBeat via email. That’s somewhat concerning because while NEC professes its algorithms to be bias-free, it has refused to share proof even when compelled by U.K. courts. There’s anecdotal evidence to the contrary of NEC’s claims — an NEC algorithm used by U.K. police forces as recently as two years ago had a false positive rate of 98%.
“In general, there seems to be a lot of secrecy and confusion over how this NEC algorithm works and is trained, which is never good,” Mike Cook, an AI researcher at the Queen Mary University of London, told VentureBeat via email. “Ultimately, we know very little or nothing about NEC, and obscurity is not a solution for something like payments.”
Miller offered as a counterpoint facial recognition benchmarks published in December 2019 by the U.S. National Institutes of Technology (NIST). NIST determined an NEC algorithm provided to the U.S. Department of Homeland Security (DHS) had no statistically detectable race or gender bias. In other words, NIST couldn’t find evidence the algorithms the DHS is using contain any racial bias, though it made clear that it didn’t account for variables like camera quality and position. (PopID sources a range of hardware equipment from third-party vendors including Panasonic, and configurations vary merchant-to-merchant.)
Miller also asserted that PopID’s tools use “image processing techniques” to bolster accuracy, and that the company is in the process of building a tool that will allow users to update their images based on “fundamental changes” in features beyond things like hair growth (which the algorithm already accounts for). “The market will show that the best biometrics algorithms work extremely well across all different types of people and lighting conditions,” Miller said.
The problem is, as University of Washington AI researcher Os Keyes told VentureBeat in a previous interview, that people can present in different ways and have many different life histories, trajectories, and desired forms of treatment. “When you have a technology that is built on the idea that how people look determines, rigidly, how you should classify and treat them,” Keyes said, “there’s absolutely no space for [things like] queerness or non-binary.”
Miller disagrees. “Some algorithms are biased. However, our algorithms are not biased,” he said.
This statement is at odds with the consensus of experts, including Keyes, who say imbalances in data and systems make eliminating algorithmic bias nearly impossible. In 2015, a software engineer pointed out that the image recognition algorithms in Google Photos were labeling his Black friends as “gorillas.” A University of Washington study found women were significantly underrepresented in Google Image searches for professions like “CEO.” More recently, the nonprofit AlgorithmWatch showed that Google’s Cloud Vision API automatically labeled a thermometer held by a dark-skinned person as a “gun” even though it labeled a thermometer held by a light-skinned person as an “electronic device.”
Data ownership
Recognition accuracy aside, there’s the question of data ownership. Miller says the PopID platform is opt-in in the sense that users must choose to register and that the system performs biometric matching only in particular situations and places (i.e., when a customer is making a purchase and is standing in front of the camera-equipped terminal). Users aren’t forced to pay with PopID; if they opt to use a credit card or phone with NFC, the system won’t capture their photo.
PopID stores face images in a database that the company claims is encrypted both at rest and in transit. When asked whether NEC had access to this database, Miller said that PopID hosts NEC’s algorithm in a “secure cloud” apart from NEC’s infrastructure and that users can request that their data be deleted or deactivated at any time. “We only share a person’s identity at a particular moment in a particular place when a user chooses to use PopID to have their identity shared,” he further explained. “We never share a picture — we only authenticate that the person is there at that particular moment.”
According to PopID’s user agreement and privacy policy, personal data isn’t shared until the individual decides to use PopID at a location to authenticate. But from that point forward, it’s uncharted waters — PopID reserves the right to share personal data with “businesses that the consumer elects to use PopID to authenticate identity.” And if users don’t explicitly request deletion or limited disclosure of their information (by emailing info@popid.com or using a web portal), PopID stores the data for three years from the last date they used any of its services.
Cook finds this problematic. “It seems to say that they instead send your information to the business, and the phrasing of [the privacy policy] also kind of implies that once you’ve done this, they can share your information with that business whenever they want,” he said. “For example, if they signed a contract to provide facial recognition for paying for driver’s license renewals at the DMV, would that as a result allow them to ‘share personal data,’ i.e. your facial scans, with the entire U.S. government, in perpetuity, because you opted-in by paying once at a single place?”
Security and privacy
While PopID says it takes pains to secure its database, the company in some cases does little to prevent point-of-sale fraud. In restaurants manned by cashiers, Miller says it’s incumbent on employees to flag customers holding up pictures of someone else to the camera. PopID applies an algorithm-based anti-spoofing solution only to “unmanned” payment terminals and PopEntry gateways.
And of course, just because the database is encrypted doesn’t mean it’s immune from compromise. Law enforcement could subpoena PopID to obtain a customer’s information without the person’s knowledge, or a malicious actor could hack into the database and steal sensitive information. “Ownership and use of the data is a big issue,” Cook said.
That’s not to suggest PopID has nefarious intentions — far from it. But even the best-engineered facial recognition systems are inherently fraught, which is perhaps why Amazon, IBM, and Microsoft have all paused or ended the sale of facial recognition products over the past several months. Facial recognition that doesn’t work well on people of certain demographics may, as a practical matter with a POS system like PopID, return false negatives, which would be terribly embarrassing for someone trying to make a purchase. And it can make surveillance a part of everyday life, laying the groundwork for expansion to other uses.
Alibaba’s face-based payment system in China is a case in point. While purportedly operating on an opt-in basis, reports show that anyone who verifies their account using a photo can subsequently be recognized by their face during the payment process, even if they opted out of facial recognition payments. Privacy concerns aside, over 60% of 40,000 respondents to a 2019 poll by Sina Technology said that face-based payment systems made them feel “ugly.” (Alibaba later rolled out filters to its terminals that show “beautified” versions of users’ faces at payment time.)
If PopID maintains its current course, it would do well to provide greater transparency regarding how it collects, stores, and processes facial data. A failure to do so would not only engender mistrust among its users, but pose a danger to those users’ privacy and security.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.