Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
A school in Poland has been fined €4,600 ($5,200) for breaching Europe’s General Data Protection Regulation (GDPR) after it was found to be processing students’ fingerprint data to verify whether they had paid for school lunch. The news comes as biometric data harnessing programs around the world spark significant privacy concerns.
The unidentified school in Gdansk, a city in northern Poland, processed the fingerprints of hundreds of children “without a legal basis,” according to a statement by Jan Nowak, pesident of Poland’s Personal Data Protection Office (UODO). Nowak added that there were adequate alternative options for managing school meals. According to the UODO, the primary school had been using a biometric reader at the cafeteria entrance since 2015 to verify whether pupils had paid for their meals. In the current academic year, the system was used on 680 children — with four kids using “an alternative identification system.”
Students not using biometric ID were forced to the end of the line.
“In the opinion of the president of the UODO, such rules introduce unequal treatment of students and their unjustified differentiation, as they clearly favour students with biometric identification,” the statement reads. “Moreover, in the authority’s view, the use of biometric data, considering the purpose for which they are processed, is significantly disproportionate.”
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
While parental consent was obtained for the biometric ID program, the UODO found that the system was “not essential for achieving the goal of identifying a child’s entitlement to receive lunch.”
The GDPR factor
The final decision cited numerous facets of GDPR, including recital 38, which refers to specific provisions made for data protection of children. “It should be emphasized that children require special protection of personal data, as they may be less aware of the risks, consequences, safeguards, and rights they have in connection with the processing of personal data,” the report found.
Biometric data is defined under GDPR as “personal data resulting from specific technical processing relating to the physical, physiological, or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person.” This includes fingerprints, iris scans, hand geometry, voice recognition, and facial scans. Indeed, the latest GDPR privacy fallout comes shortly after a Swedish school was fined €20,000 ($23,000) under GDPR for conducting a facial recognition pilot program that tracked students’ attendance.
Last year, the U.K.’s Information Commissioner’s Office (ICO) issued an enforcement notice against Her Majesty’s Revenue and Customs (HMRC), after a complaint was made over a system it had implemented that used callers’ voices to verify their identity. In the case of HMRC, no fine was imposed, but it was instructed to delete all biometric data it had collected through the voice authentication system without explicit consent.
This highlights the fact that GDPR isn’t only about imposing gargantuan fines, as it has in other high-profile cases. Last year, British Airways (BA) was hit with a record $230 million fine by the U.K.’s ICO over a 2018 security breach that compromised the personal data of 500,000 customers, while Google received a $57 million fine from the French data privacy body for a “lack of transparency, inadequate information, and lack of valid consent” regarding its ad personalization technology.
While the fine imposed on the Polish primary school at the center of this latest violation is relatively modest, the school has also been ordered to erase all personal data it had gathered through its program and cease collecting all such data.
As data privacy regulations take effect around the world, including the recently implemented California Consumer Privacy Act (CCPA), we will likely see more debate over how biometric data programs should be implemented — or whether they should be used at all.
Under GDRP, biometric data is regarded as a “special category,” separate from other personal data — such as email addresses and phone numbers — that may be gathered through digital platforms. Unlike email addresses or credit card credentials, biometric markers cannot be easily changed, which is why they are given special status under GDPR.
“The biometric system identifies characteristics which are not subject to change, as in the case of dactyloscopic [fingerprint] data,” the UODO noted in its statement. “Due to the unique and permanent character of biometric data, which means that they cannot change over time, the biometric data should be used with due care. Biometric data [is] unique in the light of fundamental rights and freedoms and therefore require[s] special protection. [Its] possible leakage may result in a high risk to the rights and freedoms of natural persons.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.