Biometric data is information collected on body measurements and calculations related to human characteristics through biometric technologies like facial recognition or fingerprint scanners. The purpose of this collection is surveillance. As biometrics fast become more popular, countries like the USA and UK have even started using biometric tools for mass surveillance of their citizens. This creates some very real privacy and data protection concerns for you as the data subject. The misuse of your facial features and fingerprints sounds a lot more ominous than the simple misuse of your cellphone number.

Considering how sensitive biometric data is, laws like the General Data Protection Regulations (GDPR) and the Protection of Personal Information Act (POPIA) provide us with guidance around safely and legally handling data subjects’ biometric data. This post highlights the frameworks established around the use of biometric data.

Biometrics under the GDPR

The GDPR classifies biometric data as a type of special category of personal data. This means that you may not process biometric data. Even so, the GDPR allows you to process special categories of personal data if your processing falls within one of the lawful reasons for processing. Examples of these are: processing with the explicit consent of the data subject; or where processing is necessary for reasons of substantial public interest.

Examples of the GDPR cases involving biometric data

The Swedish Data Protection Authority (DPA) fined a school for taking attendance through facial recognition technology. This was because the reason for processing biometric data did not fall into one of the allowed reasons under the GDPR. The school got parental consent to use facial recognition technology but the DPA found their consent defective and ‘forced’ because of the imbalance of power between the school and the parents. The GDPR states that if possible, you should get data using less intrusive means. 

The Dutch DPA issued a 750 000 euro fine for unlawful processing of employees’ biometric data. The company used the data to take its employees’ attendance and time registration. The grounds for processing the biometric data were disproportionate and did not qualify under the exceptions in the GDPR.

In 2022, the French DPA fined Clearview AI 20 million euros and ordered it to stop collecting and using data on individuals in France without a legal basis. Additionally, Clearview AI was ordered to delete the data already collected. It collected over 20 billion photographs online, including images from social media, to build a “biometric template”. This means that it collected sensitive information about people’s physical characteristics.  The vast majority of people whose images were collected into the search engine were unaware of this feature. The French DPA found that Clearview AI breached several articles under the GDPR. It gave Clearview two months to comply.  Lastly, the committee imposed a penalty of 100 000 euros per day for any delays beyond these two months.

Biometrics in the USA (Illinois)

The USA does not have any federal law dealing with the use of biometric data but certain states, like Illinois, have the Biometric Information Protection Act (BIPA). BIPA imposes obligations on organisations that collect and use biometric information. BIPA requires organisations to get the written consent of the data subject before they process biometric data. Still, the penalties for violations are relatively low in comparison to the GDPR. It costs approximately 1000 dollars per violation, and 5000 dollars for intentional or reckless violation. In a judgment handed down on 2 February 2023, it was clear that BIPA itself does not specify a statute of limitations, which provides significant guidance to future BIPA cases.

More states have biometric privacy laws similar to the state of Illinois. The state of Washington passed its state data privacy bill has changed the way companies use biometric data for facial recognition technology.

Biometrics under POPIA

POPIA first introduced the term ‘biometrics’ into South African law. Biometrics are defined as a “technique of personal identification based on physical, physiological or behavioural characterisation…” It protects biometric information as special personal information. This means that there is a general prohibition on processing biometric information in South Africa. But POPIA does give general authorisation for the processing of special personal information, and specific authorisation of processing criminal behaviour or biometric information. Under the specific authorisation, only law enforcement and responsible parties (processors) can process biometrics under the law and labour legislation.

Examples of biometrics in South Africa:

It appears that many organisations use fingerprints as their primary form of biometrics. More organisations are embracing the use of fingerprints, voice and facial recognition.

Is blanket surveillance even legal?

In 2021, the Constitutional Court found that the Regulation of Interception of Communications and Provision of Communication-Related Information Act (RICA) is unconstitutional. It didn’t provide adequate safeguards to protect the right to privacy. A journalist challenged the constitutionality of of state surveillance under RICA. The Constitutional Court acknowledged the importance of the right to privacy, which is tied to dignity. It also accepted that state surveillance is important to investigate and combat serious crime, guarantee national security, and maintain public order. The question the court asked was whether RICA was doing enough to reduce the risk of unnecessary intrusions to people’s rights. The court found that limiting the right to privacy in those circumstances was unjustifiable and unreasonable.

Banks

Banks are starting to adopt biometric technology to increase their customer base and improve their app security. For example, Tymebank uses your fingerprints to verify your identity with the Home Affairs National Identification System (HANIS) when you sign up with them as a customer.

FNB ran a promotion where one uses a selfie to open a bank account with them through their banking app. The app works in conjunction with HANIS to verify your identity. Standard Bank also recently launched a product called digi.me which is used to protect your mobile banking app. Digi.me requires you to take a selfie and capture your fingerprints to authenticate your identity.

Data Protection

The Information Regulator (our DPA) has not made any recommendations or rulings based on biometrics. South Africans will need to look to the European Union for guidance in this matter.

Biometrics and the AI Act

In June 2023 the European Parliament discussed rules for biometric artificial intelligence systems. These discussions have been going on since the European Commission first proposed the Artificial Intelligence Act.
The European Parliament, European Commission and the Council of the European Union have different opinions about how to regulate these systems

The AI Act focuses on AI systems that use biometric data or data related to our unique physical characteristics. It has more detailed rules for these systems. On the other hand, the General Data Protection Regulation (GDPR) only has one definition for biometrics.

The European Commission’s proposal for the AI Act has six definitions for biometrics, and the European Parliament added three more. The Council of the EU also defines “general-purpose AI,” which covers systems that can recognise things like faces and voices.

These different trilogues don’t treat all biometric systems the same way. For example, the European Commission thinks biometric categorisation is a “high-risk” AI system. The European Parliament believes it’s too risky and wants to ban it, except for some medical uses. The Council of the EU only wants these systems to be more transparent.

The European Parliament is taking the strictest approach. They want to ban more biometric AI systems and classify some as high-risk. They also see biometric verification systems as safer.

One hot topic is real-time remote biometric identification, which all three groups are discussing. They have different ideas about when this technology can be used in public spaces. The European Commission wants to ban most law enforcement use, except for specific cases. The Council of the EU is a bit more lenient with law enforcement. The European Parliament wants to ban it completely in public places, for both public and private use.

As these debates continue, the European Union is trying to figure out how to regulate biometric AI systems while dealing with complex and evolving definitions and rules.

Actions you can take: