Home Insights Facial recognition technology: a model law
Share

Facial recognition technology: a model law

Amidst growing concerns about privacy and mass surveillance, a new report published by the University of Technology Sydney (UTS) Human Technology Institute outlines a model law for facial recognition technology. 

The report, Facial Recognition Technology: towards a model law, recommends that facial recognition technology be formally assessed for potential harms before deployment and, in certain circumstances, prohibited. A public register of deployed facial recognition technology has also been recommended. 

A risk-based approach

Facial recognition technology (FRT) includes any computer system or device with embedded functionality that uses biometric data to verify someone’s identity, to identify an individual or to analyse characteristics about a person. Personal information involved in the use of this technology is regulated under the Privacy Act 1988 (Cth) (Privacy Act), where biometric templates and biometric information are considered ‘sensitive information’ and subject to additional protections.

However, the report suggests that the Privacy Act does not adequately address the unique challenges posed by FRT development in relation to human rights, including freedom of association and expression and the right to assembly.

The model law proposes a novel risk-based framework that would require developers and organisations seeking to use FRT (called ‘deployers’) to complete an assessment of the potential harms, including the potential human rights risk. This ‘Facial Recognition Impact Assessment’ would be registered, publicly available and could be challenged by the regulator or interested parties.

A Facial Recognition Impact Assessment would consider various factors in the proposed use of FRT, including spatial context, functionality, performance and whether individuals can provide free and informed consent. An FRT developer or deployer would also consider whether the technology produces outputs that lead to a decision with legal or similarly significant effect, including whether that decision is wholly or partially automated. This factor draws heavily on concepts found in the European General Data Protection Regulation.

The report includes guidance about the potential level of risk associated with these factors – for example, the use of facial recognition in public spaces is considered a higher risk than in private spaces. The use of FRT in a workplace would also raise the risk rating as individuals have a reduced ability to enter, move and act freely in that environment.  

Risk-based legal requirements

Based on this risk assessment, the Model Law sets out a cumulative set of legal requirements, restrictions and prohibitions that apply based on the relevant risk rating. 

There are three proposed categories of risk: base-level, elevated and high. As expected, stricter legal constraints are imposed as the level of risk increases. In addition, FRT applications used in Australia must continue to comply with the Privacy Act. Failing to comply with these legal requirements would carry civil penalties and injunctions granted against unauthorised use of FRT.

  1. Base-level risk. The report considers that all FRT Applications carry at least a base-level risk to human rights. FRT Application will likely be assessed at ‘base-level risk’ if there are only moderate human rights vulnerabilities. Among other requirements, developers and deployers of base-level and elevated-risk assessments would be required to comply with an FRT technical standard, aligned with international standards and best practice.

  2. Elevated risk. Additional legal requirements apply to elevated-risk   assessments. A risk rating is ‘elevated’ if there are one or more significant human rights vulnerabilities, including where FRT is used to make decisions that have a legal or similarly significant effect. For example, the use of FRT to identify individuals prohibited from entering a venue would be likely to have affected similarly significant rights of an individual. In these circumstances, a developer or deployer must provide human review where the technology materially contributes to a decision. Deployers would also have a duty of care to use FRT lawfully on affected individuals. A breach of this duty would be actionable by affected individuals and could result in a complaint to the regulator.

  3. High risk. The model law prohibits the use of facial recognition that is assessed as high-risk unless it is for law enforcement or national security, for academic research, or when the regulator provides authorisation. A FRT application would be considered at ‘high-risk’ if there are extreme human rights vulnerabilities. For example, the use of FRT in an open and publicly-accessible space to track and monitor individuals would be assessed as high risk.

The report also contemplates specific legal rules, including a ‘face warrant scheme’ for police and other security agencies. Under the proposed regime, a judge or independent authority would consider applications by law enforcement for repeated or routine use of FRT involving members of the public who are not suspected of having committed a crime.

Next steps

These potential obligations would have wide-reaching implications for organisations developing FRT and any entity deploying this technology, including government and private sector organisations. The report calls for the Attorney-General to commit to national facial recognition reform based on the model law and proposes to assign regulatory responsibility to the Office of the Australian Information Commissioner. 

We expect that the Australian Government would engage in industry and public consultation before introducing a bill based on the model law. It is likely that some of these concerns will be addressed by the ongoing Privacy Act review


Authors

Claire Allen

Law Graduate


Tags

Technology, Media and Telecommunications Intellectual Property Responsible Business and ESG

This publication is introductory in nature. Its content is current at the date of publication. It does not constitute legal advice and should not be relied upon as such. You should always obtain legal advice based on your specific circumstances before taking any action relating to matters covered by this publication. Some information may have been obtained from external sources, and we cannot guarantee the accuracy or currency of any such information.