The Hon. T.A. FRANKS (17:24): I move:
That this council—
1. Acknowledges the growing public debate around the use of facial recognition technology.
2. Recognises that the use of facial recognition technology and its prevalence has outpaced consumer and privacy protection legislation.
3. Recognises that the public have a right to privacy and that where that privacy is limited by necessity, the public have a right to know how and why.
4. Commits to establishing a regulatory framework that:
(a) ensures facial recognition technology can only be used for specific, clearly defined reasons;
(b) requires authorised users of facial recognition technology to disclose their use of facial recognition technology and their purpose for using it;
(c) requires that authorised use of facial recognition technology be reasonable and proportionate;
(d) requires users of facial recognition technology to manage, store and delete data in a timely and safe manner;
(e) requires users to notify impacted people in the event of a data breach; and
(f) enables people to seek redress if adversely affected by the use of facial recognition technology.
I rise today to speak on this motion that recognises the need for our government to step up when it comes to consumer and privacy protections around the use of facial recognition technology. We are, of course, now very familiar with facial recognition technology in this day and age. In fact, it is likely that many of us in this chamber use it on a daily basis, just to unlock our phones, but we are less likely to be aware of the insidious and increasingly widespread use of facial recognition technology in monitoring our day-to-day lives.
This was most recently demonstrated by the public outcry when a CHOICE investigation revealed that large retailers such as Kmart, Bunnings and The Good Guys were using facial recognition technology in their stores to monitor unsuspecting customers. Thousands of people have spoken out against such invasive use of this technology, and their outcry, I am pleased to report, has led to those three retailers pausing that use of the technology while the privacy watchdog investigates.
There can undoubtedly be great benefits to using facial recognition technology in some situations, but are we prepared to be watched without knowing when, where, by whom or why, just when we go to the shops or walk down the street? Do we not deserve to know how that information is used, where it is stored, whether it is secure and whether it is being sold on? South Australians deserve privacy, and we as a community need to ask what level of mass surveillance we are prepared to accept and, of course, who will watch the watchers.
As former Australian human rights commissioner and UTS industry professor Edward Santow has stated:
Australian privacy law is a bit like Swiss cheese. So there are so many gaps in that law and it doesn't effectively protect people from harmful uses of facial recognition. The law was never crafted with widespread facial recognition use in mind and we need a specific law to address it.
No laws currently directly regulate the use of facial recognition technology in our nation, despite all Australian policing agencies reportedly using or trialling these technologies. Further, all states and territories signed an agreement with the commonwealth in late 2017 to cooperate on identity matching services, which include sending driver's licence photos to a database to be used for facial recognition matches. Queensland, Victoria, South Australia and Tasmania have already fed in driver's licences to that database, while the ACT government has said it will wait until legislation passes before they do so.
In 2019, the federal government put forward the Identity-matching Services Bill. However, it was rejected by the Parliamentary Joint Committee on Intelligence and Security for a lack of privacy protection and oversight. In a paper on the Australian Identity-matching Services Bill for the AI Now Institute, Jake Goldfein and Monique Mann raise the following concern about the push by Australian jurisdictions for more biometric surveillance:
Although governments have always had the function of identifying their citizens, they have not always linked those identities to intelligence dossiers or made them available to law enforcement agencies. Indeed, the intermingling of civil and criminal identity systems have been the concern of human rights jurisprudence for some time. Biometrics are of particular concern to the linkage of criminal and civil systems, and surveillance more generally, because they act as a conduit between an individual's physical presence and digital databases, thus amplifying surveillance capacities. By advancing a centralised identity matching system, Australia is pushing beyond the limits of legitimate state function.
The Australian Human Rights Commission states in their report titled Human Rights and Technology Final Report that where biometric technologies are used in high-stakes decision-making, such as policing, errors can increase the risk of human rights infringement and have an impact on individual privacy. The AHRC then go on to make the following recommendations:
Recommendation 19: Australia's federal, state and territory governments should introduce legislation that regulates the use of facial recognition and other biometric technology. The legislation should:
(a) expressly protect human rights
(b) apply to the use of this technology in decision making that has a legal, or similarly significant, effect for individuals, or where there is a high risk to human rights, such as in policing and law enforcement
(c) be developed through in-depth consultation with the community, industry and expert bodies such as the Australian Human Rights Commission and the Office of the Australian Information Commissioner.
Recommendation 20: Until the legislation recommended in Recommendation 19 comes into effect, Australia's federal, state and territory governments should introduce a moratorium on the use of facial recognition and other biometric technology in decision making that has a legal, or similarly significant, effect for individuals, or where there is a high risk to human rights, such as in policing and law enforcement.
Not only is there currently no federal legislation regulating facial recognition technology, there is currently no legislation in South Australia creating a general right of privacy, although, as I have mentioned before in this place, there is a cabinet administrative instruction, the Information Privacy Principles instruction, which has now been reissued a number of times, including in May 2020.
The instruction is not law but represents policy developed at the highest level of state government and is binding on the public sector. The instruction is similar to the Privacy Act 1988 of the commonwealth in that it protects against information misuse. Under the commonwealth act, the instruction does not allow enforcement of the instruction in a court of law.
This lack of legislative protection around the use of facial recognition data considerably heightens the risk of data being misused, of people being unfairly targeted or being monitored without a legitimate reason. An absence of strictly legal boundaries also means that the application technology could be significantly expanded.
I will note that this is not just a conversation we should be having at a state level. There is great scope and urgency for reforms of our federal laws as well. CHOICE is currently petitioning the federal government for new legislation that would close these gaps in Australia's privacy laws to address the risks of facial recognition technology.
That proposed legislation is a model law that has been developed by the Human Technology Institute at the UTS, and these changes would classify the use of facial recognition technology in retail stores as high risk and prohibit its use unless a specific exemption was granted by the regulator. The work of the Human Technology Institute in this space really highlights the importance of considering people's privacy and human rights when implementing and regulating facial recognition technology, and I do urge our state government to consider that institute's work.
Three in four Australians agree that regulation is needed to protect consumers from the potential harms of untransparent, unaccountable use of facial recognition technology. We cannot let big business decide our privacy rights, and unless governments step in, that is exactly what is happening and will continue to happen.
The public debate on the use of facial recognition technology has been ongoing, and legislators so far have failed to engage meaningfully in the conversation. We need to recognise that the prevalence of facial recognition technology and its capabilities have far outpaced consumer and privacy protection legislation. We also need to remember that the public have a right to privacy and, where that privacy is limited by necessity, we have a right to know how and why.
It is essential that this parliament commits to establishing a regulatory framework clearly identifying and defining when and how facial recognition technology can be used. The use of facial recognition technology, especially at scale, must be reasonable and proportionate and people must have access to redress if they are adversely affected by the use of the technology, particularly in the case of data breaches. Given the situation with the recent data breaches that have been mentioned many times in this place over the last week, while I commend the motion, I hope that we will all turn our minds to this important issue.