Submission on Biometrics Code of Practice
Dated: 15 May 2024
To: Office of the Privacy Commissioner
From: the Asian Legal Network
Email: contact@asianlegalnetwork.org.nz
Introduction
The Asian Legal Network (“ALN”) is a research and advocacy group. Our membership is composed of Asian peoples with a background in law who, as Tangata Tiriti, are committed to navigating Asian identities and furthering justice for Asian communities in Aotearoa.
The ALN would like to thank the Office of the Privacy Commissioner (“OPC”) for the opportunity to comment on this draft code of practice (“Draft Code”). We appreciate the work undertaken by the OPC on this code. We acknowledge that while biometric technology has the potential to benefit many businesses, agencies, and institutions in Aotearoa, it also presents significant privacy risks to individuals and vulnerable groups.
The ALN supports the overall direction taken by the Draft Code, and its proposed measures to safeguard the privacy rights of individuals and vulnerable demographics. We voice particular support for the proportionality assessment, and the restricted biometric categories limiting the use of biometric information to categorise individuals by prohibited grounds of discrimination, including race and ethnicity.
However, the Draft Code could do more to address privacy risks that disproportionately affect vulnerable minority communities - particularly the risk that the technology could facilitate or perpetuate discrimination. Moreover, we believe the Draft Code does not ensure adequate protections under a Tiriti framework.
To summarise our submissions:
The Draft Code should acknowledge He Whakaputanga or Te Tiriti o Waitangi. Protection for Māori data is not articulated under a Tiriti-framework, and does not clearly state the obligations on tauiwi (non-Māori) organisations that may engage with Māori biometric information. This risks undermining Māori data sovereignty.
Rule 1 2(e) and 2(f) should be rewritten to clearly signal the privacy risk of discrimination against Māori and other minority New Zealand demographic groups.
Agencies should have to consider, in the proportionality assessment, whether their use of biometrics risks infringing on protected rights and implement certain mandatory privacy safeguards if so.
Organisations should be prohibited from sending biometric data overseas.
Background
The ALN is concerned with how biometric technologies could disproportionately harm or discriminate against Asian peoples, tangata whenua and other minority demographics in Aotearoa.
Experts around the world have voiced concerns against the purported neutrality of more advanced biometric systems, such as those using facial recognition technologies for security and crime prevention. Studies collated by Alex Najibi, a researcher at Harvard University’s School of Engineering and Applied Sciences, show that biometric facial recognition systems are not “colour-blind”. Facial recognition technologies are consistently less accurate when used on subjects that were not lighter-skinned males, with errors 34% more likely to occur when identifying darker-skinned females over lighter-skinned males. [1]
This is the result of an inherent weakness in biometric identification technology - that systems are only as good as the data they are built on. As noted by the Gender Shades study published by Joy Buolamwini and Timnit Gebru at the Massachusetts Institute of Technology, these automated systems are not inherently neutral. “They reflect the priorities, preferences, and prejudices - the coded gaze - of those who have the power to mould “them””. [2]
We are particularly concerned that this means Māori, Asian peoples and Aotearoa’s diverse minority communities are at risk of being disadvantaged or discriminated against by biometric systems. Māori AI and data ethicist Dr Karaitiana Taiuru notes that the systems currently being adopted in Aotearoa are based on datasets with mostly European-looking men, and will hence be much less effective at identifying Māori, Pasifika, and other persons of colour. [3]
As recently as within the consultation window, we saw this play out in practice. Last month, facial recognition technologies misidentified a Māori woman from Rotorua as a shoplifter at her local supermarket, causing her significant harm and emotional distress. [4]
The Draft Code is an opportunity to prevent further such incidents of harm to individuals in Aotearoa, which is the focus of our subsequent submissions.
Initial Responses
We support the Draft Code’s acknowledgement that certain demographic groups are more vulnerable to privacy risks than others, and the measures being proposed to prevent harm from these risks.
We support the fair processing limit on using biometrics to place people in categories that are protected under the HRA. (Question 31) We concur with the OPC’s view that using biometrics to categorise individuals into these categories - including colour, race and ethnicity - are likely to lead to adverse outcomes like perpetuating bias and negative stereotypes.
We acknowledge that exceptions to the fair processing limits are justified, and that the proposed exceptions are appropriate. (Questions 32-35)
Te Tiriti o Waitangi
We are also a Tiriti-based organisation. While our members are tauiwi, we advocate for the honouring of Te Tiriti o Waitangi, especially by the kāwanatanga. This is the right thing to do. This is also important for tangata Āhia (peoples from Asia) who have made Aotearoa their home through the Crown and its (historic and contemporary) immigration processes. As the Crown exists by virtue of Te Tiriti o Waitangi, our belonging to Aotearoa is contingent on the kāwanatanga governing honourably (Article 1) and upholding its other responsibilities in accordance with Te Tiriti. This submission therefore discusses the responsibilities of the kāwanatanga under Te Tiriti o Waitangi. We are not making any submission on Te Ao Māori me ōna tikanga and its perspectives on biometrics.
Notwithstanding this, we also defer to any other Māori organisations submitting on this issue, especially if there are any contradictions between our submissions. We also wish to acknowledge Māori staff and Māori organisations working with the OPC in developing this code to date, as well as the emphasis that the consultation process and code currently affords to Māori and applying a Te Ao Māori lens. [5]
Despite the above, we still believe this Draft Code falls short of upholding Tiriti obligations incumbent upon the OPC for two reasons:
We submit that the Draft Code must be framed around Te Tiriti as a starting point, which would strengthen understanding and compliance by agencies interacting with Māori biometric data, and serve a protective function.
We also submit that changing consent from a general requirement to a safeguard fails to uphold Māori self-determination over their taonga, in potential breach of article 2 and the United Nations Declaration of the Rights of Indigenous Peoples (“UNDRIP”).
First, the code makes no mention of He Whakaputanga or Te Tiriti and does not frame the obligations of handling Māori data as a matter of Te Tiriti. Rule 1(2)(e) simply requires the agency to “take into account […] the cultural impacts and effects of biometric processing on Māori”. This fails to state the constitutional importance and the operation of Te Tiriti o Waitangi when any agency collects Māori biometric data. The Draft Code does not make mention of any obligation to consult, explicitly seek consent or partner with the relevant Māori organisations.
The OPC has stated that it “has obligations under Te Tiriti o Waitangi to partner with Māori, whānau, hapū, and iwi to bring Te Ao Māori perspectives to privacy”. [6] We submit that these obligations extend to all of the kāwanatanga as well as tauiwi organisations, operating under the laws of the kāwanatanga, that may collect Māori data.
We acknowledge the logic in the OPC’s decision to “strengthen the protections around biometric information overall” as the best way to protect Māori biometric information. While we support the specific safeguards, we believe the best protection comes from meaningfully upholding Te Tiriti o Waitangi, which allocates a specific article to the self-determination of tangata whenua.
The current wording fails to impress upon tauiwi organisations that the protection of Māori biometric information is not merely a cultural nicety for cultural reasons, but is a constitutional requirement. As rule 1(2)(e) is worded, the cultural impacts and effect on Māori are stated simply as a category distinct from “any other New Zealand demographic group” in rule 1(2)(f). These categories are also the last two listed considerations that agencies must take into account. There is no context or a meaningful articulation for its inclusion, which is justified in Te Tiriti. Such a meaningful articulation would strengthen compliance and build understanding. This in turn serves a protective function.
Secondly, there are potential shortcomings in upholding Article 2 of Te Tiriti o Waitangi. The Draft Code does not afford Māori substantive decision-making, control and autonomy over their taonga, especially their biometric information. That decision-making resides largely with the agency.
We are particularly concerned that despite Māori stakeholders stating “that organisations should be required to get consent from Māori before collecting their biometric information”, consent is no longer a general requirement, due to the practicalities of “distinguish[ing] between Māori and non-Māori biometric information”, and as ethnicity information is not collected, a specific provision is not included. [7]
In our view, Article 2 operates as a starting point, deferring to Māori stakeholders on how to best protect their taonga, who have stated the need for a general consent requirement. This embodies the many principles of Māori Data Sovereignty by Te Mana Raraunga. [8] Relevant principles include that “Māori have an inherent right to exercise control over Māori data”, and the ability to exercise kaitiakitanga, for “Māori shall decide which Māori data shall be controlled (tapu) or open (noa) access”. Interestingly, another principle is “Data disaggregation: The ability to disaggregate Māori data increases its relevance for Māori communities and iwi. Māori data shall be collected and coded using categories that prioritise Māori needs and aspirations”. [9]
Instead, the justification of practical difficulties is used to demote consent to a privacy safeguard, which is subject to a reasonableness test. Specifically that the safeguards are “relevant and reasonably practicable in the circumstances to reduce privacy risk” or “reasonable in the circumstances”. [10] This shifts decision-making power away from tangata whenua and into the agency to make a first instance determination of what is reasonable, and only in cases of complaint, is there oversight by OPC. By failing to frame the Draft Code in a Tiriti context and understanding the operation of Article 2 as a starting point, the practical issues of disaggregation, which itself is a stated principle, is used as justification to erode the promises stated in Te Tiriti o Waitangi.
The Draft Code may also breach UNDRIP. The failure to create a requirement of consent, despite the practical difficulties, may permit the taking of Māori “cultural, intellectual, religious and spiritual property” without the free, prior, informed consent of Māori, as protected under Article 11(2) of the United Nations Declaration of the Rights of Indigenous Peoples (UNDRIP). This must be taken into account by the Privacy Commissioner under s 21(b) of the Privacy Act as Aotearoa, as a signatory, has accepted UNDRIP as part of its international obligations.
Discrimination in the Proportionality Test
The ALN supports the additional requirement that organisations should have to ensure that biometric processing is proportionate. (Question 15) We believe that the additional considerations required by the proportionality test are a positive step, as they more adequately address the risks brought about by biometric technology than IPP 1 standing alone.
While we support the six factors of the proportionality test listed in rule 1(2), we suggest amendments to be made to rules 1(2)(e) and 1(2)(f). (Question 16) We recommend that these sub-rules should explicitly designate discrimination as a central consideration in the proportionality test.
The consultation document states that rules 1(2)(e) and 1(2)(f) were intended to capture “any different impacts the use of biometrics has on demographic groups” including “discrimination and heightened surveillance”, which we would like to acknowledge and express our support for. [11] However, we believe that the proposed language(“any cultural impacts or effects of the biometric processing”) is too broad.
An ordinary reading by a reasonable person of the proposed rules would not call to mind concerns of discrimination and heightened surveillance. The term “cultural impacts and effects” is more likely to call to mind cultural sensitivity and how different groups perceive biometrics, and not how biometrics could affect individuals from different backgrounds regardless of their cognisance.
The Draft Code is also intended to guide agencies with a range of privacy acumen. We are concerned that the current wording of these rules in the Draft Code would not adequately signal to all agencies working to implement biometric systems that they should be considering, as part of the proportionality assessment, the risk that the systems could discriminate or lead to discriminatory outcomes. This is inconsistent with OPC’s intention to protect vulnerable demographics from discriminatory use of biometrics, and our view of what the code should achieve.
We propose the following amendments to rule 1(2)(e) and 1(2)(f):
Rule 1(2)(e): whether biometric processing would have any effects or cultural impacts (including discriminatory effects and impacts) on Māori; and
Rule 1(2)(f): whether biometric processing would have any effects or cultural impacts (including discriminatory effects or impacts) on any other New Zealand demographic group.
We believe this amended rule would make clear to all agencies and individuals, regardless of their understanding of the risks of biometrics, that they should be considering the potential of their biometric systems to discriminate and developing appropriate privacy safeguards to prevent this from happening.
Privacy Safeguards
We support the requirement for organisations to adopt reasonable and relevant privacy safeguards to mitigate privacy risks (Question. 19). We also agree with the definition of privacy safeguards and the list of suggested privacy safeguards, including measures that could meaningfully mitigate the risks of discrimination (Question 20).
However, this requirement’s objectives (to adopt reasonable and relevant privacy safeguards) could be better achieved through a mandatory requirement that privacy safeguards should be implemented for certain use cases that pose greater risks.
None of the privacy safeguards are compulsory, and that it is up to agencies to determine which safeguards are reasonable and relevant to adopt. [12] We appreciate the benefits of minimising red tape by providing agencies with this flexibility, and agree that it would be appropriate for a significant majority of biometrics use cases. However, we are concerned that this relaxed approach may not be appropriate for use cases that pose greater privacy risks.
Some uses for biometrics, such as the use of facial recognition software being trialled in supermarkets across Aotearoa New Zealand, include the identification of individuals who are alleged to have committed crimes. We are particularly concerned with these use cases as they have been observed to be markedly less accurate at identifying women and ethnic minorities, and have significant ramifications for the individuals wrongly identified.
The misidentification of individuals as alleged criminals treads close to infringing on the right against unreasonable search and seizure, which is protected in the New Zealand Bill of Rights Act. [13] The use of technologies which are more likely to misidentify individuals with certain demographic profiles as alleged criminals, without adequate safeguards or counter-measures, risks resulting in the right against unreasonable search and seizure being disproportionately infringed upon against those demographics. This would infringe on the right of freedom from discrimination.
In such cases, the privacy risks - particularly those posed to minority demographics - are much greater, and are not proportional to the proposed standard.
We would like to see privacy safeguards be mandatory for certain use cases, which risk infringing upon protected rights. Supporting the list of privacy safeguards drafted in rule 3(3), we would like to propose the insertion of the following rule under rule 1(1)(c):
Rule 1(1)(c)(a): if the collection of biometric information risks infringing on rights and freedoms affirmed in the New Zealand Bill of Rights Act 1990, the agency should implement the privacy safeguards in rules 3(3)(c)-(h)
The insertion of this sub-rule, requiring mandatory privacy safeguards only for use cases that risk infringing on affirmed rights and freedoms, would avoid burdening agencies implementing biometric systems for uses that do not risk infringing on protected rights. We have excluded rules 3(3)(a) and (b) in this proposed rule, noting the OPC’s findings following previous consultation - that a general consent requirement would not be practicable.
We believe that requiring agencies to assess whether their use of biometrics would infringe on affirmed rights, and implement certain privacy safeguards if it would, adds greater clarity to the relevance and reasonableness tests required by proportionality assessment, and would help ensure the rights of minority demographics.
Restricting overseas data
Rule 12 of the Draft Code proposes that, in order for organisations to send New Zealand-collected biometric data to an overseas jurisdiction, the legislation of that jurisdiction must have comparable safeguards to the Draft Code, rather than just the Privacy Act. Question 41 of the consultation document asks whether submitters agree with this rule.
We support greater protection for biometric data sent overseas, but believe rule 12 does not go far enough to protect members of Asian communities who may be more exposed to the activities of overseas governments.
Rule 12 requires sending organisations to ensure that the overseas jurisdiction has comparable safeguards for biometric data that Aotearoa New Zealand has. However, the legislation of overseas jurisdictions may not reflect the actual practice in that jurisdiction, where the rule of law may not apply.
Rule 12 therefore seriously fails because it only seeks harmony of legislation rather than practice. Biometric data could be sent overseas based on a dead letter not observed or enforced overseas due to different political or legal cultures.
This is particularly serious because once the biometric data is overseas, the sending organisation and Aotearoa New Zealand authorities will be unable to revoke the data. Biometric data is intrinsically especially vulnerable because people cannot change their biometric data. They can forever be identified with that data.
For example, it is conceivable that biometric data could be obtained and linked to the activities of New Zealanders. When they arrive at a foreign airport, customs may be able to link them to their online activities with biometric data.
Because of these security concerns, (and aforementioned issues relating to Māori data sovereignty) we would prefer that organisations be prohibited from sending biometric data beyond New Zealand shores.
Conclusion
Thank you for the opportunity to make submissions in respect of the biometrics processing code. We are available to discuss the feedback if required.
Please reach out to us at contact@asianlegalnetwork.org.nz, if you have any questions.
[1] Alex Najibi “Racial Discrimination in Face Recognition Technology” (24 October 2020) Science in the News - Harvard Kenneth C. Griffin Graduate School of Arts and Sciences <https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology>
[2] Gender Shades <http://gendershades.org/overview.html>. See also their academic paper: Joy Buolamwini and Timnit Gebru “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification” (2018) 81 Proceedings of Machine Learning Research 1.
[3] Pokere Paewai “Māori woman mistaken as thief by supermarket AI not surprising, experts say” (17 April 2024) RNZ <https://www.rnz.co.nz/news/national/514523/maori-woman-mistaken-as-thief-by-supermarket-ai-not-surprising-experts-say>
[4] Sandra Conchie “Supermarket facial recognition trial: Rotorua mother’s ‘discrimination’ ordeal” (13 April 2024) RNZ <https://www.rnz.co.nz/news/national/514155/supermarket-facial-recognition-trial-rotorua-mother-s-discrimination-ordeal>
[5] We also note the OPC’s stated intention to “[apply] a Te Ao Māori lens”. Statement of Intent: 1 July 2020 to 30 June 2024 (Te Mana Mātāpono Matatapu | Privacy Commissioner, July 2020) at 8.
[6] Office of the Privacy Commissioner position on the regulation of biometrics (Office of the Privacy Commissioner, October 2021) at 1.1.
[7] Exposure draft of a biometric processing code of practice: consultation paper ( (Te Mana Mātāpono Matatapu | Privacy Commissioner, April 2020) at 11-12.
[8] Principles of Māori Data Sovereignty (Te Mana Raraunga, Brief 1, October 2018) at 1.1 and 6.3.
[9] At 2.2.
[10] Biometric Processing Privacy Code Exposure Draft Only for Comment at rr 3(2), 1(1)(c).
[11] al Exposure draft of a biometric processing code of practice: consultation paper ( (Te Mana Mātāpono Matatapu | Privacy Commissioner, April 2020) at 32-33.
[12] Exposure draft of a biometric processing code of practice: consultation paper ( (Te Mana Mātāpono Matatapu | Privacy Commissioner, April 2020) at 35.
[13] New Zealand Bill of Rights Act 1990, s 21.
Download the submission here.