AustLII Home | Databases | WorldLII | Search | Feedback

Melbourne University Law Review

Melbourne Law School
You are here:  AustLII >> Databases >> Melbourne University Law Review >> 2024 >> [2024] MelbULawRw 8

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Hildenbrand, Serena Syme --- "Public Sector Data Sharing: Applying State-based Human Rights Laws to Minimise Privacy Harms" [2024] MelbULawRw 8; (2024) 47(2) Melbourne University Law Review 386


PUBLIC SECTOR DATA SHARING:
APPLYING STATE-BASED HUMAN RIGHTS LAWS TO MINIMISE PRIVACY HARMS

Serena Syme Hildenbrand[1]*

Public sector data sharing offers public benefit but raises risks of privacy harm. This article considers the Victorian legal framework for such data sharing, highlighting risks to privacy (individual and social) and describing additional protection offered by the Charter of
Human Rights and Responsibilities Act 2006 (Vic) (‘Victorian Charter’), including by reference to the varied origins of privacy and data protection legislation. It argues that Victorian courts have taken a wrong turn in applying the Victorian Charter privacy right, diverging from overseas approaches and the principle of legality and weakening its protection. Drawing parallels with Queensland and the Australian Capital Territory, this article identifies remedies, explaining how appropriately applied human rights based privacy rights could uphold social value, maximise social licence and minimise privacy harms.

Contents

I Introduction

Imagine a future world of ideal government-promoted convenience. Each government department you approach has your personal data to hand and

can quickly respond based on your other dealings with government. Your

location is known due to travel data, so advice is tailored. Private companies allocate you to a demographic segment, understand each segment deeply using de-identified government data, and microtarget their pricing and marketing to you. Government predicts your future activities, nudging your behaviour and assessing your regulatory risk profile accordingly. Law enforcement efficiently tailors its resourcing and intervention to each person’s likely future actions. In reality, most people are likely to consider this ‘ideal’ world a significant and alarming intrusion into their personal privacy and autonomy. But new data technologies and a drive for government efficiency are leading Australians in this direction, with current information privacy legislation unlikely to provide much impediment to these developments.

Data sharing by public sector agencies offers benefits but can also generate risks and potential harms. The 21st century has been described as the ‘era of big data’,[1] with demand for government data increasing.[2] Like many governments, the Victorian government is conscious of the potential social and efficiency benefits offered by data, and imposes a ‘clear responsibility on Victorian government agencies to share data’.[3] Data analytics offer significant promise to public and private sectors alike, with some estimates placing the value of public sector data to Australia at as much as $64 billion per year.[4] Big data techniques such as machine learning and artificial intelligence have the potential to expand the use of public sector data even further.[5]

But data analytics are not benign from the point of view of individuals whose data are being collected, used, disclosed and shared. Those individuals may experience harm if their data are misused such as through financial fraud, identity theft and family violence, and such individuals may be at risk of other physical harm.[6] Public sector agencies may also experience harm if they are involved in or contribute to such misuse, principally through reputational

damage and loss of trust.[7] Such harm can spread across agencies: a significant mistake by one agency is likely to affect trust across the government.

The harms flowing from interferences with privacy are commonly viewed through an individualistic lens.[8] This is necessary but insufficient; a focus on individual harms and self-protection made sense in the 20th century context

of the misuse of individual records, but this lens is too narrow for big data.

Collective harms caused by data practices arguably include undermining the social value of privacy and violating social licence.

Existing Australian legal protections are largely founded on information privacy frameworks designed for an earlier era of information collection and use. This legislation generally protects ‘personal information’ only[9] (information which may allow identification of a data subject).[10] Federal and state information privacy statutes such as the Privacy Act 1988 (Cth) (‘Privacy Act’) and the Privacy and Data Protection Act 2014 (Vic) (‘PDPA’) lack some of the protections found in contemporary European data protection laws,[11] and continue to rely on data subjects making their own privacy decisions via the processes of notice and consent.[12] For this and other reasons, statutory information privacy frameworks appear inadequate in the face of 21st century data challenges.[13]

Victoria has been selected as a jurisdictional case study because its human rights framework may provide additional protections for data subjects. In

Victoria, information privacy regulation is complemented and enhanced

by broader human rights legislation, namely the Charter of Human Rights and Responsibilities Act 2006 (Vic) (‘Victorian Charter’). Because specific human rights legislation operates at a higher ‘principles’ level, it may have a valuable role in futureproofing more granular information privacy frameworks. Both Queensland and the Australian Capital Territory (‘ACT’) have similar

legislation — the Human Rights Act 2019 (Qld) (‘Qld HRA’) and Human Rights Act 2004 (ACT) (‘ACT HRA’) respectively — so an approach developed by

reference to Victoria’s legal framework could be applied in those jurisdictions too. The prospect of a federal Bill of rights is regularly raised,[14] and these

arguments would provide strong support for the adoption of such a model in the federal context.[15]

This article argues that in a big data context it is appropriate to strike a fine balance between data use and privacy. Consistent with the common law

principle of legality, this requires an appropriately limited approach to

exceptions from the protection of privacy rights. This article will point out how Victorian courts have taken a wrong turn in this regard, disregarding foreign jurisprudence and broadening the scope of exceptions applicable to the

Victorian Charter privacy right.

Part II of this article describes some of the risks and personal, institutional and social harms of public sector data sharing and briefly summarises existing gaps in the legal framework. Part III outlines the Victorian legal framework for data sharing, including the Victorian Charter. Part IV sets out to develop a

normative framework for legal protection, describing ‘opaque’ and ‘transparent’ modes of protection and the social value of privacy and social licence. Part V explores the practical application of the relevant human rights legislation and identifies how Victorian courts have taken a wrong turn. Part VI concludes with recommendations to protect data subjects.

II Public Sector Data Sharing: The Need for Additional Protection

A The Importance of Data to the Public Sector

In the 20th century, information privacy legislation focused on the collection and disclosure of written information, the interception of private communications and the computerised exchange of information between government

databases.[16] But the 21st century’s artificial intelligence driven big data analytics present a new challenge to privacy law, enabling predictions and inferences to be drawn based on vast volumes of digital data, exposing new aspects of data subjects’ lives, attitudes, interactions and movements.[17]

In this context, the general term ‘data analytics’ refers to ‘processes or activities which are designed to obtain and evaluate data to extract useful information’,[18] usually by providing insights or making predictions.[19] The more specific term ‘big data analytics’ refers to the use of artificial intelligence and machine learning techniques to process ‘big data’ (characterised by high

volumes, velocity and variety of data)[20] to identify correlations and patterns.[21] As the name suggests, big data analytics require vast volumes of data to train an algorithm and allow it scope to locate patterns of interest. I use the term ‘data subjects’, following the terminology used in the European Union’s General Data Protection Regulation (‘GDPR’),[22] to refer to individuals whose data are being shared and used, while acknowledging that this term is imperfect given its

implication that individuals are merely passive subjects. In this article, ‘data sharing’ refers to the sharing of personal information with a specified recipient under legal protection (such as a data sharing agreement) and does not cover public data release or open data.

A 2015 Organization for Economic Co-Operation and Development (‘OECD’) report titled Data-Driven Innovation: Big Data for Growth and Well-Being assessed the economic potential of data driven innovation, concluding that it had huge potential to address economic and resource distribution

challenges.[23] The Australian Productivity Commission’s 2017 Data Availability and Use Inquiry Report (‘Productivity Report’) also identified a range of benefits to individuals and governments of increased public sector data sharing, including: improved service delivery; assisted decision-making; better utilisation of resources; and more effective risk management.[24]

B Potential Harms to Data Subjects

The collection and use of data may also cause harm to data subjects. While not specifically focused on human rights, the OECD has highlighted the potential infringements of privacy and human rights arising from every step in the data value chain, including: over collection of personal information; threats

to stored data; intrusive inferences with personal information; information asymmetry; and real-world discrimination due to data driven decisions.[25]

There is some evidence that Australians expect public sector data sharing to occur but are uncomfortable with aspects of it. The Australian Information Commissioner’s Australian Community Attitudes to Privacy Survey 2020

revealed that only 36% of Australians are comfortable with intra-government data sharing (up from 30% in 2017), while 40% are uncomfortable.[26] Australians are even less comfortable with government sharing their data with

private businesses (15% comfortable, 70% uncomfortable).[27] Further, 59% of Australians experienced ‘problems’ in the handling of their personal

information over the previous year.[28] The problems are not exhaustively defined and range from transparency related issues (such as missing privacy policies), through processing of personal information without explicit consent, to a

variety of cyber harms.[29]

Consumer harms arising from data sharing and data use were also explored by the Australian Competition and Consumer Commission (‘ACCC’) in its 2019 Digital Platforms Inquiry, which identified harms including the collection of location data, online tracking and the non-transparent sharing of personal data with third parties, leading to decreased consumer welfare through reduced trust and privacy, profiling, discrimination, reduced competition and impacts on vulnerable consumers.[30] While not all these harms will attach to public

sector data sharing, the act of sharing personal information heightens the risk of such damage.

Importantly, privacy harm is not only triggered by unauthorised disclosure; the use of personal data can cause harm even if data are used as intended.

Daniel J Solove’s taxonomy of privacy problems illustrates how common uses of personal information can cause dignitary harms (impacting reputation or causing angst) and architectural harms (increasing risks of harm or undesirably shifting power), even without any clear misuse or theft of data.[31] Solove outlines how architectural harms in particular can harm society as well as individuals: by ‘chilling’ behaviour, creating power imbalances and impeding activities that contribute to social goods such as political participation and free expression.[32] Current uses of big data technologies can create such architectural harms. Data integration and data matching involve augmenting personal data with data from other sources to create a more complete picture of an individual, and data mining can highlight patterns in the data which expose sensitive personal

details.[33] While such practices offer considerable opportunity for businesses and public sector agencies, they may exceed the data subject’s intention in

sharing their data. There are examples in the literature of retailers using

correlations in purchase data to draw accurate yet intrusive conclusions about individuals, leading to the disclosure of highly personal information which shoppers had no intention or awareness of revealing.[34] There is no suggestion that similar practices are currently in use in the Australian public sector, but the drive within government to improve customer experience and streamline customer service resourcing may promote such approaches.[35]

The significant 2022 Optus data breach in Australia highlighted the public anxiety triggered by the exposure of government identifiers and the harm of stress and inconvenience to individuals who were attempting to understand the exposure and block or replace compromised identity documents.[36] Organisations implicated in such significant data incidents can suffer lasting reputational and financial damage.[37]

Public sector data activities may increase privacy risks and harms. In any given jurisdiction, multiple public sector agencies each hold some personal data, but such siloed data may offer relatively low utility to agencies using big data analytics.[38] Public sector data sharing allows the incorporation of limited and discrete sources of personal information into more comprehensive public and private databases by means of data integration and data matching.[39] This generates the volumes of rich data necessary for big data analytics and may

allow levels of surveillance and intrusion that would not otherwise be possible.[40] In this way, public sector data sharing may increase the value of the data for analytics but generate new privacy risks. Public sector agencies are also bound by public records legislation,[41] which can require personal customer data to be retained for long periods,[42] meaning that rich and extensive public sector data reserves also cover decades of information, increasing the risk of misuse and harm.

Public sector data matching is not new. Relying on the Data-Matching

Program (Assistance and Tax) Act 1990 (Cth),[43] federal government agencies such as the Australian Tax Office (‘ATO’) and Services Australia commonly match customer data among themselves and other federal agencies to ensure compliance and prevent fraud.[44] The risks of such an approach were vividly

illustrated by the Australian Government’s ‘Robodebt’ scheme, which

incorrectly matched Centrelink data against ATO records to recoup

overpayments and led to a Royal Commission, a class action and reparations exceeding $1 billion.[45] It has parallels to the Dutch System Risk Indication (‘SyRI’) program, developed to detect welfare fraud using data linkage and

automated decision-making.[46] The Hague District Court overturned the law

establishing SyRI in 2020 on the basis that it contravened human rights

obligations.[47]

Public sector data collection developed further during the COVID-19

pandemic.[48] All Australian states and territories required citizens to provide personal contact tracing data via jurisdiction-specific contact tracing apps, with assurances that it would only be used for public health purposes.[49] Yet freedom of information requests to the Victorian Department of Health revealed that it denied multiple data requests for contact tracing data, including two requests by Victorian Police.[50] Victoria has since amended its legislation to limit these secondary uses.[51]

C Gaps in Legal Protection

Given such seismic developments in data technology and big data analytics, pressure on the legal framework is to be expected. As the Productivity Report noted: ‘[t]he legal and policy frameworks under which public and private sector data is collected, stored and used (or traded) in Australia are ad hoc and not contemporary’.[52] This dated legal framework gives rise to certain gaps in legal protection of data subjects.

In assessing the effectiveness of Australia’s information privacy framework in light of developments in data and technology, the Australian Law Reform Commission (‘ALRC’) stated in its 2014 report Serious Invasions of Privacy in the Digital Era that ‘[a]lthough the existing law provides protection against some invasions of privacy, there are significant gaps or uncertainties’.[53] The gaps identified by the ALRC included piecemeal state-based protections, the

restriction of the Privacy Act to information privacy and inadequate remedies.[54] These findings highlight the inconsistent coverage of the Australian information privacy framework, which consists of a patchwork of federal, state and territory laws that do not necessarily cohere well or cover all relevant privacy harms or data users.

In its 2021 Human Rights and Technology report, the Australian Human Rights Commission (‘AHRC’) described the potential for gaps in

information privacy law in the light of data-rich new technologies such as

artificial intelligence.[55] Additional gaps identified by the AHRC included the pressure that big data technologies place on the Privacy Act’s reliance on

consent, de-identification of personal information[56] and narrow interpretations of ‘personal information’.[57] The ACCC also expressed concern that the legal framework in Australian jurisdictions may be ill-equipped to protect

Australians, noting in its Digital Platforms Inquiry that ‘the existing regulatory frameworks for the collection and use of data have not held up well to the challenges of digitalisation’.[58]

In her 2020 article on the use of artificial intelligence in government, Moira Paterson also identified numerous deficiencies in the Privacy Act in the context of big data analytics, including: an overly narrow definition of ‘personal information’; a failure to protect groups and not just individuals;[59] an outdated focus on the ‘purpose’ of data use; and the need to limit legislative overrides of privacy legislation.[60] The Attorney-General’s Department’s recently completed review of the Privacy Act appears to address only one of these identified deficiencies,[61] and while it is appropriately regarded as a ‘landmark blueprint for privacy law reform in Australia’,[62] the improvements it offers will not be immediate — the federal government released its response to the review in September 2023, with plans for additional consultation on proposed legislative amendments to take place in 2024.[63] So the gaps identified above in Commonwealth legislation will endure, and as that framework is mirrored by state and territory legislation, one can expect similar gaps to be present in the Victorian, Queensland and ACT information privacy frameworks.

III The Victorian Legal Framework for Data Sharing

In Victoria, data sharing relies principally on a statutory framework of

information privacy laws. This reflects the lack of any common law right to

privacy, as made clear by the High Court in the 1937 case of Victoria Park Racing & Recreation Grounds Co Ltd v Taylor.[64] That position endured until 2001, when Callinan J suggested in Australian Broadcasting Corporation v Lenah Game Meats Pty Ltd that ‘the time is ripe for consideration whether a tort of invasion of privacy should be recognised in this country’.[65] There has been some willingness in inferior courts to follow this approach and identify a common law right of privacy,[66] but such a ‘right’ remains underdeveloped.[67] Accordingly, information privacy protection in Australia has tended to be a creature of

statute.

A Information Privacy Legislation

Information privacy protection was introduced at the Commonwealth level by the Privacy Act, implementing elements of the 1980 OECD Guidelines on the Protection of Privacy and Transborder Flows of Data (‘Guidelines on the

Protection of Privacy’).[68] The Privacy Act now applies to any Australian private sector entity with a turnover of more than $3 million per financial year,

including state government contractors.[69] In turn, Victoria enacted the

Information Privacy Act 2000 (Vic) (‘Information Privacy Act’), which was replaced in 2014 by the PDPA.[70] Following the Commonwealth’s specific

request,[71] the Information Privacy Act only covered, and the PDPA continues to only cover, the Victorian public sector to limit regulatory burden on the private sector.[72]

Both Victorian statutes contained Information Privacy Principles (‘IPP’)

designed to protect personal information, similar to the Australian Privacy Principles under the Privacy Act.[73] To summarise, the IPP allow public sector agencies to collect the data they need by lawful and fair means, and with

transparent privacy policies and collection notices.[74] Data collected should not be used for a secondary purpose distinct from the primary purpose of

collection unless one of the specific criteria listed in cl 2.1 are met.[75] The IPP also require the data to be accurate and complete, safely stored, transparently managed, for access to be available for correction and for sensitive information to be treated with special care.[76] The PDPA includes provisions regulating

information security, including by introducing the Victorian protective data

security standards.[77]

Section 20 of the PDPA requires an organisation to comply with the IPP in relation to any personal information it collects or uses and s 13 sets out the organisations to which the PDPA applies — essentially Victorian public sector agencies. Many such agencies are emanations of the Crown with no separate legal identity, but the combined effect of ss 13 and 20 is that each Victorian public sector agency is viewed as a separate unit for the purposes of compliance with the IPP and therefore for data sharing.[78]

Privacy protection under the PDPA only applies to ‘personal information’, defined as ‘information ... about an individual whose identity is apparent,

or can reasonably be ascertained, from the information’.[79] The Victorian

information privacy framework also includes other legislation such as the Health Records Act 2001 (Vic), the Victorian Data Sharing Act 2017 (Vic), the Surveillance Devices Act 1999 (Vic), sector-specific data sharing provisions[80] and various secrecy provisions.[81]

In practice, this gives rise to a somewhat piecemeal approach to information privacy protection, with the legislative basis for data sharing provided by

some combination of sector-specific legislation and the IPP. In this context, overarching human rights legislation like the Victorian Charter and equivalent ACT and Queensland legislation[82] potentially fills a valuable role by adding a layer of ‘principles-based’ protection to address gaps in the more granular

information privacy framework. Paterson notes that the Victorian Charter

privacy right ‘gives explicit recognition to the status of privacy as [a] human right, which arguably fills a gap’.[83]

B The Victorian Charter

In 2006, Victoria was the first Australian state to introduce specific human rights legislation,[84] comprising 20 human rights which ‘primarily derive from’ the 1966 United Nations International Covenant on Civil and Political Rights (‘ICCPR’).[85] Importantly, the Victorian Charter was intended to have its greatest impact within the public service, enshrining a culture of human rights.[86] While the Victorian Charter is intended to protect the rights of individuals, it also has a broader social purpose, arising from this cultural emphasis together with a key principle in its preamble: ‘human rights are essential in a democratic and inclusive society’.[87] Section 13 of the Victorian Charter contains the privacy right, the most relevant right for our purposes.

Table 1 compares the relevant privacy rights in Australia and internationally, demonstrating that the s 13 Victorian Charter privacy right closely mirrors

art 17(1) of the ICCPR; the two provisions have tended to be treated as

identical.[88] Section 13 is less similar to the equivalent European Union right, art 8 of the Convention for the Protection of Human Rights and Fundamental Freedoms (‘ECHR’),[89] which is nevertheless important due to its extensive body of jurisprudence. Table 1 also shows strong similarities between s 13 and

relevant provisions in Queensland and the ACT. Note that the Australian

provisions are considered to have ‘internal limitations’, in that they only

proscribe interferences with privacy which are arbitrary or unlawful.[90]

Table 1: Privacy Protections of the Victorian Charter, Comparable State and Territory Legislation, ICCPR and ECHR

Victorian Charter
s 13
A person has the right—
(a) not to have that person’s privacy, family, home or correspondence unlawfully or arbitrarily interfered with; and
(b) not to have that person’s reputation unlawfully
attacked.
ICCPR
art 17(1)
No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to
unlawful attacks on his honour and reputation.
ECHR
art 8
(1) Everyone has the right to respect for his private and
family life, his home and his correspondence.
(2) There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society ...
Qld HRA
s 25
A person has the right—
(a) not to have the person’s privacy, family, home or correspondence unlawfully or arbitrarily interfered with; and
(b) not to have the person’s reputation unlawfully
attacked.
ACT HRA
s 12
Everyone has the right—
(a) not to have his or her privacy, family, home or
correspondence interfered with unlawfully or
arbitrarily; and
(b) not to have his or her reputation unlawfully
attacked.

In addition to s 13, there are several key operative provisions of the Victorian Charter relevant to the application of the privacy right. This article will now briefly explore the general limitations clause (s 7(2)), the legislative override

(s 31), the interpretive obligation (s 32), and the conduct obligation (s 38).

The s 7(2) general limitations clause indicates that a human right may be subject to reasonable limitations and sets out the approach to assessing whether a limitation is reasonable and demonstrably justified. The ICCPR has no such explicit clause, but similar clauses can be found in s 13 of the Qld HRA and s 28 of the ACT HRA. The interaction between the general limitations clause and the Victorian Charter s 13 internal limitations has made it hard for applicants to establish an unjustified limitation of privacy.[91] Commentators have noted that ‘[i]f an individual who claims that their rights under s 13(a) have been breached must establish that the interference was unlawful or arbitrary, there would seem to be no work for s 7(2) to do’.[92] If the applicant is successful in establishing this, the onus will rest on the public authority to establish that the limitation is justified within the meaning of s 7(2).[93]

I will briefly touch on the s 31 legislative override, which allows the Victorian Parliament to specifically override Victorian Charter rights in ‘exceptional circumstances’,[94] with each such override sunsetting after five years.[95] Parliament has only used this mechanism in three scenarios, two of which were to prevent named prisoners and ‘police-killers’ accessing parole[96] and the other was to address issues in uniform legal practice across Australia.[97]

Section 32 contains the interpretive obligation, which applies to anyone

interpreting the Victorian Charter, including public servants and courts:

(1) So far as it is possible to do so consistently with their purpose, all statutory provisions must be interpreted in a way that is compatible with human rights.

(2) International law and the judgments of domestic, foreign and international courts and tribunals relevant to a human right may be considered in interpreting a statutory provision.

Similar interpretive obligations are contained in s 48 of the Qld HRA and

ss 30–1 of the ACT HRA. Note that s 32(1) of the Victorian Charter does not allow a provision to be interpreted so that it has no effect; any interpretation of provisions must not ‘displace Parliament’s intended purpose ... in a manner which avoids achieving the object of the legislation’.[98]

The leading authority on s 32(1) is the High Court decision in

Momcilovic v The Queen (‘Momcilovic’),[99] which resulted in six disparate judgments offering varied guidance. Chief Justice French compared s 32(1) to the common law principle of legality, stating that ‘[s] 32(1) applies to the interpretation of statutes in the same way as the principle of legality but with a wider field of application’.[100]

The common law principle of legality is the presumption that ‘Parliament does not intend to interfere with common law rights and freedoms except by clear and unequivocal language for which Parliament may be accountable to the electorate’.[101] Where the principle of legality applies, a court may potentially ‘read down’ a statutory provision to prevent it from interfering with common law rights.[102] In the High Court case of Coco v The Queen, a majority held that to interfere with fundamental common law rights, the principle of legality

requires ‘a clear expression of an unmistakable and unambiguous intention’.[103] Their Honours referred with approval to this concise summary of the principle of legality given by Brennan J in Re Bolton; Ex parte Beane: ‘Unless the

Parliament makes unmistakably clear its intention to abrogate or suspend a fundamental freedom, the courts will not construe a statute as having

that operation.’[104]

In Australia, the rationale for the principle of legality has tended to be based on legislative intention.[105] But in a subtly different approach, the ‘quintessential statement of the principle’[106] was delivered in the United Kingdom (‘UK’) by Lord Hoffmann in R v Secretary of State for the Home Department; Ex parte Simms: ‘But the principle of legality means that Parliament must squarely

confront what it is doing and accept the political cost. Fundamental rights

cannot be overridden by general or ambiguous words.’[107]

Following Momcilovic, Victorian courts have tended to view s 32(1) as

a codification of the principle of legality.[108] Bruce Chen notes that this approach may not be accurate or useful, as there are good reasons to give greater

normative force and scope to s 32(1) than to the principle of legality,[109] yet courts are currently construing it to provide ‘unjustifiably weak’ protection.[110]

Momcilovic also offered some commentary on s 32(2) of the Victorian

Charter, with French CJ expressing caution about the application of foreign jurisprudence:

Section 32(2) does not authorise a court to do anything which it cannot already do. ... If such a judgment concerns a term identical to or substantially the same as that in the statutory provision being interpreted, then its potential logical or analogical relevance is apparent. ... Nevertheless, international and foreign
domestic judgments should be consulted with discrimination and care.[111]

Justice Gummow expressed similar cautions, noting the importance of situating statutory human rights within their specific constitutional framework.[112] In the Victorian Court of Appeal case of Bare v Independent Broad-Based Anti-Corruption Commission, Warren CJ referenced this reasoning in relation to the application of s 32(2), noting she would ‘consider some of the foreign and

international jurisprudence, but ... with the caution urged by French CJ and Gummow J in Momcilovic’.[113] In relation to the s 13 privacy right, it will be most defensible to apply overseas jurisprudence if the wording of the comparative provision is similar (such as art 17 of the ICCPR).[114]

Section 38(1) of the Victorian Charter imposes conduct obligations

requiring public authorities (including public servants) to act compatibly with human rights and to give proper consideration to human rights in making any decision.[115] Similar conduct obligations can be found in s 58 of the Qld HRA and s 40B of the ACT HRA.[116] In Castles v Secretary, Department of Justice (‘Castles’), Emerton J indicated that this is a practical task and not a legal one, noting that

proper consideration of human rights should not be a sophisticated legal exercise. Proper consideration need not involve formally identifying the ‘correct’ rights or explaining their content by reference to legal principles or
jurisprudence.[117]

It follows that public servants are not required to undertake a legal exercise or apply current jurisprudence in order to fulfil their obligations to give proper consideration.

As a matter of policy and procedure, public sector agencies usually prepare an assessment to demonstrate giving proper consideration to relevant human rights in respect of a decision and acting in accordance with such rights.[118] For a data sharing proposal, such an analysis would usually involve identifying the engagement of the privacy right, including by considering whether the conduct is unlawful or arbitrary and (if arbitrary) assessing under s 7(2) whether any resulting limitation is reasonable and justified.[119] In Victoria, such an assessment would ideally be prepared alongside a privacy impact assessment with each informing the other.[120]

For a richer analysis of the s 13 privacy right, it is helpful to turn to the

normative basis and origins of privacy rights more broadly.

IV A Normative Framework: The Underlying Values

This Part examines the normative and social values of privacy and data

protection. Although the recognition of both privacy and data protection rights is not uniform across jurisdictions, the values underpinning those rights have relevance to all liberal democracies. This Part explores and expands on the

following values, which I consider critical to effectively applying privacy and data sharing protection in the public sector:

1 Privacy and data sharing protection have distinct but complementary foundations, offering versions of ‘opacity’ and ‘transparency’ protection which are arguably both needed for adequate protection of private life.

2 Democracy depends upon such protection — citizens need private ‘breathing room’,[121] free from both state interference and surveillance, to discover their own moral autonomy or personhood.[122]

3 A liberal state and its public authorities have a strong interest in supporting and upholding these elements of democracy, which are essential to the ongoing health of the state.[123] Where it is necessary to limit privacy to enable the use of personal data by public authorities, democratic transparency and accountability are required — in reliance both on the transparency principles of data protection and also on theories of social licence.

4 In adopting new privacy-impacting technologies, liberal states carry a special obligation to carefully consider and provide for the application of privacy rights, including by appropriately balancing opacity and transparency elements.

A Privacy: Theoretical and Legal Origins

Privacy is a relatively recent legal concept, which emerged during the 19th

century and generally developed in parallel with (or in response to) technological advancement.[124] Early formulations of privacy were around the value of ‘private life’,[125] the prevention of ‘unreasonable searches and seizures’ reflected in the Fourth Amendment to the United States Constitution, and the ‘general right of the individual to be let alone’ as highlighted by Warren and Brandeis in their seminal 1890 article ‘The Right to Privacy’.[126] Such approaches reflected a

growing sense of the importance of the liberty of citizens from the power of

the state.[127]

There has been considerable debate on the theoretical underpinnings of

privacy, but without clear consensus.[128] It has been described as a ‘concept in disarray’,[129] and as ‘a multidimensional, multi-faceted concept, the complexity of which is therefore hard to grasp within a single conceptual setting’.[130]

Theorists have focused on privacy as an essential component of individuals’ ability to develop social relationships,[131] engage in intimacy,[132] exercise self-determination over personal information[133] or sustain a ‘personal space’.[134] There are also critiques of privacy, ranging from the reductionist view that there

is nothing morally or legally distinctive about privacy,[135] through to feminist

critiques that privacy can be used as a shield to cover bad or criminal behaviour, particularly in the domestic sphere.[136] Privacy arguments have been made in relation to a variety of fields of privacy, including bodily, spatial, communicational, proprietary, intellectual, informational, decisional, associational and behavioural.[137]

The traditional concept of privacy is concerned with protection of an

intimate private life,[138] reflective of the origins of privacy in the opacity,

protection and secrecy of private information. We see this thread running through Prince Albert’s early 19th century English legal action to protect his and Queen Victoria’s private etchings,[139] Warren and Brandeis’ article in the United States (‘US’)[140] and recent UK decisions penalising a newspaper publisher for disclosure of the Duchess of Sussex’s private correspondence.[141]

In the 20th century, privacy developed in a legal sense as a human right

under international law, such as the 1948 Universal Declaration of Human Rights[142] and the ICCPR. The formulation of these human rights focuses on

privacy as traditionally constituted: the protection of home, correspondence, private life and reputation.[143]

B Privacy and Data Protection: Opacity and Transparency

While privacy and data protection share some common ground, they emerge from distinct normative foundations. Data protection laws have tended

to be based not on opacity but on transparency; namely an understanding

that personal information can be legitimately processed and shared in

certain contexts.[144]

The non-binding 1980 OECD Guidelines on the Protection of Privacy

provide one of the earliest examples of such an approach, seeking to establish a firm basis for the commercial use of data across member states.[145]

Justice Michael Kirby, chair of the OECD expert group that developed these

guidelines, later noted:

The concern which propelled the OECD into the issues of privacy was the fear that its member states would introduce incompatible and conflicting laws for the defence of privacy in the newly established databases of the interlinked information technologies.[146]

In that sense, these OECD guidelines were intended to pre-empt and limit

privacy rights over personal data.

Drawing a contrast between the two approaches, De Hert and Gutwirth identify privacy as a ‘tool of opacity’ and data protection as a ‘tool of transparency’,[147] stating ‘[o]pacity tools embody normative choices about the limits of power; transparency tools come into play after these normative choices have been made in order still to channel the normatively accepted exercise of power’.[148] Transparency tools are procedural, specifying how information

may be handled, while opacity tools challenge whether it should be used or

disclosed at all.

The relationship between opacity and transparency tools can be visualised much like the protection of a house. A house offers opacity protection (walls, a roof, window coverings) essential to keep out the elements and prying eyes, but it also offers transparency protection (glass windows and open doors) to allow the inhabitants to experience the outdoors and venture outside. Just as both a transparent greenhouse and a highly protective but opaque bomb shelter

are inadequate for comfortable housing, so the balance between opacity and transparency protection is arguably key to the design of an effective information privacy framework in the age of big data.

The Australian Privacy Act, strongly influenced by the OECD Guidelines on the Protection of Privacy,[149] is reflective of the transparent normative foundation of data protection; subsequent state-based legislation follows a similar

approach. Consequently, so-called ‘privacy legislation’ in Australia may be

better characterised as data protection legislation and a tool of transparency. The legislation does contain limited opacity elements like the higher degree of protection it applies to narrowly defined ‘sensitive information’, such as

personal information about racial or ethnic origin, sexual preference, religious beliefs, political opinions and criminal records.[150] But it is instructive to apply the house analogy to this framework and consider whether one of its

weaknesses is its excessive transparency — additional opacity protection seems warranted.

Privacy is recognised as a fundamental right under international human rights instruments, but it can be challenging to apply because it is not absolute.[151] It has been described as ‘a relatively weak fundamental right’ on the basis that ‘not a single aspect of privacy takes absolute precedence over other rights and interests’.[152] Data protection was recognised by the European Union as a fundamental right in 2000,[153] and the ECHR art 8 privacy right has also been taken to encompass data protection,[154] at least for ‘essentially private’ data.[155]

But even in the commonly more rights-protective European environment, data protection may be treated in practice as a procedural right rather than as an inviolable right. Article 8 of the ECHR arguably includes both a transparency tool (‘in accordance with the law’) focusing on whether an interference with rights is appropriately authorised, and an opacity tool (whether the use of the data is ‘necessary in a democratic society’). In their 2006 article, De Hert and Gutwirth argued that, in practice, the European Court of Human Rights was overly focused on the procedural requirement that the intervention be ‘in

accordance with the law’, turning art 8 of the ECHR into a ‘transparency-promoting vehicle’.[156] De Hert and Gutwirth also consider it important that any law restricting privacy rights be authorised by a Parliament, noting that where this is not the case

[d]emocratically there is a loss. Insisting on the content of law, rather than on the formal basis of law, has allowed the Court to declare certain regulations ‘in accordance with the law’ that have not even been debated and approved by a parliament.[157]

The existence of a right to data protection is less clear outside Europe, including under the Victorian Charter. It is broadly assumed that the Victorian Charter privacy right will cover the protection of personal data, but to date there is little judicial precedent on this point.[158]

C Social and Democratic Contribution

Much of the academic discussion of privacy concerns its role as an individual benefit, but with an emerging parallel focus on privacy’s social importance.[159] For example, Cohen argues that information privacy is a ‘constitutive element of a civil society’.[160] Gellert and Gutwirth characterise it as ‘the ultimate defence line of liberty’, due to its role in filling the gaps left by more classical human rights in defending the boundary between citizen and state.[161] Galloway

describes this boundary as follows: ‘Privacy in this context represents “breathing room” that allows us to be ourselves, guided by our social interactions and value systems, and free to become engaged citizens away from the gaze of the State.’[162] An understanding of privacy’s social value is helpful because it counters the narrowing of information privacy to the status of individual choices and strengthens the argument for state protection against harms.

Privacy theorists emphasise the important role of privacy in furthering

the autonomy of citizens,[163] which is central to democracy. Mokrosinksa

considers that the autonomy afforded by privacy is both social and political: ‘privacy preserves the conditions of individual autonomy by preserving the

integrity of political practices’.[164] Big data technologies and surveillance (fuelled by public sector data sharing) arguably threaten autonomy due to the vast

reserves of personal data collected, amalgamated and analysed, and the lack of any form of genuine informed consent from data subjects to most of these

activities.[165]

Social arguments have also been made for the importance of data

protection. Rodotà comments that ‘the strong protection of personal data

continues to be a “necessary utopia” if one wishes to safeguard the democratic nature of our political systems’.[166] Rouvroy and Poullet argue that both privacy and data protection rights ‘are necessary, in a democratic society, to sustain a vivid democracy’.[167] Gellert and Gutwirth go further, positing that a reliance only on transparency tools (data protection rights) will be inadequate ‘to keep intact the political private sphere of liberty’, and that it is essential to include opaque privacy rights, especially as a threshold test regarding the acceptance of new data practices.[168]

Another strong argument for a social framing of privacy and data protection is that increasingly, the harms posed by privacy infringements and data

incidents are not individual harms, but collective harms.[169] This follows from the scope and impact of big data analytics practices which carry societal

impacts exceeding the harm to any individual. As Graef and van der Sloot note, ‘no individual can fully protect herself against collective harms considering the interdependent nature of privacy protection and the competitive harm data technologies cause’.[170]

It remains important to address individual harms, but a broader normative framing assists in making the case for legislative protection that extends to

collective harms.

D The Role of ‘Social Licence’

‘Social licence’ is a term used to represent the basis for an activity’s legitimacy as a form of social contract theory.[171] The concept of social licence first arose in relation to extractive industries, such as forestry and mining: industries which carried benefits and risks and where there was a perceived need for community cooperation and acceptance.[172] It is now common to see it raised in the context of data sharing — another activity with benefits and risks — to underpin or challenge the legitimacy of the activity.[173]

A key concern of the Productivity Report was ‘putting at the front and centre of any data use the requirement to maintain a social licence’.[174] This was

considered of particular relevance to public sector data, given that public sector agencies operate under social licence — essentially a compact of trust and

credibility with the public.[175] This social licence can easily be broken if governments act in a way that is not transparent and trustworthy and fail to communicate the value of their activities to the public. Despite the fact that data sharing may be legal, ‘legal authority does not necessarily command social

legitimacy’.[176]

The Productivity Report identified the following factors as important to

the maintenance of social licence for data sharing: shared public value;

citizen control of participation; trust, through the use of data safeguards; and genuine accountability.[177]

In a recent review of the sharing of health data in the Australian public

sector, Richards and Schneiber also identified the importance of social licence in data sharing, commenting that ‘[t]he key to appropriate governance in this area has consistently been identified as transparency, creation of trust and

identification of the parameters of the social licence’.[178] They add: ‘If an activity (such as data-sharing) occurs outside that licence, then the legitimacy of the

activity is open to challenge even if it takes place within an established

legal framework.’[179]

E Digital Surveillance and Other Novel Technologies

I have already noted that with the adoption by the public sector of big data techniques and other new data technologies, the impacts on privacy are

dynamic and changing.[180]

An activity that receives special attention in the privacy literature is surveillance, which can be defined as a systematic, focused and routine pursuit of a person or their data in order to manage, protect, direct or influence them.[181] Surveillance strongly engages the social value of privacy because its harms may extend beyond the individual to have a chilling effect on society.[182] Surveillance is generally presented as a security measure but depending on the mechanism, it may interfere with the intellectual and informational privacy of individuals and the power of free thought.[183]

Given that the focus of this article is public sector data sharing, references to surveillance include but are not limited to the common use of the term to describe tracking or monitoring people’s movements or communications. As described above, public sector data sharing can facilitate the development of a full digital picture of an individual.[184] Such data amalgamation ‘collapses the divides that all people create between different areas of their lives ... [and]

represents a loss of control of our carefully managed identities’.[185] The bringing together of multiple public sector datasets may allow insight into a data

subject’s conduct and preferences and may facilitate predictive analytics around future behaviour. It does not seem far-fetched to describe this as a form of

digital surveillance.

In adopting such new privacy-impacting technologies, states arguably have an obligation to carefully consider and provide for the application of privacy rights including by appropriately balancing opacity and transparency elements. The European Court of Human Rights considered that a particular duty applied to European jurisdictions in such cases: ‘any state claiming a pioneer role in the development of new technologies bears special responsibility for striking the right balance in this regard’.[186]

With this normative framework in mind, I move now to consider

application of the legal framework in Victoria.

V Applying the Victorian Charter Privacy Right

Assessing Victorian information privacy law through the lens of transparency and opacity tools, almost all of the legislative protection appears closely aligned to transparent data protection, rather than to opaque privacy protection. Like the Privacy Act, the PDPA is best characterised as a transparency tool, allowing the sharing of personal information if certain requirements are met.[187] The Health Records Act 2001 (Vic), Victorian Data Sharing Act 2017 (Vic) and

sector-specific data sharing provisions have a similar focus.[188] Both the Privacy Act and the PDPA offer some opacity protection to ‘sensitive information’, as defined.[189] Opacity protection can also be found in: the Surveillance Devices Act 1999 (Vic), which prevents the use of surveillance devices in private

locations;[190] various secrecy provisions;[191] and potentially in s 13 of the

Victorian Charter.

To return to the house analogy, Victorian information privacy legislation

is largely a transparent greenhouse, with some isolated opaque sections

provided by the legislative protection of sensitive information, restrictions on the use of surveillance devices in private spaces without consent,[192] and secrecy provisions.[193] This means that relevant legal arguments generally concern the transparency question of how relevant information should be used or disclosed, rather than the opacity question of whether it should be used or disclosed.

In this context, s 13 of the Victorian Charter has potential to offer considerable utility as a principles-level overlay in relation to privacy interferences, especially those arising from new data technologies. In essence, it could provide an opaque roof on the greenhouse, prohibiting or significantly limiting certain novel uses of personal and private information, including the burgeoning use of artificial intelligence tools and applications.

However, in the 15 years since it became fully operational, s 13 of the Victorian Charter has had a minimal impact on protecting privacy rights in Victoria. Paterson comments that the s 13 right is one of the most frequently asserted Victorian Charter rights, but ‘it has not had a substantial impact because of the very small number of cases where it has played a determinative role in relation to the outcome of litigation’.[194] She explains that this is frequently because the relevant conduct is not considered unlawful or arbitrary.[195] The s 13 internal limitations (‘unlawfully’ and ‘arbitrarily’) are effectively carve-outs in protection, weakening the opacity protection of the privacy right — if an interference is considered to be both non-arbitrary and lawful, s 13 has no real operation. In addition, the applicant bears the onus of proving that the relevant conduct was arbitrary or unlawful — this is often a difficult task, especially given that pertinent evidence may be in the sole possession of the public authority.[196] The application and practical impact of these limitations is an important element of the balance between opacity and transparency.

A ‘Arbitrarily’ Internal Limitation

In relation to the ‘arbitrarily’ limitation, the Victorian Charter Explanatory Memorandum expressed an intention that s 13 be interpreted consistently with ‘the existing information privacy and health records framework in Victoria to the extent that it protects against arbitrary interferences’.[197] The intention seems to be that an interference permitted by the existing transparency-focused

data protection framework would not be considered arbitrary within the

meaning of s 13.

The recent Victorian Court of Appeal case of Thompson v Minogue (‘Thompson’) concerned an application brought against prison authorities by a Victorian prisoner for urine testing and strip searching.[198] The Court confirmed that ‘arbitrarily’ has a broad meaning, taken from international human rights law, such that an arbitrary interference is ‘one which is capricious, or has resulted from conduct which is unpredictable, unjust or unreasonable in the sense

of not being proportionate to the legitimate aim sought’.[199] In Thompson the strip searches were found to be arbitrary because even though they were not capricious, unpredictable or unjust, they were not proportionate to a legitimate aim and were ultimately incompatible with the applicant’s privacy rights.[200]

B ‘Unlawfully’ Internal Limitation

In relation to the construction of ‘unlawfully’ in s 13, the United Nations

Human Rights Committee noted in its commentary on the very similar art 17 of the ICCPR that a law limiting the privacy right should ideally meet certain minimum requirements of precision and clarity: ‘relevant legislation must specify in detail the precise circumstances in which such interferences may be

permitted’.[201] However, such an approach has not been clearly adopted by the Victorian judiciary.

Re Kracke and Mental Health Review Board (‘Kracke’) was an early Victorian Civil and Administrative Tribunal case in which Bell J sought to apply relevant international jurisprudence to construe ‘unlawfully’ in the Victorian Charter. While Kracke focused most heavily on the ‘under law’ requirement of the general limitations clause in s 7(2) of the Victorian Charter, Bell J drew a parallel between that requirement and the references to unlawfulness elsewhere under the Victorian Charter including s 13, indicating that the same considerations applied in each case.[202] His Honour summarised the applicable European cases on art 8 of the ECHR:[203] Malone v United Kingdom (‘Malone’)[204] and

Olsson v Sweden (‘Olsson’).[205] Having regard to Momcilovic, it should be noted that these cases relate to a textually different provision and a different jurisprudential framework, potentially giving them lower weight in a Victorian context.[206]

Malone concerned the application of art 8 of the ECHR to wiretapping.[207] In considering whether the wiretapping was ‘in accordance with the law’, the

European Court of Human Rights held that ‘the law must indicate the scope of any such discretion conferred on the competent authorities and the manner of its exercise with sufficient clarity ... to give the individual adequate protection against arbitrary interference’.[208] Olsson also considered art 8 of the ECHR, citing Malone with approval and summarising the requirements of ‘in accordance with the law’ as requiring precision, quality of law and clarity of scope of any discretion.[209] These European authorities cited in Kracke are now relatively dated, but their approach to lawfulness remains current. The 2021 case of Varga v Slovakia concerned a complaint about state surveillance.[210] On the question of lawfulness in art 8 of the ECHR, the European Court of Human Rights reiterated its ‘settled’ case law, applying the same requirements outlined

in Malone.[211]

Turning to New Zealand, Bell J cited the approach to lawfulness taken by McGrath J in applying the New Zealand Bill of Rights Act 1990 (NZ) (‘NZBORA’) in the Supreme Court of New Zealand case of R v Hansen

(‘Hansen’):

To be prescribed by law, limits must be identifiable and expressed with sufficient precision ... The limits must be neither ad hoc nor arbitrary and their nature and consequences must be clear, although the consequences need not be foreseeable with absolute certainty.[212]

This approach also remains settled law in New Zealand. The High Court of New Zealand case of Borrowdale v Director-General of Health concerned the legality of the exercise of public health powers during the COVID-19 pandemic.[213] In a joint judgment, the Court accepted that the restrictive measures imposed by government interfered with human rights under NZBORA but inquired whether they were ‘prescribed by law’; applying the above passage from Hansen, the Court established the measures were not ‘prescribed by law’.[214]

In Kracke, Bell J summarised these principles as follows: ‘the law concerned must be adequately accessible, not uncertain or vague and sufficiently precise to enable people to predict its application and foresee its consequences to a reasonable degree’.[215] For several years, this approach was considered by Victorian Charter commentators to form part of the applicable Victorian jurisprudence around s 13 unlawfulness.[216] But broader judicial support for this approach is rather thin, and recent decisions appear to have diverged from it.[217]

In the 2021 case of HJ v Independent Broad-Based Anti-Corruption Commission (‘HJ’), the Victorian Court of Appeal addressed the construction of

‘unlawfully’ by simply stating that it means infringing an applicable law.[218] The case concerned privacy interferences arising from the search and seizure of documents by the Independent Broad-Based Anti-Corruption Commission (‘IBAC’) under specific legislative provisions and search warrants.[219] The applicants conceded that IBAC had not acted unlawfully and the s 13 submissions turned instead on whether the action was arbitrary.[220] Accordingly, the Court’s discussion of lawfulness was brief.[221]

This narrow approach to unlawfulness was followed in Thompson. In relation to the ‘unlawfully’ limitation in s 13, the Thompson Court cited HJ and indicated that the test for whether a restriction was unlawful was simply this: ‘An “unlawful” interference with a person’s privacy for the purposes of s 13(a) of the [Victorian Charter] is one which infringes an applicable law.’[222] There

was no discussion of the Kracke approach or the international and comparative jurisprudence around the need for a positive law of adequate quality. The

prisoner respondent conceded that the urine testing was lawful, but argued that the strip searching was not.[223] The Court rejected this argument, finding that the respondent had not discharged his onus of proof in demonstrating that the strip searching was unlawful.[224]

The effect of this failure to engage with the international and comparative jurisprudence around the adequacy of laws is to significantly restrict the scope of protection offered by s 13. The implication of Thompson is that there is no need for Parliament to have considered the issue and supplied a legal basis

of adequate quality; the mere absence of an ‘applicable law’ prohibiting the

interference is adequate. This approach greatly expands the scope of the s 13 ‘unlawfully’ limitation, without any of the usual transparency requirements of procedural protection or parliamentary accountability. It also increases the

difficulty of an applicant’s task in discharging the onus of proof — to prove that the conduct is unlawful they must identify a specific law prohibiting it, rather than simply establishing that the respondent’s legislative basis is of inadequate quality to limit the right. I have already indicated that there are relatively few legislative provisions prohibiting the use of personal information;[225] the effect of Thompson is that an applicant would most likely need to prove a contravention of the PDPA or an opacity law before the conduct would be considered unlawful for the purposes of s 13. This is such a challenging task that most applicants seeking to apply s 13 would be forced to prove instead that the conduct

is arbitrary.

It is worth mentioning that this is not about statutory vagueness. The High Court clearly established in Brown v Tasmania that, unlike in the US, there is no doctrine of vagueness in Australian law that would invalidate a law for lack of clarity alone.[226] But the argument under s 32(1) and Kracke is not that a

law permitting interference with privacy might be invalidated for lack of clarity, but that it might not be adequately clear and transparent to restrict a statutory right. In any event, Thompson allows a statutory right to be restricted by the mere absence of a law prohibiting the interference, which accords the right very little value. This approach is difficult to square with the principle of legality’s operation at common law (which requires a clear expression of intention to

interfere with the right) and as codified by s 32(1) of the Victorian Charter. The Thompson approach also seems inconsistent with the well-accepted view that human rights drafting is entitled to more generous interpretation than other legislation, in order to give full meaning to the intended rights.[227]

C Applying the Privacy Right

Given the risks posed by new data uses and the duty of the public sector and the courts to maintain a careful balance between opacity and transparency

approaches, Victorian courts have taken a wrong turn, greatly undermining the opacity protection of s 13. Without serious consideration of the consequences of such an approach, the Thompson Court departed from the settled ‘human rights’ approach to lawfulness taken in comparable jurisdictions. It is critical that Victorian courts grapple with the practical implications of this departure, especially given the accepted approach of construing human rights in the ‘broadest possible way’.[228] This is also true for Queensland and ACT courts, where there is sparse jurisprudence on the meaning of ‘unlawfully’ and Victorian authorities are likely to be highly persuasive.[229]

In discharging their s 38 obligation, I have already noted that Castles

encourages public servants to take a pragmatic approach which need not be closely tied to the latest jurisprudence.[230] So it remains open in their assessment of lawfulness to follow Kracke rather than Thompson and require a clear

legislative basis of adequate clarity and parliamentary accountability.[231] This would more clearly demonstrate a proper consideration of the privacy right than following Thompson and would assist in upholding the government’s

social licence for data sharing. Such an approach would be particularly

appropriate when considering new data technologies associated with data

sharing, such as public sector digital surveillance of various kinds and big data technologies including artificial intelligence initiatives. Arguably, this higher level of care is required to discharge the normative obligations connected with the introduction of novel data technologies.

VI Concluding Recommendations

Gellert and Gutwirth argue persuasively that one cannot assume that privacy and its associated social goods will be adequately protected solely by data

protection legislation focused on transparency.[232] This has particular relevance for Australian information privacy laws with their strong focus on transparency tools. For Australians to be protected against some of the harms associated with new data technologies, a level of opacity protection will be required. In Victoria and other jurisdictions with statutory human rights protection, including at the federal level under a future Bill of rights, this could be supplied by an appropriately construed and applied statutory privacy right.

The meaning of ‘unlawfully’ in s 13 of the Victorian Charter appears not

to have been fully argued in HJ and Thompson by reference to Kracke and

international jurisprudence. That meant the Court in HJ and Thompson did not clearly weigh the consequences of its approach, which has the practical effect of reducing s 13 privacy protection by greatly expanding the impact of its

internal limitations. I have already noted Paterson’s comment that the s 13

privacy right is often raised in litigation but has not been determinative in practice, partly due to the courts’ approaches to its internal limitations.[233] In this age of big data technologies, it would be unfortunate for these internal

limitations to be construed so broadly that the privacy right cannot provide a necessary overlay of opacity protection.

The analysis of this article confirms Paterson’s observation that ‘it is time

to impose limitations on the extent to which other laws can override’

privacy laws.[234] Her useful suggestion is to emulate art 22(2)(b) of the GDPR, which only allows such an override where the overriding law provides suitable protections.[235] Following that model would require legislative amendment to the Victorian Charter, for which there seems to be little appetite.[236] But even without such amendment, courts have considerable latitude to apply existing principles of interpretation and avoid expanding the internal limitations of s 13 such that they swallow up the right itself. Courts should also be alert to giving effect to s 32(1) of the Victorian Charter by applying a codified principle of

legality and not allowing statutory rights to be overridden by the broader

legislative framework, except where Parliament evinces clear intent to do so, and accordingly takes democratic accountability for the outcome.

I have already noted that s 31 of the Victorian Charter contains a specific mechanism (with in-built protections) to override Victorian Charter rights.[237] In the absence of an art 22(2)(b) GDPR-style amendment, a s 31 override would be a more transparent and democratic method to enable a new public sector data technology or pattern of data sharing than the much less transparent use of expanded ‘lawfulness’ — provided ‘exceptional circumstances’ exist within the meaning of s 31. But wherever possible, clear articulation of an adequate legislative basis for data use (with appropriate protections) is preferable to any s 31 override. While the override power may ensure data use does not contravene the Victorian Charter, the articulation of a coherent legislative foundation ensures any data use complies with the Victorian Charter in a substantive sense, which is surely preferable. It is also likely to be more palatable in a political sense than an override of human rights protections.

Public servants also have a duty to uphold democratic accountability and social licence in their assessment of data sharing proposals. They can most readily achieve this outcome by seeking (wherever possible) to advocate for or identify a clear legislative basis for public sector data practices, especially for data sharing arising from new data technologies such as artificial intelligence.

Given the risk of harms to data subjects, courts and public servants in

Victoria, Queensland and the ACT would be well advised to be mindful of the balance between opacity and transparency as they construe privacy rights. A more constrained approach to the s 13 internal limitations will better protect the residents of those jurisdictions in an age of big data and intrusive data

practices, require more democratic accountability from Parliament, and assist in supporting the social value of privacy and the social licence around data sharing. This would allow human rights legislation to truly act as an extra layer of protection, potentially filling existing gaps in the information privacy

legislative framework and futureproofing it against other deficits.


* PhD candidate at Deakin University and Head of Data Sharing at the Department of Transport and Planning, Victoria, overseeing driver and vehicle data. The views expressed in this article are my own and not necessarily those of the Department. I wish to thank my supervisory

team of Shiri Krebs, Matthew Groves and Bruce Chen for their ongoing support and valuable input, and Dan Meagher, Ben Saunders, Jake Goldenfein and the anonymous reviewers for their helpful comments.

[1] See generally Brad Brown, Michael Chui and James Manyika, ‘Are You Ready for the Era of “Big Data”?’ [2011] (4) McKinsey Quarterly 24.

[2] Department of the Prime Minister and Cabinet, Australian Data Strategy: The Australian

Government’s Whole-of-Economy Vision for Data (Report, 2021) 17 <https://www.finance.gov.au/sites/default/files/2022-10/australian-data-strategy.pdf>.

[3] Victorian Government, ‘Victorian Public Sector Data Sharing Policy’, DataVic (Web Page, 14 September 2021) <https://www.data.vic.gov.au/victorian-public-sector-data-sharing-policy>, archived at <https://perma.cc/AJS6-L3D8>.

[4] Productivity Commission (Cth), Data Availability and Use (Inquiry Report No 82, 31 March 2017) 117–18 <https://www.pc.gov.au/inquiries/completed/data-access/report/data-access.p‌df>, archived at <https://perma.cc/XC3J-ZX8P>.

[5] See Moira Paterson and Maeve McDonagh, ‘Data Protection in an Era of Big Data: The Challenges Posed by Big Personal Data’ [2018] MonashULawRw 1; (2018) 44(1) Monash University Law Review 1, 1, 5–6.

[6] Office of the Australian Information Commissioner, Data Breach Preparation and Response: A Guide to Managing Data Breaches in Accordance with the Privacy Act 1988 (Cth) (Guide, July 2019) 8 <https://www.oaic.gov.au/__data/assets/pdf_file/0017/1691/data-breach-preparation-and-response.pdf>, archived at <https://perma.cc/QQ3A-E6DR>.

[7] Ibid.

[8] See, eg, Kate Galloway, ‘Big Data, Government, Privacy and Human Rights’ in Paula Gerber and Melissa Castan (eds), Critical Perspectives on Human Rights Law in Australia (Thomson Reuters, 2021) vol 2, 357, 359–61.

[9] See, eg, Privacy Act 1988 (Cth) sch 1 (‘Australian Privacy Principles’); Privacy and Data Protection Act 2014 (Vic) sch 1 (‘Information Privacy Principles’).

[10] See Privacy Act 1988 (Cth) s 6 (definition of ‘personal information’) (‘Privacy Act’); Privacy and Data Protection Act 2014 (Vic) s 3 (definition of ‘personal information’) (‘PDPA’).

[11] See Moira Paterson, ‘The Uses of AI in Government Decision-Making: Identifying the Legal Gaps in Australia’ (2020) 89(4) Mississippi Law Journal 647, 660–1 (‘The Uses of AI’).

[12] See Inge Graef and Bart van der Sloot, ‘Collective Data Harms at the Crossroads of Data

Protection and Competition Law: Moving beyond Individual Empowerment’ (2022) 33(4)

European Business Law Review 513, 514.

[13] Paterson, ‘The Uses of AI’ (n 11) 658–61, 664–5.

[14] See generally Australian Human Rights Commission, Free & Equal: A Human Rights Act for Australia (Position Paper, December 2022). In March 2023, the Commonwealth Attorney-General initiated a parliamentary inquiry into the adequacy of the Australian human rights framework, to be reported by March 2024: ‘Inquiry into Australia’s Human Rights Framework’, Parliament of Australia (Web Page) <https://www.aph.gov.au/Parliamentary_Business/Committees/Joint/Human_Rights/HumanRightsFramework>, ‌‌archived at <http‌s:/‌‌‌/‌‌perma.‌‌‌‌‌‌cc/8K‌X9-V9KG>.

[15] Galloway (n 8) argues that a legal right to privacy is essential to protect Australians against harms from government use of big data, suggesting that such legislation could provide a foundational privacy standard which would be a ‘technology-neutral approach to upholding individual rights to data protection, and therefore protection of individuals themselves’: at 378. See also at 357–8.

[16] See Priscilla M Regan, Legislating Privacy: Technology, Social Values, and Public Policy (University of North Carolina Press, 1995) 5–7 (‘Legislating Privacy’).

[17] Paterson and McDonagh (n 5) 3–4.

[18] Office of the Australian Information Commissioner, Guide to Data Analytics and the Australian Privacy Principles (Guide, March 2018) 6 <https://www.oaic.gov.au/privacy/guidance-and-advice?a=3086>, archived at <https://perma.cc/JG4T-ALFE> (‘Guide to Data Analytics’).

[19] Ibid 3, 8.

[20] David Gewirtz, ‘Volume, Velocity, and Variety: Understanding the Three V’s of Big Data’, ZDNet (Web Page, 21 March 2018) <https://www.zdnet.com/article/volume-velocity-and-vari‌ety‌-understanding-the-three-vs-of-big-data/>, archived at <https://perma.cc/S2MQ-XLYC>.

[21] Paterson and McDonagh (n 5) 2–4.

[22] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the

Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection

Regulation) [2016] OJ L 119/1 art 4(1) (‘GDPR’). The GDPR (n 22) refers to a ‘data subject’ as ‘an identified or identifiable natural person’ to whom any information relates.

[23] Organization for Economic Co-Operation and Development, Data-Driven Innovation:

Big Data for Growth and Well-Being (Report, 2015) 17 <https://read.oecd-ilibrary.org/science-and-technology/data-driven-innovation_9789264229358-en#page1>, archived at <https://per‌ma‌‌.cc/Y35X-5ZSS> (‘Data-Driven Innovation’).

[24] Productivity Commission (Cth) (n 4) 108–14. It is worth noting that the Australian Productivity Commission’s methodology depended heavily on stakeholder submissions and interviews, not original inquiry: at 393.

[25] Data-Driven Innovation (n 23) 41–2.

[26] Office of the Australian Information Commissioner, Australian Community Attitudes to Privacy Survey 2020 (Report, September 2020) 27–8 <https://www.oaic.gov.au/engage-with-us/research/australian-community-attitudes-to-privacy-survey-2020-landing-page>, archived at <https:‌‌/‌‌/perma.cc/N8PU-8DMH>.

[27] Ibid 27.

[28] Ibid 20.

[29] Ibid 21.

[30] Australian Competition and Consumer Commission, Digital Platforms Inquiry (Final Report, June 2019) 384–93, 442–8 <https://www.accc.gov.au/system/files/Digital%20platforms%20inquiry%20-%20final%20report.pdf>, archived at <https://perma.cc/SM3T-AUB7>.

[31] Daniel J Solove, ‘A Taxonomy of Privacy’ (2006) 154(3) University of Pennsylvania Law Review 477, 487–9.

[32] Ibid 488–9.

[33] Paterson and McDonagh (n 5) 3–4. See also Privacy Act (n 10) s 6 (definition of ‘sensitive

information’ paras (a)–(b)). For definitions of ‘data integration’, ‘data matching’ and ‘data mining’, see Guide to Data Analytics (n 18) 7.

[34] One notorious example concerns the chain store Target, which accurately determined that a teenage girl was pregnant based on her purchase of certain products and then proceeded to send her coupons for baby products. Her father was incensed and complained, but later discovered that Target was aware of her condition before he was: Charles Duhigg, ‘How Companies Learn Your Secrets’ (16 February 2012) The New York Times Magazine, cited in Guide to Data Analytics (n 18) 34.

[35] See, eg, Department of the Prime Minister and Cabinet (n 2) 8; Services Australia, Modernising Our Services (Report, 23 November 2022) 4 <https://www.servicesaustralia. gov.au/sites/default/files/2022-09/modernising-our-services.pdf>. In relation to the United States public sector, it is said that ‘[c]itizens are accustomed to the experiences offered by companies from Amazon to Zillow and now want the same from governments’: Tony D’Emidio et al, ‘The Public Sector Gets Serious about Customer Experience: 8 Things You Should Know about Customer Experience in the Public Sector’ [2019] (3) McKinsey Quarterly 16, 17.

[36] See generally Bruce Baer Arnold, ‘I’ve Given Out My Medicare Number: How Worried Should I Be about the Latest Optus Data Breach?’, The Conversation (online, 29 September 2022) <https://theconversation.com/ive-given-out-my-medicare-number-how-worried-sho uld-i-be-about-the-latest-optus-data-breach-191575>, archived at <https://perma.cc/X337-GENF>.

[37] Clancy Yeates, ‘“A Gift to Telstra”: Cyberattack To Hit Optus’ Reputation’, The Age (online, 30 September 2022) <https://www.theage.com.au/business/companies/a-gift-to-telstra-cyberattack-to-hit-optus-reputation-20220928-p5blmq.html>, archived at <https://perma.cc/N3HS-YY7W>.

[38] See Galloway (n 8) 372.

[39] Ibid 361.

[40] Galloway (n 8) discusses these risks in the context of aggregation of government data and elimination of silos: at 369–70, 372.

[41] See, eg, Public Records Act 1973 (Vic) ss 2(1) (definition of ‘public office’), 12–13.

[42] For example, Victorian driver licence records are required to be retained for 85 years: Keeper of Public Records (Vic), Retention and Disposal Authority for Records of Vehicle Registration and Driver Licensing (PROS 09/08 VAR 3, 23 December 2009) 14.

[43] Data-Matching Program (Assistance and Tax) Act 1990 (Cth) ss 6, 10.

[44] ‘Centrelink Data Matching Activities’, Services Australia (Web Page, 23 October 2023) <https://www.servicesaustralia.gov.au/centrelink-data-matching-activities>, archived at <https://perma.cc/ST3P-674Z>.

[45] Royal Commission into the Robodebt Scheme (Report, 7 July 2023) vol 1, v, xxiv, xxix. See also David Braue, ‘Robodebt a “Massive Failure” of Government Automation: Federal Court Slams Shameful Chapter in Welfare Legacy’, ACS Information Age (online, 15 June 2021) <https:‌‌‌‌//ia.acs.org.au/article/2021/robodebt-a--massive-failure--of-government-automation.html>. The letters patent establishing the Royal Commission into the Robodebt Scheme identifies the establishment, design and implementation of the scheme as an appropriate subject of the inquiry: Letters Patent for the Royal Commission into the Robodebt Scheme from Mark Dreyfus, Attorney-General to Catherine Ena Holmes, 18 August 2022, 2 <https://robodebt.royalcommission.gov.au/publications/letters-patent>, archived at <https://perma.cc/R‌67L-JCYL>.

[46] Adamantia Rachovitsa and Niclas Johann, ‘The Human Rights Implications of the Use of AI in the Digital Welfare State: Lessons Learned from the Dutch SyRI Case’ (2022) 22(2) Human Rights Law Review 1, 1–3.

[47] Netherlands Juristen Comité voor de Mensenrechten v Netherlands (Hague District Court,

No C/09/550982 / HA ZA 18-388, 5 February 2020) [6.7] <https://uitspraken.rechtspraak.n‌l/#!/details?id=ECLI:NL:RBDHA:2020:1878>, archived at <https://perma.cc/W2ST-H7SZ>, citing Wet structuur uitvoeringsorganisatie werk en inkomen [Work and Income Implementation Organisation Structure Act] (Netherlands) 29 November 2001, art 65, as at 9 October 2013. See generally ibid for an explanation of the case and its consequences.

[48] Michael K McCall, Margaret M Skutsch and Jordi Honey-Roses, ‘Surveillance in the COVID-19 Normal: Tracking, Tracing, and Snooping’ (2021) 10(2) International Journal of E-Planning Research 27, 28–35.

[49] See, eg, Serena Seyfort, ‘Victorian Government Assures Public COVID-19 Contact Tracing Data Can Only Be Shared in “Life or Death Matters”’, 9News (online, 28 December 2021) <https://www.9news.com.au/national/coronavirus-victoria-update-state-government-assure‌s-public-covid-19-contact-tracing-data-can-only-be-shared-in-life-or-death-matters/ae18c5a‌5‌-25ec-42ec-abcf-ef9b21ebd027>, archived at <https://perma.cc/BGW4-YTX2>. See also ‘COVID-19 Check-In Apps and Privacy’, Office of the Australian Information Commissioner (Web Page) <https://www.oaic.gov.au/privacy/your-privacy-rights/more-privacy-rights/cov‌id-19/covid-19-check-in-apps-and-privacy>, archived at <https://perma.cc/8MZJ-F3N8>.

[50] Jeremy Nadel, ‘AFP, Vic Police and Illion Requested Victorian QR Code Data’, iT News (online, 13 December 2021) <https://www.itnews.com.au/news/afp-vic-police-and-illion-request-victorian-qr-code-data-573905>, archived at <https://perma.cc/V49Y-7EX8>.

[51] Public Health and Wellbeing Act 2008 (Vic) pt 8A div 8, as inserted by Public Health and Wellbeing Amendment (Pandemic Management) Act 2021 (Vic) s 12. This amendment was made as part of the Victorian government’s commitment to parliamentary cross-benchers in securing an extension of the pandemic state of emergency: Daniel Andrews, Premier of Victoria, ‘Laws To Enshrine Safe and Clear Pandemic Responses’ (Media Release, 26 October 2021) <htt‌ps://www.premier.vic.gov.au/laws-enshrine-safe-and-clear-pandemic-respon‌ses>‌, archived at <https://perma.cc/3DZH-2TUM>.

[52] Productivity Commission (Cth) (n 4) 12.

[53] Australian Law Reform Commission, Serious Invasions of Privacy in the Digital Era (Final

Report No 123, June 2014) 51 [3.50] <https://www.alrc.gov.au/wp-content/uploads/2019/08‌/final_report_123_whole_report.pdf>, archived at <https://perma.cc/7MB8-XE2V> (‘Serious Invasions of Privacy’).

[54] Ibid 51–3 [3.50].

[55] Australian Human Rights Commission, Human Rights and Technology (Final Report, 1 March 2021) 119–20 <https://humanrights.gov.au/our-work/rights-and-freedoms/publications/hum‌a‌n-rights-and-technology-final-report-2021>, archived at <https://perma.cc/2PQN-HJW‌8>.

[56] Ibid 121–3.

[57] Ibid 122 n 673, quoting Privacy Commissioner v Telstra Corporation Ltd [2017] FCAFC 4; (2017) 249 FCR 24,

36 [63] (Kenny and Edelman JJ, Dowsett J agreeing at 25 [3]).

[58] Australian Competition and Consumer Commission (n 30) 3.

[59] Taylor argues that extending privacy protection to groups is essential to effectively protect

people from harm within a modern information society in which data relating to multiple persons (group data) will be the data ‘that fundamentally reshapes the conditions under which future generations live and work’: Mark J Taylor, ‘“Personal Information” and Group Data under the Privacy Act 1988 (Cth)’ (2020) 94(10) Australian Law Journal 730, 740.

[60] Paterson, ‘The Uses of AI’ (n 11) 658–60, 665.

[61] Attorney-General’s Department (Cth), Privacy Act Review (Report, 2022) <https://www.a‌g.gov.au/sites/default/files/2023-02/privacy-act-review-report_0.pdf> (‘Privacy Act Review’). Recommendations include a proposed broader definition of ‘personal information’ in s 6 of the Privacy Act (n 10), replacing the restrictive ‘about an identified individual’ with the broader ‘relates to an identified individual’: Privacy Act Review (n 61) 23–7. However, the Privacy Act Review (n 61) appears to contain no proposals to extend privacy protection to groups rather than individuals, move away from the focus on ‘purpose’ of use or limit legislative overrides of privacy protection.

[62] Maria O’Sullivan, ‘The Privacy Act Review Report 2022: A Radical Review or Just a Re-Imagining?’ (2023) 51(1) Australian Business Law Review 52, 55.

[63] Australian Government, Government Response: Privacy Act Review Report (Report, 28 September 2023) 2, 4 <https://www.ag.gov.au/sites/default/files/2023-09/government-response-pr‌ivacy-act-review-report.PDF>. See also ibid 52.

[64] [1937] HCA 45; (1937) 58 CLR 479, 495–6 (Latham CJ), 508 (Dixon J), 526–7 (McTiernan J).

[65] (2001) 208 CLR 199, 328 [335]. See also at 248–9 [106]–[107] (Gummow and Hayne JJ, Gaudron J agreeing at 231 [58]).

[66] In the County Court of Victoria, Judge Hampel found that the publication of the complainant’s personal information by the Australian Broadcasting Company constituted a tortious action of invasion of privacy: Doe v Australian Broadcasting Corporation [2007] VCC 281, [157]. A common law right of privacy was also identified in the Queensland District Court case

Grosse v Purvis (2003) Aust Torts Reports 81–706, 64187 [446] (Skoien J).

[67] Farm Transparency International Ltd v New South Wales [2022] HCA 23; (2022) 403 ALR 1, 11 [39] (Kiefel CJ and Keane J), 22–3 [90] (Gageler J), 37–8 [159] (Gordon J), 55–6 [233]–[238] (Edelman J).

See also Warren CJ’s comment that ‘the question of whether such a [privacy] right exists at common law, and if so, its scope, is yet to be settled by the High Court or a superior court of record’: WBM v Chief Commissioner of Police [2012] VSCA 159; (2012) 43 VR 446, 465 [81] (Hansen JA agreeing at 475 [133]). For a brief discussion of the common law right, see Victorian Law Reform Commission, Surveillance in Public Places (Final Report No 18, May 2010) 128–30 [7.7]–[7.15]; Australian Law Reform Commission, For Your Information: Australian Privacy Law and

Practice (Final Report No 108, May 2008) vol 3, 2250–2 [74.61]–[74.69] <https://www.a‌lrc.gov.au/wp-content/uploads/2019/08/108_vol3.pdf>, archived at <https://perma.cc/9D77-EQHE>.

[68] See, eg, Privacy Act Review (n 61) 37 [4.4.1] n 207; Organization for Economic Co-Operation and Development, Guidelines on the Protection of Privacy and Transborder Flows of Data (Guidelines, 23 September 1980) pt 2 <https://www.oecd-ilibrary.org/oecd-guidelines-on-t‌he-protection-of-privacy-and-transborder-flows-of-personal-data_5lmqcr2k94s8.pdf?>, archived at <https://perma.cc/YK2V-DAWY> (‘Guidelines on the Protection of Privacy’).

[69] Privacy Act (n 10) ss 6 (definitions of ‘APP entity’ and ‘contracted service provider’), 6A(2), 6C(1), (3), 6D(1), (4)(e), 13(1), (3), 15. If a state government contractor is operating under an outsourcing arrangement, it will usually be covered by the PDPA (n 10) to the extent of that arrangement, and by the Privacy Act (n 10) for its internal business activities: see PDPA

(n 10) ss 3 (definitions of ‘contracted service provider’ and ‘State contract’), 17(2).

[70] Information Privacy Act 2000 (Vic) (‘Information Privacy Act’), as repealed by PDPA (n 10)

s 126. See also Victoria, Parliamentary Debates, Legislative Assembly, 12 June 2014, 2106–9 (Robert Clark, Attorney-General). The data security protections were added in part to address issues identified by the Victorian Auditor-General in a highly critical 2009 report which found that public sector information in Victoria was poorly protected with inadequate data sharing practices: Victorian Auditor-General’s Office, Maintaining the Integrity and Confidentiality of Personal Information (Parliamentary Paper No 258 of 2006–09, November 2009)

vii–viii, 22–4 <https://www.audit.vic.gov.au/sites/default/files/20091125-Data-Integrity-Full-Report.pdf?>, archived at <https://perma.cc/U4QS-CPH8>.

[71] See John Howard, Prime Minister, ‘Privacy Legislation’ (Media Release, 21 March 1997) <https://pmtranscripts.pmc.gov.au/sites/default/files/original/00010278.pdf>.

[72] See Information Privacy Act (n 70) ss 5, 9(1); PDPA (n 10) ss 1(a), (d), 13.

[73] Information Privacy Act (n 70) sch 1; Information Privacy Principles (n 9); Australian Privacy Principles (n 9).

[74] Information Privacy Principles (n 9) cl 1; Information Privacy Act (n 70) sch 1 cl 1.

[75] Information Privacy Principles (n 9) cl 2; Information Privacy Act (n 70) sch 1 cl 2.

[76] Information Privacy Principles (n 9) cls 2.1(a)(i), 3, 4, 6, 10; Information Privacy Act (n 70)

sch 1 cls 2.1(a)(i), 3, 4, 6, 10.

[77] PDPA (n 10) ss 84(1), 86.

[78] See Office of the Victorian Information Commissioner, IPP 2: Use and Disclosure (Guidelines, 4th ed, 14 November 2019) 4–5 [2.11]–[2.14] <https://ovic.vic.gov.au/wp-content/uploads/20‌19/11/IPP-2-2019.B.pdf>, archived at <https://perma.cc/7W6R-N3TM>.

[79] PDPA (n 10) s 3 (definition of ‘personal information’).

[80] See, eg, Mental Health and Wellbeing Act 2022 (Vic) pt 9.8; Road Safety Act 1986 (Vic) pt 7B; Marine Safety Act 2010 (Vic) pt 8.8A.

[81] See, eg, Judicial Proceedings Reports Act 1958 (Vic) s 4(1A); Gambling Regulation Act 2003

(Vic) s 10.1.34; Health Records Act 2001 (Vic) s 90; Independent Broad-Based Anti-Corruption Commission Act 2011 (Vic) s 44.

[82] Human Rights Act 2004 (ACT) (‘ACT HRA’); Human Rights Act 2019 (Qld) (‘Qld HRA’).

[83] Moira Paterson, ‘Privacy Rights and Charter Rights’ in Matthew Groves and Colin Campbell (eds), Australian Charters of Rights a Decade on (Federation Press, 2017) 203, 206.

[84] Charter of Human Rights and Responsibilities Act 2006 (Vic) (‘Victorian Charter’). The ACT HRA (n 82), which commenced in 2004, provided a legislative model for human rights legislation in Australian jurisdictions: see endnote 3.

[85] Explanatory Memorandum, Charter of Human Rights and Responsibilities Bill 2006 (Vic) 1, 7 (‘Charter Explanatory Memorandum’), discussing International Covenant on Civil and Political Rights, opened for signature 16 December 1966, 999 UNTS 171 (entered into force 23 March 1976) (‘ICCPR’). See Victorian Charter (n 84) pt 2.

[86] Michael Brett Young, From Commitment to Culture: The 2015 Review of the Victorian Charter of Human Rights and Responsibilities Act 2006 (Parliamentary Paper No 77 of 2014–15,

September 2015) 22–3 (‘2015 Review of the Victorian Charter’). The second reading speech stated that ‘[t]his bill seeks to achieve a rights respecting culture across government and the community’: Victoria, Parliamentary Debates, Legislative Assembly, 4 May 2006, 1295 (Rob Hulls, Attorney-General).

[87] Victorian Charter (n 84) Preamble.

[88] See, eg, Re Kracke and Mental Health Review Board (2009) 29 VAR 1, 125 [590]–[591] (Bell J) (‘Kracke’).

[89] Convention for the Protection of Human Rights and Fundamental Freedoms, opened for

signature 4 November 1950, 213 UNTS 221 (entered into force 3 September 1953) art 8 (‘ECHR’).

[90] See Thompson v Minogue [2021] VSCA 358; (2021) 67 VR 301, 316 [44] (Kyrou, McLeish and Niall JJA) (‘Thompson’). The unlawfulness must be independent of the Victorian Charter (n 84) and not merely because of s 38: Burgess v Director of Housing [2014] VSC 648, [219] (Macaulay J).

[91] Paterson, ‘Privacy Rights and Charter Rights’ (n 83) 225–6.

[92] Alistair Pound and Kylie Evans, Annotated Victorian Charter of Rights (Lawbook, 2nd ed, 2019) 115 [13.40].

[93] Thompson (n 90) 321 [74] (Kyrou, McLeish and Niall JJA).

[94] Victorian Charter (n 84) ss 31(3)–(4).

[95] Ibid s 31(7). See also ss 435 of the Qld HRA (n 82), but note that no such override provision exists in the ACT HRA (n 82).

[96] Julie Debeljak, ‘Of Parole and Public Emergencies: Why the Victorian Charter Override Provision Should Be Repealed’ [2022] UNSWLawJl 19; (2022) 45(2) University of New South Wales Law Journal 570, 571.

[97] Ibid 571 n 3. It is worth noting that the two prisoner overrides arguably did not meet the

‘exceptional circumstances’ threshold and the sunset provision was disapplied in both cases: at 593–4, 606, 610–11. The 2015 Review of the Victorian Charter (n 86) recommended the repeal of s 31: at 196–200.

[98] Charter Explanatory Memorandum (n 85) 23.

[99] (2011) 245 CLR 1 (‘Momcilovic’).

[100] Ibid 50 [51].

[101] Ibid 46 [43].

[102] Bruce Chen, ‘Revisiting Section 32(1) of the Victorian Charter: Strained Constructions and Legislative Intention’ [2020] MonashULawRw 6; (2020) 46(1) Monash University Law Review 174, 211–12.

[103] [1994] HCA 15; (1994) 179 CLR 427, 437 (Mason CJ, Brennan, Gaudron and McHugh JJ, Deane and

Dawson JJ agreeing at 446).

[104] [1987] HCA 12; (1987) 162 CLR 514, 523, quoted in ibid.

[105] Bruce Chen, ‘The Principle of Legality and s 32(1) of the Victorian Charter: Is the Latter a Codification of the Former?’ (2020) 31(4) Public Law Review 444, 448 (‘The Principle of

Legality’); Robert French, ‘The Principle of Legality and Legislative Intention’ (2019) 40(1) Statute Law Review 40, 40.

[106] Dan Meagher, ‘The Common Law Principle of Legality in the Age of Rights’ (2011) 35(2)

Melbourne University Law Review 449, 455.

[107] [1999] UKHL 33; [2000] 2 AC 115, 131.

[108] Chen, ‘The Principle of Legality’ (n 105) 445. See, eg, Nigro v Secretary, Department of Justice [2013] VSCA 213; (2013) 41 VR 359, 383 [85] (Redlich, Osborn and Priest JJA).

[109] Chen, ‘The Principle of Legality’ (n 105) 450, 453.

[110] Ibid 462.

[111] Momcilovic (n 99) 36–7 [18]–[19] (emphasis added).

[112] Ibid 89–90 [155]–[159]. Justice Gummow raised particular reservations about the application in Australia of House of Lords decisions under the Human Rights Act 1998 (UK), given the UK’s distinctly different legal framework: at 84 [146], 88 [151], 90 [160].

[113] [2015] VSCA 197; (2015) 48 VR 129, 189 [186] (‘Bare’), discussing Momcilovic (n 99) 37–8 [19]–[20]

(French CJ), 83–9 [146], 90 [159] (Gummow J).

[114] See Momcilovic (n 99) 36–7 [18] (French CJ).

[115] ‘Public authority’ is defined in the Victorian Charter (n 84) to include any Victorian public servant and a range of other bodies encompassing the Victorian public sector: at s 4(1).

[116] Note that under s 58(1) of the Qld HRA (n 82), the obligation to proceed compatibly with human rights applies to both acts and decisions. Under the Victorian Charter (n 84), it applies only to actions; the conduct obligation for decisions is to ‘give proper consideration to a

relevant human right’: at s 38(1).

[117] [2010] VSC 310; (2010) 28 VR 141, 184 [185].

[118] See Paul Barnes et al, ‘The Victorian Charter of Human Rights and Public Policy: An Exploration of Decision-Making Processes’ (ANZSOG Work-Based Project Reports No 2, November 2023) 12–16 <https://anzsog.edu.au/app/uploads/2023/11/WPB2-Barnes-et-al-2022-FINA​L.pdf>, archived at <https://perma.cc/5974-JLJU>. No particular format is required for a

Victorian Charter (n 84) assessment; agencies may develop it as a memo, a briefing note, or a legal advice: see PJB v Melbourne Health [2011] VSC 327; (2011) 39 VR 373, 442 [311] (Bell J) (‘PJB’).

[119] If the conduct is unlawful, it cannot be excused on the basis that it is reasonable; the balancing under s 7(2) only applies if the privacy-limiting conduct is considered arbitrary but not unlawful: Thompson (n 90) 318–19 [58] (Kyrou, McLeish and Niall JJA).

[120] There is a guide and template for privacy impact assessments: Office of the Victorian Information Commissioner, Privacy Impact Assessment Guide: Guide for Completing OVIC’s Template (Guide, April 2021) <https://ovic.vic.gov.au/privacy/privacy-impact-assessment/>,

archived at <https://perma.cc/92L2-WKS3>.

[121] Galloway (n 8) 365, citing Julie E Cohen, ‘What Privacy Is for’ (2013) 126(7) Harvard Law Review 1904, 1906.

[122] Neil M Richards, ‘The Dangers of Surveillance’ (2013) 126(7) Harvard Law Review 1934,

1945–6; Volker Boehme-Neßler, ‘Privacy: A Matter of Democracy’ (2016) 6(3) International Data Privacy Law 222, 227–8; Jeffrey H Reiman, ‘Privacy, Intimacy, and Personhood’ (1976) 6(1) Philosophy and Public Affairs 26, 39–40.

[123] Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life

(Stanford Law Books, 2009) 86–7, citing Regan, Legislating Privacy (n 16). An example of state support for privacy protection is the secret ballot, which was initially controversial but is now accepted as a key element of citizen participation in democracy: Annabelle Lever, On Privacy (Routledge, 2012) 23–4.

[124] Urs Gasser, ‘Recoding Privacy Law: Reflections on the Future Relationship among Law, Technology, and Privacy’ (2016) 130(2) Harvard Law Review Forum 61, 61.

[125] Oliver Diggelmann and Maria Nicole Cleis, ‘How the Right to Privacy Became a Human Right’ (2014) 14(3) Human Rights Law Review 441, 445.

[126] Samuel D Warren and Louis D Brandeis, ‘The Right to Privacy’ (1890) 4(5) Harvard Law

Review 193, 205.

[127] As an example, the Fourth Amendment to the United States Constitution emerged in response to the impacts of ‘general warrants’ in England and the new American colonies, which allowed the state to search private property without probable cause or specificity: Priscilla H Machado Zotti, Injustice for All: Mapp vs Ohio and the Fourth Amendment, ed David A Shultz (Peter Lang, 2005) 50–60.

[128] Megan Richardson, Advanced Introduction to Privacy Law (Edward Elgar Publishing, 2020) 12.

[129] Solove, ‘A Taxonomy of Privacy’ (n 31) 477.

[130] Raphaël Gellert and Serge Gutwirth, ‘Beyond Accountability, the Return to Privacy?’ in Daniel Guagnin et al (eds), Managing Privacy through Accountability (Palgrave Macmillan, 2012) 261, 261.

[131] James Rachels, ‘Why Privacy Is Important’ (1975) 4(4) Philosophy and Public Affairs 323, 326.

[132] Robert S Gerstein, ‘Intimacy and Privacy’ (1978) 89(1) Ethics 76, 76.

[133] Alan F Westin, Privacy and Freedom (Atheneum, 1967) 7.

[134] Roger Clarke, ‘Privacy and Social Media: An Analytical Framework’ (2014) 23(1) Journal of Law, Information and Science 169, 174.

[135] See, eg, Judith Jarvis Thomson, ‘The Right to Privacy’ (1975) 4(4) Philosophy and Public Affairs 295, 313.

[136] Catharine A MacKinnon, Toward a Feminist Theory of the State (Harvard University Press, 1989) 193–4.

[137] Bert-Jaap Koops et al, ‘A Typology of Privacy’ (2017) 38(2) University of Pennsylvania Journal of International Law 483, 484.

[138] Megan Richardson, ‘A Common Law of Privacy?’ [2021] (March) Singapore Journal of Legal Studies 6, 6.

[139] Prince Albert v Strange [1849] EngR 669; (1849) 1 Mac & G 25; 41 ER 1171, 1178 (Cottenham LC).

[140] Warren and Brandeis (n 126) 215.

[141] Duchess of Sussex v Associated Newspapers Ltd [2021] EWHC 273; [2021] 4 WLR 35, [170] (Warby J), affd Duchess of Sussex v Associated Newspapers Ltd [2022] 4 WLR 81, [104]–[106] (Vos MR, Sharp P agreeing at [107], Bean LJ agreeing at [108]).

[142] Universal Declaration of Human Rights, GA Res 217A (III), UN GAOR, UN Doc A/810 (10 December 1948).

[143] Ibid art 12; ICCPR (n 85) art 17.

[144] Paul De Hert and Serge Gutwirth, ‘Privacy, Data Protection and Law Enforcement: Opacity of the Individual and Transparency of Power’ in Erik Claes, Antony Duff and Serge Gutwirth (eds), Privacy and the Criminal Law (Intersentia, 2006) 61, 77–8.

[145] Guidelines on the Protection of Privacy (n 68) paras 15–18.

[146] Justice Michael Kirby, ‘Privacy Protection, a New Beginning: OECD Principles 20 Years on’ (1999) 6(3) Privacy Law and Policy Reporter 25, 25.

[147] De Hert and Gutwirth (n 144) 62, 78.

[148] Ibid 70.

[149] See above n 68 and accompanying text.

[150] Privacy Act (n 10) s 6 (definition of ‘sensitive information’); Australian Privacy Principles (n 9) cl 3.3; Information Privacy Principles (n 9) cl 10.1. The definition of ‘sensitive information’ is set out at the beginning of the Information Privacy Principles (n 9).

[151] Serious Invasions of Privacy (n 53) 30 [2.7]; Jurecek v Director, Transport Safety Victoria [2016] VSC 285; (2016) 260 IR 327, 345 [67] (Bell J) (‘Jurecek’).

[152] De Hert and Gutwirth (n 144) 74.

[153] Charter of Fundamental Rights of the European Union [2000] OJ C 364/1, art 8. Within the European Union, data protection was legislated in the Directive 95/46/EC of the European

Parliament and of the Council of 24 October 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data [1995] OJ L 281/31. This was replaced by the GDPR (n 22) for all European Union member states effective on 25 May 2018: at art 94.

[154] European Court of Human Rights, Guide to the Case-Law of the European Court of

Human Rights: Data Protection (Report, 31 August 2022) 31 [123] <https://echr.coe.int/Docum‌e‌‌nts/‌‌‌Guide_Data_protection_ENG.pdf>, archived at <https://perma.cc/Y4T2-4X6K>.

[155] Gellert and Gutwirth (n 130) 268. In the UK, the courts consider it settled law that art 8

of the ECHR (n 89) only has application where ‘the person in question had a reasonable

expectation of privacy’: Catt v Commissioner of Police of the Metropolis [2012] HRLR 23,

605 [24] (Gross LJ, Irwin J agreeing at 615 [68]), quoting Campbell v MGN Ltd [2004] UKHL 22; [2004] 2 AC 457, 466 [21] (Lord Nicholls). See also R v Association of Chief Police Officers of England, Wales and Northern Ireland [2015] UKSC 9; [2015] AC 1065, 1078 [4] (Lord Sumption JSC, Lord Neuberger PSC agreeing).

[156] De Hert and Gutwirth (n 144) 86–9. It is not clear that this critique continues to hold as strongly, given more recent cases in which the Court has recognised a contravention of art 8 ECHR (n 89) because a given use was not necessary in a democratic society. For example,

Catt v United Kingdom (2019) 69 EHRR 7 (‘Catt’) concerned the retention of footage of

the nonagenarian applicant at various peace protests: at 181–2 [5]–[6]. The Strasbourg Court

recognised the police had a legitimate aim for the collection of the footage (though it expressed concern about the ambiguity of the legal basis), but went on to hold that the retention of the footage was not ‘necessary in a democratic society’ and violated art 8 of the ECHR (n 89): Catt (n 156) 200 [107], 200–4 [109]–[128].

[157] De Hert and Gutwirth (n 144) 87.

[158] Jurecek (n 151) may be considered to give some support for the application of the privacy right to personal data, but this is complicated by the fact that neither party filed to bring the matter under the Victorian Charter (n 84) — the case was ultimately decided by reference to the

Information Privacy Act (n 70): Jurecek (n 151) 344 [62], 345 [65], 364 [174] (Bell J). See also Victorian Equal Opportunity and Human Rights Commission, The Charter of Human Rights and Responsibilities: A Guide for Victorian Public Sector Workers (Report, 2nd ed, June 2019) 30 <https://www.humanrights.vic.gov.au/static/e5f7d882b4ac4082403ad0f57606369c/Resource-Responsibilities-VPS_Charter_Guide.pdf>, archived at <https://perma.cc/A97H-N8B2>. This guide for public sector workers indicates a range of matters that would be covered by the

Victorian Charter (n 84) s 13 privacy right, including data collection and use.

[159] See, eg, Ferdinand David Schoeman, Privacy and Social Freedom, ed Douglas MacLean (Cambridge University Press, 1992) 1–2; Regan, Legislating Privacy (n 16) 1–5. Schoeman and Regan were among the first to focus specifically on the social elements of privacy: Priscilla M Regan, ‘Privacy and the Common Good: Revisited’ in Beate Roessler and Dorota Mokrosinska (eds), Social Dimensions of Privacy: Interdisciplinary Perspectives (Cambridge University Press, 2015) 50, 53–5.

[160] Julie E Cohen, ‘Examined Lives: Informational Privacy and the Subject as Object’ (2000) 52(5) Stanford Law Review 1373, 1427–8 (‘Examined Lives’).

[161] Gellert and Gutwirth (n 130) 270.

[162] Galloway (n 8) 365, citing Cohen, ‘What Privacy Is for’ (n 121) 1906. See also Galloway’s

comment that ‘at the heart of concerns about government use of big data is the intrusion of government into the lives of the individual beyond boundaries traditionally accepted in a

liberal democracy’: Galloway (n 8) 364.

[163] See Daniel J Solove, Understanding Privacy (Harvard University Press, 2009) 99; Nissenbaum (n 123) 81–2, 176–7.

[164] Dorota Mokrosinska, ‘Privacy and Autonomy: On Some Misconceptions concerning the

Political Dimensions of Privacy’ (2018) 37(2) Law and Philosophy 117, 127.

[165] Paterson and McDonagh (n 5) 7, 14–15.

[166] Stefano Rodotà, ‘Data Protection as a Fundamental Right’ in Serge Gutwirth et al (eds),

Reinventing Data Protection? (Springer, 2009) 77, 78 (citations omitted).

[167] Antoinette Rouvroy and Yves Poullet, ‘The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy’ in Serge Gutwirth et al (eds), Reinventing Data Protection? (Springer, 2009) 45, 57.

[168] Gellert and Gutwirth (n 130) 274.

[169] Graef and van der Sloot (n 12) 513.

[170] Ibid 535.

[171] Social contract theory has a long history, focusing on the hypothetical agreement between citizens in a well-ordered society around the kind of society they wish to live in: Fred D’Agostino, Gerald Gaus and John Thrasher, ‘Contemporary Approaches to the Social Contract’, Stanford Encyclopedia of Philosophy (Web Page, 27 September 2021) <https://plato.stanford.edu/entries/contractarianism-contemporary/>, archived at <https://perma.cc/K8B4-J7PB>.

[172] Melanie Davies, ‘Understanding the Social Licence To Operate’, Griffith University Law Futures Centre Blog (Blog Post, 3 August 2021) <https://blogs.griffith.edu.au/law-futures-centre/2021/08/03/understanding-the-social-licence-to-operate/>, archived at <http‌s‍‍‌:/‌‍‌/‌‍per‌ma.c‌‌c/97RB-45VV>.

[173] See, eg, Productivity Commission (Cth) (n 4) for a discussion of social licence in connection with public sector data sharing: at 177–80.

[174] Ibid 176.

[175] Ibid 177–80.

[176] Pam Carter, Graeme T Laurie and Mary Dixon-Woods, ‘The Social Licence for Research:

Why care.data Ran into Trouble’ (2015) 41(5) Journal of Medical Ethics 404, 408.

[177] Productivity Commission (Cth) (n 4) 178.

[178] Bernadette Richards and James Scheibner, ‘Health Technology and Big Data: Social Licence, Trust and the Law’ (2022) 29(2) Journal of Law and Medicine 388, 399.

[179] Ibid 389–90.

[180] See above Part II(B).

[181] Melissa de Zwart, Sal Humphreys and Beatrix van Dissel, ‘Surveillance, Big Data and Democracy: Lessons for Australia from the US and UK’ [2014] UNSWLawJl 27; (2014) 37(2) University of New South Wales Law Journal 713, 714, quoting David Lyon, Surveillance Studies: An Overview (Polity Press, 2007) 14.

[182] Solove, ‘A Taxonomy of Privacy’ (n 31) 488. See generally Jonathon W Penney, ‘Chilling Effects: Online Surveillance and Wikipedia Use’ (2016) 31(1) Berkeley Technology Law Journal 117.

[183] Richards (n 122) 1944–6; Cohen, ‘Examined Lives’ (n 160) 1425–6.

[184] See above Part II(A).

[185] de Zwart, Humphreys and van Dissel (n 181) 716.

[186] S v United Kingdom (2008) 5 Eur Court HR 167, 205 [112]. The Court made these comments in relation to the novel use of DNA technologies by UK law enforcement agencies.

[187] See, eg, Information Privacy Principles (n 9) cl 2.1(g); Australian Privacy Principles (n 9)

cl 6.2(e).

[188] See above n 80 and accompanying text.

[189] See above n 150.

[190] Surveillance Devices Act 1999 (Vic) pt 2 (‘Surveillance Devices Act’).

[191] See above n 81.

[192] Surveillance Devices Act (n 190) pt 2.

[193] Secrecy provisions tend to have a very narrow scope and apply criminal penalties to the

disclosure of specific information: see, eg, Judicial Proceedings Reports Act 1958 (Vic) s 4(1A); Gambling Regulation Act 2003 (Vic) s 10.1.34; Health Records Act 2001 (Vic) s 90; Independent Broad-Based Anti-Corruption Commission Act 2011 (Vic) s 44.

[194] Paterson, ‘Privacy Rights and Charter Rights’ (n 83) 225. See also at 213, 222, discussing PJB (n 118) 395 [85] (Bell J), and DPP (Vic) v Kaba [2014] VSC 52; (2014) 44 VR 526, 564 [132] (Bell J) (‘Kaba’).

[195] Paterson, ‘Privacy Rights and Charter Rights’ (n 83) 225.

[196] Thompson (n 90) 318 [57] (Kyrou, McLeish and Niall JJA). The Victorian Court of Appeal noted that the evidence necessary to establish that an interference is arbitrary or unlawful may be in the sole knowledge of the public authority, in which case the applicant may discharge the onus by pointing to objective circumstances which give rise to such an inference: at 316 [47].

[197] Charter Explanatory Memorandum (n 85) 13.

[198] Thompson (n 90) 306 [1]–[5] (Kyrou, McLeish and Niall JJA).

[199] Ibid 318 [55]. The Court added that an assessment of whether the conduct was arbitrary does not require consideration of the s 7(2) elements, just a ‘broad and general assessment’ of whether the interference goes beyond what is reasonably necessary to achieve the legitimate purpose: at 318 [56].

[200] Ibid 387–8 [317]–[318], 399 [360].

[201] Human Rights Committee, ‘General Comment Number 16: Article 17 (Right to Privacy)’ in Compilation of General Comments and General Recommendations Adopted by Human Rights Treaty Bodies, UN Doc HRI/GEN/1/Rev.9 (27 May 2008) vol 1, 192 [8].

[202] Kracke (n 88) 154 [746].

[203] Ibid 47–8 [178]–[181].

[204] (1984) 82 Eur Court HR (ser A) (‘Malone’).

[205] (1988) 130 Eur Court HR (ser A) (‘Olsson’).

[206] See Momcilovic (n 99) 36–7 [18]–[19] (French CJ), 89 [155] (Gummow J).

[207] Malone (n 204) 10–11 [12]–[18], 30 [62].

[208] Ibid 33 [68].

[209] Olsson (n 205) 30 [61], citing ibid 32 [67].

[210] (European Court of Human Rights, Chamber, Application No 58361/12, 20 July 2021)

2–4 [5]–[17].

[211] Ibid 30 [151]. This approach was affirmed in Haščák v Slovakia (European Court of Human Rights, Chamber, Application No 58359/12, 23 June 2022) 17 [87], [89].

[212] R v Hansen [2007] 3 NZLR 1, 62 [180], cited in Kracke (n 88) 51 [195].

[213] [2020] NZHC 2090; [2020] 2 NZLR 864, 871–2 [5]–[7] (Thomas, Venning and Ellis JJ) (‘Borrowdale’), discussing Health Act 1956 (NZ) s 70.

[214] Borrowdale (n 213) 908 [200], 912 [225]. Mr Borrowdale was unsuccessful with several causes of action and appealed the decision, but the Crown did not cross-appeal on this point:

Borrowdale v Director-General of Health [2021] NZCA 520; [2022] 2 NZLR 356, 363 [11] (Collins J).

[215] Kracke (n 88) 154 [744].

[216] Paterson observes that ‘[i]n order to qualify as law, however, a rule must be both accessible and sufficiently precise “to enable an individual to regulate his or her conduct accordingly”’:

Paterson, ‘Privacy Rights and Charter Rights’ (n 83) 211, quoting Human Rights Committee, General Comment No 34: Article 19 (Freedoms of Opinion and Expression), 102nd sess, UN Doc CCPR/C/GC/34 (12 September 2011) 6–7 [25]. See also Pound and Evans (n 92) 113 [13.40]. It is relevant to note the consistent language used in relation to the s 13 privacy right in many statements of compatibility submitted to Victorian Parliament that an interference ‘will be lawful if it is permitted by a law which is precise and appropriately circumscribed’: see, eg, Victoria, Parliamentary Debates, Legislative Assembly, 6 March 2024, 716 (Lily D’Ambrosio); Victoria, Parliamentary Debates, Legislative Council, 22 February 2024, 524–5 (Ingrid Stitt); Victoria, Parliamentary Debates, Legislative Assembly, 8 February 2024, 224 (Anthony Carbines); Victoria, Parliamentary Debates, Legislative Assembly, 7 February 2024, 114 (Ben Carroll).

[217] Most citations of Bell J’s comments on lawfulness in Kracke (n 88) are by Bell J himself: see, eg, Re Director of Housing and Sudi [2010] VCAT 328; (2010) 33 VAR 139, 153–4 [63]–[68] (Bell J); PJB (n 118)

396 [91] (Bell J); Kaba (n 194) 570 [152] (Bell J). But see Certain Children v Minister for Families and Children [No 2] [2017] VSC 251; (2017) 52 VR 441, 504 [201] (John Dixon J).

[218] [2021] VSCA 200; (2021) 64 VR 270, 305 [152] (Beach, Kyrou and Kaye JJA).

[219] Ibid 272 [2].

[220] Ibid 302 [139], 311 [177].

[221] Ibid 308 [165].

[222] Thompson (n 90) 317 [49] (Kyrou, McLeish and Niall JJA), citing ibid 305 [152].

[223] Thompson (n 90) 360 [214] (Kyrou, McLeish and Niall JJA).

[224] Ibid 392 [332]–[333].

[225] See above Part III(A).

[226] (2017) 261 CLR 328, 373 [149] (Kiefel CJ, Bell and Keane JJ).

[227] Certain Children v Minister for Families and Children [2016] VSC 796; (2016) 51 VR 473, 496 [143] (Garde J), quoting Re Application under the Major Crimes (Investigative Powers) Act 2004 [2009] VSC 381; (2009) 24 VR 415, 434 [80] (Warren CJ), and De Bruyn v Victorian Institute of Forensic Mental Health [2016] VSC 111; (2016) 48 VR 647, 691 [126] (Riordan J).

[228] See above n 227.

[229] See, eg, the recent Queensland Industrial Relations Commission case where O’Connor V-P cited only Victorian authorities on lawfulness and concluded there was no unlawful interference: Hunt v Department of Agriculture and Fisheries (Qld) [2022] QIRC 162, [168]–[171].

[230] See above n 117 and accompanying text.

[231] It appears that some Victorian public servants are already following this approach when drafting statements of compatibility under s 28 of the Victorian Charter (n 84), because it is

common for the wording in relation to the privacy right to reference the adequacy of the legal basis: see above n 216.

[232] Gellert and Gutwirth (n 130) 274.

[233] See above n 194 and accompanying text.

[234] Paterson, ‘The Uses of AI’ (n 11) 665.

[235] Ibid. Under the GDPR (n 22), the overriding law must be one ‘to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests’: at art 22(2)(b).

[236] Apart from some minor amendments to definitions, notes, pronouns and process, the Victorian Charter (n 84) has not yet had any significant amendment — not even to implement the recommendations of the mandated 2015 Review of the Victorian Charter (n 86): Victorian Charter (n 84) endnote 2.

[237] See above nn 94–5 and accompanying text.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/MelbULawRw/2024/8.html