![]() |
Home
| Databases
| WorldLII
| Search
| Feedback
UNSW Law Society Court of Conscience |
WHEN ‘PRIVATE THOUGHTS’ ARE NO LONGER PRIVATE
By Lorraine Finlay,* Patrick J Hooton# & Andrea Olivares Jones‡
I Introduction
The human mind is what makes us who we are. It stores our memories, controls our emotions and allows us to problem solve. Throughout human history, the boundary between private thought and the external world has been an impassable obstacle. However, the rise of neurotechnologies changes this – as previously unquestionable comments such as ‘that is how I feel’ can now potentially be analysed and challenged.[1] The rise of neurotechnology undoubtedly raises new questions about how legislation can protect the human right to privacy and heightens the urgency of this conversation.
While neurotechnology may sound more science-fiction than fact, it is not new — one early example is deep brain stimulators using electric pulses to the basal ganglia to help eliminate tremors often associated with Parkinson’s disease since 1997.[2] Yet neurotechnologies are advancing at a significant pace, with bold claims by Elon Musk that it will lead to an era of transhumanism.[3] While such claims should be treated with scepticism, it is true that neurotechnologies today are increasingly being developed and deployed across the globe. While the technology undoubtedly has beneficial applications it may also pose serious human rights risks.
It is for this reason that several groups have emerged seeking to ensure regulatory and ethical guardrails are in place. The Organisation for Economic Co-operation and Development (‘OECD’), United Nations Educational, Scientific and Cultural Organization (‘UNESCO’) and the United Nations Human Rights Council (‘UNHRC’) are all pursuing policy agendas for the safe development of neurotechnology.[4] Domestically the Australian Human Rights Commission (‘AHRC’) has also taken a proactive approach to neurotechnology. In the AHRC’s ‘Protecting Cognition: Background Paper on Neurotechnology and Human Rights’,[5] significant consideration is given to the ethical and legislative issues surrounding neurotechnology. This is an ongoing issue for the AHRC, which also held a symposium in June 2024, in partnership with the University of Melbourne, that was attended by notable academics, law firms and regulators including the Age Discrimination Commissioner, Human Rights Commissioner, Privacy Commissioner and eSafety Commissioner.
II What is Neurotechnology
Before delving into the risks associated with neurotechnology, it is important to have a preliminary understanding of the technology, and how it operates – albeit at a high level. Neurotechnology can be defined as those ‘...devices and procedures used to access, monitor, investigate, assess, manipulate and/or emulate the structure and function of the neural systems of animals or human beings ... They are meant to either record signals from the brain and “translate” them into technical control commands, or to manipulate brain activity by applying electrical or optical stimuli’.[6]
A core component of many neurotechnologies are brain computer interfaces (‘BCI’) – which are the devices responsible for connecting the brain to a computer or other device.[7] The BCI plays a facilitative role in bi-directional communication between the external device and the brain itself – monitoring brain activity, intervening in brain activity or some combination of the two.[8] BCIs are usually connected to the brain either through implantation into the human skull (this can be done via open brain surgery or endovascular means) or via non-implantable means in the form of a wearable device.[9] It is worth noting that BCIs are just one core medium for interacting and analysing neural data – other modes exist and will continue to develop.
Where a BCI is implanted via surgical means, the device will usually be placed directly onto the brain.[10] The BCI, generally utilising electrodes, will then send brain data to a computer or other device for analysis, decoding and operational functions.
Non-implantable devices often take the forms of existing products people are familiar with; smart watches, glasses and helmets. Apple has even registered a patent recently for AirPods which are capable of monitoring brainwaves.[11] It is these less invasive forms of neurotechnologies which are most prominent in consumer-oriented neurotechnology devices.[12]
The meteoric rise of neurotechnological capabilities in recent years has been staggering, providing opportunities to collect and utilise brain data to better understand and intervene with the human mind.[13] This can drastically improve the quality of life for individuals. For example, allowing people to walk again[14] or increasing our understanding of chronic pain.[15]
Even more futuristic applications are possible, especially when artificial intelligence (‘AI’) and neurotechnologies are combined. One example has been the integration of AI and neurotechnology to effectively translate brain activity. In one experiment AI analysed functional magnetic resonance imaging (‘fMRI’) scans (which measure the flow of blood into different areas of the brain) and was able to translate that into readable language.[16] The experiment worked by researchers playing a recorded story to participants while they underwent fMRI scans. The original transcript of the recording stated ‘I got up from the air mattress and pressed my face against the glass of the bedroom window expecting to see eyes staring back at me but instead only finding darkness’.[17]
The researchers were particularly interested in how accurately the AI translation would replicate the original transcript. The AI decoded the brain activity and produced the following transcript: ‘I just continued to walk up to the window and open the glass i stood on my toes and peered out i didn’t see anything and looked up again i saw nothing’.[18] Although the words and meaning are somewhat skewed, the basic premise of the transcript was preserved, giving rise to claims that AI can now ‘read minds’.[19] While ‘mind reading’ is an inaccurate representation of the study, it does highlight how private thoughts could be decoded and accessed by external parties, and gives a clear sense of the direction in which this could develop in the future.
III Human Right To Privacy
Although there are several key human rights risks examined throughout neurotechnological literature, one of the most prominent is the right to privacy under article 17 of the International Covenant on Civil and Political Rights and article 12 of the Universal Declaration of Human Rights (although the right is also protected under several other international instruments).[20]
Neurotechnology notably provides access to an unprecedented level of sensitive information – primarily brain data.[21] Access to brain data could inevitably allow users to be analysed and have their behaviours and attitudes predicted with a high degree of accuracy.[22] This could provide detailed insights into everything from a person’s sexual orientation, health or political views.[23]
The collection, maintenance and usage of online personal information by government, third parties and social media has long threatened the right to privacy. The gathering of limited pieces of data (location, browser history, age etc) may seem innocuous. However, these small pieces of data can, when placed together, paint a more detailed profile of an individual – a notion termed the ‘mosaic effect’.[24] With the rise of neurotechnology, it isn’t hard to imagine that these mosaic profiles will soon become digital mirrors of users, as brain data provides a new level of insight into people’s thoughts, feelings and emotions. This is especially concerning given a recent report found that of 30 non-medical neurotechnology companies analysed, 96% became the proprietor of collected neural data – while a further 66% had the right to share or sell that data onto third parties.[25]
Neural data may also be used by companies to advertise their product at times where the consumer is most open to make a purchase - known as 'nudging'. This could see companies take advantage of individuals' neural information to, for example, push advertisements for Red Bull when a person is drowsy, or yoga classes when they are stressed — often even before the individual realises that they are experiencing tiredness or stress.
More concerning is how this can have broader societal impacts, such as if brain data can reveal political leanings, which would then allow governments and advertisers to potentially target their political messaging to manipulate voters with unseen levels of specificity. For example, access to neural data may allow parties to determine the exact popularity of candidates or policies with specific voters. It isn’t hard to imagine when brain data and large language models, like ChatGPT, are used in tandem to deliver tailored and persuasive political content to viewers.
A Privacy Legislation
Australia’s central piece of privacy legislation is the Privacy Act 1988 (Cth) (‘Privacy Act’). The Privacy Act is principles-based, emphasising a technology-neutral approach to regulating how entities collect, use and disclose information.[26] This approach to the Privacy Act ensures it remains flexible to address new and emerging technologies which may threaten privacy in novel ways. First drafted in the 1980s, the Privacy Act has been reformed numerous times – most recently the Attorney-General’s Department has embarked upon a consultative process designed to bring the Privacy Act into the digital age. The federal government released its response paper in September 2023 and at the time of writing is expected to introduce reforms in the second half of 2024.[27] Although there was a focus on how the Privacy Act could be updated to address AI and other new and emerging technologies (such as facial recognition)[28] there was no specific reform agenda in respect of neurotechnology. However, the Privacy Act adopts a flexible approach and so specific legislative reform on neurotechnology may not be necessary.
Proposed reforms to the Privacy Act will include changes to the definitions of ‘personal information’.[29] Currently personal information includes ‘information or an opinion about an identified individual, or any individual who is readily identifiable: (a) whether the information or opinion is true or not; and (b) whether the information or opinion is recorded in a material form or not’.[30] Information deemed to be personal information under the Privacy Act carries with it obligations surrounding the collection, use of disclosure of such information.[31] The federal government has agreed-in-principle to amend the definition of personal information to replace ‘about’ with ‘relates to’.[32] This will clarify that personal information is an expansive notion which includes both technical and inferred information.[33] This change will be supported by a non-exhaustive list of information that could be considered personal information under the Privacy Act to assist entities in identifying relevant types of personal information.[34] As noted in the Privacy Act Review Final Report such a list could include ‘one or more features specific to the physical, physiological, genetic, mental, behavioural, economic, cultural or social identity or characteristics of a person’.[35] As noted by the AHRC these changes may result in neural data being captured as personal information under the Privacy Act.[36]
Yet the Privacy Act also places more strict obligations on regulated entities in respect of ‘sensitive information’. Sensitive information, for the purpose of the Privacy Act, encapsulates several types of information — including health information.[37] The AHRC notes that neural data collected by organisations may also meet the definition of sensitive information.[38] Given the immense importance of neural data and the implications it can have on a person’s right to privacy – this additional level of protection under sensitive information would be welcomed.
Until reforms are enacted it remains unclear the level of protection neural data would attain under the existing Privacy Act. Even once reforms have been passed, the answer may be no clearer. Privacy legislation needs to be clear on how neural data is treated. There has been significant action in this space outside of Australia, after the US state of Colorado signed House Bill 24-1058 to define and protect neural data — extending privacy legislation to include data collected by non-medical consumer neurotechnologies.[39] Equally the California Senate Judiciary Committee approved Bill SB1223, which seeks to extend existing consumer protections to an individual’s neural data.[40] Although Australia’s privacy regime is technology-neutral and such targeted reform may not transfer directly from the US approach, clarification can still be achieved in Australia. The Office of the Australian Information Commissioner (‘OAIC’) regularly produces guidance materials on the interaction between the Privacy Act (including the Australian Privacy Principles) and information obligations and best practices. Guidance on neural data could provide essential insights as to how such data from both medical and non-medical devices is treated under the Privacy Act and would provide needed clarity.
Such action on neurotechnology would not be uncommon and follows the United Kingdom’s Information Commissioner’s Office which published substantive work on the topic in its ‘ITO Tech Future: Neurotechnology’ report.[41] The report makes bold predictions about how neurotechnology will impact people’s lives, and how it will impact the privacy of users.[42]
IV Conclusion
The human right to privacy is increasingly being challenged by new and emerging technologies. In the digital age, it appears that many people have relinquished their privacy rights in exchange for being able to participate in modern life. There are already several policy and legislative initiatives around the globe seeking to claw back the privacy of people online — but very little of this relates to neural information. Without safeguards in place, neurotechnology may pose an elevated threat as neural data can be decoded and even sold to third parties. The risks associated with neural information are not on the ‘radar’ of many countries despite the more sensitive nature of that data.
To ensure that our private thoughts and feelings remain confidential, in Australia we need to deeply consider how neural information will be treated under the Privacy Act and whether any gaps exist. While upcoming privacy law reforms will go some way to addressing serious shortcomings in the Australian jurisdictions, more is needed to specifically tackle neural data. Until this is done there is a risk that neurotechnologies could gain access to incredibly personal data and may be misused in ways that result in serious harms for individuals.
* Lorraine Finlay is AustraliaR[1]s Human Rights Commissioner.
** Patrick J Hooton is a Human Rights Advisor at the Australian Human Rights Commission. He holds a Juris Doctor from Monash University and a Master of Law from the University of Melbourne.
‡ Andrea Olivares Jones is a Project Officer at the Australian Human Rights Commission. She holds Masters in European and International Human Rights Law from Leiden University in the Netherlands, and a Bachelor of Law (Honours)/Bachelor of Arts from Monash University.
1 Sjors Ligthart et al, ‘Minding Rights: Mapping Ethical and Legal Foundations of ‘Neurorights’’ (2023) 32(4) Cambridge Quarterly of Healthcare Ethics 461, 466.
[2] The Neurorights Foundation, Market Analysis Neurotechnology (Report, March 2023) 14; UNESCO et al, The Risks and Challenges of Neurotechnologies for Human Rights (UNESCO, 2023) 10 (‘Risks and Challenges of Neurotechnologies’).
[3] Neil Hughes, ‘Transhumanism and Neuralink: The Dawn of Digitally Enhanced Humans’, Cybernews (online, 10 June 2023) <https://cybernews.com/editorial/transhumanism-and-neuralink/>.
[4] See, eg, Organisation for Economic Co-operation and Development, Neurotechnology Toolkit (Recommendation Paper, April 2024); International Bioethics Committee, Report of the International Bioethics Committee of UNESCO (IBC) on the Ethical Issues of Neurotechnology, UN Doc SHS/BIO/IBC-28/2021/3 Rev (15 December 2021) (‘Report of the International Bioethics Committee’). The UNHRC has requested the Advisory Committee to prepare a study on the opportunities and challenges of neurotechnology with regard to the promotion and protection of all human rights: see ‘Neurotechnology and Human Rights’, United Nations Human Rights Council (Web Page, 2024) <https://www.ohchr.org/en/hr-bodies/hrc/advisory-committee/neurotechnologies-and-human-rights>.
[5] See generally Australian Human Rights Commission, Protecting Cognition: Background Paper on Neurotechnology and Human Rights (Background Paper, March 2024).
[6] Report of the International Bioethics Committee (n 4) 5.
[7] The Neurorights Foundation (n 2) 3.
[8] Ibid; Ligthart et al (n 1) 463.
[9] The Law Society, Neurotechnology, Law and the Legal Profession (Report, August 2022) 4; The Neurorights Foundation (n 2) 3.
[10] The Neurorights Foundation (n 2) 14.
[11] Yacine Achiakh and Lauren Sarda Dutilh, ‘Industry News - Apple Patents a Next-generation AirPods Sensor System’, Wisear (online, 27 July 2023) <https://www.wisear.io/posts/industry-news-apple-patents-a-next-generation-airpods-sensor-system>.
[12] The Neurorights Foundation (n 2) 5.
[13] Marcello Ienca and Roberto Andorno, ‘Towards New Human Rights in the Age of Neuroscience and Neurotechnology’ (2017) 13(5) Life Sciences, Society and Policy 1, 1.
[14] Oliver Whang, ‘Brain Implants Allow Paralyzed Man to Walk Using His Thoughts’, The New York Times (online, 24 May 2023) <https://www.nytimes.com/2023/05/24/science/paralysis-brain-implants-ai.html>.
[15] Nidhi Subbaraman, ‘In the Brain, Scientists Find New Clues to Treating Chronic Pain’, The Wall Street Journal (online, 22 May 2023) <https://www.wsj.com/articles/brain-study-finds-clues-to-treating-chronic-pain-a19be9fb>.
[16] See generally Jerry Tang et al, ‘Semantic Reconstruction of Continuous language from Non-invasive Brain recordings’ (2023) 26(5) Nature Neuroscience 858.
[17] Ibid 859.
[18] Ibid.
[19] Led Kim, ‘AI-Powered ‘Thought Decoders’ Won’t Just Read Your Mind – They’ll Change it’, Wired (online, 12 September 2023) <https://www.wired.com/story/ai-thought-decoder-mind-philosophy/>.
[20] See Convention on the Rights of the Child, opened for signature 20 November 1989, 1577 UNTS 3 (entered into force 2 September 1990) art 16; International Convention on the Protection of the Rights of All Migrant Workers and Members of Their Families, opened for signature 18 December 1990, 2220 UNTS 3 (entered into force 1 July 2003) art 14; Convention on the Rights of Persons with Disabilities, opened for signature 30 March 2007, UN Doc A/61/611 (entered into force 3 May 2008) art 22; African Charter on the Rights and Welfare of the Child, opened for signature 1 July 1990, CAB/LEG/153/Rev 2, (entered into force 29 November 1999) art 10; American Convention on Human Rights, opened for signature 22 November 1969, 1144 UNTS 123 (entered into force 18 July 1978) art 11; Convention for the Protection of Human Rights and Fundamental Freedoms, opened for signature 4 November 1950, 213 UNTS 221 (entered into force 3 September 1953) art 8.
[21] The Risks and Challenges of Neurotechnologies (n 2) 30.
[22] Ligthart et al (n 1) 7.
[23] Ibid.
[24] Australian Human Rights Commission, Human Rights and Technology Final Report 2021 (Final Report, 2021) 115. See also David Pozen, ‘The Mosaic Theory, National Security, and the Freedom of Information Act’ [2005] YaleLawJl 21; (2005) 115(3) Yale Law Journal 628.
[25] Jared Genser, Stephen Damianos and Rafael Yuste, Safeguarding Brain Data: Assessing the Privacy Practices of Consumer Neurotechnology Companies (Report, NeuroRights Foundation, April 2024) 2–3.
[26] Australian Human Rights Commission, Protecting Cognition: Background Paper on Neurotechnology and Human Rights (Background Paper, March 2024) 28.
[27] See, eg, Attorney-General’s Department (Cth), Government Response Privacy Act Review Report (Government Response, September 2023) (‘Government Response Privacy Act Review Report’).
[28] Ibid 6, 10–12, 23, 28.
[29] Government Response Privacy Act Review Report (n 27) 21.
[30] Privacy Act 1988 (Cth) s 6(1) (‘Privacy Act’).
[31] Australian Human Rights Commission (n 26) 28.
[32] Government Response Privacy Act Review Report (n 27) 21.
[33] Ibid.
[34] Ibid 5.
[35] Ibid 28–9.
[36] Australian Human Rights Commission (n 26) 28.
[37] Privacy Act (n 30) s 6(1).
[38] Australian Human Rights Commission (n 26) 28.
[39] See generally House Bill 24-1058. See also Sigal Samuel, ‘Your Brain’s Privacy is at Risk. The US just Took its First Big Step Toward Protecting it’, Vox (Web Page, 19 April 2024) <https://www.vox.com/future-perfect/24078512/brain-tech-privacy-rights-neurorights-colorado-yuste>.
[40] See generally Bill SB1223.
[41] See generally Information Commissioner’s Office (UK), ICO Tech Futures: Neurotechnology (Report, 1 June 2023).
[42] Ibid.
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/journals/UNSWLawSocCConsc/2024/5.html