Home
| Databases
| WorldLII
| Search
| Feedback
Precedent (Australian Lawyers Alliance) |
SAFE, BUT FOR WHOM?
COMMONWEALTH E-SAFETY CHANGES
By Dr Bruce Baer Arnold
The Online Safety Act 2021 (Cth) and Online Safety (Transitional Provisions and Consequential Amendments) Act 2021 (Cth) (the Acts) received Royal Assent on 23 July and are due to commence in early 2022. They provide the basis for an expansive and potentially problematic digital content regulation regime centred on decision-making by a single government agency with extensive powers.
This article argues that the regime is overly broad and weakly accountable, a matter of laudable objectives but administratively at odds with contemporary community values and expectations that morality cannot be determined by a single individual. The Acts disregard the Content Classification Review and Privacy Act Review, both underway at the time of writing. They embody a culture in which problems are to be ‘solved’ by an executive’s decision-making that is not necessarily transparent, might be idiosyncratic and contrary to respect for sexual diversity, and can be seen to have unduly limited scope for correction through conventional processes for reviewing administrative error.
This article seeks to contextualise and critique the Acts, referring to parallel developments overseas and questioning the regime that will be established by the new legislation. It suggests that we should look beyond the term ‘online safety’ and seek to understand whether this new regime can be just and proportionate rather than fulfilling a vote-winning promise made during the 2019 national election campaign.
ORIGINS
Australia’s content regulation regime, in particular restrictions on the publication and redistribution of images and other material that is at odds with generally accepted values regarding harm, has historically been conflicted. Regulation has typically been justified through reference to a compelling need to protect the vulnerable.
Vulnerability is a culturally and hence temporally contingent notion, with censorship over the past two centuries categorising, for example, women, minors, servants or other members of the working class, the poor, and First Nations people as variously needing protection under colonial, state or territory and Commonwealth law against obscene publications and communication deemed likely to deprave, corrupt or otherwise injure. Vulnerability has historically been politicised, with sporadic moral panics about the promotion of fertility services,[1] the distribution of comics, novels by authors such as D H Lawrence, Robert Close or Philip Roth,[2] films by auteurs such as Pier Paolo Pasolini, speeches in favour of the Soviet Union or the decolonisation of Ireland,[3] and work by artists ranging from Michelangelo and Bill Henson to Robert Mapplethorpe and Andres Serrano.[4] The notion of vulnerability has provided opportunities for corruption by regulatory bureaucracies such as state police vice squads and also at the Commonwealth level, for example within the postal and customs agencies.[5]
Regulation of content has involved activity at the state, territory and Commonwealth levels that has seen inconsistencies in classification codes (for content permissible in a particular jurisdiction) and enforcement. Historically most regulation involved the states, with the Commonwealth relying on interdiction of distribution through the post and at ports, alongside supervision of private and public sector broadcasters.[6] Seizure of prohibited content continues to occur, but public understanding about the Commonwealth’s role in content regulation has come to centre on the telecommunications power – in other words, the national government’s authority under the Constitution to make law on digital communication systems.[7] That is a reflection of both the ubiquity of digital content (including the shift of ‘old media’ to digital platforms) and the persuasiveness of successive governments in presenting the online world as one where dangers necessitate strong remedies.
The history of law, politicised administration and agitation about what people can or cannot do or view is useful for understanding the current Commonwealth regime and the two new Acts. In essence, the Acts are heirs of Federation-era legislation regarding the postal and telegram (later telephone) systems. They reflect both past anxieties and achievements, alongside the absence of a constitutionally-enshrined Bill of Rights recognising a broad freedom of expression.[8] They also reflect overseas legislation, for example the UK’s Online Safety Bill,[9] marketed as globally ‘groundbreaking’ and establishing ‘a new age of accountability for tech and bring[ing] fairness and accountability to the online world’.[10] They elide substantive concerns about administration which are evident in the ongoing debate about Australia’s content classification schemes, as highlighted in the current Classification Review.[11]
A coherent Australian framework for content regulation is necessary and respect for human rights requires some restriction of content making and dissemination. Such restriction, however, should be transparent, proportionate and accountable.
About the Acts
The Acts are being promoted as a smart and progressive step forward as part of the Morrison Government’s digital agenda.[12] The Acts exist alongside initiatives such as the large-scale vetting of people to protect critical infrastructure,[13] the RoboDebt scheme,[14] authorisation to delete or otherwise alter online content,[15] the facilitation of inadequately regulated intergovernmental data sharing,[16] and privacy-erosive projects such as the Interoperability Hub and National Driver Licence Facial Recognition Solution.[17] The poor outcomes and administrative over-reach of those initiatives justifies scepticism about official rhetoric such as the following about the new online safety regime:
‘For the first time anywhere in the world, eSafety will formally begin operating a new adult cyber abuse scheme to finally give Australian adults who are the victims of seriously harmful online abuse, somewhere to turn when the platforms fail to act.
And there will be significant financial penalties for perpetrators, so trolls will no longer feel safe to perpetuate abuse and online hate with impunity.’[18]
The legislation originated ahead of the 2019 Commonwealth election, with the then Minister for Communications announcing an independent review of Australia's ‘online safety legislation’, specifically the Broadcasting Services Act 1992 (Cth)[19] and Enhancing Online Safety Act 2015 (Cth). The latter Act deals with the function, governance and power of the Office of the eSafety Commissioner, originally concerned with the safety of children using the internet.
The review reported that, although the current regulatory arrangements were effective, ‘major reform’ was needed to meet community expectations, in particular through a single Online Safety Act.[20] The Liberal Party accordingly announced a ‘Keeping Australians Safe Online’ policy, with a commitment to introducing a new Online Safety Act ‘to ensure our laws keep pace with tech change and to make sure safety is embedded in the online world’.[21] The Hon Christian Porter MP indicated that penalties for online abuse would be strengthened to reflect community expectations that online crime should be treated as seriously as offline crime, a claim somewhat at odds with the reality of existing state, territory and Commonwealth law regarding online offences.[22]
The resultant Acts seek to expand the authority of the eSafety Commissioner to being concerned with online safety generally rather than focusing specifically on children’s safety online. The new regime is multifaceted. It addresses the non-consensual sharing of intimate images; broadens the cyber-bullying scheme to capture harms occurring on services other than social media; establishes an online content scheme for the removal of material; creates a complaints-based removal notice scheme for cyber abuse perpetrated against an Australian adult; and reduces the timeframe for service providers to respond to a removal notice from the Commissioner. It brings providers of app distribution services and internet search engine services into the remit of the new online content scheme. It also establishes a power for the Commissioner to request or require internet service providers to disable access to material depicting, promoting, inciting or instructing in ‘abhorrent violent conduct’ for time-limited periods in crisis situations, building on the blocking enactment fast-tracked after the 2019 lone wolf terrorist incident in Christchurch.[23]
The regime features five schemes to deal with different categories of harmful online material, including a new adult cyber abuse scheme. The Cyberbullying Scheme provides for the removal of online material harmful to Australian children, with the take-down time for such material being reduced from 48 hours to 24 hours. The Abhorrent Violent Material Blocking Scheme mirrors existing provisions in the Criminal Code Act 1995 (Cth), providing for the blocking of abhorrent violent material such as video or still images of terrorist incidents.[24] The Adult Cyber Abuse Scheme extends the Cyberbullying Scheme to adults, covering the removal of material that seriously harms – it has a higher threshold of ‘harm’ to reflect the greater resilience of adults. The Image-based Abuse Scheme provides for the removal of intimate images shared without the depicted person’s consent, with the take-down time under the Enhancing Online Safety Act 2015 being reduced to 24 hours. The Online Content Scheme simplifies the current regime in Schedules 5 and 7 of the Broadcasting Services Act, providing for the removal of harmful material in certain circumstances and extending the eSafety Commissioner’s take-down powers.
The Commissioner remains an official of the Australian Communications and Media Authority for the purposes of the Public Governance, Performance and Accountability Act 2013 (Cth) but is otherwise independent of that agency. The Commissioner will have strong investigative and disclosure authority, including the power to summon a person by written notice to produce documents or answer questions, alongside the power to obtain information about the contact details or identity of an end-user from a social media service, relevant electronic service or designated internet service. Part 15 of the Online Safety Act enables the Commissioner to disclose information to the Minister, public service employees for the purpose of advising the Minister, Royal Commissions, other entities, teachers or school principals, and parents or guardians. Such disclosure is not inherently repugnant but should be construed in relation to the flaws of the Privacy Act 1988 (Cth); the ongoing difficulties of the Office of the Australian Information Commissioner resulting from under-resourcing; the absence of a cause of action for serious disregard of privacy; and the protection of the eSafety Commissioner from civil liability.
DISPROPORTIONATE POWER – A CAUSE FOR CONCERN
Looking beyond the desirability of reducing harms, and consistent with the Australian Competition and Consumer Commission’s concern in the 2019 Digital Platforms report to foster responsibility on the part of service providers, the details of the new regime require a high level of critique.[25] This is evident in submissions by the Australian Lawyers Alliance (ALA) and other bodies to the Senate Environment and Communications Legislation Committee’s 2021 inquiry into the Bills.[26] That inquiry was fast-tracked, an example of the Government’s management of contentious legislation that deserves a more informed and hence longer consultation, as did the law enforcement enactments considered in the article by Mann and Murray in this edition of Precedent (see pp. xx to xx).
Preventing harm is laudable, but the proposed regime takes liberties with expectations regarding administration. It enshrines the Commissioner as a position with strong and wide-ranging powers but little accountability and, given the politicised nature of content classification, gives scope for subjective decision-making about expression that is not violent and not intended to harm. The Commissioner commented that:
‘We remain a very small but nimble agency that receives a comparatively high number of complaints, so we also need to ensure that reporting requirements are not so onerous that they undermine our ability to provide compassionate citizen service. If the proposed legislation is enacted, we will release detailed regulatory guidance and publicly release our regulatory priorities.’[27]
It is disquieting that at the time of writing those priorities have not been publicly released, given that they were presumably considered during the drafting of the Acts and in the small-scale stakeholder consultation prior to release of the Bills. If the resourcing of the agency is inadequate to enable reporting, this should be addressed by the Government as a matter of urgency.
The regime makes provision for an independent review within three years of commencement. Scarlet Alliance, in a submission to the Senate Committee, condemned granting:
‘... disproportionate power to an unelected public bureaucrat to decide what types of content are viewable to Australian end-users ... This power, combined with both a lack of oversight or rigorous public reporting requirements, leaves room for the individual holding this position to approach their task with a great deal of subjectivity and little accountability.’[28]
That is especially pertinent given that the Class 1 and Class 2 Material provisions to establish the proposed Online Content Scheme can be read as capturing consensual adult content that should not be considered harmful, akin to the history of police seizures of figless reproductions of Michelangelo’s David. One submission to the same Committee highlighted a discrimination concern, commenting:
‘Other businesses use social media and online platforms to advertise their products and services. Why, then, should the services of sex workers be afforded fewer rights than other businesses? The sex work is largely lawful (but regulated) in Australia and thus should not be subject to discriminatory regulation.’[29]
The ALA argued persuasively that there is a significant risk that the proposed Australian regime, like that in the UK, will result in excessive proactive monitoring and removal of content.[30] Such erasure of diversity is contrary to human rights and a recipe for confusion, particularly as the Acts do not provide for an effective appeal mechanism and many users of online services have little agency because of their service providers’ terms and conditions.
The ALA further expressed concern about the danger of vesting very broad discretion in a single individual empowered to determine community expectations.[31] As noted above, such determination is not guided or reviewed by a community body. The politicisation of decision-making on harms and morality means that ostensible periodic review by a parliamentary committee is inadequate. The widespread unresponsiveness of government departments and agencies to disclosure under the Freedom of Information Act 1982 (Cth) justifies scepticism about whether the Commissioner, with substantial new responsibilities, will embrace disclosure.
The Acts do not provide for an internal review of a decision by the Commissioner, a weakness obfuscated by the Commissioner’s reference to scope for an appeal to the Administrative Appeals Tribunal under s220 of the Online Safety Act – something that places an undue burden upon many applicants. The Commissioner has stated that ‘Any government decision can always be referred to the Commonwealth Ombudsman’,[32] a mechanism of little comfort when that agency is under-resourced. Scarlet Alliance criticised the lack of procedural fairness in the proposed Online Content Scheme, where:
‘[T]here is no process outlined for users to be notified that their material may be removed, no notice period offered to them, and no opportunity for them to be given a hearing to speak or write back in relation to the complaint. The expedited 24 hour time frame removes any opportunity for the Commissioner to develop or communicate considered reasons. That the office of the Commissioner is indemnified from liability for civil damages to those suffering the consequences of inappropriate or even illegal notices issued by the Commissioner is cause for further concern.’[33]
CONCLUSION
Robert Menzies reportedly stated:
‘Expediency matters in this world, and if we are confronted by a state of affairs in which we find things challenging the peace of the world and the security and future of our people, it is no use stating airy-fairy legalistic ideals’.[34]
The Acts are a matter of expediency, to be assessed against legal principles regarding the accountability of the executive and the inappropriateness of statutory drafting that embodies executive over-reach. In considering the proposed regime we should remember that laudable objectives do not excuse inadequate processes and that the history of content regulation in Australia highlights the need for caution. The Acts seek to enshrine broad new powers. Importantly, however, existing powers have been used infrequently and the use of power might be obviated through greater responsibility on the part of service providers.
Harvard Law School academic Evelyn Douek has commented:
‘[W]ithout adapting speech governance to the very different nature of the task being undertaken – systemic balancing instead of individual categorization – platform decisionmaking processes and the rules that govern online speech will continue to be viewed as illegitimate.’[35]
Legitimacy matters in a world where there is wariness about political self-interest and there are disquieting signs of community disengagement.[36] Content regulation in Australia has often been a matter of appeasing stakeholders and enshrining bureaucracies rather than addressing underlying harms and empowering the disadvantaged. In practice, the new regime is likely to emphasise bureaucratic activity counts rather than positive outcomes and it lacks remedies such as a readily accessible statutory cause of action for serious invasions of privacy. In the absence of wider reform, the Acts will attract votes but inadequately address harms and could disempower many of the people most in need of support. Both the Acts and the history of Commonwealth e-safety initiatives provide civil society and the legal profession with few grounds for optimism.
Dr Bruce Baer Arnold teaches technology law at the University of Canberra. His interests include privacy and health data, reflected in membership of OECD data protection working parties. He has contributed to numerous parliamentary and law reform commission inquiries on networks, content regulation, cybersecurity, privacy and human rights.
[1] N Moore, ‘Treasonous sex: Birth control obscenity censorship and white Australia’, Australian Feminist Studies, Vol. 20, No. 48, 2005, 319–42.
[2] P Mullins, The Trials of Portnoy: How Penguin Brought Down Australia’s Censorship System, Scribe, Brunswick, Vic, 2019; D Bryans, ‘The trials of Robert Close’, Script & Print, Vol. 35, No. 4, 197–218; and N Moore, The Censor’s Library, University of Queensland Press, St Lucia, 2012.
[3] K Windle, Undesirable: Captain Zuzenko and the Workers of Australia and the World, Australian Scholarly Publishing, North Melbourne, Vic, 2012; and H Zogbaum, Kisch in Australia: The Untold Story, Scribe Publications, Melbourne, 2004.
[4] B Harris, ‘Pell v Council of Trustees of the National Gallery of Victoria: Should blasphemy be a crime? The Piss Christ case and freedom of expression’, Melbourne University Law Review, Vol. 22, No. 1, 217–29; and D Marr, The Henson Case, Text Publishing, Melbourne, Vic, 2008.
[5] Most recently Australian Postal Corporation Act 1989 (Cth), pt 7B.
[6] Broadcasting Services Act 1992 (Cth), sch 7.
[7] Commonwealth of Australia Constitution Act 1900, s51(v).
[8] B Harris, A New Constitution for Australia, Cavendish Publishing, London, Sydney, 2002.
[9] UK Minister of State for Digital and Culture, Draft Online Safety Bill (12 May 2021) <https://www.gov.uk/government/publications/draft-online-safety-bill>.
[10] UK Department for Digital Culture, Media and Sport, ‘Landmark laws to keep children safe, stop racial hate and protect democracy online published’ (12 May 2021) <https://www.gov.uk/government/news/landmark-laws-to-keep-children-safe-stop-racial-hate-and-protect-democracy-online-published>.
[11] Australian Government Department of Infrastructure, Transport, Regional Development and Communications, Review of Australian classification regulation, <https://www.communications.gov.au/have-your-say/review-australian-classification-regulation>.
[12] Examples of the promotion: Australian Government Department of Infrastructure, Transport, Regional Development and Communications, Online Safety Bill – Reading Guide, 2020; and eSafety Commissioner, ‘Online Safety Act 2021 receives Royal Assent’ (Media release, 23 July 2021) (eSafety Commissioner media release) <https://www.esafety.gov.au/about-us/newsroom/online-safety-act-2021-receives-royal-assent>. See also Liberal Party of Australia, 'Keeping Australians safe online' (Media release, 5 May 2019) (Liberal Party media release) <https://www.liberal.org.au/latest-news/2019/05/05/keeping-australians-safe-online>.
[13] J Hall, ‘More than two million Australians to have internet history scoured by employers’, The Australian (Media release, 14 July 2021) <https://www.theaustralian.com.au/breaking-news/more-than-two-million-australians-to-have-internet-history-scoured-by-employers/news-story/9826ece29d5c116b8739bba74d85a2d8>; and Security Legislation Amendment (Critical Infrastructure) Bill 2020 (Cth).
[14] T Carney, ‘Robo-debt illegality: The seven veils of failed guarantees of the rule of law?’, Alternative Law Journal, Vol. 44, No. 1, 2019, 4–10; and Commonwealth Ombudsman, Services Australia’s Income Compliance Program: A Report about Services Australia’s Implementation of Changes to the Program in 2019 and 2020 (Report, 1 January 2021).
[15] Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 (Cth).
[16] See the discussion of the Telecommunications Legislation Amendment (International Production Orders) Bill 2020 (Cth) in M Mann and A Murray, ‘Striking a balance: Legislative expansions for electronic communications surveillance in Australia’ in this edition, xx-xx.
[17] See for example Parliamentary Joint Committee on Intelligence and Security, Review of the Identity-matching Services Bill 2018 and the Australian Passports Amendment (Identity-matching Services) Bill 2018 (Report, 2019).
[18] eSafety Commissioner media release, above note 12.
[19] Broadcasting Services Act 1992 (Cth), schs 5 and 7 provide for an Online Content Scheme.
[20] Australian Government, Department of Infrastructure, Transport, Regional Development and Communications, Report of the Statutory Review of the Enhancing Online Safety Act 2015 and the Review of Schedules 5 and 7 to the Broadcasting Services Act 1992 (Online Content Scheme) (Report, 15 February 2019) <https://www.communications.gov.au/publications/report-statutory-review-enhancing-online-safety-act-2015-and-review-schedules-5-and-7-broadcasting>.
[21] Liberal Party media release, above note 12.
[22] Ibid.
[23] E Douek, ‘Australia’s “abhorrent violent material” law: Shouting “nerd harder” and drowning out speech’, Australian Law Journal, Vol. 94, No. 1, 2020, 41–67.
[24] Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (Cth).
[25] Australian Competition & Consumer Commission, Digital Platforms Inquiry Final Report (Report, 26 July 2019) <https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report>.
[26] See the submissions and report by the Senate Environment and Communications Legislation Committee, <https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Environment_and_Communications/OnlineSafety/Submissions>.
[28] Ibid, 8
[29] Ibid, 18.
[30] Ibid, 11.
[31] Ibid, 11.
[32] Ibid, 11.
[33] Ibid, 10.
[34] R Menzies, quoted in G Haigh, The Brilliant Boy: Doc Evatt and the Great Australian Dissent, Simon & Schuster Australia, Cammeray NSW, 2021, 337.
[35] E Douek, ‘Governing online speech: From “posts-as-trumps” to proportionality and probability’, Columbia Law Review, Vol. 121, No. 3, 2021, 759–834 at 766.
[36] The ANU 2019 Australian Election Study reported that satisfaction with democracy is at its lowest level since the constitutional crisis of the 1970s and trust in government has reached its lowest level on record. Just 25 per cent of Australians believe people in government can be trusted, 56 per cent believe government is run for ‘a few big interests’, and 12 per cent believe it is run for ‘all the people’. See <https://australianelectionstudy.org/wp-content/uploads/The-2019-Australian-Federal-Election-Results-from-the-Australian-Election-Study.pdf>. See also B Harris, Constitutional Reform as a Remedy for Political Disenchantment in Australia: The Discussion We Need, Springer, Singapore, 2020.
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/journals/PrecedentAULA/2021/57.html