What’s not to ‘like’? Social media pages readied for reactivation
Market Insight 18 February 2022 18 February 2022
After de-activating or removing their social media pages last year following the Australian High Court’s decision in Fairfax Media Publications v Voller, many organisations are preparing to reactivate their social media pages now that anti-trolling reform is in the pipeline.
- Currently, and as a result of the Voller decision, administrators of Facebook and other social media pages on which third parties can post comments can be liable for the ‘publication’ of defamatory material contained in those comments.
- On 10 February 2022, a new bill was introduced into Australia’s Federal Parliament which, if passed, will shift liability away from social media page administrators for third-party comments posted on their social media pages.
- Organisations that reacted to last year’s Australian High Court decision by taking down their social media pages can now look to reactivate them without the risk of liability for third-party comments.
Publication in brief
Australian defamation law gives a person the right to sue in relation to the publication of a defamatory matter. In bringing a defamation action, a plaintiff must meet the essential elements of a cause of action, one of which is ‘publication’.
Once the essential elements of defamation are made against a defendant, the defendant will be liable for defamation subject to the application of any defences. A number of defences are available both at common law and statutorily, and for organisations with social media websites or webpages which invite or attract comments, the defence of ‘innocent dissemination’ is an important one. The defence of innocent dissemination will be made out and liability avoided by a defendant where:
- the defendant published the matter merely in the capacity, or as an employee or agent, of a ‘subordinate distributor’; and
- the defendant either knew or ought reasonably to have known that the matter was defamatory; and
- the lack of knowledge was not due to any negligence on the part of the defendant.
Why some organisations took down their social media pages last year
In Voller, the High Court of Australia upheld the findings of the Supreme Court of New South Wales and the New South Wales Court of Appeal on the question of whether the ‘publication’ element of the defamation was met in respect of Facebook comments posted by third-party users on the Facebook pages of the three media organisations involved in the proceedings.
Dylan Voller (whose experience while detained at a youth detention centre had been the subject of significant media attention) commenced legal proceedings against three large Australian media organisations who had posted links to various news articles concerning Mr Voller on their media organisation Facebook pages. These posts attracted ‘trolling’ comments from third-party Facebook users (i.e. the general public) that Mr Voller alleged gave rise to defamatory imputations of and concerning him, prompting him to commence defamation proceedings against the media organisations.
At first instance and on appeal by the media organisations, the Supreme Court of New South Wales, the New South Wales Court of Appeal and, ultimately, the High Court of Australia all held that the media organisations that administered the Facebook pages had ‘published’ the trolling comments (by encouraging and assisting their publication). Those organisations could therefore potentially be liable for defamation if the imputations pleaded by Mr Voller were found to be capable of arising and be defamatory of him.
Justices Gageler and Gordon of the High Court referred to the long-standing principle from Webb v Bloch2 that, generally, every person who contributes to a publication can be held liable (e.g. in the case of a defamatory newspaper article, the journalist, editor, publisher, printer, distributor, etc). Their Honours acknowledged, however, that the advent of the internet has resulted in:
- a disaggregation of the process of publication (i.e. there are many actors each playing separate roles in the process of content being published as opposed to the traditional model of publication where many of the roles were performed by a single media organisation responsible for writing, editing, printing, distributing, etc); and
- a shift from ‘one-to-many’ publication (i.e. one media company selling a newspaper for the consumption of the people who purchase that newspaper) to ‘many-to-many’ publication (e.g. many people writing comments on Facebook for the consumption of all the other people on Facebook).
The High Court also confirmed that a person can participate as a publisher even without intending to or knowing that they are publishing defamatory material. In other words, a person can accidentally be a publisher of defamatory material.
One consequence of the Voller decision was that administrators of Facebook (or other social media) pages on which third parties can post comments could now be liable for the publication of defamatory matter contained in those comments. Some Australian organisations (particularly media organisations) disabled or re-configured certain functionality on their social media pages as a result to prevent accidental publication of defamatory material.
Defence against the dark parts (of the internet)
Following a process of defamation law reform, with the intention that the reforms would be applied uniformly across the defamation legislation in each Australian State and Territory, on 1 July 2021 new legislative amendments came into effect in New South Wales, Victoria, Queensland, South Australia and the Australian Capital Territory. Tasmania followed on 12 November 2021, while Western Australia and the Northern Territory have not yet enacted the amendments.
One of the outcomes of the abovementioned law reform, is that, at least in the States and Territories where the amendments are now in force, defamation proceedings cannot be commenced unless a concerns notice has been given to the prospective defendant. For social media providers, a plaintiff who takes issue with an alleged defamatory comment on a social media page must first give the administrator of that social media page a concerns notice to communicate their concern before commencing legal proceedings. This gives the administrator an opportunity to remedy the concern first (e.g. apologise and perhaps remove the offending comment). On a practical level, this reduces the administrator’s potential exposure to legal proceedings.
Further, additional law reform is potentially on the horizon with the Social Media (Basic Expectations and Defamation) Bill 2021 (Basic Expectations & Defamation Bill), introduced into Federal Parliament on 25 October 2021 and, as foreshadowed by Prime Minister Scott Morrison in November 2021, the Social Media (Anti-Trolling) Bill 2022 (Anti-Trolling Bill) introduced on 10 February 2022.
If passed into law, the Basic Expectations & Defamation Bill will:
- enable the Minister to set basic expectations of social media service providers (e.g. Facebook) regarding the hosting of defamatory material on social media platforms;
- ensure that service providers are liable for defamatory material hosted on their platforms which is not removed within a reasonable timeframe; and
- address the lack of accountability on service providers when defamatory materials are published on their sites.
The Anti-Trolling Bill, if passed, will:
- establish that the administrator of a social media page will not be a publisher of third-party material posted on that social media page (whereas the provider of the social media platform will be a publisher);
- where a provider of the social media platform is a publisher, give that provider a defence to an action in defamation where certain criteria are met, in particular where the provider has a complaints scheme in place; and
- require social media organisations to reveal the identities and contact details of the third parties who post defamatory comments.
These changes not only aim to protect social media page administrators from standing in the firing line for comments posted on their social media pages by trolls – they also make it easier for those defamed by such comments to identify the trolls responsible for the comments and institute defamation proceedings against them.
What you should do
If enacted, the Anti-Trolling Bill will allow organisations with a social media presence to have at least some preliminary protections against liability for defamatory third-party comments posted on their social media pages. However, to help foster safe online spaces for customers and community members, administrators should continue to observe certain practices with respect to page upkeep. This includes employing moderation functionality (e.g. content filters), avoiding the posting of provocative content (particularly about a specific person) that might invite unsavoury comments in reply and actively monitoring social media pages for poor online behaviour.
How we can help
Clyde & Co’s Cyber & Digital Law team has unparalleled and specialised expertise across the privacy, cyber and broader technology and media practice areas and houses the largest dedicated and market-leading privacy and cyber incident response practice across Australia and New Zealand. Our team is also highly regarded for their expertise and experience in managing all forms of disputes across sectors and international borders including advising on some of the most high-profile disputes and class actions commenced in Australia.
The firm's privacy, cyber, tech and media practice provides an end-to-end risk solution for clients. From advice, strategy, transactions, innovations, cyber and privacy pre-incident readiness, and incident response and post-incident remediation through to regulatory investigations, dispute resolution, litigated proceedings (plaintiff and defendant), recoveries and third party claims (including class action litigation), the team assists clients across the full spectrum of legal services within this core practice area.