Proposed defamation defences for digital intermediaries, including forum administrators

Lyndal Sivell, Alice Brennan
01 Sep 2022
Time to read: 3 minutes

Two alternative options for a new defence in defamation for digital intermediaries are open for written submissions by 9 September.

Reform of the Model Defamation Provisions continues and is now keenly focused on the extent to which digital intermediaries should be liable under defamation law for the publication of third-party content online. On 12 August 2022, the Meetings of Attorneys-General agreed to release the NSW Government's draft Part A Model Defamation Amendment Provisions (Part A MDAPs) and the accompanying Background Paper for public consultation.

Who is a "digital intermediary"?

The Part A MDAPs broadly define "digital intermediary" as a person, other than an author, originator or poster of the matter, who provides an online service in connection with the publication of the matter. This is consistent with the approach taken in the Background Paper which identifies "internet intermediaries" as covering a broad range of functions such as internet service providers, content hosts, search engines, social media platforms, and extends to forum administrators (which are individuals and organisations that use online platforms to host forums that allow or invite third party comments).

Proposed defences for digital intermediaries

The Part A MDAPs address a range of recommendations, as set out in the Background Paper. These include more straightforward exceptions for digital intermediaries that provide caching, conduit or storage services and where those digital intermediaries did not take certain steps such as editing or promoting the relevant content.

Recommendations 3A and 3B stand out from the pack as they present two alternative options for a new defence for digital intermediaries:

  • Model A – safe harbour defence for digital intermediaries, subject to a simple complaints notice process; or
  • Model B – innocent dissemination defence for digital intermediaries, subject to a simple complaints notice process.

There are key differences between the two models. The purpose of Model A is to focus the dispute between the complainant and the originator and acts as an automatic defence where it is possible for the complainant to identify the poster. To establish the defence, the digital intermediary would have to prove that it was a digital intermediary in relation to the publication, and that it had an easily accessible mechanism for complaints at the time of the publication. In addition, if the complainant provided the digital intermediary with a complaints notice, then the intermediary must within 14 days after being given that complaints notice either have provided the complainant with enough information (with the poster's consent) to enable a concerns notice to be issued or proceedings commenced, or taken steps to prevent access to the publication, that were reasonable in the circumstances (if any).

Model B does not provide an automatic defence or safe harbour where the complainant could identify the poster. The purpose of Model B is to recognise that digital intermediaries should not be liable for the publication of third-party defamatory content where they are merely subordinate distributors and are not aware of it. To establish this defence, the internet intermediary needs to prove that it was a digital intermediary in relation to the publication, and that at the time of the publication, it had a complaints mechanism that was easily accessible. Further, it must prove that if the complainant provided a complaints notice, then the digital intermediary must have within 14 days taken reasonable steps to prevent access to the publication.

While there are clear differences, both models would provide for basic prescribed content for the complaints notice, a specific period of time within which the intermediary must act, an internet intermediary not being ineligible for the defence simply because it has a practice of monitoring for or taking down unlawful content (i.e. practising good behaviour), and the internet intermediary being denied the defence if it is actuated by malice.

Importantly, this defence only applies to a person who is not the author, originator or poster of the matter. This is because the defence is only intended to apply where the person providing the service is acting as an intermediary and is a secondary publisher.

Why are these defences on the table?

This space is evolving both in Australia and overseas. At home, last year in Fairfax Media Publications Pty Ltd v Voller [2021] HCA 27 the High Court found that the appellants – who were media companies that maintained public social media pages – were legally responsible as "publishers" for third parties' comments on those social media pages. However, the High Court did not consider the availability of defences. The Online Safety Act 2021 (Cth) (OSA) also came into effect on 23 January 2022, and introduced an adult online cyber abuse scheme, introduced the Basic Online Safety Expectations for online service providers, transposed the "BSA immunity" from Schedule 5 of the Broadcasting Services Act 1991 (Cth) into the OSA, and granted the eSafety Commissioner the power to obtain end-user identity information and contact details from online service providers.

Further afield, the draft Digital Services Act (DSA) (which will replace the E-Commerce Directive) seeks to update and harmonise safety and liability rules for intermediary services operating in the EU. The DSA retains the established liability exemptions for intermediary services and clarifies that voluntary own-investigations by digital intermediaries will not disqualify them from those liability exemptions. In the UK, the draft Online Safety Bill 2021 creates duties of care for social media platforms and search engines to mitigate risks in relation to the dissemination of content which is illegal or harmful.

We expect that consideration will also need to be given to the High Court's recent decision in Google LLC v Defteros [2022] HCA 27. In that case the court confirmed that search engines are not "publishers" of defamatory content by providing links to defamatory websites.

Get in touch

Disclaimer
Clayton Utz communications are intended to provide commentary and general information. They should not be relied upon as legal advice. Formal legal advice should be sought in particular transactions or on matters of interest arising from this communication. Persons listed may not be admitted in all States and Territories.