An Italian court, the Tribunale Ordinario di Milano, has found Google liable for defamation because its autocomplete suggestions paired the plaintiff's name with defamatory key words, raising the question of whether such a case could succeed here in Australia.
What was the basis for the defamation case against Google?
Google's autocomplete function suggests the rest of the word and appropriate key words to narrow the search.
For example, if you begin to type in "fair wo", Google will suggest "Fair Work Act 2009" or "Fair Work Australia".
The plaintiff (whose name has been suppressed) is an Italian entrepreneur who also provides education in personal finance, with much of his business conducted online.
When his name was entered, Google helpfully suggested "truffa" (fraud) and "truffatore" (swindler) as appropriate key words. Needless to say, he was unimpressed and approached Google to have this remedied. Finding no satisfaction, he sued.
The European Union's safe harbour for service providers
In the European Union, there's a safe harbour (of sorts) for service providers.
Under the EU's Directive on electronic commerce 2000/31/EC, the service provider isn't liable
if it's a mere conduit for information (Article 12);
if it's merely caching material (Article 13); or
for information stored at the request of a recipient of the service if it doesn't have actual knowledge of the unlawful nature of the information, or, if it's made aware, it acts expeditiously to remove or to disable access to the information (Article 14).
There's also no general responsibility to monitor (Article 15).
Why Google couldn't take advantage of the safe harbour
Google pointed out that the autocomplete suggestions are terms entered by Google users, gathered by software, and then suggested in order of popularity by the automatic working of an algorithm. In short, don't blame us for what people are using as search terms.
Not quite, said the Court. Search engines are essentially databases plus software: Google's search engine is an enormous database of web pages gathered by spiders and stored on its servers. It and other search engines organise information and then offer it to the user. Google makes choices about how to set up that software, which then determines the key words which are then presented to the user. Google could have altered the autocomplete when this problem was brought to its attention, but did not.
But was this defamatory? Yes – "the user who sees this linkage will be immediately induced to doubt the moral integrity of the person whose name appears in association with these words, and to suspect unlawful conduct on the part of the same." And it is irrelevant that the words complained of are present in part in other content – Google is responsible for the linkage its software makes between the plaintiff and the disparaging terms.
What would happen in Australia?
Australia doesn't have a general safe harbour for ISPs for all types of legal liability. There is however a defence of "innocent dissemination" in the uniform Defamation Acts for subordinate distributors, which is someone who
was not the first or primary distributor of the matter, and
was not the author or originator of the matter, and
did not have any capacity to exercise editorial control over the content of the matter (or over the publication of the matter) before it was first published.
These provisions have not been properly tested yet in the context of the internet. Given the facts in this case however it's possible that a similar claim could succeed in Australia, as Google could be argued to have had the capacity to exercise editorial control over the content, particularly if, as here, a complaint was made to it and Google declined to take down the offending material.
You might also be interested in ...