Complex contracts usually contain clauses which specify who will own the Intellectual Property (IP) which arises from, or goes into, a project. But what happens when a non-human entity creates original work under the contract?
Artificial intelligence today
While the fully automated Artificial Intelligence (AI) envisioned in the science-fictional world of killer robots is yet to be realised, smaller-scale AI is commonplace in day-to-day life. AI can be simply defined as technology which performs tasks that would normally require the brain of a human to complete, such as autonomous decision-making. AI programs can have the capacity to learn independently and improve their own processes.
Recently, forms of AI have been used in diverse endeavours, from self-driving cars to the composition of Shakespearean sonnets. While the quality of the poetry produced by such programs thus far is debatable, and we are some way off from seeing AI that has self-awareness that would truly rival that of a human, it is clear that AI is improving and expanding in use. Some form of AI is currently used in many facets of daily life, such as smartphones, computer gaming, media streaming services and modern car technology. In the business world, AI programs are used in aspects of banking transactions. Increasingly, AI is relied on in ICT and other service delivery projects.
But while AI continues to grow in sophistication, this is not mirrored by the legal systems which regulate it. One area where AI and the law intersects awkwardly is copyright.
The copyright conundrum
Copyright protects original written work. This includes software. Usually the creator of content enjoys copyright over it, although copyright can be assigned or licenced to others.
In Australia, generally only work created by humans can be copyright. This is partly because copyright only protects original work, and courts have interpreted originality as requiring a degree of human ingenuity. Where there is no agreement to assign the ownership, copyright over computer programs and software will generally vest with the whoever created the source code of that software. Likewise, content generated by "AI-like" software which performs functions based on programmed rules but without exhibiting true intelligence or originality, for example a "smart-home" device that can dim lights or check the weather forecast on command, would likely remain the copyright of the author of the program's code or the person making the input. The same would likely apply for programmes used as part of an artistic or technical process but which are ultimately controlled by human choices.
A problem arises where AI software creates work which had little or no input from humans. While the creator of the AI program would generally retain copyright over the original source code, they may have no rights to original work created by the software that they did not envision or program. For example, the source code of an AI program designed to create original music or generate business recommendations would be copyright, but the actual decisions and work generated by that AI may not be if there is not a sufficient level of human input. And the issue only becomes more apparent as the distance from human intervention increases where an AI program continues to evolve.
Courts in Australia have been reluctant to attribute copyright where the work was largely created by a computerised process. For example, in Acohs Pty Ltd v Ucorp Pty Ltd (2012) 201 FCR 173, the Full Federal Court found that data sheets created by a computer program were not subject to copyright because there was not a sufficiently involved human author.
The program used in Acohs was not AI software: rather it was a relatively simple data-collecting mechanism. Consequently, it seems that a court in Australia following the current approach would be hesitant to attribute copyright to a work created entirely by an AI program.
Time to update copyright laws?
Unless the Copyright Act is amended, for the time being it seems that work generated by AI will not attract copyright in Australia, unless the AI involvement is hidden and a human purports to create the new work. In contrast, in the UK, the Copyright, Designs and Patents Act 1988 specifically provides that copyright in original work generated by a computer will reside with whoever "made the arrangements necessary" for the computer to create the work. Meanwhile other jurisdictions like the US and EU have similar issues to Australia.
If the Australian Copyright Act were amended with a similar provision to the UK one, perhaps the law would better protect the investments of companies which develop AI. However, this may necessitate careful consideration as to the term of copyright in AI-created work. AI programs are not mortal like humans. Consequently, the usual survival of copyright for 70 years after the author's death would also need to be considered and addressed.
AI and IP: the contractual context
It is important that a contract clearly defines which party owns the Intellectual Property associated with the project. This is normally done by specifying ownership in the existing material, modifications to existing material, and in new material created in the performance of the agreement.
However, the above uncertainty can lead to complicated negotiations and indeed risks where the new software is more than a modification to existing software. Generally, agreements between sophisticated entities will include clauses devoted to IP issues arising from the contractual project. This will often specify who owns which IP, and who licenses the IP for how long and for what purpose.
AI, therefore, presents a threat to the ordinary regulation of IP under contract. There is a significant risk that, regardless of the terms of an IP clause, work created by AI under a contract would be uncopyrightable, and could therefore be used outside the contractual framework and indeed by people who are not parties to the contract.
How can you protect your IP?
In the meantime, what can parties do to protect their investments in AI?
Some ways of mitigating the IP risks associated with AI are:
- choosing an appropriate jurisdiction to develop AI;
- clearly stating who retains ownership of material that is copyright under the contract;
- stating as clearly as possible who owns AI generated intellectual property-like material;
- distributing risks potentially created by AI via insurance clauses or other mechanisms;
- ensuring the use and nature of any AI technology remains confidential; and
- clauses specifying who in the contract may use AI-generated content based on a contracted ownership mechanism. This would see an expansion of "IP-like" clauses to deal with things that are akin to IP but are not technically IP.
Another option could be to attempt to patent AI algorithms or the content they generate. But given the expense of the patent process, and the question of whether the AI-created items are always patentable, this is unlikely to be a panacea.
Whether Australia's copyright laws will be changed remains to be seen. But one thing is clear: as AI technology becomes more prevalent, so will the issue of how AI-created work can be protected.