UK AI and copyright voluntary code of practice dropped – what next?
Published on 9th Feb 2024
Balancing IP rightsholders' and AI developers' interests proves difficult as agreement on a code cannot be reached and Lords committee calls for certainty
In March 2023, following a review of pro-innovation regulation conducted by Sir Patrick Vallance, then the government's chief scientific adviser, the government agreed to produce, by that summer, a code of practice which would give clarity on the relationship between intellectual property (IP) law and artificial intelligence (AI) and would be developed in collaboration with the AI and creative sectors. However, summer came and went and no code materialised.
The government has now confirmed in its response to its AI regulation consultation that no effective voluntary code can be agreed. It has promised that it will soon set out further proposals on the way forward and flagged that it intends to explore mechanisms for providing greater transparency so that rightsholders can better understand whether content they produce is used as an input into AI models.
IP v AI?
Data or content is an essential raw material for the development of AI.
Some of the most powerful AI models, including those that underpin generative AI systems for creating text or images, are trained on publicly-available content. This is typically scraped from the web, much of which will be protected by IP rights.
The creative sector is generally taking the position that using IP-protected content to train AI models, which can then be used to create potentially competing content, is an infringement of the relevant IP rights unless licences with rightsholders have been agreed.
AI developers typically maintain that this use is fair and transformative and that to require them to obtain licences from all rightsholders would have a stymying effect on technological advancement.
There is intense lobbying on both sides of the debate and the issues are currently being litigated before the UK courts in Getty Images v Stability AI – a case concerning the use of Getty's image library to train the Stable Diffusion text-to-image model.
Due to the lack of clarity on the UK government position, the debate has continued to intensify. This is all against the backdrop of the UK government's stated ambitions for the UK to be at the forefront of AI development and its pro-AI innovation approach and in the wake of a recent High Court decision, which held that AI inventions are patentable in the UK, positioning the UK as an AI developer-friendly jurisdiction.
TDM U-turn
There are currently various infringement exceptions that permit the use of a copyright work or database in certain scenarios, but in the UK these are narrow in scope. For example, the UK's text and data mining (TDM) copyright exception only protects those carrying out "computational analysis" on lawfully accessed works for the "sole purpose of research for a non-commercial purpose" and thus is usually interpreted as excluding any TDM carried out for commercial purposes.
Following a consultation, in 2022, the government had announced that it would introduce a new, more expansive TDM exception. This would have allowed TDM for any purpose (provided that the works were lawfully accessed) and would have covered both copyright and database rights. Crucially, it had said that rightsholders would not have been able to contract or opt out of the exception and would not have been able to charge for TDM licences.
After strong complaints, particularly from the creative sector, and criticism from the Lords Communications and Digital Committee, the government U-turned on this decision and announced that it would not be proceeding with the proposed expanded exception.
LLM inquiry report
A couple of days before the government confirmed that the voluntary code of practice would be dropped, the Lords Communications and Digital Committee released its report following an inquiry into large language models (LLMs) in which it called for the government to provide the clarity necessary to "resolve disputes definitively".
It acknowledged the potential importance of LLMs for society but equally endorsed upholding a robust copyright regime. It recognised that the application of the law to LLM processes is complex but maintained that the purpose of copyright is to "reward creators for their efforts, prevent others from using works without permission, and incentivise innovation". It concluded that the current legal framework is "failing to ensure those outcomes occur" and stated that the government has a duty to act.
The committee was particularly critical that the government's approach was "waiting for the courts' interpretation of these necessarily complex matters", which could, it said, take up to a decade to happen.
The government's next steps
In delivering its response to the AI regulation consultation, the government confirmed that it is committed to supporting both the AI technology sector and creative sectors but that it had become clear that an effective voluntary code of practice could not be agreed.
Although light on the specifics of what the approach might be, it stated that the government will commence a period of engagement with the relevant sectors to ensure the "workability and effectiveness of an approach that allows the AI and creative sectors to grow together in partnership".
It said that its approach will be underpinned by trust and transparency between all parties, including exploring ways in which there can be greater transparency for rightsholders to understand whether their data has been used to train AI models and around the attribution of outputs. It said that further proposals will be put forward soon.
A global perspective
In its response, the government emphasised the importance of "close engagement" with its international counterparts who are also considering these issues. The AI and IP minister had previously acknowledged that the disagreements between AI developers and creative sectors came down to "legal interpretations across multiple jurisdictions".
Differing approaches to the dispute are being taken in different jurisdictions. For example, in the EU, commercial TDM on a copyright work is permitted, provided that the rightsholder has not opted out its work from the exception in an appropriate manner. Whereas in the USA, there is a broad fair use doctrine, which many AI developers have claimed covers AI training. However, the limits of the doctrine are currently being tested in various litigation proceedings.
The then-IP minister for the UK, George Freeman, had suggested an opt-out model might be preferable in the UK. Although it is not clear whether this is an option being considered by the present minister.
Osborne Clarke comment
The fairly recent creation of a joint AI and IP ministerial brief recognises that the two fields are necessarily intertwined. But, with the government seeking to simultaneously position the UK as an AI leader and a friendly jurisdiction for AI development, and promote the growth of the UK's creative industries, clarity on that interrelationship is needed.
The fact that the interface between IP and the practice of commercial TDM is less permissive under English law makes for a contentious starting point.
Although agreement has not been reached on the voluntary code of practice, the government's approach still seems to be one of seeking collaboration and agreement between the different sides of the debate. Whether a workable and effective approach can be brokered remains to be seen.
The inability to reach an acceptable consensus might push the government to its fallback position of introducing legislation, although the indications are that such intervention may be designed to improve transparency and cooperation between the AI and creative sectors rather than implementing a rigid legislative framework.
In the meantime, many rightsholders are considering whether to bring IP infringement claims in the UK against the developers of AI models that have been trained on IP-protected content.