UK fails to agree AI/copyright code of practice

The UKIPO has failed to broker a deal between creative industry stakeholders and AI companies relating to use of copyright content to train AI models. Work began last summer to reach a voluntary code of conduct for this purpose, but, in its recent response to the AI White Paper consultation, the UK Government has confirmed that, while the working group has provided a valuable forum for stakeholders to share views, it is now clear that it will not be able to agree an effective voluntary code.

Background

Back in March 2023, Sir Patrick Vallance published his review on pro-innovation regulation for digital technologies. In it, he identified an urgent need to find practical solutions to the barriers faced by AI firms in accessing copyright and database right materials, and recommended drawing up a code of practice for this purpose.

In the UK, the current legal framework doesn’t allow unauthorised copying of copyright-protected content for training AI models, save where it is done for purely non-commercial purposes. That means that if a commercial AI company wishes to copy third party materials that are publicly available on the internet to train its models, it may need to obtain permission to do so from each relevant rightsholder. However, as the relevant rightsholder can be hard to identify, and as there is no streamlined licensing framework for this purpose, and as machine learning systems need vast quantities of data to learn, this process can be difficult and cumbersome in practice.

In other countries, there may be more flexibility. For example, in the EU, there is an exemption that permits copying for this purpose, save where the content owner has opted-out. In the US, there may be broader exceptions applicable in some circumstances, but this remains uncertain, and the subject of significant ongoing litigation between content creators and tech companies.

To address these known issues, the UK Government accepted Sir Patrick’s recommendations relating to an AI/copyright code of practice. By June 2023, it had convened a working group consisting of AI companies and stakeholders from the creative industries, with the aim of agreeing a voluntary code of practice on copyright and AI. The code of practice was intended to make licences for text and data mining more available in order to overcome some of the barriers that AI companies currently face in the UK, while respecting the UK copyright framework and ensuring that it continues to promote and reward investment in content creation. The Government also said that if the code of practice is not adopted or agreement is not reached, legislation could be considered.

Failure to agree

On 6 February, the Government published a response to the UK AI White Paper (entitled “A pro-innovation approach to AI regulation”). In it, it confirmed that while the working group had provided a valuable forum for stakeholders to share their views, “unfortunately, it is now clear that the working group will not be able to agree an effective voluntary code”.

This is not a hugely surprising outcome, as it is difficult to envisage a voluntary code of conduct that would effectively smooth the licensing environment while satisfying stakeholders on both sides of the debate. AI companies need quick and ideally continuous access to vast quantities of data, accessible in a manner that does not build huge inefficiencies or cost into their operating models, while content creators want meaningful compensation for use of their works, plus sufficient transparency to enable clear monitoring of that use.  A non-legislative initiative that resolves these competing considerations was always going to be a tall order.

What’s next?

The Government’s response paper is light on detail as to what happens next. It says that DSIT and DCMS ministers will now lead a period of engagement with the AI and rights holder sectors, seeking to ensure the workability and effectiveness of an approach that allows the AI and creative sectors to grow together in partnership. It notes that its work will include exploring mechanisms for providing greater transparency so that rights holders can better understand whether content they produce is used as an input into AI models, and that it will soon set out further proposals for the way forward. Legislation in this area may therefore be back on the agenda.

Comment
While not hugely surprising, this development is a blow to both AI developers and content creators alike. Under the current legal framework, AI developers are not legally able to copy third party copyright works in the UK for the purpose of training their AI models, save in very limited circumstances. At the same time, content creators have limited ability to monitor and police use of their works by AI developers. Some action is therefore required to strike a better balance of interests in the UK. This development also marks a setback in the Government’s ambitions to be “a global AI superpower’’ and to take a leading role in developing a harmonised framework for use of AI globally. Calls for action also came from the House of Lords Communications and Digital Committee’s recent report on “Large language models (LLMs) and generative AI, which identifies that the Government has a short window to steer the UK towards a positive outcome and emphasises that it has a “duty to act”. We therefore expect further Government action in this area sooner rather than later.