Posted: 05/11/2025
The debate over AI training and copyright has reached a critical juncture. Getty Images v Stability AI clarified one technical point on the interpretation of 'article' for the purposes of secondary infringement, but otherwise a gaping hole in copyright law remains.
The case confirmed that, under the Copyright, Designs and Patents Act 1988 (CDPA), an 'article' can include an AI model. This interpretation matters because it means AI systems can, in principle, fall within the scope of secondary infringement provisions, despite an AI model being a statistical matrix of probabilities stored in intangible form. Justice Joanna Smith found the definition of 'article' did not need to be tethered to a specific physical embodiment given the ever-changing nature of technology.
However, the judgment also made something crystal clear: model weights - the billions of parameters that define an AI system - do not contain or store infringing copies of copyright works. They are statistical representations, not reproductions. The court found that while training involves exposure to copyrighted material, the resulting model does not embed those works in any fixed form.
In short:
This distinction is pivotal. It means that under current law, importing or deploying a trained AI model in the UK cannot be treated as importing an infringing copy - because the model does not store any copyright works.
The CDPA was drafted for a world of tangible copies - books, CDs, DVDs - not for seemingly intangible statistical matrices powering generative AI.
Today, AI systems learn from creative works without ever 'fixing' them in a legally recognisable form. Under the CDPA, fixation is required for a work to qualify for copyright protection, but infringement can occur even through transient copies. In the AI context, while trained model weights are not fixed reproductions of works, the act of making temporary copies during training may still constitute copying. This creates a legal grey area: current law does not clearly address whether such ephemeral uses fall within existing exceptions or require licensing. That leaves creators without clear remedies and developers operating in a compliance vacuum.
History shows that copyright law adapts to technological disruption. When peer-to-peer file sharing exploded in the early 2000s, the law evolved to address digital copying and distribution. AI represents a similar inflection point. Just as Napster forced lawmakers to rethink reproduction and communication rights, generative AI demands a recalibration of what 'copying' means in a world of probabilistic models.
The UK cannot afford legislative inertia. As Mrs Justice Joanna Smith said:
'Both sides emphasise the significance of this case to the different industries they represent: the creative industry on one side and the AI industry and innovators on the other. Where the balance should be struck between the interests of these opposing factions is of very real societal importance ….. the case remains that if creative industries are exploited by innovators such as Stability AI without regard to the efforts and intellectual property rights of creators, then such exploitation will pose an existential threat to those creative industries for generations to come.'
To protect creators while enabling innovation, the CDPA should be updated to reflect the realities of AI development:
The Getty judgment should not be seen as closing the debate - it should be the starting gun for reform. The CDPA has evolved before; it must evolve again. Without legislative clarity, the UK risks undermining both its creative industries and its ambition to lead in AI innovation. The question is not whether to act, but how quickly.