Posted: 25/04/2025
In March 2025, Penningtons Manches Cooper hosted a panel discussion at the firm's London office, addressing the challenges posed by deepfake evidence in court proceedings. The event featured insights from an expert panel including:
The event was introduced and hosted by Penningtons Manches Cooper partner Charlotte Hill, beginning with a keynote speech from Mr Justice Waksman who highlighted how deepfake technology threatens the authenticity of court documents. Mr Justice Waksman emphasised that, in an adversarial legal system such as in England and Wales, it is generally the responsibility of the parties to challenge document authenticity, rather than judges who might not always have the detailed context to allow them to detect false evidence. He referenced CPR 32.19, noting the necessity for parties to formally challenge document authenticity in accordance with the Court Rules, or risk implicit acceptance.
Mr Justice Waksman explained that this situation frequently requires parties to call forensic experts to support their arguments. Tristan Jenkinson and James Evans elaborated on the importance of hiring specialists with the appropriate forensic experience as deepfake techniques differ significantly depending on the type of document involved.
During the event, a compelling demonstration illustrated how challenging it is to spot deepfake evidence. A video showed Charlotte Hill seemingly presenting a montage of deepfake videos, audio clips, and photos, which was then revealed to be a deepfake.
Attendees were asked to participate in a poll to identify authentic versus deepfake examples, with the results demonstrating that perhaps deepfakes are not (currently) too hard to spot:
Although the majority correctly identified the deepfakes, a large number did not. The panel stressed that such evidence can sometimes narrowly evade detection. James Evans described deepfake technology as being in its 'embryonic stages', but evolving rapidly and at a compounding rate.
The panel shared personal experiences of their stumbling across deepfake evidence. Ben Fearnley recounted spotting a false bank statement date ('31st September') just before trial, while Tristan identified a suspicious bank email with mismatched language settings. Both examples highlighted that despite sophisticated technology, simple checks on document properties can effectively expose fakes.
Laura Wright discussed the notable COPA v Wright trial, where Dr Craig Wright claimed to be the creator of bitcoin, Satoshi Nakamoto, and was found to have forged at least 47 documents in trial.
The ease of creating deepfake evidence was also discussed. Tristan noted the low cost and accessibility of deepfake creation tools. James added that simply searching online reveals numerous easily accessible platforms for deepfake generation.
To guard against the risk of deepfake evidence, the panel advised proactive vigilance. James emphasised prevention, recommending thorough document reviews well ahead of trial. Mr Justice Waksman suggested examining document provenance closely, noting that documents introduced into evidence unilaterally are inherently riskier.
James highlighted technology solutions like secure platforms, referencing his own experience in the family law sphere with OurFamilyWizard, which prevents document alterations by providing a secure platform for communication. Meanwhile, Ben reassured attendees that traditional legal skills, including attention to detail, familiarity with the case, and instinctual judgement, remain critical defences to guard against the risk of deepfakes. Lawyers should always scrutinise documents closely, questioning their coherence and authenticity, and taking into account the wider circumstances of the case. In short, if it doesn’t feel right, further investigation is advised.
Finally, James optimistically proposed that advancements in AI itself might in the future offer safeguards against deepfakes by enhancing detection capabilities.
This article was co-written by Puja Patel, trainee solicitor in the commercial dispute resolution.
Email Charlotte
+44 (0)20 7457 3107
Email Rebecca
+44 (0)20 7457 3127
Email Lee
+44 (0)20 7753 7850