News and Publications

Seeing isn’t always believing - navigating the challenges of falsified evidence

Posted: 25/04/2025


In March 2025, Penningtons Manches Cooper hosted a panel discussion at the firm's London office, addressing the challenges posed by deepfake evidence in court proceedings. The event featured insights from an expert panel including:

  • Mr Justice David Waksman (High Court judge and Judge in Charge of the Technology and Construction Court)
  • Laura Wright (barrister at 4 Pump Court)
  • Ben Fearnley (barrister at 29 Bedford Row)
  • James Evans (head of UK professional education, OurFamilyWizard)
  • Tristan Jenkinson (owner and author, eDiscovery Channel)

Summary of the event

The event was introduced and hosted by Penningtons Manches Cooper partner Charlotte Hill, beginning with a keynote speech from Mr Justice Waksman who highlighted how deepfake technology threatens the authenticity of court documents. Mr Justice Waksman emphasised that, in an adversarial legal system such as in England and Wales, it is generally the responsibility of the parties to challenge document authenticity, rather than judges who might not always have the detailed context to allow them to detect false evidence. He referenced CPR 32.19, noting the necessity for parties to formally challenge document authenticity in accordance with the Court Rules, or risk implicit acceptance.

Mr Justice Waksman explained that this situation frequently requires parties to call forensic experts to support their arguments. Tristan Jenkinson and James Evans elaborated on the importance of hiring specialists with the appropriate forensic experience as deepfake techniques differ significantly depending on the type of document involved.

During the event, a compelling demonstration illustrated how challenging it is to spot deepfake evidence. A video showed Charlotte Hill seemingly presenting a montage of deepfake videos, audio clips, and photos, which was then revealed to be a deepfake.


Attendees were asked to participate in a poll to identify authentic versus deepfake examples, with the results demonstrating that perhaps deepfakes are not (currently) too hard to spot:

  • text message: 58% correctly identified the deepfake;
  • photo of Charlotte Hill: 62% correctly identified the deepfake;
  • voice note audio file: 61% correctly identified the deepfake.

Although the majority correctly identified the deepfakes, a large number did not. The panel stressed that such evidence can sometimes narrowly evade detection. James Evans described deepfake technology as being in its 'embryonic stages', but evolving rapidly and at a compounding rate.

The panel shared personal experiences of their stumbling across deepfake evidence. Ben Fearnley recounted spotting a false bank statement date ('31st September') just before trial, while Tristan identified a suspicious bank email with mismatched language settings. Both examples highlighted that despite sophisticated technology, simple checks on document properties can effectively expose fakes.

Laura Wright discussed the notable COPA v Wright trial, where Dr Craig Wright claimed to be the creator of bitcoin, Satoshi Nakamoto, and was found to have forged at least 47 documents in trial.

The ease of creating deepfake evidence was also discussed. Tristan noted the low cost and accessibility of deepfake creation tools. James added that simply searching online reveals numerous easily accessible platforms for deepfake generation.

To guard against the risk of deepfake evidence, the panel advised proactive vigilance. James emphasised prevention, recommending thorough document reviews well ahead of trial. Mr Justice Waksman suggested examining document provenance closely, noting that documents introduced into evidence unilaterally are inherently riskier.

James highlighted technology solutions like secure platforms, referencing his own experience in the family law sphere with OurFamilyWizard, which prevents document alterations by providing a secure platform for communication. Meanwhile, Ben reassured attendees that traditional legal skills, including attention to detail, familiarity with the case, and instinctual judgement, remain critical defences to guard against the risk of deepfakes. Lawyers should always scrutinise documents closely, questioning their coherence and authenticity, and taking into account the wider circumstances of the case. In short, if it doesn’t feel right, further investigation is advised.

Finally, James optimistically proposed that advancements in AI itself might in the future offer safeguards against deepfakes by enhancing detection capabilities.

Key takeaways:

  • stay vigilant against the possibility of deepfake evidence, particularly in proceedings where emotions, and therefore the temptation to fabricate evidence, run high (eg acrimonious divorce proceedings/child arrangement battles);
  • deploy the professional skills that lawyers have always used when reviewing documents and ascertaining potentially suspicious evidence, questioning the source, journey and content of documents;
  • consider deploying technological assistance to guard against deepfake evidence, including the use of secure platforms;
  • if it appears that a document could be deepfaked, appropriate specialists should be consulted to validate the document well in advance of trial and in accordance with Court Rules.

This article was co-written by Puja Patel, trainee solicitor in the commercial dispute resolution.


Arrow GIFReturn to news headlines

Penningtons Manches Cooper LLP

Penningtons Manches Cooper LLP is a limited liability partnership registered in England and Wales with registered number OC311575 and is authorised and regulated by the Solicitors Regulation Authority under number 419867.

Penningtons Manches Cooper LLP