Deepfakes, falsified evidence and family law – what lawyers need to know

Seeing is not always believing. Advances in technology mean that audio, video, and documents can be fabricated with alarming ease. For family lawyers, this raises critical questions: how do we spot falsified evidence, and what are our duties when it appears in a case?

A recent high-profile example underscores the scale of the problem: computer scientist Craig Wright, who repeatedly and falsely claimed to be Bitcoin’s creator, was found to have falsified hundreds of documents in litigation to support his assertions (Crypto Open Patent Alliance v Wright; Wright and other companies v BTC Core and others [2024] EWHC 1198 (Ch)). The judgment illustrates the scale of dishonesty possible with digital evidence and the importance of rigorous scrutiny.

Why this matters

Forgery is as old as the law, but AI has changed the game. Tools that create ‘deepfakes’ – realistic but fake media – are now widely accessible and often free. We are moving into an era whereby falsified evidence can be created potentially by anyone on any device.

In family disputes, where emotions run high, we as lawyers deploy evidence to persuade judges to make decisions that have profound implications for our clients and their families. We need to do so with confidence, vigilance and with our duties as officers to the court in mind.

It is essential to recognise the growing availability of tools – often free – that can create or manipulate evidence. These technologies can appear in many forms within cases, and we must be able to identify the key indicators of falsification.

With increased accessibility, fabricated material is becoming more common. It may be provided by our own clients to strengthen their position or presented by opposing parties to support or deny allegations. In Mr Wright’s case, the motivation for forgery was financial; in family law, motivations are often even stronger and emotionally driven, particularly when child access is at stake.

Spotting red flags

Each item of evidence, whether from clients or from others, must be assessed on its individual merits and examined within the broader context of the case to determine its credibility.

Questions to consider when evaluating digital evidence include:

  • Was it filed at the last minute? Exercise extra caution with last minute filings.
  • Does it dramatically alter the case? Look out for that ‘smoking gun’ piece of evidence.
  • Is there a convoluted story as to why original versions were not available? Be wary of pivotal evidence first referenced late in proceedings.
  • Are formats non-native (eg PDFs, scans, printed copies instead of source files)?
  • Why is the original not available? Is the story believable?

Some key definitions

Deepfake – Allegedly originating from a combination of ‘deep learning’ (a form of AI) and fake, the term deepfake is typically used to describe falsified media which has been generated or altered using some form of AI (machine learning, generative AI etc). Most commonly this relates to audio and video content where technology is used to ‘map’ someone’s voice or image into or onto that media. Though most common usage is now in relation to audio and video, the term originated regarding images and the use of face-swapping technology.

Shallowfake – Sometimes used to refer to other forms of falsified evidence which have been falsified or manipulated without the use of AI, such as PDFs or emails etc.

Cheapfake – This is a term sometimes used to describe particularly poor fakes, often generated using online falsification tools, but which stand up to very little scrutiny.

Falsified evidence – A term that relates to any evidence which has been falsified in some way, regardless of how it was manufactured.

When to call an expert

When receiving data from another party, always keep an untouched original copy. If appropriate, consider having a forensic expert preserve it. Never open the original file – only work from a duplicate – because even opening a file can alter its metadata, which may be critical for detecting falsification.

When reviewing evidence, look for signs of inconsistency or irregularity. Key checks include:

  • Content – Does the document align with other evidence? Are details like headers, logos, and formatting appropriate for its date?
  • Dates – Watch for impossible or unusual dates (eg 31 September) or actions on weekends/bank holidays.
  • Comparison – Does it match legitimate files from the same period?
  • Metadata – Review properties (eg created/modified dates) for anomalies.
  • Media files – Ask how and where the file was created, and confirm location and device details (format, aspect ratio, resolution) for consistency.

Take care when checking for inconsistencies, especially before making any allegations. If initial checks raise concerns, consider involving a digital forensic expert early. They can uncover hidden timestamps, structural anomalies and device inconsistencies. While cost and proportionality matter, expert input may be decisive where falsification could change the outcome.

The bottom line

AI-driven falsification is here to stay. Family lawyers must stay alert, understand the technology, and know when to seek expert help.

Our authors recently explored these issues in depth in an article for Lexis Nexis Family Law Journal. That piece considers the legal framework, practical steps and emerging regulation. Read the full article here. Please do not hesitate to contact us if we can help further with these issues in your family law matter.

This article was co-written by Dr Tristan Jenkinson, Sky Discovery.

Related expertise