
Smart Glasses in the Witness Box: A New Threat to Evidence Integrity
15th April 2026The Civil Justice Council
The Civil Justice Council (CJC) is the body responsible for overseeing the civil justice system in England and Wales. In February 2026, it published an interim report and consultation paper examining whether new procedural rules are needed to govern the use of AI by legal representatives and expert witnesses in the preparation of court documents.
The consultation document covers a wide range of document types. These include statements of case, skeleton arguments, witness statements and, critically for those working in medico-legal practice, expert reports.
The paper draws heavily on the judgment of Dame Victoria Sharp, President of the King’s Bench Division, in Ayinde v The London Borough of Haringey [2025] EWHC 1383 (Admin). That case is a salutary reminder of the courts’ appetite for robust accountability when AI is used without appropriate professional oversight.
What the Consultation Proposes for Expert Reports
The CJC’s working group recognises that expert witnesses occupy a distinct position. They are subject to their own professional regulatory frameworks. They are also subject to specific duties under the Civil Procedure Rules, principally through Practice Direction 35.
The consultation notes that a 2025 Bond Solon survey of 525 expert witnesses found that 20% had already used AI in their role as an expert witness. That is a significant proportion. It points to rapid adoption, and to the likelihood that courts will increasingly encounter expert evidence that has been shaped, in part, by AI tools.
The CJC references the American case of Kohls v Elison, decided in January 2025. In that case, an expert witness used AI to draft his report and inadvertently submitted misinformation as a result. The CJC regards this as an example of a real and live risk.
The paper’s proposal for expert reports is targeted and considered. It does not seek to prohibit the use of AI. Instead, it proposes that Practice Direction 35, which governs the form of experts’ statements of truth, should be amended. The amendment would require an expert to explain what use of AI has been made in preparing the report, other than administrative functions such as transcription. The expert would also be required to identify the specific AI tools used.
The proposal draws a deliberate distinction between administrative uses of AI, which are regarded as unobjectionable, and generative uses, which require disclosure. Spell checking, grammar correction and transcription would not need to be declared. Using a large language model to draft sections of the report, analyse clinical literature or structure conclusions would require disclosure.
Why This Matters for Medico-Legal Practice
The implications for those preparing medico-legal reports are significant. The expert’s duty is to the court, not to the instructing party. That duty requires independence, accuracy and intellectual honesty. AI tools, however sophisticated, remain prone to what the CJC calls hallucination. This is the generation of plausible but fictitious information. In a clinical or legal context, a hallucinated reference to a research paper or clinical guideline could undermine an entire expert opinion.
There is also a subtler concern. The CJC notes that large language models reflect the errors and biases embedded in their training data. Those biases can enter court documents without the author being aware of them. For expert witnesses dealing with conditions where clinical evidence is contested or evolving, that risk is not theoretical.
A further consideration raised in the consultation is the risk to the level playing field between experts. If one party’s expert uses AI to synthesise literature and structure argument, while the opposing expert does not, and that use goes undisclosed, the court cannot properly assess the foundations of the evidence it is receiving. Transparency serves the interests of justice.
The Current Position and What to Do Now
The proposals remain consultative. No new rules have been introduced. Expert witnesses are not currently required by court rule to declare their use of AI. However, existing professional and ethical obligations already apply. Those obligations require honesty, accuracy and that the report represents the expert’s own independent views.
Expert witnesses who use AI tools in preparing their reports would be well advised to keep a clear record of how and where those tools were used. They should verify all clinical references independently. They should satisfy themselves that every conclusion in the report is their own. They should not present AI generated material as their own analysis without proper review and ownership.
The consultation period represents an opportunity for medico-legal practitioners and their professional bodies to contribute to shaping the rules that will govern this area. The CJC has asked specifically whether the proposed amendment to Practice Direction 35 is appropriate. Responses from those with direct experience of the pressures and practicalities of expert witness work will be informative.




