In Mavundla v MEC: Department of Co-Operative Government and Traditional Affairs KwaZulu-Natal, the Pietermaritzburg High Court dealt with the consequences of legal practitioners submitting false case citations, most likely generated by artificial intelligence (AI), in their court filings.

Who should care about this judgment and why?

  • Legal practitioners using AI for legal research should take note, as the case highlights the dangers of relying on AI-generated content without proper checks.
  • Judges, magistrates and court officials need to be aware that AI-generated material can introduce errors into pleadings, judgments or arguments.
  • Individuals whose rights may be affected by AI-based decisions should pay attention, especially where AI tools are used in legal or administrative decision-making.

What could you do about it?

  • Always verify AI-generated legal research before using it in court or administrative processes.
  • Put measures in place to identify AI hallucinations in legal arguments, pleadings or official documents.
  • Train legal and government professionals on the risks, limitations and ethical issues related to the use of AI in their work.

Our insights on the judgment

This judgment is a warning to legal professionals about the uncritical use of AI tools in legal research and drafting. It shows why thorough fact-checking remains essential, regardless of the source of the information, to maintain professional standards and uphold justice.

The case demonstrates that AI tools can “hallucinate” – producing case law, statutes or arguments that are entirely fictional. As the legal and administrative sectors increasingly adopt AI, this decision illustrates the importance of closely reviewing AI-generated content before allowing it to influence official decisions.

It also reminds legal practitioners of their ethical obligations to the court and their clients, and the need to apply sound professional judgment when performing their duties.

Digest

Facts and background

Mavundla challenged a decision made by the Department of Co-Operative Government and Traditional Affairs (COGTA) in KwaZulu-Natal relating to a traditional leadership dispute.

His legal representatives filed a supplementary notice of appeal, citing several authorities. Upon examination, the court found that many of the cited cases did not exist in any recognised legal database.

The judge conducted an independent search using ChatGPT, an AI chatbot, to verify one of the citations. The AI incorrectly confirmed the case’s existence and details, showing how unreliable such tools can be for legal research. The court found that the legal team had likely relied on AI-generated material without checking its accuracy, which led to fictitious case references being submitted.

It became clear that the AI-generated research included references to made-up cases, raising serious concerns about the reliability of legal work produced with the help of AI.

Reasoning

The court stressed that legal practitioners have an ethical duty not to mislead the court, whether by intention or negligence. It stated that a lack of knowledge about AI’s risks does not excuse a breach of this duty.

Submitting false or non-existent legal authorities misrepresents the law and breaches professional standards. The court also pointed out that legal practitioners must properly supervise their staff and ensure that all information – including content sourced from AI – is accurate and reliable.

Order

  • The court dismissed Mr Mavundla’s application for leave to appeal, finding no reasonable prospects of success. It criticised the submissions as flawed and unprofessional.
  • The court ordered Surendra Singh and Associates, the law firm representing Mr Mavundla, to pay the costs related to additional court appearances on specified dates. This order signalled the court’s disapproval of the firm’s conduct in submitting unverified and fictitious legal citations.
  • The registrar was instructed to forward a copy of the judgment to the Legal Practice Council (KwaZulu-Natal Provincial Office) for investigation and possible disciplinary action against the legal practitioners involved. This step reflects the court’s serious concern over the misuse of AI-generated content and its impact on professional integrity in legal proceedings.

Details of Mavundla v MEC: COGTA

  • Universal citation: [2025] ZAKZPHC 2
  • Case number: 7940/2024P
  • Full name: Mavundla v MEC: Department of Co-Operative Government and Traditional Affairs KwaZulu-Natal and Others