Attorney Pleads for Mercy After Using AI in Court, Where It Made Up Fake Lawsuits

Few settings would seem worse suited for submitting AI-generated text than a court of law, where everything you say, write, and do, is subjected to maximum scrutiny. And yet lawyers keep getting caught relying on crappy, hallucination-prone AI models anyway, usually to the judge’s — and the client’s — chagrin. After all the public shaming, you’d think they’d know better by now. The latest high-profile instance, The Register reports, comes from a 2023 lawsuit filed against Walmart and Jetson Electric Bikes, in which the plaintiff alleged that a hoverboard sold by the two companies was responsible for a fire that burned down their home. These are serious claims. But the legal minds involved apparently took the easy route, to disastrous effect.  On Thursday, a federal judge in Wyoming asked the plaintiff’s lawyers to give him a good reason not to impose sanctions on them for citing nine totally-made-up legal cases in the suit. And you guessed it: they were conjured up by a shoddy AI model. The lawyers, from the firms Morgan & Morgan and Goody Law Group, withdrew the filing that contained the botched case law, and ate humble pie in a follow-up one. “Our internal artificial intelligence platform ‘hallucinated’ the cases in question while assisting our attorney in drafting the motion in limine,” they wrote, per The Register. “This matter comes with great embarrassment and has prompted discussion and action regarding the training, implementation, and future use of artificial intelligence within our firm.” This is classic AI bullshittery….Attorney Pleads for Mercy After Using AI in Court, Where It Made Up Fake Lawsuits

Leave a Reply

Your email address will not be published. Required fields are marked *