Crime and Punishment A lawyer named Steven Schwartz is in even more deep trouble after using ChatGPT to research a brief. As should be expected at this point, the tool spat out a bunch of nonsense, leading Schwartz to reference a bunch of completely made-up court cases in the brief for his client, a man suing an airline over an alleged injury. Now, Schwartz has appeared before a Manhattan judge today alongside his associate Peter LoDuca, the New York Times reports — and it’s not going well. “I simply had no idea that ChatGPT was capable of fabricating entire case citations or judicial opinions, especially in a manner that appeared authentic,” wrote Schwartz in a declaration earlier this week, as quoted by the NYT. “This has been deeply embarrassing on both a personal and professional level as these articles will be available for years to come,” he added. Colossal Fumble The details emerging during today’s court hearing are almost painful to read. The courtroom was packed with spectators, with some even “being sent to an overflow courtroom,” according to courthouse reporter Josh Russell. During the hearing, Federal District Court Judge Kevin Castel conjured the AI-fabricated case “Varghese vs. China Southern Airlines Co. Ltd.” in his questioning. “What did you think when you read this?” he asked LoDuca, as quoted by Russell. “It’s gibberish. Does that make any sense to you?” Throughout the tongue-lashing, the two layers clearly didn’t have much to say, except to point out just how convincing all of…Things Are Getting Even Worse for Lawyer Who Used ChatGPT in Court