
Image by Beyond My Ken, from Wikimedia Commons
Self-Represented Litigant Uses AI Avatar, Sparks Courtroom Backlash
A man who appeared on a video screen to defend his case in a New York court turned out to be an AI-generated avatar instead of a real person.
In a rush? Here are the quick facts:
- A man used an AI avatar to argue his case in court.
- Judges quickly stopped the video after realizing it wasn’t a real person.
- The court was upset he hadn’t disclosed the use of an avatar.
As first reported by the AP, Jerome Dewald appeared before the New York State Supreme Court Appellate Division’s First Judicial Department to argue his employment dispute through a video submission, while allegedly representing himself.
“The appellant has submitted a video for his argument,” said Justice Sallie Manzanet-Daniels, as reported by the AP. “Ok. We will hear that video now,” she added.
The AP reported that on screen appeared a well-dressed, youthful figure who greeted the judges by saying, “May it please the court. I come here today a humble pro se before a panel of five distinguished justices.”
But the judge quickly paused. “Ok, hold on. Is that counsel for the case?” she asked. Dewald admitted: “I generated that. That’s not a real person.”
According to the AP, the video was immediately stopped. “It would have been nice to know that when you made your application. You did not tell me that sir,” Manzanet-Daniels said. “I don’t appreciate being misled,” she added.
Dewald later apologized, saying he meant no harm. He said that since he does not have a lawyer and has a problem with public speaking, he used an AI avatar to present his case more clearly. He made the avatar with a tool from a San Francisco tech company but could not make it look like him in time.
“The court was really upset about it,” Dewald said, as reported by the AP. “They chewed me up pretty good,” he added.
The AP notes that Dewald is not the first to get in trouble over AI in the courtroom. For example, the AP cites the case of two lawyers who were fined in 2023 for using an AI chatbot that created fake cases.
The AP notes that even high-profile lawyers, like those for Michael Cohen, faced similar issues when they unknowingly cited AI-created rulings.
Moreover, recent reports show that AI’s hallucinations—errors and made-up information created by generative AI models—are causing legal problems in courts across the United States.
Legal technology expert Daniel Shin noted to the AP that while it’s rare for lawyers to use AI in court, it’s not surprising that a non-lawyer like Dewald would try such an approach.
Dewald’s case is still under review by the court.
Leave a Comment
Cancel