Lawyers used ChatGPT in court, but it totally flew away.  Now they have to pay the fine and be ashamed

Lawyers used ChatGPT in court, but it totally flew away. Now they have to pay the fine and be ashamed

Steven Schwartz and Peter LoDuca have to pay a fine because they used the help of ChatGPT, which made up the rulings they later presented in court. They managed to avoid disciplinary action. According to the expert, it is thanks to the media publicity that the case gained.

5,000 was imposed by the court. fines for the lawyers representing Roberto Mata. A man sued Avianca after a snack cart injured him on board the plane. Steven Schwartz, a lawyer with 30 years of experience, preparing for the case, he used the ChatGPT application. However, this one made up the judges’ rulings (“hallucinated” – as AI specialists say), to which the lawyer later referred during the trial.

In addition to the fine, the judge ordered Schwartz and Peter LoDuca (the other attorney who represented Mata) to send letters to the judges whose alleged ChatGPT-forged rulings were used in the trial –

ChatGPT made up with lawyers

The judge wrote in his opinion that the lawyers violated the fundamental principles of the US legal system. He stressed that quoting false judgments does a lot of damage. The opposing side has to waste time and money to expose the ruse, and the courts are distracted from other important matters.

Moreover, according to the judge, this reinforces a cynical attitude towards the legal profession and the system as a whole. This, in turn, may create a temptation to question judicial decisions.

The judge chose not to force an apology from the judges whose opinions had been falsified. He decided that an insincere apology meant nothing and left the matter to Schwartz and LoDuci.

The AI ​​was supposed to help with the case, instead it was hallucinating

How did Steven Schwartz get ‘scam’ by ChatGPT? While preparing a case regarding his client’s fatal accident, the lawyer decided to make his life easier and asked AI to find similar cases of passengers who also decided to sue the airline.

ChatGPT did a great job. He found many other cases where carriers from around the world were sued – including China Southern Airlines, KLM or United Airlines. Schwartz trusted AI and decided to use these cases in court as support for his arguments. The problem is that each of them was completely made up by ChatGPT.

The lawsuit was quickly dismissed because neither Avianca’s defense lawyers nor the trial judge himself could find any of the disputes that Schwartz presented. The lawyer was summoned for questioning, and during his sworn testimony, he admitted that he had not searched for the cases himself, but had asked ChatGPT to do so.

He explained that he had no idea that a popular chatbot could give him false information. What’s more, as proof, he presented screenshots showing the course of the conversation with the program. He added that in one case he asked AI if the case was true. Bot confirmed.

The case was later dealt with by a three-judge panel of the Court of Appeal. According to their opinion, the judgments presented by Schwartz and, as it turned out later, made up by ChatGPT, contained errors in reasoning and style, which usually do not appear in legal documents.

“His legal analysis is gibberish. A summary of the procedural history of the cases is hard to trace and borders on nonsense,” the judges concluded.

The cases of Swartz and LoDuci were not referred to disciplinary proceedings. However, the responsible authorities may decide to do so themselves. According to Stephen Giller, a professor of legal ethics at New York University School of Law, the attorneys managed to avoid a worse fate thanks to the publicity that the case gained. In his opinion, both lawyers will forever be known as those who were deceived by ChatGPT.

Source: Gazeta

You may also like

Immediate Access Pro