Lawyer relies on AI APP, ChatGPT, to cite cases in Federal Court but they end up being fake

by | May 31, 2023 | Firm News, Minnesota News |

Artificial Intelligence maybe the new thing but as one lawyer found out, you need to verify what it is saying or writing otherwise proceed at your own peril. In this case the lawyer did not verify and found out the hard way that the AI APP, ChatGPT, provided him with fake cases for his brief that he submitted to a federal court.

What happened

A lawyer in federal court used the AI APP ChatGPT to generate cases for his motion in a personal injury suit. When the judge received the motion he called into question 6 of the cases generated by ChatGPT and deemed them bogus and unprecedential. Despite not verifying the initial caselaw generated by ChatGPT, the lawyer in question again allegedly failed to verify the cases but instead responded by filing an affidavit with the court attesting that he had asked ChatGPT if one of the cases mentioned by the judge was real or bogus and ChatGPT responded the case was real. He also attested that he asked ChatGPT if the other 5 cases were real and the APP indicated they were all real and could be found in Westlaw and LexisNexis databases. When the lawyer in question finally did his own verification of the cases, he was forced to file another affidavit with the court revealing the source (ChatGPT) for the cited cases was unreliable.

What is Next

The Judge has scheduled a hearing in the next couple weeks to consider possible sanctions against the attorney and possible dismissal of the lawsuit. While the case may survive a dismissal, the attorney in question will likely face sanctions for his conduct. The situatiion is a cautionary tale for lawyers about the need to always verify cases filed in court. While AI tools may be of assistance in the legal arena and to not automatically be ignored, lawyers should not to rely on them 100 percent and do their own work to verify any cases generated by the APP.

Archives

Categories

FindLaw Network