ChatGPT for legal research and the chatbot created fake cases

In a recent court case, a lawyer relied on ChatGPT for legal research, resulting in the submission of false information. The incident sheds light on the potential dangers associated with the intelligence in the legal field.

digital court

The case revolved around a man suing an airline for personal injury. The plaintiff's legal team filed a brief, citing several previous court cases to support its arguments, seeking to set a legal precedent. However, the airline's lawyers discovered that some of the cases listed did not exist and immediately notified the presiding judge.

Judge Kevin Castel expressed his surprise at the situation, calling it an "unprecedented circumstance". In his order, the judge sought an explanation from the plaintiff's legal team.

Steven Schwartz, a colleague of the lead attorney, confessed to using ChatGPT to search for similar legal precedents. In a written statement, Schwartz expressed deep regret that he had "never previously used artificial intelligence for legal research and was not aware that its content could be false."

ChatGPT confirmed the authenticity of a case, indicating that it could be found in legal databases such as LexisNexis and Westlaw. However, subsequent investigations revealed that this case did not exist, leading to further doubts about the other cases given by ChatGPT.

In light of this incident, both attorneys involved in the case, Peter LoDuca and Steven Schwartz of the law firm Levidow, Levidow & Oberman, have been summoned to an upcoming disciplinary hearing on June 8 to explain their actions.

This event has sparked much debate within the legal community about the appropriateness of using artificial intelligence tools in legal research and the need for comprehensive guidelines to prevent similar incidents.

Source: NYT The Best Technology Site in Greecefgns

Subscribe to Blog by Email

Subscribe to this blog and receive of new posts by email.

ChatGPT, chatbot

Written by giorgos

George still wonders what he's doing here ...

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).