New York Lawyer Faces Discipline for Citing Non-Existent AI-Generated Case

A New York lawyer is facing potential disciplinary action after referencing a non-existent case generated by AI, highlighting the challenges of navigating emerging technologies in the legal field.

New York Lawyer Faces Discipline for Citing Non-Existent AI-Generated Case

A New York lawyer is facing potential disciplinary action after referencing a non-existent case generated by artificial intelligence (AI). This incident highlights the challenges attorneys and courts face in navigating emerging technologies like AI.

New York Lawyer Faces Discipline for Citing Non-Existent AI-Generated Case - 1994791440

( Credit to: Reuters )

The 2nd U.S. Circuit Court of Appeals has referred attorney Jae Lee to its attorney grievance panel for her use of OpenAI’s ChatGPT in a medical malpractice lawsuit. Lee failed to verify the validity of the case she cited, leading the court to conclude that her conduct “falls well below the basic obligations of counsel.”

New York Lawyer Faces Discipline for Citing Non-Existent AI-Generated Case - 501795315

( Credit to: Reuters )

Lee, a lawyer at JSL Law Offices, P.C., a small New York firm, expressed surprise at the disciplinary referral and stated her commitment to upholding the highest professional standards. In her appeal to revive her client’s lawsuit against a Queens doctor, she included a non-existent state court decision related to an alleged botched abortion.

When the court requested a copy of the cited decision, Lee admitted that she was unable to provide it. She acknowledged using a case “suggested” by ChatGPT but denied any bad faith, willfulness, or prejudice towards the opposing party or the judicial system.

The order from the 2nd Circuit is the latest example of lawyers inadvertently including false case citations generated by AI tools in court filings. Generative AI programs can produce text that is convincing but incorrect, a phenomenon known as “hallucination.”

Similar incidents have occurred in other high-profile cases, including one involving Michael Cohen, former lawyer and fixer for Donald Trump. Two New York lawyers were also sanctioned last year for submitting a brief with six fictitious citations. In another case, a Colorado lawyer faced temporary suspension from practicing law.

Recognizing the growing prevalence of AI tools in legal practice, an increasing number of judges and courts are issuing orders or considering new rules to govern their use by attorneys. However, the 2nd Circuit concluded that it is unnecessary to establish a specific rule for attorneys to ensure the accuracy of their submissions, as licensed attorneys should already understand this obligation.

While the 2nd Circuit’s rules committee has discussed AI-related issues, no specific panel has been established to examine the matter. However, other appeals courts are forming committees to address the implications of AI in the legal profession.

The 2nd Circuit has referred Jae Lee to its grievance panel for further investigation and has ordered her to provide a copy of the ruling to her client. The underlying case, Park v. Kim, remains dismissed.

As AI technology continues to evolve and impact various industries, including the legal sector, it is crucial for lawyers and courts to adapt and develop guidelines for the responsible and accurate use of AI tools. This case serves as a reminder of the importance of diligence and verification when relying on AI-generated information in legal proceedings.

Leave a Reply

Your email address will not be published. Required fields are marked *