• ImplyingImplications@lemmy.ca
    link
    fedilink
    arrow-up
    11
    ·
    13 hours ago

    It could hallucinate a citation that never even existed as a fictional case

    That’s what happened in this case reviewed by Legal Eagle.

    The lawyer provided a brief that cited cases that the judge could not find. The judge requested paper copies of the cases and that’s when the lawyer handed over some dubious documents. The judge then called the lawyer into the court to ask why he submitted fraudulent cases and why he shouldn’t have his law licence revoked. The lawyer fessed up that he asked ChatGPT to write the brief and didn’t check the citations. When the judge asked for the cases, the lawyer went back to ask ChatGPT for them, and it generated the cases…but they were clearly not real. So much so that the defendants names would change throughout the case, the judges who ruled on the cases were from different districts, and they were all about a page long when real case rulings tend to be dozens of pages.