That thing where lawyers (and others) use generative AI in court filings, and the AI makes stuff up? Now there’s a list of such situations: the AI Hallucination Cases database.
“This database tracks legal decisions in cases where generative AI produced hallucinated content – typically fake citations, but also other types of arguments.”
“While seeking to be exhaustive (201 cases identified so far), it is a work in progress and will expand as new examples emerge.”
Leave a Reply