Generative AI Lies

Examples of generative AI making stuff up

Category: Citations

  • More hallucitations

    ()

    AI Is Inventing Academic Papers That Don’t Exist — and They’re Being Cited in Real Journals

    Rolling Stone says:

    [Academic] articles which include references to nonexistent research material […] are themselves being cited in other papers, which effectively launders their erroneous citations. This leads to students and academics (and any large language models they may ask for help) identifying those “sources” as reliable without ever confirming their veracity. The more these false citations are unquestioningly repeated from one article to the next, the more the illusion of their authenticity is reinforced.


  • Made-up journals

    ()

    Scientific American says:

    OpenAI’s ChatGPT, Google’s Gemini, Microsoft’s Copilot and other models are befuddling students, researchers and archivists by generating “incorrect or fabricated archival references,”

    Which is a problem for librarians:

    who end up wasting their time looking for requested nonexistent records, says Library of Virginia chief of researcher engagement Sarah Falls. Her library estimates that 15 percent of emailed reference questions it receives are now ChatGPT-generated, and some include hallucinated citations for both published works and unique primary source documents. “For our staff, it is much harder to prove that a unique record doesn’t exist,” she says.

    I kinda want to call these things “hallucitations.”


  • Fake Guardian articles

    (, )

    ChatGPT is making up fake Guardian articles.”

    “In response to being asked about articles on this subject, the AI had simply made some up. Its fluency, and the vast training data it is built on, meant that the existence of the invented piece even seemed believable to the person who [it was attributed to but who] absolutely hadn’t written it.”

    (Original Facebook post.)