Generative AI Lies

Examples of generative AI making stuff up

Made-up journals

Scientific American says:

OpenAI’s ChatGPT, Google’s Gemini, Microsoft’s Copilot and other models are befuddling students, researchers and archivists by generating “incorrect or fabricated archival references,”

Which is a problem for librarians:

who end up wasting their time looking for requested nonexistent records, says Library of Virginia chief of researcher engagement Sarah Falls. Her library estimates that 15 percent of emailed reference questions it receives are now ChatGPT-generated, and some include hallucinated citations for both published works and unique primary source documents. “For our staff, it is much harder to prove that a unique record doesn’t exist,” she says.

I kinda want to call these things “hallucitations.”

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *