Generative AI Lies

Examples of generative AI making stuff up

Category: Microsoft

  • European elections

    ()

    Researchers say Bing made up facts about European elections

    “Human rights organization AlgorithmWatch said in a report that it asked Bing Chat—recently rebranded as Copilot—questions about recent elections held in Switzerland and the German states of Bavaria and Hesse. It found that one-third of its answers to election-related questions had factual errors”

    (Article from Dec. 15.)

    (Original Facebook post.)


  • Wrong phone prices

    (, )

    That thing we’ve been talking about lately, where an AI chat system gets incorporated into a search engine and then gives made-up answers to questions?

    Here’s a real example. Microsoft is now including ChatGPT (or some variation on it) as part of Bing, so Twitter user @GaelBreton tried doing some searches with it. They posted a (brief) thread that’s mostly about other aspects of the experience, but the part that interested me most is the final tweet in the thread, which shows a screenshot of Bing/GPT answering a question about phones. And it gives significantly wrong prices or specs for all three of the phones that it mentions.

    So I ask again, as I’m sure I’ll ask many times in the future: what good is a conversational AI interface for search results if it provides false answers?

    (Original Facebook post.)