Generative AI Lies

Examples of generative AI making stuff up

Retrieval Augmented Generation

Article (from May 4) about feeding relevant information to a generative AI to reduce the likelihood of it making stuff up, an approach known as “RAG,” or “Retrieval Augmented Generation.”

“RAG can help reduce a model’s hallucinations — but it’s not the answer to all of AI’s hallucinatory problems. Beware of any vendor that tries to claim otherwise.”

(Original Facebook post.)

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *