Hallucination
< GlossaryWhen an AI model generates plausible but factually incorrect output. A key challenge of LLMs, addressed through RAG, ground truth data, and verification.
When an AI model generates plausible but factually incorrect output. A key challenge of LLMs, addressed through RAG, ground truth data, and verification.