What are hallucinations in LLMs?
LLM Engineer Interview Questions: LLM Evaluation, Hallucinations, Guardrails, Production Monitoring
Audio flashcard · 0:18Nortren·
What are hallucinations in LLMs?
0:18
Hallucinations are outputs where the model states something false with apparent confidence. They occur because LLMs generate plausible-sounding text without grounding in truth. Hallucinations are especially common for facts the model never learned, niche topics, recent events, and numerical details. They are the central reliability problem for LLM applications.
arxiv.org