AI Hallucination
Definition: An AI hallucination happens when an artificial intelligence system confidently gives an answer that is completely made up or wrong. It looks believable but has no real source behind it.
Example
A lawyer asks an AI tool to find case law for a brief, and it lists court decisions that do not actually exist. The AI is not lying on purpose; it is generating information that sounds real because it was trained to predict text patterns, not verify facts.
Why It Matters?
AI hallucinations can cause serious problems when accuracy matters, especially in law. False case citations, made-up facts, or incorrect legal reasoning can harm clients, waste time, and even lead to court sanctions. Lawyers using AI must always double-check the information it produces to ensure it is real and reliable.
Learn more: When AI Hallucinations Hit the Courtroom: How Mata v. Avianca Changed Legal Practice
