Hallucination in AI refers to when a model generates confident, plausible-sounding information that is factually incorrect or entirely fabricated. In coding, this means AI might invent non-existent APIs, suggest deprecated methods, or create function signatures that don't match the actual library.
Hallucination is perhaps the most important concept for vibe coders to understand. AI doesn't "know" things the way humans do — it predicts plausible outputs, which sometimes means inventing convincing fiction.
AI models are pattern matchers, not fact databases:
Verify critical code:
Prompt for uncertainty:
Recognize warning signs:
Not every error is hallucination. AI also makes genuine mistakes in logic, just like human developers. Hallucination specifically refers to fabricated information presented as fact.