Nice one!
But sometimes, even with a good prompt, the AI starts hallucinating. Why?
Even with a solid prompt, AI can still hallucinate because it doesn’t actually “know” things—it just predicts what words should come next based on patterns. If the topic is unfamiliar, missing from its training, or too vague, it might fill in the blanks with made-up info that sounds convincing. It’s like a super-confident storyteller who sometimes forgets what’s real and what’s imagined.