ADSX
HOME/GLOSSARY/HALLUCINATION
AI FUNDAMENTALS

Hallucination

When AI generates false or fabricated information that appears convincing.

DEFINITION

What is Hallucination?

Hallucination refers to when AI systems generate information that is factually incorrect but presented confidently. AI might 'hallucinate' incorrect details about your brand—wrong products, features, or information. Understanding hallucination helps contextualize AI visibility challenges and the importance of ensuring AI has accurate information about your brand.

IN PRACTICE

We monitor for hallucinations about your brand and work to ensure AI has accurate information to draw from.

WHY IT MATTERS

AI might say incorrect things about your brand due to hallucination. Monitoring and optimizing helps reduce inaccurate representations.

EXAMPLES
01

AI mentioning a product feature you don't have

02

Incorrect founding date or company details

03

Made-up customer testimonials or statistics

FREQUENTLY ASKED QUESTIONS

Why does AI hallucinate?

AI generates responses based on patterns, sometimes creating plausible but incorrect information. This is a known limitation being actively improved.

Can I prevent hallucinations about my brand?

You can't prevent them entirely, but ensuring accurate, clear information is widely available helps reduce their likelihood.

Ready to improve your AI visibility?

Get a free audit to see how your brand appears across ChatGPT, Claude, Perplexity, and other AI platforms.