AI Hallucination Lesson
AI

AI Hallucination Lesson

Learn to spot when AI makes things up

🤖

What is an AI Hallucination?

When AI confidently makes things up

What should I expect on my first day at work?

Did you spot the problem? AI made a confident claim without knowing your situation.

AI Predicts Patterns, Not Truth

Understanding how AI really works

Patterns

AI learns from data and predicts likely responses

Truth

Verified facts that are actually correct

⚠️

Patterns ≠ Truth

AI generates responses based on what seems likely, not what's necessarily true or verified.

Why This Matters

The ripple effect of AI hallucinations

1️⃣

AI makes an error

It sounds confident, so you believe it

2️⃣

You ask a follow-up question

Based on that incorrect information

3️⃣

Everything after is wrong

The whole conversation is now built on false information

🚨

Critical Insight

If you don't spot the hallucination early, everything that follows in the conversation can be affected.

🎯

Can You Spot the Hallucination?

Choose a scenario to test your understanding

💡 Complete all scenarios to finish the lesson
💼

Scenario 1: Career Advice

You asked: "What job should I do after school?"

AI

🤔 What should you think about this answer?

⏱️

Scenario 2: Learning Time

You asked: "How long will it take me to learn a new skill?"

AI

🤔 What should you think about this answer?

Scenario 3: Accuracy & Trust

You asked: "Is this information definitely correct?"

AI

🤔 What should you think about this answer?

Excellent Work! 🎉

You've completed the AI Hallucination lesson

💡

Key Takeaway

AI can sound confident even when it's guessing. Knowing when to question it is an essential part of AI literacy.

What You've Learned:

Confidence doesn't equal correctness - AI can be very wrong while sounding very sure

Always verify important information - Don't trust AI blindly, especially for critical decisions

Humans are responsible for checking - You're the final quality control for AI outputs