Bird
0
0

An AI chatbot sometimes hallucinates facts when answering medical questions. How can combining AI with human experts help?

hard🚀 Application Q9 of 15
AI for Everyone - AI Safety and Limitations
An AI chatbot sometimes hallucinates facts when answering medical questions. How can combining AI with human experts help?
AHumans limit AI's vocabulary
BHumans speed up AI processing
CHumans verify AI answers to catch hallucinations before sharing
DHumans reduce AI memory use
Step-by-Step Solution
Solution:
  1. Step 1: Understand hallucination risk in medical AI answers

    AI may give false info; human experts can check accuracy.
  2. Step 2: Identify how humans help

    Humans verifying AI answers catch hallucinations before sharing, improving safety.
  3. Final Answer:

    Humans verify AI answers to catch hallucinations before sharing -> Option C
  4. Quick Check:

    Human verification reduces hallucination risk [OK]
Quick Trick: Human checks catch AI hallucinations [OK]
Common Mistakes:
MISTAKES
  • Thinking humans speed processing
  • Confusing memory use with hallucinations
  • Limiting vocabulary doesn't fix facts

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More AI for Everyone Quizzes