Bird
0
0

You want to allow Googlebot to crawl everything except the /private/ folder, but block all other bots from the entire site. Which robots.txt configuration achieves this?

hard📝 Application Q15 of 15
SEO Fundamentals - Technical SEO Basics
You want to allow Googlebot to crawl everything except the /private/ folder, but block all other bots from the entire site. Which robots.txt configuration achieves this?
AUser-agent: Googlebot Allow: / User-agent: * Disallow: /private/
BUser-agent: * Disallow: / User-agent: Googlebot Allow: /private/
CUser-agent: Googlebot Disallow: /private/ User-agent: * Disallow: /
DUser-agent: * Disallow: /private/ User-agent: Googlebot Disallow: /
Step-by-Step Solution
Solution:
  1. Step 1: Understand Googlebot's rule

    Googlebot should be allowed everywhere except /private/, so Disallow: /private/ applies to Googlebot.
  2. Step 2: Understand other bots' rule

    All other bots (*) should be blocked from the entire site, so Disallow: / applies to them.
  3. Step 3: Check options

    User-agent: Googlebot Disallow: /private/ User-agent: * Disallow: / matches these rules exactly. Other options either allow or block incorrectly.
  4. Final Answer:

    User-agent: Googlebot Disallow: /private/ User-agent: * Disallow: / -> Option C
  5. Quick Check:

    Googlebot partial block, others full block = C [OK]
Quick Trick: Use specific user-agent rules before general ones [OK]
Common Mistakes:
  • Reversing Allow and Disallow for Googlebot
  • Blocking Googlebot fully by mistake
  • Using Allow incorrectly for blocking

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More SEO Fundamentals Quizzes