Bird
0
0

How should you write a robots.txt file to allow Googlebot full access except blocking the /admin/ directory, while denying all other crawlers access to the entire site?

hard📝 Application Q9 of 15
SEO Fundamentals - Technical SEO Basics
How should you write a robots.txt file to allow Googlebot full access except blocking the /admin/ directory, while denying all other crawlers access to the entire site?
AUser-agent: Googlebot<br>Allow: /admin/<br>User-agent: *<br>Disallow: /admin/
BUser-agent: *<br>Disallow: /admin/<br>User-agent: Googlebot<br>Allow: /
CUser-agent: Googlebot<br>Disallow: /admin/<br>User-agent: *<br>Disallow: /
DUser-agent: *<br>Disallow: /<br>User-agent: Googlebot<br>Disallow: /admin/
Step-by-Step Solution
Solution:
  1. Step 1: Allow Googlebot except /admin/

    Specify Googlebot with Disallow: /admin/ to block only that folder.
  2. Step 2: Block all other crawlers

    Use wildcard User-agent: * with Disallow: / to block entire site.
  3. Step 3: Order matters

    Specific user-agent rules should come before wildcard rules.
  4. Final Answer:

    User-agent: Googlebot
    Disallow: /admin/
    User-agent: *
    Disallow: /
    -> Option C
  5. Quick Check:

    Specific rules override wildcard; order is important [OK]
Quick Trick: Place specific user-agent rules before wildcard rules [OK]
Common Mistakes:
  • Reversing order of user-agent blocks
  • Using Allow incorrectly to override Disallow
  • Blocking Googlebot entirely by mistake

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More SEO Fundamentals Quizzes