Bird
0
0

What will happen if the robots.txt file contains:

medium📝 Analysis Q5 of 15
SEO Fundamentals - Technical SEO Basics
What will happen if the robots.txt file contains:
User-agent: *
Disallow:

Which URLs are blocked?
ANo URLs are blocked; all are allowed
BAll URLs are blocked
COnly the homepage is blocked
DOnly URLs with query parameters are blocked
Step-by-Step Solution
Solution:
  1. Step 1: Understand empty Disallow directive

    An empty Disallow: means no path is blocked for the specified User-agent.
  2. Step 2: Apply to all crawlers

    Since User-agent is *, this applies to all crawlers, allowing them full access.
  3. Final Answer:

    No URLs are blocked; all are allowed -> Option A
  4. Quick Check:

    Empty Disallow means allow all [OK]
Quick Trick: Empty Disallow means allow all URLs [OK]
Common Mistakes:
  • Thinking empty Disallow blocks everything
  • Confusing Disallow with Allow

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More SEO Fundamentals Quizzes