Bird
0
0

A website's robots.txt file disallows all pages but the sitemap.xml is submitted to search engines. What is the issue?

medium📝 Analysis Q6 of 15
SEO Fundamentals - Technical SEO Basics
A website's robots.txt file disallows all pages but the sitemap.xml is submitted to search engines. What is the issue?
APages will be crawled but not indexed
BSitemap overrides robots.txt, so no issue
CSearch engines cannot crawl pages despite sitemap submission
DRobots.txt only affects images, so pages are fine
Step-by-Step Solution
Solution:
  1. Step 1: Understand robots.txt effect

    Robots.txt disallow rules prevent search engines from crawling specified pages.
  2. Step 2: Sitemap role vs robots.txt

    Sitemap lists URLs but does not override robots.txt restrictions.
  3. Final Answer:

    Search engines cannot crawl pages despite sitemap submission -> Option C
  4. Quick Check:

    Robots.txt blocks crawling despite sitemap [OK]
Quick Trick: Robots.txt blocks override sitemap URLs [OK]
Common Mistakes:
  • Assuming sitemap overrides robots.txt
  • Thinking robots.txt only blocks images
  • Confusing crawling with indexing

Want More Practice?

15+ quiz questions · All difficulty levels · Free

Free Signup - Practice All Questions
More SEO Fundamentals Quizzes