SEO Fundamentals - Technical SEO Basics
Which directive in
robots.txt is used to prevent crawlers from accessing specific pages?robots.txt is used to prevent crawlers from accessing specific pages?Disallow directive tells crawlers which pages or folders they cannot access.Allow permits access, Crawl-delay sets wait time, and Sitemap points to sitemap files.15+ quiz questions · All difficulty levels · Free
Free Signup - Practice All Questions