SEO Fundamentals - Technical SEO Basics
Which
robots.txt directive correctly blocks all web crawlers from accessing any part of the website?robots.txt directive correctly blocks all web crawlers from accessing any part of the website?* is used to specify all user agents.Disallow: / blocks access to the entire site.* + Disallow: / blocks all [OK]15+ quiz questions · All difficulty levels · Free
Free Signup - Practice All Questions