User-agent: *
Disallow: /admin
Allow: /admin/public
Sitemap: https://example.com/sitemap.xmlThe Disallow: /admin blocks crawling of the /admin directory, but Allow: /admin/public explicitly permits crawling of the /admin/public subdirectory. The User-agent: * applies this rule to all search engines.
Option C uses correct syntax with colons and standard directives. Option C misses colons, Option C has an invalid 'Extra' directive, and Option C has an invalid 'User-agent' value.
User-agent: * Disallow: secret
Robots.txt paths must start with a slash. Without it, 'secret' does not match '/secret' URLs, so crawling is not blocked.
The Sitemap directive tells search engines where to find the sitemap XML file, which lists all important URLs. This improves crawling efficiency and indexing.
export default function handler(req, res) { res.setHeader('Content-Type', 'text/plain'); res.write('User-agent: *\nDisallow: /private'); res.end(); }
The handler sets the Content-Type to 'text/plain' and writes valid robots.txt content, so search engines will correctly interpret it and block /private.