Complete the code to import the package for robots.txt generation in Next.js.
import [1] from 'nextjs-robots';
The package exports RobotsTxt component to generate robots.txt in Next.js.
Complete the code to define a basic robots.txt rule allowing all user agents.
export const robotsTxtOptions = {
policies: [
{ userAgent: '*', [1]: true }
]
};disallow instead of allow for permitting access.crawl or permit.The allow property set to true means all user agents can crawl the site.
Fix the error in the robots.txt options to block a specific path '/admin'.
export const robotsTxtOptions = {
policies: [
{ userAgent: '*', disallow: [1] }
]
};The path must be a string starting with a slash, so '/admin' is correct.
Fill both blanks to add sitemap URL and host in robots.txt options.
export const robotsTxtOptions = {
sitemap: '[1]',
host: '[2]'
};The sitemap should point to the sitemap XML URL, and host is the main site URL.
Fill all three blanks to create a robots.txt component with options and export it as default.
import [1] from 'nextjs-robots'; const robots = () => { return <[2] options={robotsTxtOptions} />; }; export default [3];
Import RobotsTxt, use it as a component, and export the function robots as default.