Recall & Review
beginner
What is the purpose of a
robots.txt file in a Next.js project?The
robots.txt file tells search engines which pages or sections of your website they can or cannot visit and index. It helps control what appears in search results.Click to reveal answer
beginner
How do you serve a
robots.txt file in a Next.js app?You can place a
robots.txt file inside the public/ folder. Next.js will serve it automatically at /robots.txt when the site runs.Click to reveal answer
beginner
What does this
robots.txt content mean?<br>User-agent: * Disallow: /admin Allow: /
It means all search engines (
*) are told not to visit or index any pages under /admin, but they can visit and index all other pages.Click to reveal answer
intermediate
How can you dynamically generate a
robots.txt file in Next.js?You can create an API route (e.g.,
/api/robots.txt) that returns the robots.txt content with the correct text/plain header. Then rewrite /robots.txt to this API in next.config.js.Click to reveal answer
beginner
Why is it important to test your
robots.txt file after deployment?Testing ensures search engines see the correct rules. Mistakes can block important pages from indexing or allow private pages to be indexed, affecting your site's visibility.
Click to reveal answer
Where should you place a static
robots.txt file in a Next.js project?✗ Incorrect
Next.js serves static files from the
public/ folder at the root URL.What does
User-agent: * mean in a robots.txt file?✗ Incorrect
* is a wildcard meaning all user agents (search engines).How do you tell search engines not to index the
/private folder?✗ Incorrect
Disallow tells search engines not to visit or index that path.Which HTTP header should be set when serving
robots.txt dynamically?✗ Incorrect
robots.txt is a plain text file, so text/plain is required.What happens if you block all pages in
robots.txt?✗ Incorrect
Blocking all pages stops search engines from indexing your entire site.
Explain how to add a
robots.txt file to a Next.js project and what it controls.Think about where static files go and what search engines read.
You got /4 concepts.
Describe how to create a dynamic
robots.txt in Next.js and why you might want to do that.Consider when your rules might change based on environment or conditions.
You got /4 concepts.