0
0
SEO Fundamentalsknowledge~10 mins

Robots.txt configuration in SEO Fundamentals - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to allow all web crawlers to access the entire website.

SEO Fundamentals
User-agent: [1]
Disallow:
Drag options to blanks, or click blank then click option'
A*
BGooglebot
C/
DNone
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'None' instead of '*' for all user agents.
Putting '/' in Disallow which blocks the whole site.
2fill in blank
medium

Complete the code to block all web crawlers from accessing the /private directory.

SEO Fundamentals
User-agent: *
Disallow: [1]
Drag options to blanks, or click blank then click option'
A/public
B/images
C/private
D/
Attempts:
3 left
💡 Hint
Common Mistakes
Using '/' which blocks the entire website.
Using a directory name without a leading slash.
3fill in blank
hard

Fix the error in the robots.txt snippet to correctly block Googlebot from /secret.

SEO Fundamentals
User-agent: Googlebot
Disallow[1] /secret
Drag options to blanks, or click blank then click option'
A=
B:
C-
D/
Attempts:
3 left
💡 Hint
Common Mistakes
Using '=' instead of ':'.
Omitting the colon after Disallow.
4fill in blank
hard

Fill both blanks to block Bingbot from /admin and allow all others full access.

SEO Fundamentals
User-agent: Bingbot
Disallow: [1]

User-agent: [2]
Disallow:
Drag options to blanks, or click blank then click option'
A/admin
B*
C/private
D/
Attempts:
3 left
💡 Hint
Common Mistakes
Blocking all bots by mistake.
Using wrong directory paths.
5fill in blank
hard

Fill all three blanks to block all bots from /tmp, allow Googlebot full access, and block all others from /logs.

SEO Fundamentals
User-agent: [2]
Disallow:

User-agent: *
Disallow: [1]
Disallow: [3]
Drag options to blanks, or click blank then click option'
A/tmp
BGooglebot
C/logs
D/private
Attempts:
3 left
💡 Hint
Common Mistakes
Using the same User-agent twice without proper order.
Blocking Googlebot unintentionally.