Concept Flow - Robots.txt configuration
Start: Browser or Bot Requests URL
Check for robots.txt file
Allow full access
Read robots.txt rules
Match User-agent
Check Disallow/Allow rules
Decide if URL is Allowed
Block URL from crawling
Allow URL to be crawled
When a bot visits a website, it looks for robots.txt to find rules about which pages it can or cannot visit.