Every website has a hidden "doorman" that greets search engine crawlers. This doorman operates 24/7, holding a simple set of instructions that tell bots like Googlebot where they are and are not allowed to go. This instruction file is robots.txt, and its most powerful and misunderstood command is Disallow.


