Before starting let’s firs get an insight on what a Robot.txt file is -
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.These instructions may be to hide some information or to let bots to access some, totally depends on your requirement.
Following are a few examples of what you can type in a robots.txt file.
Ban all robots
User-agent: * Disallow: /
Allow all robots
To allow any robot to access your entire site, you can simply leave the robots.txt file blank, or you could use this:User-agent: * Disallow:
Ban specific robots
To ban specific robots, use the robot’s name. Look at the list of robot names to find the correct name. For example,Google is Googlebot and Microsoft search is MSNBot. To ban only Google:User-agent: Googlebot Disallow: /
Allow specific robots
As in the previous example, use the robot’s correct name. To allow only Google, use all four lines:User-agent: Googlebot Disallow: User-agent: * Disallow: /
Ban robots from part of your site
To ban all robots from the page “Archives” and its subpages, located at http://yourblog.example.com/archives/,User-agent: * Disallow: /archives/
Leave A Comment