Robots.txt is a file with a .txt extension that is used to control which files can be accessed by search engine crawlers that crawl your site, robots.txt consists of rules that block or allow certain crawlers access to predefined file paths on your site. Robot.txt is in the root folder of your site / public_html for example public_html/robot.txt or it could be www.your-domain.com/robot.txt
to use this feature your website must use a name like this http://yourdomain.com, there must be http:// or https:// in front of your domain name otherwise your website properties will not appear in the property select box.
Now in this tutorial I will share how to create and test robot.txt in Google Search Console, here I assume you have registered your website on the Google Search Console.
- Login to google search console
- Select the website property that you will make robot.txt
- Select “legacy tools and reports” then click “learn more” on the right menu as shown below
4. Next will apear a dialog box will appear on the right as shown below, then click the words “Robot.txt Tester”
5. Select the website property that you will make robot.txt from the page that appears as shown below
If you do not use http:// or https:// in front of your domain name, your website name will not appear in the “Please select a property” dropdown box, if your website’s domain name is visible then select one.
6. After you have chosen your website’s domain name above, it’s time for you to make the contents of your robots.txt file in the box provided, you can write the code as shown below
7. The next is you have finished creating the robots.txt file, press the submit button
8. Please test all your website urls whether they are blocked by search engine crawlers or not at the bottom as shown below
That’s the tutorial for Creating and testing robot.txt on Google Search Console, hopefully it can provide benefits for all of us.