Here I will discuss about what robots.txt file is and how to add custom robots.txt in Blogger. It is very Important For SEO. For a newbie blogger it may be difficult To enable custom robots.txt. So In This tutorial you will learn how to add custom robots.txt file in blogger.
Enable Custom Robot.txt File in Blogger
To enable custom robot.txt file in blogger you need to follow the steps.
- Login to your Blogger and go to your Dashboard.
- Go to Setting then select Search Preferences.
- Scroll Down and see there is a heading Crawling & Indexing.
- In Front of Custom robots.txt. It is disabled click on edit and select yes.
Custom Robots File Code
Paste the code inside the box and click save changes.
User-agent: Mediapartners-Google Disallow: User-agent: * Allow: / Disallow: /search?q=* Disallow: /*?updated-max=*
People also Read: How to Add & Submit Sitemap in Blogger
If you have more than 500 posts then you can replace index=501&max-results=1000
Overall you can add custom robots.txt file in Blogger with sitemap. If you want to know how to submit Blogger sitemap to Google Webmaster Follow this Post How to Submit Blogger Sitemap to Google Webmaster.
What is Robots.txt File
Robots.txt file made to do interactions with the robots, spiders and google crawlers. Instruct them to follow the strict rules which part of your website to be crawled. In other words those crawlers are in search of fresh contents on web so they come and crawl. Any content which are updated or new be found then it index you higher according to rules. Directories and files which are protected and not to be circulated in the search engine then you can stop it.
Final Words: Robots files are beneficial for search engines or robots of Search engines Whom you don’t want to share specific things.