Sk blogging tutorial to learn about how to optimize your website and your content. If you want to learn SEO  more follow here  https://sk-learn.blogspot.com/

Setup Custom Robots.txt Easy to Add

In the last tutorial, you make your business website or personal website look professional and learn how to customize the blog widget user-friendly. In this tutorial, you will learn how to set up a Custom robots.txt file efficient way. By following this blogging tutorial your site appears on the top of the search engine like google, bing, yahoo. Next tutorial you will learn about setup custom robots header tags.

What is a custom robots.txt file?

The Robots.txt is a text file root file on your blog server that you can alter for search engine( google search, bing search, yahoo search ) bots. It tells internet search engine bots which registries, pages, or links should be indexed or not be indexed in search results. If your website URL is www.tutorial.com, then your robots.txt appears on ww.tutorial.com/robots.txt.

It implies you can confine search engine bots to creep a few registries and site pages or connections of your site or blog. Presently custom robots.txt is accessible for bloggers and Blogspot also

Now, you know some basic knowledge of robots.txt. I give example below.

User-agent: Googlebot
Disallow: /nogooglebot/

The Googlebot should not crawl any folder on your directory or any subdirectories.
here another example 

User-agent: *
Allow: /

All user agents can access our directory or subdirectory.

The robots.txt should be standard UTF-8 text files. Don't use another format often save that files in a prohibitory format and can add irrelevant characters, like curly quotes, it causes crawlers problem.

Robots.txt file Syntax

  • Robots.txt file contains 1 or more groups.
  • Every group contains various rules or directives, one directive per line. Per group starts with a line that defines the aim of the groups.
  • A group provides the following report:
    • Who the group connects to (the user agent).
    • Which directories or files the user-agent can access.
    • Which directories or files that the user-agent can't access.
  • Crawlers manner groups from head to foot. A user agent can equal only one ruleset, which is the first, most special group that equals a given user agent.
  • Default assuming is that a user agent can crawl any page or directory not blocked by a disallow rule.
  • Rules are case-sensitive. For instance, disallow: /file.asp applies to https://www.example.com/file.asp, but not https://www.example.com/FILE.asp.
  • The # use for comments.

Create robots.txt 

Step 1: Log in to your blogger account. Go to settings, scroll down find "Crawlers and indexing" under, "Enable custom robots.txt
". Turn on the button nearby it so that you can add the custom robots.txt file to your blog. See the image below for more information.

Step 2: Next option you will see the "custom robots.txt", go forward and click on it. See the image below for more information.

Step 3: After that"custom robots.txt" a small popup window  Copy the custom robots.txt file fit for you below and paste it inside the box. Don’t forget to change your URL with your blog address or a custom domain. 



Type 1 robots.txt

Apply this custom robots.txt file if your sitemap is an XML sitemap. The sitemap you submitted in the google search console should be the same that is included in the custom robots.txt file.

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://www.yourblogurl.com/sitemap.xml

Type 2 robots.txt

Apply this custom robots.txt file if your sitemap is an atom sitemap. the sitemap you submitted in the google search console should be the same that is included in the custom robots.txt file. And also use this if your post is not more than 500 posts.

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://www.yoururl.com/atom.xml?redirect=false&start-index=1&max-results=500

Type 3 robots.txt

This same as type 2 robots.txt but type 2 only allow500 posts. By using this type robots.txt  if your post is more than 500 posts near reach a thousand. If your post is more than 1000, you can use the same file by adding another sitemap line to it to extend beyond 1000 posts.

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://www.yoururl.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: https://www.yoururl.com/atom.xml?redirect=false&start-index=500&max-results=1000


DO NOT forget to change your URL

How to Check Custom Robots.txt File on Your Blog

Enter your URL https://www.yoururl.com/robots.txt

Final Thoughts:

By attaching Custom Robots.txt File on your Blogger it helps to increase your website’s organic traffic. If you liked this blog article, share it with your supporters who want to add custom robots.txt files on blogger and want to improve site traffic. Leave your query in the comments section.


Thanks!