How To Add Custom Robots.txt File in Blogger

Every Blogger wishes to have high rank and a large number of audience for his blog to get them we will have to completely make our blog Optimized for the search engines, There are so many ways to optimize your blog properly if the proper settings are not applied it will definitely destroy your hard work, As we know about the google current update on blogger which has brought lots of stunning features to make the blog more SEO friendly and they provided the most important function to us which is Robots.txt, Actually this functions tells the search engine that which part or which content of your blog should be indexed and which should be denied which is very important to add it to every blogger it helps a lot in optimizing the blog for the search engine, Its main function is not to index the raw or useless content according to my personal experience and observation i believe that this is most important feature every blogger must apply, So lets get started follow the below instructions and enjoy getting higher rank.

Follow all Steps to Add Robots.txt File in Blogger for Healty ( Friendly ) SEO...

robot.txt file in blogger


  • Step 1: Adding Robots.txt File


To Add The File Into Your Blogger Simply Follow The Below Simple Steps

Go To Blogger >> Settings >> Search Preferences

Then In The Crawlers and Indexing Sections, Look For The Custom Robots.txt 

Ok Then Click On Edit Then check YES A Box Will Appear As Shown In The Below Image

Read :- How To Add SEO Meta Tags in Blogger
Crawlers and indexing

Ok Once You Have Enabled The Function Copy The Below Code And Paste It Over There

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://
www.hackerspedia.com/feeds/posts/default?orderby=UPDATED

Now Replace The Url Colored With Blue With Your Blog Url

And Now You Are Done Just Save it 

Read :- Basic SEO Tips For Beginners

Learn More About This Function:- Optional 


The Below Code is used to tell all the search engines to crawl all the content of your home page or any thing you want to be indexed not only in Google search engine but in all of them like Bing and Yahoo thats why User-agent is used. While the other code is discussed below

User-agent: *Disallow: /

The second code means that you don't want search engine to index your content but that can be used for a particular page only like in the above code we haven't put any URL or the address of a page that we don't want to index see the below example to understand how to use that attribute

User-agent: *
Disallow: /

As you can see in the above code we have used a URL after slash / the same you will do if you don't want to index or don't allow search engine to index that URL.

Read :- Check Google Keyword Ranking

After today's helpful tutorial i pray that you all success in your hard work, Today's help tutorial was all about the new function that was provided with the current Google update on Blogger just follow all the above steps and i am sure it will help you, Remember me in your prayers if you face any difficulty please ask for any help our team will be out there to help you, Please share your valuable thought by commenting below so that we can do better.

Facebook Comment