How to Add Your Custom Robots.Txt In blog or website-Trips & Tricks BD














Today we're going to enable a custom Robots.Txt file in Blogger which can really help you to index your blog in Google faster and it will also make your blog more search engine friendly. Robots.Txt file is used in all website but in the past with old Blogger interface
we're unable to add that file in our blogs but now in the new interface which is really incredible with a lot of good features we can enable that file in Blogger. Those who don't know about Robots.Txt file can read the under passage to understand that file.
What is Robots.Txt File ?
That is a easy text file in which the website owner use to write the commands for the search engine's crawler. It means that we can instruct the search engine's crawler with that file that we part of our site can be indexed or not. That commands are written in different coding which is only for search engine's crawler. You can look your robots.txt file just by following the under URL.

http://www.yourdomain.com/robots.txt

What Areas You Should Disallow For Crawling In Your Blog?
So, This ask maybe coming in your mind that which areas of your blog you should disallow for crawling. In fact, You can disallow any place of your blog but there are some important areas for disallowing. If you disallow those areas then you blog will be more search engine friendly and that areas are Search Result pages, Archive pages and labels pages. However, Below I'm giving you an search engine friendly robots.txt file which will help you a lot.
Enable Custom Robots.txt File In Blogger
So, This process is so easy just follow the easy steps under.
Go To Blogger >> Settings >> Search Preferences
Look For Custom Robots.Txt Section In The Bottom and Edit It.

Now a check bin will appear tick mark "Yes" and a crate will appear where you have to write the robots.txt file. So if you desire to use our file then copy the code under and glue it in that crate.
User-agent: Mediapartners-Google
User-agent: *
Disallow: /search?q=*
Disallow: /*?updated-max=*
Allow: /
Sitemap: http://www.yourdmomain.com/feeds/posts/default?orderby=updated

Note :The first line "User-agent: Mediapartners-Google" is for Google AdSense. So if you are using Google AdSense in your blog then remain it similar otherwise remove it.
Click "Save Changes".
After adding the file now you must be wanted to understand your file that what we have allowed and disallowed. So, I've listed all command with detail under.
Explanations
User-agent: Mediapartners-Google :So, This is a first command which is for those blogs which are using Google AdSense if you are not using Google AdSense then remove it. In this command, we're telling the AdSense's separate robot that crawl all pages which are having AdSense Ads.
User-agent: * :Here theUser-agentis calling the robot and*is for every the search engine's robots like Google, Yahoo etc.
Disallow: /search?q=* :This line tells the search engine's crawler not to crawl the search pages.
Disallow: /*?updated-max=* :This one disallows the search engine's crawler to do not index or crawl label or navigation pages.
Allow: / :This one allows to index the homepage or your blog.
Sitemap :So this last command tells the search engine's crawler to index the every new or updated post.
Got It ?
How Can I Add a Command For Disallowing Any Page?
So if you are interested in adding your own command then you can also do it. Here is an example for disallowing the contact pageDisallow: /p/contact-us.html. So, first you will need to replace/p/contact-us.htmlwhich is other part of the domain with another. It mean you just need to exclude the main domain and just type remain part of the page. Two things you should remember that add your custom command underUser-agent: *and per line = per command. If you yet have not got it then question in comments please.

Please if you like this post share this thanks.

Share this article :

0 Comments:

Post a Comment