GuidePedia

Breaking News

Add Custom Robots.txt file in Blogger

What is Robots.txt ?

Robots.txt is a text files which contains few lines of simple code. Its saved on websites or blog's server which instruct the web crawlers to how to index and crawl your blog in search results.That means you can restrict any web page on your blog from we crawlers. so that it an't get indexed in search . Always remember that search crawlers scan robots.txt before crawling any web page.
each blog on blogger has its default robots.txt file which is look like this:-


User-agent: Mediapartners-GoogleDisallow:User-agent: *Disallow: /searchAllow: /Sitemap: http://blogname.blogspot.com/feeds/posts/default?orderby=UPDATED


if you have 500+ posts on blog then you can use two sitemaps :-
Sitemap: http://blogname.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500Sitemap: http://blogname.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000

How to add Robots.txt in your blog:-

Steps :-
Go To your blog
Navigate to settings >>> Search Preferences>>>Crawlers and indexing>>> Custom roboot.s.txt >>>Edit>>>Yes.
Now paste your codes in the box.

Click to Save changes button.
You Have Done now. :)


How to check your Robots.txt file


You can check this file on your blog by adding /robots.txt at last of your blog URL.
for ex.

http://dream-droid.blogspot.in/robots.txt


Note:-
I really tried with my heart to make this tutorial as simple as possible. but still if you have any doubt or  query then feel free to ask me . Don't put any code in custom robots.txt  setting without knowing about it.

No comments:

Post a Comment

Copyright © 2014-15 TechSaavy