How to optimize custom Robots.txt for BlogSpot?




Optimize custom Robots.txt for BlogSpot?


Robots.txt generator (serves) put off web crawlers work with web robots from gating all or a section of a website if it is openly vision able. Basically a robots.txt file with a web site will provide as a demand that specific robots ignore specific files or listings when crawling a site.

Robots.txt file can make it yourself by c-panel (File manager),if you use wordpress platform but if you use BlogSpot platform then there are few activities need to  activating custom robots.txt.

1. Login into your Blogger dashboard

2. Then Click on Settings >> Search preferences

 optimize custom Robots.txt for BlogSpot

3. Now edit your Custom robots.txt if you want to enable  custom robots.txt in your blog.

 optimize custom Robots.txt for BlogSpot

4. Then fill up the blank area with the following command.

User-agent: Mediapartners-Google
Allow:/
User-agent: Googlebot
Disallow: /?m=1
Disallow: /?m=0
Disallow: /*?m=1
Disallow: /*?m=0
User-agent: *
Disallow: /search?*
User-agent: Twitterbot
Allow:/
User-agent: *
Disallow: /search
Disallow: /p/*
Disallow: /view/*
Allow: /
Sitemap: http://blognucleus.com/feeds/post/default?orderby=UPDATED
Or            http://blognucleus.com/sitemap.xml

 Note: Blue letters change with your own blog

5. Now SAVE, That’s all.        

For others website which develop on several sub domains, generally every sub domain must have its own robots.txt file. If abc.com had a robots.txt file but sub.abc.com had not. Regulation, which will apply for abc.com will not apply to sub.abc.com.

That’s all. Share this post and reply your valuable comments below in addition to get more updates from us subscribe to our RSS feeds.

                               

Labels: