Poetry of Programming

Its about Ruby on Rails – Kiran Soumya


How to stop search engines from indexing your pages.

Got something you want to put online, but you don’t really want it showing up in search engine results? Here are two quick and easy solutions.
Use a specific meta tag

For each page you don’t want to appear in search engine results, have only one tag. Not a description, not some keywords, just a single tag for robots.

meta name=”robots” content=”noindex,nofollow,noarchive” />

Pop that into the of each page, and you’re telling search engines not to index the page, not to follow any links from the page, and not to archive the page.
Create a robots.txt file

If your pages are all in a separate directory, you can also block search engines by using a robots.txt file.

Create a text file and, in it, disallow all the directories you want protected:

User-agent: *

Disallow: /nameofdirectory

Disallow: /anothernameofdirectory

Do it for all the directories you want, then save the file as robots.txt, and upload it to your main directory. The search engine robots will hit the robots.txt, find out which directories you don’t want them sniffing in, and skip them.

So there you go. Two little things that can save you a world of trouble.

However, these aren’t completely effective solutions. If you really want to block search engines from accessing your pages, you can either password-protect your pages, or keep them offline.

The choice is yours. Have fun!

Leave a Reply

Your email address will not be published. Required fields are marked *