I have few pages in website and don't want to crawled by any crawlers. How do I disallow specific page, directory using robot.txt.
You can also add a specific page or directory with extension in robots.txt file. Use Disallow keyword as follow:
Disallow: /index_test.htmlDisallow: /products
Following are methods to disallowed index:
Disallow: /index.htmlDisallow: /prod
<meta name="robots" content="noindex, nofollow" />
The next time your site is indexed this will make your content drop out of the Google index.
You can also you the noarchive value – this will block caching of your page. This is Google specific:
<meta name="robots" content="noarchive" />