How do I prevent pages from being crawled?
You can add data-sj-noindex anywhere in a page and it will not be indexed. Most commonly this will be defined in the <head> of an HTML page as follows:</head>
- Locate the <head> tag of the page you want to prevent from being crawled.</head>
- Add the following code within the <head>:</head>
<meta name="robots" content="noindex" data-sj-noindex="">
- Save the changes. The crawler will ignore this page next time it comes across it.
Additionally you can use crawling rules to programmatically exclude sections or certain pages of your web site. You can also set individual pages to not be indexed from the data sources tab of the admin Console.