How do I prevent pages from being crawled?

You can add data-sj-noindex anywhere in a page and it will not be indexed. It's fairly common to see the following in our customers' site headers:

 <meta name="robots" content="noindex" data-sj-noindex />

To prevent all pages from being indexed, but still keep the Sajari JS on page, you can add the following to our install script:

 _sj.push(['noindex']);

Additionally you can use crawling rules to programmatically exclude sections or certain pages of your web site. You can also set individual pages to not be indexed from the data sources tab of the admin Console.

Successful organizations use Sajari