You can prevent search engines from indexing selected pages by specifying the noindex metadata for the pages or overwriting MyCashflow's default robots.txt file.
If you'd like to prevent a small number of pages from being indexed, specifying the noindex metadata is the most efficient option. A good example of the use of the noindex metadata is a content page used as a thank you page for the contact form.
A higher number of pages is more easily excluded from indexing by using the robots.txt file.
Defining the noindex metadata for selected pages
Adding the noindex metadata requires modifying the store theme's HTML files. If you don't have any experience in editing HTML files, contact your theme's designer or our customer service.
Adding the noindex metadata for a page involves the following:
- Creating an alternative template for the theme
To be able to use alternative templates, you'll need the Web Designer extension.
- Adding the noindex metadata to the template's
<head>
element
You can add the noindex metadata to any of your online store's contents. The following instructions use a content page as an example:
Next time search engines index your website, the noindex command is going to indicate that the page shouldn't be included in search engine results.
If you'd like to index the page in some versions, add corresponding templates to the versions' store themes, and make the following change to the noindex meta tag:
<meta name="robots" content="index, follow" />
Overwriting the robots.txt file
If you'd like to prevent a substantial number of pages from being indexed by search engines, the easiest way to do it is by editing the robots.txt file.
The text file robots.txt is read by search engines indexing websites for the sake of search results. The file specifies information and restrictions for bots regarding site indexing.
All of MyCashflow online stores are equipped with a default robots.txt file, which can be found at https://www.storeaddress.fi/robots.txt
Google does not recommend using the robots.txt file for preventing indexing.
If a page excluded from indexing in the robots.txt file contains links to other online store pages, the page may be indexed nevertheless.
Here's how to overwrite the default MyCashflow robots.txt file:
The next time search engines index your online store, the settings defined in the robots.txt file will take effect and the pages you've indicated will be excluded from indexing.
If necessary, learn more about using robots.txt files from Google Help.
Version-specific robots.txt files
If you'd like to use different robots.txt in different store versions, create a separate robots-VERSION.txt file for each of these versions and 301 redirects to the files.
In order to create redirects in your online store, you'll need the Redirects extension.
The next time search engine robots index different versions of your store, the Redirects extension will redirect them to the version-specific robots.txt files.