Using robots.txt for Magento


Using a robots.txt is essential for instructing bots and crawlers how and at which rate your shop should be indexed. In this article we explain how to configure your Hypernode to serve a robots.txt for one or multiple storefronts.

Manage multiple robots.txt for multiple storefronts

If you want to use several different robots.txt for each storefront, we need to create some configuration:

  • Create a directory structure and copy the current robots.txt in place per storefront:

  • Now create /data/web/nginx/server.robots with the following content:

It’s recommended to give each robots.txt for each storefront some unique identifier (for example the storefront name in a comment in the file), so it’s clear which robots.txt is served on which storefront.

  • Now test your robots.txt by requesting and verify whether the right sitemap is served:

Now start editing your robots.txt for each store!

Manage one robots.txt for all storefronts

For Magento multi sites, it is possible to manage one single robots.txt for all domains as well.
To do this, create a snippet in /data/web/nginx called server.robots with the following content:

This will create a location /robots.txt which returns the same data on all storefronts.

Create a single robots.txt

If you want to serve just one single storefront (for example on a Hypernode Start plan), all you have to do is place a robots.txt file in /data/web/public and you’re done 🙂