Create a robots.txt for your Magento 1 shop

in MagentoNginx configuration

Using robots.txt for Magento

Introduction

Using a robots.txt is essential for instructing bots and crawlers how and at which rate your shop should be indexed. In this article we explain how to configure your Hypernode to serve a robots.txt for one or multiple storefronts

Manage multiple robots.txt for multiple storefronts

If you want to use several different robots.txt for each storefront, we need to create some configuration:

  • Create a directory structure and copy the current robots.txt in place per storefront:
for CODE in $(n98-magerun sys:store:list --format csv | sed 1d | cut -d "," -f 2 )
do
mkdir -p /data/web/public/robots/$CODE
cp /data/web/public/robots.txt /data/web/public/robots/$CODE/robots.txt
echo -e "\n\n## robots for storefront: $CODE" >> /data/web/public/robots/$CODE/robots.txt
done
  • Now create /data/web/nginx/server.robots with the following content:
location /robots.txt {
rewrite ^/robots\.txt$ /robots/$storecode/robots.txt;
}

It’s recommended to give each robots.txt for each storefront some unique identifier (for example the storefront name in a comment in the file), so it’s clear which robots.txt is served on which storefront.

  • Now test your robots.txt by requesting and verify whether the right sitemap is served:
curl -v https://www.example.com/robots.txt

Now start editting your robots.txt for each store!

Manage one robots.txt for all storefronts

For Magento multi sites, it is possible to manage one single robots.txt for all domains as well.
To do this, create a snippet in /data/web/nginx called server.robots with the following content:

This will create a location /robots.txt which returns the same data on all storefronts.

Create a single robots.txt

If you want to serve just one single storefront (for example on a Start Hypernode), all you have to do is place a robots.txt file in /data/web/public and you’re done 🙂

0