nginx rewrites for robots.txt in drupal multisite with aegir

We work with Aegir and nginx a lot. Aegir's nginx configuration files have support for the robotstxt module out of the box, but this still requires you to remember to delete the robots.txt file from the root of the Drupal platform and install the module for it to work. To avoid these issues we swapped out the default robots.txt directives in nginx_simple_include.conf that ships with aegir for the following:

  1. ###
  2. ### Redirect requests for /robots.txt
  3. ### First we check for existance of site-specific robots.txt
  4. ### If none, we redirect to default drupal robots.
  5. ###
  6. location = /robots.txt {
  7.   access_log    off;
  8.   log_not_found off;
  9.   try_files /sites/$server_name/robots.txt @robots;
  10. }
  11.  
  12. location @robots {
  13.   access_log    off;
  14.   log_not_found off;
  15.   try_files /robots.txt @cache;
  16. }

This basically says, when we get a request for /robots.txt check for the file /sites/$server_name/robots.txt and use it if it exists and if it does not exist, use the callback @robots. @robots says to just deliver to the default /robots.txt in the platform.

Hope this helps out others!