How to create robots.txt file for all domains on Apache server
We have a XAMPP Apache development web server setup with virtual hosts and want to stop serps from crawling all our sites. This is easily done with a robots.txt file. However, we'd rather not include a disallow robots.txt in every vhost and then have to remove it when we went live with the site on another server.
Is there a way with an apache config file to rewrite all requests to robots.txt on all vhosts to a single robots.txt file?
If so, could you give me an example? I think it would be something like this:
RewriteEngine On
RewriteRule .*robots\.txt$ C:\xampp\vhosts\override-robots.txt [L]
Thanks!
Solution 1:
Apache mod_alias is designed for this and available from the core Apache system, and can be set in one place with almost no processing overhead, unlike mod_rewrite.
Alias /robots.txt C:/xampp/vhosts/override-robots.txt
With that line in the apache2.conf file, outside all the vhost's, http://example.com/robots.txt - on any website it serves, will output the given file.
Solution 2:
Put your common global robots.txt
file somewhere in your server's filesystem that is accessible to the apache process. For the sake of illustration, I'll assume it's at /srv/robots.txt
.
Then, to set up mod_rewrite
to serve that file to clients who request it, put the following rules into each vhost's <VirtualHost>
config block:
RewriteEngine on
RewriteRule ^/robots.txt$ /srv/robots.txt [NC, L]
If you're putting the rewrite rules into per-directory .htaccess
files rather than <VirtualHost>
blocks, you will need to modify the rules slightly:
RewriteEngine on
RewriteBase /
RewriteRule ^robots.txt$ /srv/robots.txt [NC, L]