Nginx robots.txt configuration
I can't seem to properly configure nginx to return robots.txt content. Ideally, I don't need the file and just want to serve text content configured directly in nginx. Here's my config:
server {
listen 80 default_server;
listen [::]:80 default_server ipv6only=on;
root /usr/share/nginx/html;
index index.html index.htm;
server_name localhost;
location = /robots.txt {
#return 200 "User-agent: *\nDisallow: /";
#alias /media/ws/crx-apps/eap/welcome-content/robots.txt;
alias /usr/share/nginx/html/dir/robots.txt ;
}
location / {
try_files $uri $uri/ =404;
}
}
None of the things in = /robots.txt
location works and I don't get why. Accessing http://localhost/robots.txt
gives a 404. However, http://localhost/index.html
is served properly.
Note, that I didn't change any default settings of nginx obtained from apt-get install nginx
apart from adding a new location (for testing).
Solution 1:
Firstly, i think the problem in your config is the regular expression used for matching. It would be very helpful to write a statement like this in your config in order to prevent possible mistakes with pattern matching:
location = /robots.txt {
alias /usr/share/nginx/html/dir/robots.txt;
}
Secondly you should also check the permissions and effective owner of /usr/share/nginx/html/dir/robots.txt