robots.txt directly from nginx

I have some websites published on public ip, for accessibility, but for my customers internal, private, use.
For this websites we don’t need search engines indexing.
But every time that I upload to remote new version, I delete robots.txt and, sometime, I forget to rewrite it.

We don’t need indexing so robots.txt is a very, very complex file, simply stop all crawler …

User-agent: *
Disallow: /

So why not serve it directly from nginx? without add real file?

After a little I found a good example on serverfault and I modified all my nginx server blocks:

server {
  listen myport;
  root /my/www/root;
  server_name your.site.dot
  ...
  location = /robots.txt {
   add_header Content-Type text/plain;
   return 200 "User-agent: *\nDisallow: /\n";
  }
  ...
}

… now we can delete robots.txt from my web root and test it.

tips

to test remember to disable browser cache or use curl -v https://your.site.dot/robots.txt