0

I'd like to put robots.txt for the subdomain dev. to point to a specific robots.txt static file.

What routing rule do I need?

This is what i'm thinking:

if ($host = "dev.example.com") {
    location ^~ /robots.txt {
       allow all;
       root  /static/disallow/robots.txt;
    }
}

Based on Nginx subdomain configuration I believe I may need to just make separate server blocks and use include's. If not a simple routing rule, is the includes method how this is typically done?

1 Answer 1

1

If you are wanting a specific robots.txt for each subdomain, you can do so with separate server blocks like this, which you allude (and link) to in your question:

server {
    server_name subdomainA.domain.com;
    include /common/config/path;
    location = /robots.txt {
        root /subdomainA/path;
    }
}
server {
    server_name subdomainB.domain.com;
    include /common/config/path;
    location = /robots.txt {
        root /subdomainB/path;
    }
}

Regarding your other approach, have you read If is Evil?

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.