Web Developement
Now Reading
How to Setup a Multi Environment Robots.txt file in Ruby on Rails
0

How to Setup a Multi Environment Robots.txt file in Ruby on Rails

by Adrian RandallJuly 3, 2016

Almost every web developer I know at some point or another has forgot to block search engines and had a test site rank in Google. Well here is a quick way to set it up once and it is configured correctly the Robots.txt file in Ruby on Rails for development, testing and production.

1. In a common controller such as Pages Controller or Home Controller create an action called robots with the following content:

2. Create your robots.txt files based on environment. Put each one of these in the config/ directory.

config/robots.development.txt

config/robots.staging.txt

config/robots.production.txt

3. Lastly add a route which renders a plain text file with the correct content when someone (A search engine) hits the robots.txt file:

config/routes.rb

And you’re done

About The Author
Adrian Randall
I'm a digital marketing specialist, love working on digital business and coding on just about anything. I'm the founder of Arcadian Digital and this site shares some of our knowledge and practices.

Leave a Response