I was asked to implement X-Robots-tag on one of our dedicated servers the other day.
This google document explains in the great details what it does and how it influences your site presentation on the search engine.
I have my doubts about how “otherbot” are willing to conform the google-set rules but that beyond the scope of this post.
The rest is the implementation. There are about 500+ virtual hosts on the server, each of them should have following added to the */public_html/.htaccess
1 2 3 4 |
<IfModule mod_headers.c> Header set X-Robots-Tag "googlebot: nofollow" Header add X-Robots-Tag "otherbot: noindex, nofollow" </IfModule> |
Now, to add these lines to all htaccess files preserving what was there before is a little bit tricky.
First, we need to create small shell script that will be doing the actual adding
1 2 3 4 5 6 |
cat /usr/local/src/add.sh #!/bin/bash echo '<IfModule mod_headers.c>' >> $1 echo 'Header set X-Robots-Tag "googlebot: nofollow"' >> $1 echo 'Header add X-Robots-Tag "otherbot: noindex, nofollow"' >> $1 echo '</IfModule>' >> $1 |
And from command line, something like this
1 2 3 |
for i in /home/*/public_html; > do sh /usr/local/src/add.sh $i/.htaccess; > done |
Done. I wouldn’t pretend that this is the only optimal way to do it, but it’s quick and it works for me. YMMV.
0 Comments.