First time here? Check out the FAQ!

robots.txt the nightmare config

[root@ks4000391 ~]# curl
User-agent: *
Disallow: /

[root@ks4000391 ~]#

Start google fetcher from webmaster tool :

The page could not be crawled at this time because it is blocked by the most recent robots.txt file Googlebot downloaded. Note that if you recently updated the robots.txt file, it may take up to two days before it's refreshed. You can find more information in the Help Center article about robots.txt.

I have tried all combinations in admin/robots/rule but with no success :( Please help !!

zulp's avatar
asked 2013-02-28 16:02:13 -0500, updated 2013-02-28 16:02:41 -0500
edit flag offensive 0 remove flag close merge delete


I agree with you, it's quite an annoyance, we've noticed this on one site only after two weeks of upgrading the app.

Evgeny's avatar Evgeny (2013-02-28 16:36:37 -0500) edit
add a comment see more comments

2 Answers


The problem is that django-robots app made the default policy "disallow everything".

You can downgrade the app to version 0.8.0 and this issue will go away.

We will need to do something about this robots app - either replace it with a plain-text file or add a more reasonable default.

Evgeny's avatar
answered 2013-02-28 16:25:43 -0500
edit flag offensive 0 remove flag delete link


add a comment see more comments

how to downgrade then ? (remove and reninstall?

rm -rf /usr/lib/python2.6/site-packages/django_robots-0.9.1-py2.6.egg*

and reinstall 0.8.0?

any other configuration needed ?

coldsystem's avatar
answered 2013-02-28 17:15:17 -0500
edit flag offensive 0 remove flag delete link



To downgrade, type these two commands:

pip uninstall django-robots

pip install django-robots==0.8.0
Fitoria's avatar Fitoria (2013-02-28 17:27:48 -0500) edit

Right, and you might need to clear build directory inside your virtual environment, it happens to act as cache on some systems (like on my mac).

Evgeny's avatar Evgeny (2013-02-28 17:41:24 -0500) edit
add a comment see more comments