First time here? Check out the FAQ!
2

robots.txt the nightmare config
 

[root@ks4000391 ~]# curl ---.com/robots.txt
User-agent: *
Disallow: /


Sitemap: ------.com/sitemap.xml
[root@ks4000391 ~]#

Start google fetcher from webmaster tool :

The page could not be crawled at this time because it is blocked by the most recent robots.txt file Googlebot downloaded. Note that if you recently updated the robots.txt file, it may take up to two days before it's refreshed. You can find more information in the Help Center article about robots.txt.

I have tried all combinations in admin/robots/rule but with no success :( Please help !!

To enter a block of code:

  • enter empty line after your previous text
  • paste or type the code
  • select the code and press the button above
Preview: (hide)
zulp's avatar
21
zulp
asked 12 years ago, updated 12 years ago

Comments

I agree with you, it's quite an annoyance, we've noticed this on one site only after two weeks of upgrading the app.

Evgeny's avatar Evgeny (12 years ago)
see more comments

2 Answers

1

The problem is that django-robots app made the default policy "disallow everything".

You can downgrade the app to version 0.8.0 and this issue will go away.

We will need to do something about this robots app - either replace it with a plain-text file or add a more reasonable default.

To enter a block of code:

  • enter empty line after your previous text
  • paste or type the code
  • select the code and press the button above
Preview: (hide)
Evgeny's avatar
13.2k
Evgeny
answered 12 years ago
link

Comments

see more comments
0

how to downgrade then ? (remove and reninstall?

rm -rf /usr/lib/python2.6/site-packages/django_robots-0.9.1-py2.6.egg*

and reinstall 0.8.0?

any other configuration needed ?

To enter a block of code:

  • enter empty line after your previous text
  • paste or type the code
  • select the code and press the button above
Preview: (hide)
coldsystem's avatar
31
coldsystem
answered 12 years ago
link

Comments

1

To downgrade, type these two commands:

pip uninstall django-robots

pip install django-robots==0.8.0
Fitoria's avatar Fitoria (12 years ago)

Right, and you might need to clear build directory inside your virtual environment, it happens to act as cache on some systems (like on my mac).

Evgeny's avatar Evgeny (12 years ago)
see more comments