robots.txt the nightmare config
[root@ks4000391 ~]# curl ---.com/robots.txt
User-agent: *
Disallow: /
Sitemap: ------.com/sitemap.xml
[root@ks4000391 ~]#
Start google fetcher from webmaster tool :
The page could not be crawled at this time because it is blocked by the most recent robots.txt file Googlebot downloaded. Note that if you recently updated the robots.txt file, it may take up to two days before it's refreshed. You can find more information in the Help Center article about robots.txt.
I have tried all combinations in admin/robots/rule but with no success :( Please help !!
Comments
I agree with you, it's quite an annoyance, we've noticed this on one site only after two weeks of upgrading the app.