First time here? Check out the FAQ!

Revision history  [back]

robots.txt the nightmare config

[root@ks4000391 ~]# curl ---.com/robots.txt
User-agent: *
Disallow: /


Sitemap: ------.com/sitemap.xml
[root@ks4000391 ~]#

Start google fetcher from webmaster tool :

The page could not be crawled at this time because it is blocked by the most recent robots.txt file Googlebot downloaded. Note that if you recently updated the robots.txt file, it may take up to two days before it's refreshed. You can find more information in the Help Center article about robots.txt.

I have tried all combinations in admin/robots/rule but with no success :( Please help !!

robots.txt the nightmare configweq

[root@ks4000391 ~]# curl ---.com/robots.txt
User-agent: *
Disallow: /


Sitemap: ------.com/sitemap.xml
[root@ks4000391 ~]#

Start google fetcher from webmaster tool :

The page could not be crawled at this time because it is blocked by the most recent robots.txt file Googlebot downloaded. Note that if you recently updated the robots.txt file, it may take up to two days before it's refreshed. You can find more information in the Help Center article about robots.txt.

I have tried all combinations in admin/robots/rule but with no success :( Please help !!