views:

220

answers:

1

Hi,

We have a sitemap at our site, http://www.gamezebo.com/sitemap.xml

Some of the urls in the sitemap, are being reported in the webmaster central as being blocked by our robots.txt, see, gamezebo.com/robots.txt ! Although these urls are not Disallowed in Robots.txt. There are other such urls aswell, for example, gamezebo.com/gamelinks is present in our sitemap, but it's being reported as "URL restricted by robots.txt".

Also I have this parse result in the Webmaster Central that says, "Line 21: Crawl-delay: 10 Rule ignored by Googlebot". What does it mean?

I appreciate your help,

Thanks.

A: 

the crawl delay is not an actual specification in robots.txt format so that line will be ignored... you can set a custom crawl rate if you want to in google webmaster tools under settings > crawl rate

Carter Cole