views:

36

answers:

1

Edit: I learned that my error was unrelated to the robots file. Disregard.

I just learned the hard way that Google blocks access to the Maps API if you have a restrictive robots.txt file. I recently created a robots file with "Dissallow: /". Now my site can no longer use Maps. Rats.

I removed the robots file, but I still cannot use Maps. I also tried creating a completely permissive file ("Dissallow: "), and that has not yet solved the issue.

Can anyone tell me the next step? If at all possible, I'd prefer the site not show up in Google, since it's a staging site. But I also don't know how long before they rescan for a new robots file.

+2  A: 

I don't think this is your problem. I'm successfully running Google maps on an internal development server which Google can't crawl.

Are you getting an error message?

As for rescanning the robots.txt file, you can use the Google Webmaster tools app to request a rescan.

David
You were right, it was just a coincidence that I created the robots file during the same commit as when I changed some of the status checks for some unrelated functions. This is embarrassing, but I had tried to standardize some of my status codes to lowercase 'ok'. But Google sends 'OK'. Debugging JScript is not one of my strengths. :-) Thanks for the reply.
Mark