views:

44

answers:

0
Disallow: /*“http:

is what I've been using - my guess is I may need to escape the quotation mark somehow. In Google webmaster tools, it's not even reading that quotation mark (where it allows you to see the robots.txt file and test it on a few urls).

On Google Webmaster Tools, it displays the robots.txt file without the quotes for this line.

Disallow: /*http:

Any suggestions would be appreciated.

The main issue is that a script was incorrectly formatted and there are crawl errors to the site:

http://www.domain.com/“http://www.domain.com/directory/directory/dir_ectory/dir_ectory/pagetitle"

Is an example of one of the pages we get a crawl error for. My assumption is fixing the robots.txt page will stop these pages from showing up in our crawl errors in Webmaster Tools.