views:

15

answers:

1

Hi,

it looks for me that crawlers try to resolve the index of all public folder subfolders like "/images/foo", which makes a 404 error. Should I do something or is this normal?

+1  A: 

In your robots.txt

Disallow: /images/foo

Try to keep the crawlers away from anything they don't need in order to make them focus on the things you do need.

Trip
and if images inside of foo? /images/foo/bar.png ?
funkycottleti
No just block out the entire folder with /images/ . 99% of the time, you won't need crawlers to see or access this folder.
Trip
And mark my question right if you think its worth it. :D
Trip