Hello, after several months having the site disappear from search results in every major search engine, I finally found out a possible reason.
I used WebBug to investigate server header. See the difference if the request is HEAD or GET.
HEAD Sent data:
HEAD / HTTP/1.1
Host: www.attu.it
Connection: close
Accept: */*
User-Agent: WebBug/5.0
HEAD Received data:
HTTP/1.1 403 Forbidden
Date: Tue, 10 Aug 2010 23:01:00 GMT
Server: Apache/2.2
Connection: close
Content-Type: text/html; charset=iso-8859-1
GET Sent data:
GET / HTTP/1.1
Host: www.attu.it
Connection: close
Accept: */*
User-Agent: WebBug/5.0
GET Received data:
HTTP/1.1 200 OK
Date: Tue, 10 Aug 2010 23:06:15 GMT
Server: Apache/2.2
Last-Modified: Fri, 08 Jan 2010 08:58:01 GMT
ETag: "671f91b-2d2-47ca362815840"
Accept-Ranges: bytes
Content-Length: 722
Connection: close
Content-Type: text/html
// HTML code here
Now, browsers by default send a GET request (at least this is what firebug says). Is it possible that crawlers send a HEAD request instead? If so, why only this server responds with a 403, while other servers from other sites I'm mantaining do not?
In case it's important, the only line present in .htaccess is (unless my client changed it, as they don't want to give me access to their server)
AddType text/x-component .htc
UPDATE
Thanks @Ryk. FireBug and Fiddler both send GET requests, which get 200 (or 300) responses. As expected. So I guess it's either a server bad setting (even though it's strange as the hosting is from a major company with millions of clients) or something they put in the .htaccess. They will have to let me look into their account.
The second part of my question was if that could be the cause of the website not appearing in any search engine (site:www.attu.it gives no results). Any thought?
UPDATE 2
After some fiddling around, it turns out there was the phpMyAdmin robots-blocking .htaccess in the root directory, that caused any request from robots to be sent back with a 403 Forbidden