views:

23

answers:

2

I have a dynamically generated rss feed that is about 150M in size (don't ask)
The problem is that it keeps crapping out sporadically and there is no way to monitor it without downloading the entire feed to get a 200 status. Pingdom times out on it and returns a 'down' error.

So my question is, how do I check that this thing is up and running

+1  A: 

What type of web server, and server side coding platform are you using (if any)? Is any of the content coming from a backend system/database to the web tier?

Are you sure the problem is not with the client code accessing the file? Most clients have timeouts and downloading large files over the internet can be a problem depending on how the server behaves. That is why file download utilities track progress and download in chunks.

It is also possible that other load on the web server or the number of users is impacting server. If you have little memory available and certain servers then it may not be able to server that size of file to many users. You should review how the server is sending the file and make sure it is chunking it up.

I would recommend that you do a HEAD request to check that the URL is accessible and that the server is responding at minimum. The next step might be to setup your download test inside or very close to the data center hosting the file to monitor further. This may reduce cost and is going to reduce interference.

BrianLy
just a simple php script ... with the data all coming from a mysql db.I don't control the script, but I need to be able to show that its crapping out
concept47
It also seems like pingdom is using HTTP HEAD requests already too, http://uptime.pingdom.com/general/methodology ... don't understand why its timing out on getting a response though.
concept47
A: 

Found an online tool that does what I needed
http://wasitup.com uses head requests so it doesn't time out waiting to download the whole 150MB file.
Thanks for the help BrianLy!

concept47