views:

46

answers:

2

Hi guys I'm debugging my application here and basically in a nutshell - the application is dying out on my online server or maybe its my server dying out. But I have checked this application three different servers and all exhibited similar results, the application would run for a while but all of a sudden once I'd be opening more and more requests I'd get a Network error or the site would fail to load.

I'm suspecting its my code here so I need to find out how I can make it less resource intensive infact I don't know why is it doing this in the first place. It runs ok on my localhost machine though.

Or is it because I'm hosting it on a technically shared host? Should I look for specialised hosting for hosting an application? There are a lot of complex database queries and ajax requests in my application here.

A: 

The answer is probably the fact that your hosting company has a fairly restrictive php.ini configuration. They could, for example, limit the amount of time that a script can run for, or limit the amount of memory that a script could use.

What does your code attempt to do?

You might consider making use of memory_get_usage and/or memory_get_peak_usage.

Finbarr
+1  A: 

As far as checking how much memory your script is using you can periodically call memory_get_usage(true) at points in your code to identify which parts of your script are using the memory. memory_get_peak_usage(true) obviously returns the max amount of memory that was used.

You say your application runs for a while OK. Is this a single script which is running all this time, or many different page requests / visitors? There is usually a max_execution_time for each script (often default to 30 seconds). This can be changed in code on a per script basis by calling set_time_limit().

There is also an inherent memory_limit as set in php.ini. This could be 64M or lower on a shared host.

"...once I'd be opening more and more requests..." - There is a limit to the number of simultaneous (ajax) requests a client can make with the server. Browsers could be set at 8 or even less (this can be altered in Firefox via about:config). This is to prevent a single client from swamping the server with requests. A server could be configured to ban clients that open too many requests!

A shared host could be restrictive. However, providing the host isn't hosting too many sites then they can be quite powerful servers, giving you access to a lot of power for short time. Emphasis on short time - it's in the interests of the host to control scripts that consume too many resources on a shared server as other customers would be affected.

Should I look for specialised hosting for hosting an application?

You'll have to be more specific. Most websites these days are 'applications'. If you are doing more than simply serving webpages and you are constantly running intensive scripts that run for a period of time then you may need to got for dedicated hosting. Not just for your benefit, but for the benefit of others on the shared server!

w3d
Thanks for the answer - the thing is that I am running an application on the likes of 37 Signals Basecamp and Highrise. I have it currently set up on a grid server at mediatemple. But my application seems to die out and I get Network Connection errors when loading during normal usage of the application. Its not just serving web pages its an entire application on the likes I mentioned.
Ali
By the sounds of it, you are already on a reasonably powerful host? Do you have many users _hammering_ the server? If you are able to test this with a single user and it is still failing then I would have said this was certainly an issue with your script, rather than the server?
w3d