views:

234

answers:

5

Hi All

I have php script which calls another script with php exec function. The called script does batch process job, that is updating transaction statuses, notifies customers(puts emails to mail queue which runs separately). So it will take 20-30 minutes due to a very large table(500000 rows), Now i am running it under my desktop windows machine and php uses up to 50% CPU, mysql 20% CPU. Is it normal practice ? What if i put this script in a shared hosting ? Will i have trouble with that ? It will not break the shared hosting rules ? The batch process can be started by a user anytime(normally one a month or can be more frequently).

Any suggest on this ?

Thanks for reading.

+1  A: 

The servers that run shared hosting are typically much faster than the average home computer, so it probably will not take as long as is does running on your local computer. I think the alarm bells would start to go off if you were sending 500,000 emails a day but if you have a reasonable host using that much CPU for a short time shouldn't be too much of an issue.

You could always talk to them about it first, im sure they will sort something out.

Sam152
no, they're not, they actually tend to be desktops!. You can still find hosts that serve sites on celerons. CPU has never been the deciding factor for most websites, especially shared ones.
gbjbaanb
+1 for the 500k emails a day thing. You'll find many providers will (with some justification) treat you as a spammer if you did that.
cletus
+2  A: 

Generally shared hosts don't like you hogging their precious CPU power so I would not recommend you do anything very intensive on one. if you are looking to perform batch jobs you should consider investing in a dedicated server, it would also be useful to have in general.

Mr. Vile
+4  A: 

Almost certainly, however it depends how much CPU power is available to the shared host (probably about as much as your desktop to be honest), in which case you'll see yourself booted off in short order. Shared hosts tend to be stack-em-high affairs, where you can get 100s of sites hosted.

You could go for a virtual host, where they will give you freedom to do whatever you like, but you'll probably find there that the amount of CPU power you're restricted to is limited, possibly too limited for you. Virtual hosts tend to split the host between several virtual guests, sometimes as little as 4, but more often (and especially for the cheaper plans) all the way up to 64.

Your third option then is to go for a fully dedicated server, you get a whole server all you yourself and can do what you like with it. These tend to be expensive if you want lots of server hardware (because it uses a lot of very expensive electricity), or lots of bandwidth.

Obviously those 3 options increase in cost, in the webhosting arena, you do get what you pay for.

You might like to find a webhosting forum and ask around on there - be sure to specify the CPU requirements, and how long its likely to take. You may get some hoster that'll be more than happy for you to run your script in the 'out of hours' if you agree a time with them (that doesn't interfere with backups, or if they have lower resource time, as websites tend to be accessed around the world in every timezone)

gbjbaanb
+2  A: 

I'm doing a similar thing with a Facebook application (php, 350,000 notifications). I did have 1 "alert" from my hosting company about CPU usage, but that "alert" came from sales, offering to work with me to upgrade.

I changed the batch script to use less CPU at a time, essentially by spawning less simultaneous processes (10) and putting a sleep(10) command in a few spots. The sleep command is to allow the CPU to drop back down to "normal" levels, so there are CPU spikes, not constant usage.

On a shared host CPU spikes are expected. But a constantly high CPU level will cause alerts to the hosting company (if they are any good). You want to avoid running the CPU at 50%, or any high level for any long length of time.

If it doesn't matter if the script takes and extra few minutes, put some wait states (sleep) in your code. This is also just fair to other users on the same machine.

Brent Baisley
A: 

I would suggest using a service like Amazon EC2 or Mosso's Cloud Servers (owned by Rackspace). These are virtual servers which have reasonable hourly rates and excellent APIs. You get the power of a dedicated server without the minimum monthly commitment. For example, you can configure a virtual server with EC2, then have your regular web server run a weekly/monthly cron job to start that EC2 instance, run your job, then shut down the EC2 instance.