tags:

views:

231

answers:

4

My hosting provider (pairNetworks) has certain rules for scripts run on the server. I'm trying to compress a file for backup purposes, and would ideally like to use bzip2 to take advantage of its AWESOME compression rate. However, when trying to compress this 90 MB file, the process sometimes runs upwards of 1.5 minutes. One of the resource rules is that a script may only execute for 30 CPU seconds.

If I use the nice command to 'nicefy' the process, does that break up the amount of total CPU processing time? Is there a different command I could use in place of nice? Or will I have to use a different compression utility that doesn't take as long?

Thanks!


EDIT: This is what their support page says:

  • Run any process that requires more than 16MB of memory space.
  • Run any program that requires more than 30 CPU seconds to complete.

EDIT: I run this in a bash script from the command line

+5  A: 

nice won't help you - the amount of CPU seconds will still be the same, no matter how many actual seconds it takes.

bzlm
+1  A: 

No, nice will only affect how your process is scheduled. Put simply, a process that takes 30 CPU seconds will always take 30 CPU seconds even if it's preempted for hours.

I always get a thrill when I load up all the cores of my machine with some hefty processing but have them all niced. I love seeing the CPU monitor maxed out while I surf the web without any noticeable lag.

Rich
+5  A: 

nice will change the process' priority, and thus will get its CPU seconds sooner (or later), so if the rule is really about CPU seconds as you state in your question, nice will not serve you at all, it'll just be killed at a different time.

As for a solution, you may try splitting the file in three 30 MB pieces (see split(1)) which you can compress in the allotted time. Then you uncompress and use cat to put the pieces together. Depending on if it's a binary or text you can use the -l or -b arguments to split.

Vinko Vrsalovic
a negative higher priority :D
Johannes Schaub - litb
Yes, I know nice can do that as well. It still doesn't work for what the OP wants, no matter what you do to the process' priority.
Vinko Vrsalovic
This is the only answer that seems like it might help, I don't understand why it was voted down.
Darron
+3  A: 

You have to find the compromiss between compression ratio and CPU consumption. There are -1 ... -9 options to bzip2 - try to "tune" it (-1 is a fastest). Another way is to consult with your provider - may be it is possible to grant a special permissions to your script to run longer.

Dmitry Khalatov
That is almost equivalent to changing the compression utility...
Vinko Vrsalovic
@Vinko but it's got the important advantage that nothing else needs to be changed in case something depends on the data to be bz2 compressed.
Joachim Sauer