views:

375

answers:

6

I am not sure if this belongs to Stackoverflow or Superuser but I thought I would ask here.

I have a console based application written in C which currently takes about an hour to terminate in Windows 7 64-bit OS. The task manager reports that the application is using only 25% of the available CPU. I would like to reduce the run time by increasing cpu usage.

Is there any way to let the application use all four cores (the laptop has Core i5) instead of just one? I am assuming that task manager reports 25% because only one core is allocated to the program.

A: 

Unfortunately, there is nothing you can do here. The application itself needs to be re-written to take advantage of multiple cores.

morpheus
+4  A: 

You could look at using OpenMP or POSIX thread, I'd recommend OpenMP because it has a pretty basic structure that depending on your program could make implementation quick and easy.

Brett
This requires a rewrite though. :)
BobbyShaftoe
+7  A: 

Applications need to be written in a multithreaded manner in order to use multiple cores. There is no way to make a single threaded app use multiple cores.

Ben Robinson
never say never, splitting the input is almost always a possibility
fuzzy lollipop
+3  A: 

At this point in time there are no standard tools for converting an arbitrary C program to a multithreaded program. Therefore, if as a developer you do not explicitly write multithreaded code, do not expect your program to use more than one core at a time unless you invoke a third-party library function that is written using threading.

The most you could do (if relevant in your case) to better use your CPU is to slice your input into N chunks, and run the program multiple time on each chunk, then do something to combine the inputs once they are all done.

Uri
+13  A: 

Without re-writing the app to be multi=threaded no, you can't change the behavior of the app. The only thing that you might can do, is if the app can process ranges of input data, then you can launch 4 instances of the app with different ranges of input data to process and the combine the results after they are all done.

Imagine rendering a 3D animation and the renderer is single threaded, but you can specify start and end frames. You have 100 frames to render you would start 4 instances, and specify start and end frame ranges 0 - 25, 26 - 50, 51 - 75, 76 - 100. Then you would combine all the outputs to your final movie file.

fuzzy lollipop
+1 because this will definitely work for some cases. A classic case is operating on sequential data in a big data file. Split the file into 4 smaller files, run the program 4 times, and you're done in 15 minutes instead of an hour.
Chris Thornton
Yes! Clever! I always forget about that.
Andres Jaan Tack
+2  A: 

Pedantically speaking, your application is already using all cores. There's no one specific core allocated to the process, as you say above (unless you make a explicit effort to tie your process to one specific core).

The process uses all cores, but it uses them sequentially: it runs a little on one core, then it runs a little more on another core and so on. I.e. on average the CPU time your application consumes is spread evenly across all cores in the system. Since only one core is used at any given moment of time, the CPU load will never cross the 25% limit (your system, apparently, has 4 cores).

If you want to cross the 25% limit, you have to be able to use two or more cores simultaneously, meaning that at least some of the code has to run in parallel. In order to do that the application has to be written specifically to run in parallel. An ordinary single-threaded application will never do that by itself.

AndreyT
This is not completly true. This compiler/virtual machine (http://sourceforge.net/apps/mediawiki/ildjit/index.php?title=Main_Page) is able to parallelize (single threaded) CLI code (but also other languages). And the OS try to mantain the same application on the same core to avoid cache flushing.
Luca
Windows 7 tries really hard to keep a process affine to the CPU it is running on for as long as possible, the scheduler is completely different than the degenerate behavior that was displayed in XP and before that NT
fuzzy lollipop
Yeah, but honestly this is a lot of words and doesn't really add much to answer the question. :)
BobbyShaftoe