multiprocessing

How to generate pdb files for parallel builds?

We are seeking ideas on resolving a problem with linking/pdb generation when running multiple devenv.com using Visual Studio 2005. We are getting the following intermittently errors when doing parallel builds using devenv.com. I.e. when the following get run at the same time on the same build server: devenv.com master.sln /build "Relea...

python multiprocessing proxy

I have a 2 processes: the first process is manager.py starts in backgroung: from multiprocessing.managers import SyncManager, BaseProxy from CompositeDict import * class CompositeDictProxy(BaseProxy): _exposed_ = ('addChild', 'setName') def addChild(self, child): return self._callmethod('addChild', [child]) def ...

Log output of multiprocessing.Process

Is there a way to log the stdout output from a given Process when using the multiprocessing.Process class in python? ...

Python multiprocessing and database access with pyodbc "is not safe"?

The Problem: I am getting the following traceback and don't understand what it means or how to fix it: Traceback (most recent call last): File "<string>", line 1, in <module> File "C:\Python26\lib\multiprocessing\forking.py", line 342, in main self = load(from_parent) File "C:\Python26\lib\pickle.py", line 1370, in load r...

Dumping a multiprocessing.Queue into a list

I wish to dump a multiprcoessing.Queue into a list. For that task I've written the following function: import Queue def dump_queue(queue): """ Empties all pending items in a queue and returns them in a list. """ result = [] # START DEBUG CODE initial_size = queue.qsize() print("Queue has %s items initially....

String arguments in python multiprocessing

I'm trying to pass a string argument to a target function in a process. Somehow, the string is interpreted as a list of as many arguments as there are characters. This is the code: import multiprocessing def write(s): print s write('hello') p = multiprocessing.Process(target=write, args=('hello')) p.start() I get this output:...

Python multiprocessing: restrict number of cores used

I want to know how to distribute N independent tasks to exactly M processors on a machine that has L cores, where L>M. I don't want to use all the processors because I still want to have I/O available. The solutions I've tried seem to get distributed to all processors, bogging down the system. I assume the multiprocessing module is the...

Python on multiprocessor machines: multiprocessing or a non-GIL interpreter

This is more a style question. For CPU bound processes that really benefit for having multiple cores, do you typically use the multiprocessing module or use threads with an interpreter that doesn't have the GIL? I've used the multiprocessing library only lightly, but also have no experience with anything besides CPython. I'm curious w...

Using multiprocessing pool of workers

Hello, I have the following code written to make my lazy second CPU core working. What the code does basically is first find the desired "sea" files in the directory hierarchy and later execute set of external scripts to process these binary "sea" files to produce 50 to 100 text and binary files in number. As the title of the question s...

Running code on different processor (x86 assembly)

In real mode on x86, what instructions would need to be used to run the code on a different processor, in a multiprocessor system? (I'm writing some pre-boot code in assembler that needs to set certain CPU registers, and do this on every CPU in the system, before the actual operating system boots.) ...

How to combine Pool.map with Array (shared memory) in Python multiprocessing?

I have a very large (read only) array of data that I want to be processed by multiple processes in parallel. I like the Pool.map function and would like to use it to calculate functions on that data in parallel. I saw that one can use the Value or Array class to use shared memory data between processes. But when I try to use this I get...

How does one properly use the Unix exec C(++)-command?

Specifically, I need to call a version of exec that maintains the current working directory and sends standard out to the same terminal as the program calling exec. I also have a vector of string arguments I need to pass somehow, and I'm wondering how I would go about doing all of this. I've been told that all of this is possible exclusi...

Why does my Python program average only 33% CPU per process? How can I make Python use all available CPU?

I use Python 2.5.4. My computer: CPU AMD Phenom X3 720BE, Mainboard 780G, 4GB RAM, Windows 7 32 bit. I use Python threading but can not make every python.exe process consume 100% CPU. Why are they using only about 33-34% on average?. I wish to direct all available computer resources toward these large calculations so as to complete t...

Multiprocessing Pool inside Process time out

When ever I use the following code the pool result always returns a timeout, is there something logically incorrect I am doing? from multiprocessing import Pool, Process, cpu_count def add(num): return num+1 def add_wrap(num): new_num = ppool.apply_async(add, [num]) print new_num.get(timeout=3) ppool = Pool(processes=cpu_count(...

python multiprocessing db access is very slow.

Hi I have GUI that will interact with a postgres database, using psycopg2. I have db connection in a multiprocessing process, and send SQL via a multiprocessing queue, and receive via another queue. The problem is that the speed is very very slow. A simple select * from a small table (30 rows) can be 1/10th of a second, or can take o...

Detach a subprocess started using python multiprocessing module

Hello, I would like to create a process using the mutliprocessing module in python but ensure it continues running after the process that created the subprocess exits. I can get the required functionality using the subprocess module and Popen, but I want to run my code as a function, not as a script. The reason I want to do this is to ...

Can't pickle <type 'instancemethod'> when using python's multiprocessing Pool.map()

Hi, I'm trying to use multiprocessing's Pool.map() function to divide out work simultaneously. When I use the following code, it works fine: import multiprocessing def f(x): return x*x def go(): pool = multiprocessing.Pool(processes=4) #result = pool.apply_async(self.f, [10]) #print result.get(timeou...

What's the best way to divide large files in Python for multiprocessing?

I run across a lot of "embarrassingly parallel" projects I'd like to parallelize with the multiprocessing module. However, they often involve reading in huge files (greater than 2gb), processing them line by line, running basic calculations, and then writing results. What's the best way to split a file and process it using Python's multi...

SQL Server DataBase Synchronization

Is the SQL Server synchronized to be compatible with many processes? Do i have to make synchronization to be possible to work with DataBases so that more than 1 process be used at the same time? ...

What is the best way to get a stacktrace when using multiprocessing?

I'm wondering about the best way to get a stacktrace when there is an exception inside a function executed via the multiprocessing module. Here's an example: import multiprocessing def square(x): raise Exception("Crash.") return x**2 if __name__ == '__main__': pool = multiprocessing.Pool(processes=4) results = pool.map...