multiprocessing

detection of communication failure when "put" in queue

I am using the multiprocessing python module with Queue for communication between processes. Some processes only send (i.e. queue.put) and I can't seem to find a way to detect when the receiving end gets terminated abruptly. Is there a way to detect if the process at the other end of the Queue gets terminated without having to get from...

disabling "joining" when process shuts down

Is there a way to stop the multiprocessing Python module from trying to call & wait on join() on child processes of a parent process shutting down? 2010-02-18 10:58:34,750 INFO calling join() for process procRx1 I want the process to which I sent a SIGTERM to exit as quickly as possible (i.e. "fail fast") instead of waiting for several...

The missing "Comparison of Parallel Processing API". How do I choose Multi-threading library?

I'm using the phrases Parallel Processing & Multi Threading interchangeably because I feel there is no difference between them. If I'm wrong please correct me. I'm not a pro in Parallel Processing/Multi-threading. I'm familiar with & used .NET threads & POSIX Threads. Nothing more than that. I was just browsing through archives of SO ...

Solving embarassingly parallel problems using Python multiprocessing

How does one use multiprocessing to tackle embarrassingly parallel problems? Embarassingly parallel problems typically consist of three basic parts: Read input data (from a file, database, tcp connection, etc.). Run calculations on the input data, where each calculation is independent of any other calculation. Write results of calcula...

Are indivisible operations still indivisible on multiprocessor and multicore systems?

As per the title, plus what are the limitations and gotchas. For example, on x86 processors, alignment for most data types is optional - an optimisation rather than a requirement. That means that a pointer may be stored at an unaligned address, which in turn means that pointer might be split over a cache page boundary. Obviously this c...

Python: better file I/0 using os.fork ?

My problem is quite simple: I have a 400MB file filled with 10,000,000 lines of data. I need to iterate over each line, do something, and remove the line from memory to avoid filling-up too much RAM. Since my machine has several processor, my initial idea to optimize this process was to create two different processes. One would read the...

Ruby multithreading / multiprocessing readings.

Can anyone recommend any good multithreading / processing books / sites which go into detail about the intricacies of Ruby multithreading / multiprocessing? I tried using ruby threading and basically in deadlock-free code on 1.9vm it ran into deadlocks in jruby. Yes I realize the differences are drastic (jruby has no GIL) but I wanted t...

What does "pcntl_fork(): Error 12" mean?

I've searched until I was blue in the face and cannot find the answer to this question. Where I can find a table listing the meanings of all the error codes for pcntl_fork()? Or even the C fork() function, for that matter. ...

Static and global variable in memory

Are static variables stored on the stack itself similar to globals? If so, how are they protected to allow for only local class access? In a multi threaded context, is the fear that this memory can be directly accessed by other threads/ kernel? or why cant we use static/global in multi process/ thread enviornment? ...

Python calling pipe.communicate() in a thread

Using Python 2.6.1 on Mac OS X 10.6.2, I've the following problem: I have a threaded process (a Thread class), and each of those threads has a pipe (a subprocess.Popen) something likeso: from threading import Thread cmd = "some_cmd" class Worker(Thread): def run(self): pipe = Popen(cmd, stdin=PIPE, stdou...

Python-daemon doesn't kill its kids

When using python-daemon, I'm creating subprocesses likeso: import multiprocessing class Worker(multiprocessing.Process): def __init__(self, queue): self.queue = queue # we wait for things from this in Worker.run() ... q = multiprocessing.Queue() with daemon.DaemonContext(): for i in xrange(3): Worker(q) ...

How to synchronize a python dict with multiprocessing

I am using Python 2.6 and the multiprocessing module for multi-threading. Now I would like to have a synchronized dict (where the only atomic operation I really need is the += operator on a value). Should I wrap the dict with a multiprocessing.sharedctypes.synchronized() call? Or is another way the way to go? ...

Python Process won't call atexit

I'm trying to use atexit in a Process, but unfortunately it doesn't seem to work. Here's some example code: import time import atexit import logging import multiprocessing logging.basicConfig(level=logging.DEBUG) class W(multiprocessing.Process): def run(self): logging.debug("%s Started" % self.name) @atexit.regis...

pthreads_setaffinity_np: Invalid argument?

I've managed to get my pthreads program sort of working. Basically I am trying to manually set the affinity of 4 threads such that thread 1 runs on CPU 1, thread 2 runs on CPU 2, thread 3 runs on CPU 3, and thread 4 runs on CPU 4. After compiling, my code works for a few threads but not others (seems like thread 1 never works) but runni...

Running multiprocess applications from MATLAB

I've written a multitprocess application in VC++ and tried to execute it with command line arguments with the system command from MATLAB. It runs, but only on one core --- any suggestions? Update:In fact, it doesn't even see the second core. I used OpenMP and used omp_get_max_threads() and omp_get_thread_num() to check and omp_get_max_t...

Python - multithreading / multiprocessing, very strange problem.

Module run via python myscript.py (not shell input) import uuid import time import multiprocessing def sleep_then_write(content): time.sleep(5) print(content) if __name__ == '__main__': for i in range(15): p = multiprocessing.Process(target=sleep_then_write, args=('Hello World',...

Does sending a dictionary through a multiprocessing.queue mutate it somehow?

I have a setup where I send a dictionary through a multiprocessing.queue and do some stuff with it. I was getting an odd "dictionary size changed while iterating over it" error when I wasn't changing anything in the dictionary. Here's the traceback, although it's not terribly helpful: Traceback (most recent call last): File "/usr/li...

Generate and merge data with python multiprocessing

I have a list of starting data. I want to apply a function to the starting data that creates a few pieces of new data for each element in the starting data. Some pieces of the new data are the same and I want to remove them. The sequential version is essentially: def create_new_data_for(datum): """make a list of new data from some...

python multiprocessing member variable not set

In the following script, I get the "stop message received" output but the process never ends. Why is that? Is there another way to end a process besides terminate or os.kill that is along these lines? from multiprocessing import Process from time import sleep class Test(Process): def __init__(self): Process.__init__(self) ...

python multiprocessing server connections

I wish to get a list of connections to a manager. I can get last_accepted from the servers' listener, but I want all connections. There HAS to be a method I am missing somewhere to return all connections to a server or manager Please help!! ...