I want to send data from a client to the server in a TLS TCP socket from multiple client subprocesses so I share the same ssl socket with all subprocesses. Communication works with one subprocess, but if I use more than one subprocesses, the TLS server crashes with an ssl.SSLError (SSL3_GET_RECORD:decryption failed or bad record mac).
M...
I want to achieve a application, to spawn new process execute tasks, basically like below.
web----->pyro----->multiprocessing
Through web send command,and communicate with pyro, and then pyro spwan new process to handle the task script.
What confused me is..
When I send tasks through pyro
once, I send another task, is it
possible? ...
I have some tasks in an application that are CPU bound and I want to use the multiprocessing module to use the multi-cores processors.
I take a big task (a video file analysis) and I split it into several smaller tasks which are put in a queue and done by worker processes.
What I want to know is how to report progress to the main process...
I have a numpy.array of 640x480 images, each of which is 630 images long.
The total array is thus 630x480x640.
I want to generate an average image, as well as compute the standard deviation for
each pixel across all 630 images.
This is easily accomplished by
avg_image = numpy.mean(img_array, axis=0)
std_image = numpy.std(img_array, axi...
I'm using python 2.7, and trying to run some CPU heavy tasks in their own processes. I would like to be able to send messages back to the parent process to keep it informed of the current status of the process. The multiprocessing Queue seems perfect for this but I can't figure out how to get it work.
So, this is my basic working exampl...
I want to put an instance of scapy.layers.dhcp.BOOTP on a multiprocessing.Queue. Every time I call put() the following exception occures:
Traceback (most recent call last):
File "/usr/lib/python2.6/multiprocessing/queues.py", line 242, in _feed
send(obj)
PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.f...
I am having an issue when making a shell call from within a multiprocessing.Process(). The error seems to be coming from Git, but I just can't figure out why it only happens from within a multiprocessing.Process(). Note, below is an example to demonstrate what's happening... in real code there is a lot more going on within the Process(...
how can i control the return value of this function pool apply_asyn
supposing that I have the following cool
import multiprocessing:
de fun(..)
...
...
return value
my_pool = multiprocessing.Pool(2)
for i in range(5) :
result=my_pool.apply_async(fun, [i])
some code going to be here....
digest_pool.close()
digest_pool.join()...
Hi,
I have a "master" process that needs to spawn some child processes.
How can I manage these child processes? (for example, restart if the process is dead)
Thanks!
...
Hi
I would like to know if someone has some idea or feedback about sharing data between thread or processes ( through a shared memory segment ). I was thinking about passing across the threads/process some ownership object ( through a pipe/ synchronized queue ). The only thread that can access data are the one owning the ownership of th...
I have some processes with python gui application and i want to connect with them to sql server.
i use the following modules
form multiproccessing import pool
import pymssql
conn = pymssql.connect(host=host,user=user,password=password,database=database)
for data in my_list :
self.pool.apply_async(fun,data,conn)
i it possible ...
I've been looking for an api for simple java-based multiprocessing and couldn't find anything.
I have legacy code which I need to integrate into my java application. Since this legacy (native) code crashes sometimes, the whole jvm crashes with it. So what I want to do, is to run this code and its adapter in a different process (not thre...
Hi,
I now primarily write in python, however I am looking for a language that is more thread friendly (not JAVA,C#,C or C++).
Python's threads are good when they are IO bound but it's coming up short when I am doing something CPU intensive.
Any ideas?
Thanks,
James
...
I currently have a multithreaded application which runs in following order:
Start up and change XML file
Do work
Change XML to default values
The step 3 is very important and I have to insure that it always happens. But if the application crashes, I might end up with the wrong XML.
The scenario where I am using it is:
My applicati...
The following code is from the python 2.6 manual.
from multiprocessing import Process
import os
def info(title):
print(title)
print('module name:', 'me')
print('parent process:', os.getppid())
print('process id:', os.getpid())
def f(name):
info('function f')
print('hello', name)
if __name__ == '__main__':
...
I have 2 laptops, one is Core2Duo and one is a Corei7. I heard somewhere that Linux is not able to utilize multiple processors for processing. Is this true ? Or is it that it can use 2 processors but not a quad processor ?
...
I'm writing a simple browser-based front end that should be able to launch a background task and then get progress from it. I want the browser to receive a response saying whether the task launched successfully, and then poll to determine when it is done. However, the presence of a background task seems to be stopping the XMLHttpReques...
I'm trying to make a basic multiprocessing task and this is what I have. First of all, I don't know the right way to make this program as a non-blocking process, because when I am waiting for the response of a child (with waitpid) the other processes also have to wait in the queue, but, what will happen if some child processes die before...
Say I have a ruby script called hello_world.rb that has one line
puts "Hello, world"
And I want to call that from another ruby script called spawn_hello_world.rb
pipe = IO.popen("ruby1.9.1 hello_world.rb", 'w+')
if pipe
puts pipe.gets
end
My question is: Is there a shorthand way of running another ruby process without having to c...
how can i get variable in class which is override multiprocessing in python:
#!/usr/bin/env python
import multiprocessing
import os
class TestMultiprocess(multiprocessing.Process):
def __init__(self):
multiprocessing.Process.__init__(self)
self.myvar = ''
def myfunc(self):
return os.getpid()
def ...