views:

126

answers:

2

Hi folks,

Maybe because the child processes are not aware of my hash (see the below code), the hash %output is not collecting anything .. is there any other way to collect the value apart from writing a tmp file ?

foreach $Item (@AllItems) {
$pid = $pm->start($Item) and next;
$Temp = qx($Item);
$output{$Item}= $Temp; // This doesn't collect anything. :-(
$pm->finish;
}

$pm->wait_all_children;

TIA, Tim

+2  A: 

Each process has its own memory, and data is not shared between processes. But you have several options:

  1. Write data from child processes to temp files to be read in parent, as you suggest
  2. Use sockets or pipes to accomplish the same thing
  3. Use threads with shared variables instead of fork()
  4. Use a shared memory facility (see this question, for example)
  5. Use a lightweight database (SQLLite, maybe DBD::CSV). This is a fancy way of using temporary files.

Any more? I don't have any idea how to use the builtin shmget/shmread/shmwrite functions or if they might be helpful here. Other commenters please feel free to edit.

mobrule
Thanks mobrule.
Tim
Can you please suggest me the right modules fro this job ? Tia.
Tim
Named pipes are also a good way to get the data back to the parent.
Grant Johnson
+7  A: 

Forked processes have their own copies (well, copies-on-write) of the parent process's memory. Writing to the hash in a child process won't affect the hash in the parent.

To do what you want, you'll need to employ some sort of IPC. See the perlipc manpage for a lengthy discussion of the various possibilities.

For something like this, I'd probably use something simple like an on-disk hash. DB_File provides a nice tied hash interface. Here's how you might do it:

use strict;
use warnings;

use DB_File;

tie my %output, "DB_File", "output.dat" ;

foreach my $item( @AllItems) { 
    my $pid = $pm->start and next;
    $output{$item} = qx($item);
    $pm->finish;
}
friedo
Thanks frido, thats what I needed.
Tim

related questions