tags:

views:

101

answers:

4

There exists a Perl module that provides the perfect functionality for my Python app. Is there any way for me to utilize it? (it is complicated, it would take me a month to port it)

I don't want to have to spawn a subprocess for every usage, as I need it several hundred thousand times (it is a specific type of data parser).

Thanks for your advice.

EDIT: asked for the module. It's Mail::DeliveryStatus::BounceParser. It matches mail delivery status notifications to a list of strings that may indicate a bounced mail. (it runs the DSN body/headers through a mass of regexes as well as other tests. it's a seriously awesome module.)

+3  A: 

I am not sure if this is still active but PyPerl may be of interest to you

Still there should be support for most data parsers in python. It would be good, if you can point to the parser that you are looking at.

Alternatively, you could create a complete process with that perl module and use IPC, socket mechanisms to communicate data and results back and forth from your python and perl processes.

pyfunc
Thanks Mr. Func! I am either going the tcp socket way, or stick it into a postfix policy server. Thanks for your answer!
threecheeseopera
+2  A: 

I know you can use Python in Perl with Inline::Python but that isn't really your question. Perhaps there is a similar functionality in Python. Perhaps something like perlmodule?

Joel
This answer (perlmodule) should definitely be recognized for posterity, though the one that got the checkmark was more appropriate to my particular situation. People please upvote!
threecheeseopera
A: 

I don't want to have to spawn a subprocess for every usage, as I need it several hundred thousand times (it is a specific type of data parser).

Bad policy. The basic Linux shells do this kind of process forking all the time. Avoiding process spawning is a bad limitation.

However, you can trivially do this.

python prepare_data.py | perl my_magic_module.pl | python whatever_else.py

Wrap your magic module in a simple perl script that reads from stdin, does the magical thing and writes to stdout.

Break your Python into two parts: the part done before calling perl and the part that is done after calling perl.

Assemble a high-performance pipeline that (a) does all three steps concurrently and (b) doesn't fork a lot of processes.

This, BTW, will use every core you own, also.

S.Lott
A: 

I'd use something like HTTP::Server::Simple to create a local web service. Then you just have to do queries against that. It's still an external process but it's only one.

Jason