views:

107

answers:

5

I'm trying to create a construct in Python 3 that will allow me to easily execute a function on a remote machine. Assuming I've already got a python tcp server that will run the functions it receives, running on the remote server, I'm currently looking at using a decorator like

@execute_on(address, port)

This would create the necessary context required to execute the function it is decorating and then send the function and context to the tcp server on the remote machine, which then executes it. Firstly, is this somewhat sane? And if not could you recommend a better approach? I've done some googling but haven't found anything that meets these needs.

I've got a quick and dirty implementation for the tcp server and client so fairly sure that'll work. I can get a string representation the function (e.g. func) being passed to the decorator by

import inspect
string = inspect.getsource(func)

which can then be sent to the server where it can be executed. The problem is, how do I get all of the context information that the function requires to execute? For example, if func is defined as follows,

import MyModule
def func():
    result = MyModule.my_func()

MyModule will need to be available to func either in the global context or funcs local context on the remote server. In this case that's relatively trivial but it can get so much more complicated depending on when and how import statements are used. Is there an easy and elegant way to do this in Python? The best I've come up with at the moment is using the ast library to pull out all import statements, using the inspect module to get string representations of those modules and then reconstructing the entire context on the remote server. Not particularly elegant and I can see lots of room for error.

Thanks for your time

A: 

What's your end goal with this? From your description I can see no reason why you can't just create a simple messaging class and send instances of that to command the remote machine to do 'stuff'.. ?

Whatever you do, your remote machine is going to need the Python source to execute so why not distribute the code there and then run it? You could create a simple server which would accept some python source files, extract them import the relevent modules and then run a command?

Jon Cage
+2  A: 

The approach you outline is extremely risky unless the remote server is somehow very strongly protected or "extremely sandboxed" (e.g a BSD "jail") -- anybody who can send functions to it would be able to run arbitrary code there.

Assuming you have an authentication system that you trust entirely, comes the "fragility" problem that you realized -- the function can depend on any globals defined in its module at the moment of execution (which can be different from those you can detect by inspection: determining the set of imported modules, and more generally of globals, at execution time, is a Turing-complete problem).

You can deal with the globals problem by serializing the function's globals, as well as the function itself, at the time you send it off for remote execution (whether you serialize all this stuff in readable string form, or otherwise, is a minor issue). But that still leaves you with the issue of imports performed inside the function.

Unless you're willing to put some limitations on the "remoted" function, such as "no imports inside the function (and functions called from it)", I'm thinking you could have the server override __import__ (the built-in function that is used by all import statements and is designed to be overridden for peculiar needs, such as yours;-) to ask for the extra module from the sending client (of course, that requires that said client also have "server-like" capabilities, in that it must be able to respond to such "module requests" from the server).

Can't you impose some restrictions on functions that are remoted, to bring this task back into the domain of sanity...?

Alex Martelli
+2  A: 

You may interested in the execnet project.

execnet provides carefully tested means to easily interact with Python interpreters across version, platform and network barriers. It has a minimal and fast API targetting the following uses:

  • distribute tasks to local or remote CPUs
  • write and deploy hybrid multi-process applications
  • write scripts to administer a bunch of exec environments

http://codespeak.net/execnet/example/test_info.html#get-information-from-remote-ssh-account

I've seen a demo of it. But never used it myself.

Wai Yip Tung
+1 for execnet. I've used it to execute some test cases in a distributed fashion and it works great.
Noufal Ibrahim
A: 

This will probably be hard to do and you'll run into issues with security, arguments etc. Maybe you can just run a Python interpreter remotely that can be given code using a socket or an HTTP service rather than do it on a function by function level?

Noufal Ibrahim
+1  A: 

Its' not clear from your question whether there is some system limitation/requirement for you to solve your problem in this way. If not there may be much easier and quicker ways of doing this using some sort of messaging infrastructure.

For example you might consider whether [Celery][1]

[1]: http://ask.github.com/celery/getting-started/introduction.html will meet your needs.

rtmie