views:

240

answers:

3

Our company (xyz) is moving a lot of our Flash code to Python.

In Flash, we have a shared library between our Flash apps - package xyz. We can make changes to the package without fear of breaking other apps when they are deployed because Flash compiles their code and includes the contents of the library. We deploy the final SWF via RPM, and we're done. Updates to App1 and App2 won't ever break App3.

How would you approach this in Python, the shared library dependency.

App1, App2 and App3, could all require xyz-lib.rpm, and all use the same library files, but an updated xyz-lib.rpm would have to be explicitly tested against App1,2,3 every time there was a new library, and this is just onerous.

My current favorite solution - I could make app1.rpm include the library from when it was packaged - effectively some sort of static linking of the library. This, however, feels inelegant. (although the only additional cost is hard drive space == cheap.)

I know that solid management of the shared library is probably the best solution, but I keep trying to factor in that all the developers are human, and will make mistakes. We're going to make mistakes, and I don't want a deployment of app1 to break app2 and app3 - it's just a lot more to test and debug for.

+2  A: 

"explicitly tested against App1,2,3 every time there was a new library" actually isn't that onerous.

Two things.

  • You need a formal set of API unit tests that the library must pass. This is just the API, not every nuance of functionality. If this passes, then your changes are good to go. If this fails, your changes broke the API.

  • You also need a set of unit tests for functionality, separate from the API. This is bigger, and might be classified as "onerous".

Once you start unit testing, you get addicted. Once you have reasonably complete tests, this problem is easy to manage.

S.Lott
+1  A: 

I've used variations of this cookbook entry to distribute python apps. Basically it involves zipping all your python sources up into a zip file, then concatenating it with a shell script to import the source files.

This can be helpful if you need to give an app its own version of the library.

Jason Baker
+1  A: 

I also favour the solution of packing everything together and limit the dependency on the OS libraries to the minimal (glibc and that's it). Hard drive is cheap, customer and support time is not.

On windows, it's trivial with py2exe + InnoSetup.

On Linux, it looks like bbfreeze is the right way to handle this. Quoting from the homepage, it offers:

  • zip/egg file import tracking : bbfreeze tracks imports from zip files and includes whole egg files if some module is used from an eggfile. Packages using setuputils' pkg_resources module will now work (new in 0.95.0)
  • binary dependency tracking : bbfreeze will track binary dependencies and will include DLLs and shared libraries needed by a frozen program.
Bluebird75