views:

182

answers:

2

I'm using python unittest in order to test some other external application but it takes too much time to run the test one by one.

I would like to know how can I speedup this process by using the power of multi-cores. Can I tweak unittest to execute tests in parallel? How?

This question is not able python GIL limitation because in fact not the python code takes time but the external application that I execute, currently via os.system().

+3  A: 

If your tests are not too involved, you may be able to run them using py.test which has support for distributed testing. If you are not running on Windows, then nose might also work for you.

Vinay Sajip
py.test looks very well, I will try to see how can I generate the tests dynamically.
Sorin Sbarnea
for some reason py.test doesn't want to run my test well, take a look at http://bitbucket.org/hpk42/py-trunk/issue/76/distributed-test-do-not-work-with-pytest_generate_tests
Sorin Sbarnea
A: 

Maybe you can run each test on a different process using the multiprocessing library. This implies that each unit test (or group of unit tests) should be independent and doesn't need to share the state. It will open other processes, and will make use of other cores.

Check specifically the 'Using a pool of workers' on this page ( http://docs.python.org/library/multiprocessing.html#using-a-pool-of-workers)

EDIT: This module is included since version 2.6

Khelben