views:

298

answers:

3

My django unit tests take a long time to run, so I'm looking for ways to speed that up. I'm considering installing an SSD, but I know that has its downsides too. Of course there are things I could do with my code, but I'm looking for a structural fix. Even running a single test is slow since the database needs to be rebuilt / south migrated every time. So here's my idea...

Since I know the test database will always be quite small, why can't I just configure the system to always keep the entire test database in RAM? Never touch the disk at all. Does anybody know how to configure this in django? I'd prefer to keep using mysql since that's what I use in production, but if sqlite3 or something else makes this easy, I'd go that way.

Does sqlite or mysql have an option to run entirely in memory? It should be possible to configure a RAM disk and then configure the test database to store its data there, but I'm not sure how to tell django / mysql to use a different datadir for a certain database, especially since it keeps getting erased and recreated each run. (I'm on a Mac FWIW.)

Any pointers or experience appreciated. Apologies if this isn't literally a question about code, but it is definitely a software engineering problem that I bet would benefit a lot of people if there's an elegant solve.

+2  A: 

MySQL supports a storage engine called "MEMORY", which you can configure in your database config (settings.py) as such:

    'USER': 'root',                      # Not used with sqlite3.
    'PASSWORD': '',                  # Not used with sqlite3.
    'OPTIONS': {
        "init_command": "SET storage_engine=MEMORY",
    }

Note that the MEMORY storage engine doesn't support blob / text columns, so if you're using django.db.models.TextField this won't work for you.

muudscope
+3  A: 

I can't answer your main question, but there are a couple of things that you can do to speed things up.

Firstly, make sure that your MySQL database is set up to use InnoDB. Then it can use transactions to rollback the state of the db before each test, which in my experience has led to a massive speed-up. You can pass a database init command in your settings.py (Django 1.2 syntax):

DATABASES = {
    'default': {
            'ENGINE':'django.db.backends.mysql',
            'HOST':'localhost',
            'NAME':'mydb',
            'USER':'whoever',
            'PASSWORD':'whatever',
            'OPTIONS':{"init_command": "SET storage_engine=INNODB" } 
        }
    }

Secondly, you don't need to run the South migrations each time. Set SOUTH_TESTS_MIGRATE = False in your settings.py and the database will be created with plain syncdb, which will be much quicker than running through all the historic migrations.

Daniel Roseman
+5  A: 

If you set your database engine to sqlite3 when you run your tests, Django will use a in-memory database.

I'm using code like this to set the engine to sqlite when running my tests:

if 'test' in sys.argv:
    DATABASE_ENGINE = 'sqlite3'
Etienne
Do you put that in your settings.py?
Leopd
Yes, exactly. I should have put that in my answer! Combine that with SOUTH_TESTS_MIGRATE = False and your tests should be a lot faster.
Etienne
Awesome! With these two changes the overhead for running a single fast test in my system has gone from 12 seconds down to <100ms.
Leopd