My django unit tests take a long time to run, so I'm looking for ways to speed that up. I'm considering installing an SSD, but I know that has its downsides too. Of course there are things I could do with my code, but I'm looking for a structural fix. Even running a single test is slow since the database needs to be rebuilt / south migrated every time. So here's my idea...
Since I know the test database will always be quite small, why can't I just configure the system to always keep the entire test database in RAM? Never touch the disk at all. Does anybody know how to configure this in django? I'd prefer to keep using mysql since that's what I use in production, but if sqlite3 or something else makes this easy, I'd go that way.
Does sqlite or mysql have an option to run entirely in memory? It should be possible to configure a RAM disk and then configure the test database to store its data there, but I'm not sure how to tell django / mysql to use a different datadir for a certain database, especially since it keeps getting erased and recreated each run. (I'm on a Mac FWIW.)
Any pointers or experience appreciated. Apologies if this isn't literally a question about code, but it is definitely a software engineering problem that I bet would benefit a lot of people if there's an elegant solve.