views:

140

answers:

2

I'm building web-application using Django1.1 framework with imposed database schema and data (in fact - db already exists - Postgresql). I wrote models already, now I want to perform some unit-testing.

The problem: test runner destroys and reconstructs (using information from models) database after every test method, but that's undesirable. I'd like to preserve at least schema all the time, data cleaning is acceptable. Is there a good way to obtain this behaviour?

(one solution is to use pure unittest module, setting/cleaning everything manually, but that's unsatisfactory)

A: 

This doesn't give you the behavior you're asking for, just a potential alternate behavior:

While I deploy and run integration tests against my actual legacy database, I run unit tests against a SQLite database. It's a small configuration change switching the DB engine to get it working. It ends up being faster and avoids clobbering any other work I'm doing.

Michael Greene
Would not help me much - I'd like to check (amongst other things) mapping validity (particular legacy db <-> brand new Django models) with unit tests, so switching to SQLite/letting Django constructs it using schema info from models will screw it. Just different use case.Thanks for posting though.
gorsky
A: 

After some re-googling (first attempt was several weeks ago and just couldn't find this, because it appeared a month ago) I've found this topic, which leads me to django-test-utils; persistent database test runner (e.g. python manage.py quicktest) solves my case (in addition, it seems to be good app in general). In addition, I had to tweak TEST_DATABASE_NAME option in settings.py to my main database to fit my needs.

gorsky