views:

63

answers:

3

I have a fixture (json) which loads in development environment but fails to do so in server environment. The error says: "DatabaseError: value too long for type character varying(50)"

My development environment is Windows & Postgres 8.4. The server runs Debian and Postgres 8.3. Database encoding is UTF8 in both systems.

It is as if unicode markers in the fixture count as chars on the server and they cause some strings to exceed their field's max length. However that does not happen in the dev environment..

I appreciate any help. Thanks in advance.

A: 

Get the real SQL query on both systems and see what is different.

Frank Heikens
A: 

I just encountered this this afternoon, too, and I have a fix (of sorts)

This post here implied it's a Django bug to do with length of the value allowed for auth_permission. Further digging backs up that idea, as does this Django ticket (even though it's initially MySQL-related).

It's basically that a permission name is created based on the verbose_name of a model plus a descriptive permission string, and that can overflow to more than the 50 chars allowed in auth.models.Permission.name.

To quote a comment on the Django ticket:

The longest prefixes for the string value in the column auth_permission.name are "Can change " and "Can delete ", both with 11 characters. The column maximum length is 50 so the maximum length of Meta.verbose_name is 39.

One solution would be to hack that column to support > 50 characters (ideally via a South migration, I say, so that it's easily repeatable) but the quickest, most reliable fix I could think of was simply to make my extra-long verbose_name definition a lot shorter (from 47 chars in the verbose_name to around 20). All works fine now.

stevejalim
Thanks for the reply, but your solution is limited to auth. Unfortunately any char field seems to be able to raise an error in my case.
shanyu
It won't be any charfield - it'll only be those which are 50 characters long. Run syncdb with --verbosity=2 to see which one it's choking on specifically, and that might give you a lead to see which models are actually causing the pain. Also, are the locales correct on the Debian box? I recall that it defaults to someting other than UTF8 or 16 depending on where you are (eg, in GB, it defaults to LATIN1, which explodes when fed non-ASCII character)
stevejalim
A: 

Well, what makes the difference is the encoding of the template databases. On the production server they had ascii encoding while on the dev box it is utf-8.

By default postgres creates a database using the template1. My understanding is that if its encoding is not utf-8, then the database you create will have this issue, even though you create it with utf-8 encoding.

Therefore I dropped it and recreated it with its encoding set to UTF8. The snippet below does it (taken from here):

psql -U postgres 

UPDATE pg_database SET datallowconn = TRUE where datname = 'template0';
\c template0
UPDATE pg_database SET datistemplate = FALSE where datname = 'template1';
drop database template1;
create database template1 with template = template0 encoding = 'UNICODE';
UPDATE pg_database SET datistemplate = TRUE where datname = 'template1';
\c template1
UPDATE pg_database SET datallowconn = FALSE where datname = 'template0';

Now the fixture loads smoothly.

shanyu