views:

70

answers:

1

Hello, I have a database in which all text fields and tables have 'utf8' specified explicitly as default character set, however data in this database is stored as 'cp1257'. I can display symbols only when I use SET NAMES 'cp1257' or \C cp1257. Trying to display them without those instructions fails, because it tries to fetch data as 'utf8' (?). What could I do to get all data (perhaps a backup) of database as 'cp1257' and put it back as valid 'utf8' ?

A: 

You need to convert them first to binary field and then set the field as cp1257. This way MySQL will know in which charset the field is stored:

ALTER TABLE CHANGE `field` `field` BLOB;
ALTER TABLE CHANGE `field` `field` TEXT CHARACTER SET 'cp1257';

If you want to change way how data are stored, you can now convert the field to any charset (MySQL will do the conversion of data in this step):

ALTER TABLE CHANGE `field` `field` TEXT CHARACTER SET 'utf8';
Michal Čihař
Thanks for your answer, I have many tables like this, can I do the conversion on a whole database without having to convert each table (and even each field) ?
pk4r
You can't do this in SQL, but you can definitely script it in something else what would just generate the SQL queries.
Michal Čihař
Seems a bit complicated and first line doesn't work, perhaps because `field` is used with KEY(..). I am thinking about a trick with mysqldump, I've successfully exported my database to .sql file and I can even see valid characters with editor (if I set utf-8 encoding option). So file is unicode.. How could I import the data as it is now ? Importing back in usual way (mysql -u user -p database < sqlfile.sql) gives the same.
pk4r
If the field was originally `VARCHAR`, you should rather use `VARBINARY` as a type for conversion.
Michal Čihař