I am converting a website from ISO to UTF-8, so I need to convert the MySQL database too.
On the Internet, I read various solutions, I don't know which one to choose.
Do I really need to convert my varchar columns to binary, then to UTF-8 like that:
ALTER TABLE t MODIFY col BINARY(150);
ALTER TABLE t MODIFY col CHAR(150) CHARACTER SET utf8;
It takes a long time to do that for each column, of each table, of each database.
I have 10 databases, with 20 tables each, with around 2 - 3 varchar columns (2 queries each column), this gives me around 1000 queries to write! How to do it?
Resolved :
I post the code that I have used:
PASSWORD=""
db=$1
mysqldump --password=$PASSWORD --set-charset --skip-set-charset --add-drop-table --databases "$db" > /home/dev/backup/bdd.sql
QUERY="ALTER DATABASE \`$db\` DEFAULT CHARACTER SET utf8;"
mysql --password=$PASSWORD --database "$db" -e "$QUERY"
mysql --password=$PASSWORD --default-character-set=utf8 < /home/dev/backup/bdd.sql
See the answer below for more information.
解决方案
You can do that very easily using a dump. Make a dump using
mysqldump --skip-opt --set-charset --skip-set-charset
Then create another database, set its default character set to UTF-8 and then load your dump back with:
mysql --default-character-set=
The main idea is to make a dump without any sign of data encoding.
So, at create time, the table's encoding would be inherit from the database encoding and set to UTF-8. And with --default-character-set we tell MySQL to recode our data automatically.