python - Database agnostic way to say I want a column to be UTF-8 in SqlAlchemy? -
i noticed mysql database isn't set utf-8 default. latin1_swedish_ci collation selected instead.
so naturally encountered user reported bug application didn't back upwards special characters. went , made sure app handled utf-8 right, wrote test, , sure plenty worked fine in-memory sqlite not production mysql.the solution seem sqlalchemy documentation pass collation on column:
column = db.column(db.string(500, collation='utf8_general_ci')) sadly, causes unit tests based around sqlite fail -- utf8_general_ci isn't supported encoding sqlite. mysql specific.
sqlite seems fine utf-8 without specifying collation. can , test mysql, in memory sqlite database much faster , straight-forward testing alternative spot tests. prioritize ease of testing, getting testing sqlite big priority me.
other things i've triedi've tried adding
charset=utf8&use_unicode=1 to connect string. i've used db.unicode instead of db.string. seemed create no difference.
is there straight-forward , database agnostic way through sqlalchemy indicate column should encoded utf-8?
what solved problem alter collation on table __table_args__:
class foo(base): __tablename__ = "foo" __table_args__ = {'mysql_collate': 'utf8_general_ci'} ... column = db.column(db.string(500)) sqlite happily ignores mysql setting. mysql picks appropriately.
python mysql sqlite utf-8 sqlalchemy
No comments:
Post a Comment