-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
handle special characters in mysql tables with latin charsets #23
Comments
Hm, good question. That's a tough one. My idea would be to tell MySQL on insert/update, that the given string is UTF-8 encoded, which seems to be possible with SELECT _utf8'some text';. But that would require And for reading (to compare the current db state) the translation would be necessary again… OR another approach would be to inspect the current DB schema and do the necessary comversions in python code. Hm. I have to think about and try different approaches. Just using UTF-8 in your database is not an option? I don't know about any workarounds. If the data are being used to be shown on a webpage, maybe encoding the umlauts as HTML-entities could be a solution (e.g. |
Thank you for looking into this so quickly! I think the conversion will be easiest done in python. Detecting the table's schema would be nice to have, but it is not necessary, as you normally know the encoding of the tables you deal with. A parameter to the ansible-mysql-query module like e.g. output_encoding will do. |
Ah, now I get your idea (concerning |
Maybe the name |
May I take your offer to ping on this issue and wish you a happy new year |
Ansible uses UTF-8. When using ansible-mysql-query to update mysql tables with latin encodings, special characters (german umlauts in my use case) get messed up.
Is it possible to add functionality along the lines of ansible issue #121 and ansible pull request #42171?
Is there a workaroud that can be used in the meantime?
The text was updated successfully, but these errors were encountered: