On 22.03.2012 15:07, Oliver Welter wrote:
> What locale are you using - us or ru ?
Any of them and with equal success.
> Sure I tried FF and Chrome but also my Shell Testing Environment - with
> your utf8 flagging applied, the data passes the serialization
> successfully. I used a unicode aware regex (\p{Letter}) in
> Validator/CertSubjectParts to check what is moving in - surprisingly,
> setting only Encode::_utf8_on does leave the string in the "two bytes
> encoding" and the regex fails. I then applied a utf8:.decode and the
> regex passes - so at this point, I am sure it is valid utf8 data.
Just in case: "unicode data" and "utf-8 data" are two different beasts.
And your shell testing environment makes me uneasy: employed mechanism
is explicitly bound to the fact that we have a standard browser at the
client side. That is the browser which:
1) converts text from your locale to utf-8, when passing the text
towards server (and this feature is essential for the rest of data
processing),
2) converts text from utf-8 to your locale, when passing the text
towards client (and this feature is useful if you want to read this text).
> The amazing point: If I do the decode call on the "global" data, I get a
> proper display on the UI but double encoding in the database and a
> broken character in the certificate. If I convert it only locally, the
> certificate is fine but the UI is broken - looks like there is some oher
> kind of conversion happening.
Can you tar your Apache configuration and send it to my private address
please?
------------------------------------------------------------------------------
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here
http://p.sf.net/sfu/sfd2d-msazure
_______________________________________________
OpenXPKI-devel mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/openxpki-devel