Maybe your JVM's default charset has changed?  Try -Dfile.encoding="UTF-8"
when you start Java.

Even if that fixes things, it's perhaps still a bug.  The tool should
probably not depend on the default charset, but should explicitly set its
expected input encoding.  So, if that's the problem, please file an issue.

Doug

On Mar 17, 2017 2:06 AM, "nick lawson" <[email protected]> wrote:

> In the past I have used the avro-tools jar "fromjson" to convert a json
> file
> containing utf-8 multibyte chars to avro as expected.  This data is type
> "bytes" in the schema.
>
> Today this isn't working for me  - instead the multibyte characters are
> each
> represented in my avro output as a single ? (questionmark).
>
> No doubt this is due to me changing something in myenvironment. Does anyone
> know what I need to set/download to get back to normal running?
>
> Thanks,
>
> Nick
>
>
>
>
>
> --
> View this message in context: http://apache-avro.679487.n3.
> nabble.com/avro-tools-not-serialising-multibyte-chars-today-tp4037037.html
> Sent from the Avro - Users mailing list archive at Nabble.com.
>

Reply via email to