Hello,
Per the documentation default character encoding of spark is UTF-8. But
when i try to read non ascii characters, spark tend to read it as question
marks. What am I doing wrong ?. Below is my Syntax:
val ds = spark.read.textFile("a .bz2 file from hdfs");
ds.show();
The string "KøBENHAVN"
My Terminal can display UTF-8 encoded characters. I already verified that.
But will double check again.
Thanks!
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: