Hih

I am having simiiar problem and tries your solution with spark 1.2 build
withing hadoop

I am saving object to parquet files where some fields are of type Array.

When I fetch them as below I get 

 java.lang.ClassCastException: [B cannot be cast to java.lang.CharSequence



def fetchTags(rows: SchemaRDD) = {
   rows.flatMap ( x =>
((x.getAs[Buffer[CharSequence]](0)).map(_.toString())) )
  }



The value I am fetching have been stored as Array of Strings. I have tried
replacing Buffer[CharSequence] with Array[String] Seq[String] Seq[Seq[char]]
but still got errors

Can you provide clue. 

Pankaj



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Reading-nested-JSON-data-with-Spark-SQL-tp19310p20933.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to