Nested parquet is not supported in 1.0, but is part of the upcoming 1.0.1
release.


On Thu, Jun 26, 2014 at 3:03 PM, anthonyjschu...@gmail.com <
anthonyjschu...@gmail.com> wrote:

> Hello all:
> I am attempting to persist a parquet file comprised of a SchemaRDD of
> nested
> case classes...
>
> Creating a schemaRDD object seems to work fine, but exception is thrown
> when
> I attempt to persist this object to a parquet file...
>
> my code:
>
>   case class Trivial(trivial: String = "trivial", lt: LessTrivial)
>   case class LessTrivial(i: Int = 1)
>
>   val conf = new SparkConf()
>     .setMaster( """local[1]""")
>     .setAppName("test")
>
>   implicit val sc = new SparkContext(conf)
>   val sqlContext = new org.apache.spark.sql.SQLContext(sc)
>
>   import sqlContext._
>
>   val rdd = sqlContext.createSchemaRDD(sc.parallelize(Seq(Trivial("s",
> LessTrivial(1)), Trivial("T", LessTrivial(2))))) //no exceptions.
>
>   rdd.saveAsParquetFile("trivial.parquet1") //exception:
> java.lang.RuntimeException: Unsupported datatype
> StructType(List(StructField(i,IntegerType,true)))
>
>
> Is persisting SchemaRDDs containing nested case classes supported for
> Parquet files?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-Nested-CaseClass-Parquet-failure-tp8377.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to