Hi all: I am attempting to execute a simple test of the SparkSQL system capability of persisting to parquet files...
My code is: val conf = new SparkConf() .setMaster( """local[1]""") .setAppName("test") implicit val sc = new SparkContext(conf) val sqlContext = new org.apache.spark.sql.SQLContext(sc) import sqlContext._ case class Trivial(trivial: String = "trivial") val rdd = sc.parallelize(Seq(Trivial("s"), Trivial("T"))) rdd.saveAsParquetFile("trivial.parquet") When this code executes, a trivial.parquet directory is created, and a _temporary subdirectory, but there is no content in these files... only directories. Is there an obvious mistake in my code which would cause this execution to fail? Thank you-- Tony -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-saveAsParquetFile-tp8375.html Sent from the Apache Spark User List mailing list archive at Nabble.com.