I've read through that thread, and it seems for him, he needed to add a particular hadoop-client dependency. However, I don't think I should be required to do that as I'm not reading from HDFS.
I'm just running a straight up minimal example, in local mode, and out of the box. Here's an example minimal project that reproduces this error: https://github.com/ktham/spark-parquet-example -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SchemaRDD-s-saveAsParquetFile-throws-java-lang-IncompatibleClassChangeError-tp6837p6846.html Sent from the Apache Spark User List mailing list archive at Nabble.com.