Spark SQL(1.3.0) import sqlContext.implicits._ seems not work for converting a case class RDD to DataFrame

2015-03-24 Thread Zhiwei Chan
Hi all, I just upgraded spark from 1.2.1 to 1.3.0, and changed the import sqlContext.createSchemaRDD to import sqlContext.implicits._ in my code. (I scan the programming guide and it seems this is the only change I need to do). But it come to an error when run compile as following: [ERROR]

A confusing ClassNotFoundException error

2015-06-12 Thread Zhiwei Chan
Hi all, I encounter an error at spark 1.4.0, and I make an error example as following. Both of the code can run OK on spark-shell, but the second code encounter an error using spark-submit. The only different is that the second code uses a literal function in the map(). but the first code uses a