After dig source code about error happened( https://github.com/apache/carbondata/blob/master/integration/spark-common/src/main/scala/org/apache/spark/sql/optimizer/CarbonDecoderOptimizerHelper.scala#L87) between spark1.6/spark2.1.x and spark2.2.0 . I notice spark1.6's InsertIntoTable has child LogicalPlan attribute, but spark2.2.0 change to query LogicalPlan attribute. but CarbonDecoderOptimizerHelper is at spark-common module, which means spark1.6/spark2.1 and spark2.2 both use this class. So I change to i.query, then build with spark2.2.0, It's OK then. Should this code here being version compatible or version independent?
2017-08-03 14:52 GMT+08:00 john cheng <[email protected]>: > If I build carbondata with spark2.1.x, it works. But our spark version is > spark2.2.0. If use spark2.1.x builded jar, and execute on spark2.2.0, > there're still errors when create CarbonSession, the errors is : > ClassNotFoundException: o.a.s.sql.hive.HiveSessionState > > 2017-08-03 14:45 GMT+08:00 john cheng <[email protected]>: > >> Hi carbon guys, At now carbondata seems not support spark2.2.0. I add >> spark2.2 as a new profile, and build like this: mvn -DskipTests -Pspark-2.2 >> -Dspark.version=2.2.0 -Dhadoop.version=2.6.0 clean package >> >> But there're errors on spark common module: >> >> [ERROR] /Users/zhengqh/Github/carbondata-parent-1.1.1/integration/ >> spark-common/src/main/scala/org/apache/spark/sql/ >> optimizer/CarbonDecoderOptimizerHelper.scala:87: error: value child is >> not a member of org.apache.spark.sql.catalyst. >> plans.logical.InsertIntoTable >> [INFO] case i: InsertIntoTable => process(i.child, nodeList) >> [INFO] ^ >> [WARNING] 11 warnings found >> [ERROR] one error found >> >> do you guys plan to support spark2.2.0. or at now I should downgrade to >> spark2.1.x? >> > >
