Hi, I am using Stratio library to get MongoDB to work with Spark but I get the following error:
java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.ScalaReflection This is my code. --------------------------------------------------------------------------------------- * public static void main(String[] args) {* * JavaSparkContext sc = new JavaSparkContext("local[*]", "test spark-mongodb java"); * * SQLContext sqlContext = new SQLContext(sc); * * Map options = new HashMap(); * * options.put("host", "xyz.mongolab.com:59107 <http://xyz.mongolab.com:59107>"); * * options.put("database", "heroku_app3525385");* * options.put("collection", "datalog");* * options.put("credentials", "*****,****,****");* * DataFrame df = sqlContext.read().format("com.stratio.datasource.mongodb").options(options).load();* * df.registerTempTable("datalog"); * * df.show();* * }* --------------------------------------------------------------------------------------- My pom file is as follows: *<dependencies>* * <dependency>* * <groupId>org.apache.spark</groupId>* * <artifactId>spark-core_2.11</artifactId>* * <version>${spark.version}</version>* * </dependency>* * <dependency>* * <groupId>org.apache.spark</groupId>* * <artifactId>spark-catalyst_2.11 </artifactId>* * <version>${spark.version}</version>* * </dependency>* * <dependency>* * <groupId>org.apache.spark</groupId>* * <artifactId>spark-sql_2.11</artifactId>* * <version>${spark.version}</version>* * </dependency> * * <dependency>* * <groupId>com.stratio.datasource</groupId>* * <artifactId>spark-mongodb_2.11</artifactId>* * <version>0.10.3</version>* * </dependency>* * <dependency>* * <groupId>com.stratio.datasource</groupId>* * <artifactId>spark-mongodb_2.11</artifactId>* * <version>0.10.3</version>* * <type>jar</type>* * </dependency>* * </dependencies>* Regards