Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you need the 2.11 artifact for the Spark Cassandra Connector.
On Wed, Sep 14, 2016 at 7:44 PM Trivedi Amit <[email protected]> wrote: > Hi, > > > > I am testing a pyspark program that will read from a csv file and write > data into Cassandra table. I am using pyspark with > spark-cassandra-connector 2.10:2.0.0-M3. I am using Spark v2.0.0. > > While executing below command > > ```df.write.format("org.apache.spark.sql.cassandra").mode('append').options( > table="test_table", keyspace="test").save()``` > > I am getting > ``` > py4j.protocol.Py4JJavaError: An error occurred while calling o47.save. > : java.lang.NoSuchMethodError: > scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror; > > at > com.datastax.spark.connector.types.TypeConverter$.<init>(TypeConverter.scala:116) > > at > com.datastax.spark.connector.types.TypeConverter$.<clinit>(TypeConverter.scala) > > at > com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:50) > > at > com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:46) > > at > com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18) > > at > com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18) > > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) > > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) > > at scala.collection.Iterator$class.foreach(Iterator.scala:893) > at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) > at > scala.collection.IterableLike$class.foreach(IterableLike.scala:72) > at scala.collection.AbstractIterable.foreach(Iterable.scala:54) > at > scala.collection.TraversableLike$class.map(TraversableLike.scala:234) > at scala.collection.AbstractTraversable.map(Traversable.scala:104) > at > com.datastax.spark.connector.writer.SqlRowWriter.<init>(SqlRowWriter.scala:18) > > at > com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:36) > > at > com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:34) > > at > com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:271) > > at > com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36) > > at > org.apache.spark.sql.cassandra.CassandraSourceRelation.insert(CassandraSourceRelation.scala:66) > > at > org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:85) > > at > org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:429) > > at > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:498) > at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237) > at > py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) > at py4j.Gateway.invoke(Gateway.java:280) > at > py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:128) > at py4j.commands.CallCommand.execute(CallCommand.java:79) > at py4j.GatewayConnection.run(GatewayConnection.java:211) > at java.lang.Thread.run(Thread.java:745) > ``` > > Google search for this error lead to threads where folks were talking > about using scala version 2.10 instead of 2.11 for this issue. However, I > am not using Scala and I am assuming that 2.10 in spark-cassandra-connector > is Scala version. > > Don't know how to fix or get around this issue. Appreciate any help. > > Thanks > AT > > >
