You need to use 2.0.0-M2-s_2.11 since Spark 2.0 is compiled with Scala 2.11
by default.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Write-to-Cassandra-table-from-pyspark-fails-with-scala-reflect-error-tp27723p27729.html
Sent from the Apache Spark User
any conflicts with cached or older
> versions. I only have SPARK_HOME environment variable set (env variables
> related to Spark and Python).
>
> --
> *From:* Russell Spitzer <russell.spit...@gmail.com>
> *To:* Trivedi Amit <amit_...@yahoo.com>; "user@
che.org>
Sent: Thursday, September 15, 2016 9:47 AM
Subject: Re: Write to Cassandra table from pyspark fails with scala reflect
error
Thanks Russell. I didn't build this myself. I tried with Scala 2.11
com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M(1-3) and I am getting
`
il.com>
To: Trivedi Amit <amit_...@yahoo.com>; "user@spark.apache.org"
<user@spark.apache.org>
Sent: Wednesday, September 14, 2016 11:24 PM
Subject: Re: Write to Cassandra table from pyspark fails with scala reflect
error
Spark 2.0 defaults to Scala 2.11, so if yo
Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you
need the 2.11 artifact for the Spark Cassandra Connector.
On Wed, Sep 14, 2016 at 7:44 PM Trivedi Amit
wrote:
> Hi,
>
>
>
> I am testing a pyspark program that will read from a csv file and
Hi,
I am testing a pyspark program that will read from a csv file and write data
into Cassandra table. I am using pyspark with spark-cassandra-connector
2.10:2.0.0-M3. I am using Spark v2.0.0.
While executing below command