java.lang.NoClassDefFoundError: scala/runtime/AbstractPartialFunction$mcJL$sp

2017-07-23 Thread Kaushal Shriyan
I am facing issue while connecting Apache Spark to Apache Cassandra Datastore > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > >

Re: Failed to find Spark jars directory

2017-07-20 Thread Kaushal Shriyan
On Thu, Jul 20, 2017 at 7:51 PM, ayan guha wrote: > It depends on your need. There are clear instructions around how to run > mvn with specific hive and hadoop bindings. However if you are starting > out, i suggest you to use prebuilt ones. > Hi Ayan, I am setting up Apache Spark with Cassandra

Re: Failed to find Spark jars directory

2017-07-20 Thread Kaushal Shriyan
On Thu, Jul 20, 2017 at 7:42 PM, ayan guha wrote: > You should download a pre built version. The code you have got is source > code, you need to build it to generate the jar files. > > Hi Ayan, Can you please help me understand to build to generate the jar files? Regards, Kaushal

Failed to find Spark jars directory

2017-07-20 Thread Kaushal Shriyan
Hi, I have downloaded spark-2.2.0.tgz on CentOS 7.x and when i invoke /opt/spark-2.2.0/sbin/start-master.sh, i get *Failed to find Spark jars directory > (/opt/spark-2.2.0/assembly/target/scala-2.10/jars). You need to build > Spark with the target "package" before running this program.* I am