Getting this on the home machine as well. Not referencing the spark cassandra 
connector in libraryDependencies compiles. 
I've recently updated IntelliJ to 14. Could that be causing an issue? 

From: as...@live.com
To: yuzhih...@gmail.com
CC: user@spark.apache.org
Subject: RE: Adding Spark Cassandra dependency breaks Spark Streaming?
Date: Fri, 5 Dec 2014 19:24:46 +0000




Sorry...really don't have enough maven know how to do this quickly. I tried the 
pom below, and IntelliJ could find org.apache.spark.streaming.StreamingContext 
and org.apache.spark.streaming.Seconds, but not 
org.apache.spark.streaming.receiver.Receiver. Is there something specific I can 
try? I'll try sbt on the home machine in about a couple of hours.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0";
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
    <modelVersion>4.0.0</modelVersion>

    <groupId>untitled100</groupId>
    <artifactId>untiled100</artifactId>
    <version>1.0-SNAPSHOT</version>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>
    </dependencies>

</project>




Date: Fri, 5 Dec 2014 10:58:51 -0800
Subject: Re: Adding Spark Cassandra dependency breaks Spark Streaming?
From: yuzhih...@gmail.com
To: as...@live.com
CC: user@spark.apache.org

Can you try with maven ?
diff --git a/streaming/pom.xml b/streaming/pom.xmlindex b8b8f2e..6cc8102 
100644--- a/streaming/pom.xml+++ b/streaming/pom.xml@@ -68,6 +68,11 @@       
<artifactId>junit-interface</artifactId>       <scope>test</scope>     
</dependency>+    <dependency>+      <groupId>com.datastax.spark</groupId>+     
 <artifactId>spark-cassandra-connector_2.10</artifactId>+      
<version>1.1.0</version>+    </dependency>   </dependencies>   <build>     
<outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
You can use the following command:mvn -pl core,streaming package -DskipTests

Cheers
On Fri, Dec 5, 2014 at 9:35 AM, Ashic Mahtab <as...@live.com> wrote:



Hi,
Seems adding the cassandra connector and spark streaming causes "issues". I've 
added by build and code file. Running "sbt compile" gives weird errors like 
Seconds is not part of org.apache.spark.streaming and object Receiver is not a 
member of package org.apache.spark.streaming.receiver. If I take out 
cassandraConnector from the list of dependencies, "sbt compile" succeeds.
How is adding the dependency removing things from spark streaming packages? Is 
there something I can do (perhaps in sbt) to not have this break?

Here's my build file:
import sbt.Keys._
import sbt._
name := "untitled99"
version := "1.0"
scalaVersion := "2.10.4"
val spark = "org.apache.spark" %% "spark-core" % "1.1.0"
val sparkStreaming = "org.apache.spark" %% "spark-streaming" % "1.1.0"
val cassandraConnector = "com.datastax.spark" %% "spark-cassandra-connector" % 
"1.1.0" withSources() withJavadoc()
libraryDependencies ++= Seq(
cassandraConnector,
spark,
sparkStreaming
)
resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
And here's my code:
import org.apache.spark.SparkContext
import org.apache.spark.storage.StorageLevel
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.streaming.receiver.Receiverobject Foo {
def main(args: Array[String]) {
val context = new SparkContext()
val ssc = new StreamingContext(context, Seconds(2))
}
}class Bar extends Receiver[Int]{
override def onStart(): Unit = ???override def onStop(): Unit = ???
}
                                          

                                                                                
  

Reply via email to