Hi,
Seems adding the cassandra connector and spark streaming causes "issues". I've 
added by build and code file. Running "sbt compile" gives weird errors like 
Seconds is not part of org.apache.spark.streaming and object Receiver is not a 
member of package org.apache.spark.streaming.receiver. If I take out 
cassandraConnector from the list of dependencies, "sbt compile" succeeds.
How is adding the dependency removing things from spark streaming packages? Is 
there something I can do (perhaps in sbt) to not have this break?

Here's my build file:
import sbt.Keys._import sbt._
name := "untitled99"
version := "1.0"
scalaVersion := "2.10.4"
val spark = "org.apache.spark" %% "spark-core" % "1.1.0"val sparkStreaming = 
"org.apache.spark" %% "spark-streaming" % "1.1.0"val cassandraConnector = 
"com.datastax.spark" %% "spark-cassandra-connector" % "1.1.0" withSources() 
withJavadoc()
libraryDependencies ++= Seq(cassandraConnector,spark,sparkStreaming)
resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
And here's my code:
import org.apache.spark.SparkContextimport 
org.apache.spark.storage.StorageLevelimport 
org.apache.spark.streaming.{Seconds, StreamingContext}import 
org.apache.spark.streaming.receiver.Receiverobject Foo {def main(args: 
Array[String]) {val context = new SparkContext()val ssc = new 
StreamingContext(context, Seconds(2))}}class Bar extends Receiver[Int]{override 
def onStart(): Unit = ???override def onStop(): Unit = ???}
                                          

Reply via email to