its fixed now, adding dependecies in pom.xml fixed it

<dependency>
  <groupId>com.datastax.spark</groupId>
  <artifactId>spark-cassandra-connector-embedded_2.10</artifactId>
  <version>1.4.0-M1</version>
</dependency>



On Mon, Jun 22, 2015 at 10:46 AM, Koen Vantomme <koen.vanto...@gmail.com>
wrote:

> Hello,
>
> I'm writing an application in Scala to connect to Cassandra to read the
> data.
> My setup is Intellij with maven. When I try to compile the application I
> get the following *error: object datastax is not a member of package com*
> *error: value cassandraTable is not a member of
> org.apache.spark.SparkContext*
>
> libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % 
> "1.4.0-M1"
>
>
>
> import com.datastax.spark.connector._
> import org.apache.spark.{SparkConf, SparkContext}
>
> object ReadCassandra {
>   def main(args: Array[String]): Unit = {
>
>     val conf =  new SparkConf()
>       .setAppName("Streaming Test")
>       .set("spark.executor.memory", "1g")
>       .set("spark.cassandra.connection.host", "ns6512097.ip-37-187-69.eu")
>       .set("spark.cassandra.auth.username","cassandra")
>       .set("spark.cassandra.auth.password","cassandra")
>
>     val sc = new SparkContext(conf)
>     val rdd_cassandra = sc.cassandraTable("tutorial","user")
>
>     // read the data
>     rdd_cassandra.collect()
>
>
>

Reply via email to