Hi,

i am trying to read cassandra table as Dataframe but got the below issue

spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.10:1.3.0
--conf spark.cassandra.connection.host=******

val df = sqlContext.read.
     | format("org.apache.spark.sql.cassandra").
     | options(Map( "table" -> "****", "keyspace" -> "***")).
     | load()
java.util.NoSuchElementException: key not found: c_table
        at scala.collection.MapLike$class.default(MapLike.scala:228)
        at org.apache.spark.sql.execution.datasources.
CaseInsensitiveMap.default(ddl.scala:151)
        at scala.collection.MapLike$class.apply(MapLike.scala:141)
        at org.apache.spark.sql.execution.datasources.
CaseInsensitiveMap.apply(ddl.scala:151)
        at org.apache.spark.sql.cassandra.DefaultSource$.TableRefAndOptions(
DefaultSource.scala:120)
        at org.apache.spark.sql.cassandra.DefaultSource.
createRelation(DefaultSource.scala:56)
        at org.apache.spark.sql.execution.datasources.
ResolvedDataSource$.apply(ResolvedDataSource.scala:125)


When i am using

SC.CassandraTable(tablename,keyspace) it went fine. but when i crate action
it throws plenty of error

example:
 com.datastax.driver.core.exceptions.InvalidQueryException: unconfigured
columnfamily size_estimates

-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"

Reply via email to