Hello all,
i'm using spark 1.2 with spark cassandra connector 1.2.3, 
i'm trying to update somme rows of table: 

example: 
*CREATE TABLE myTable (
    a text,
    b text,
    c text,
    date timestamp,
    d text,
    e text static,
    f text static,
    PRIMARY KEY ((a, b, c), date, d)
) WITH CLUSTERING ORDER BY (date ASC, d ASC)*

*val interactions = sc.cassandraTable[(String, String, String, DateTime,
String, String)]("keySpace", "myTable").
    select("a","b","c","date", "d", "e","f")
  val empty = interactions.filter(r => r._6 == null).cache()
  empty.count()*

I just count the number of rows containing null for "e" and the remplace
them by the value of "b"

* val update_inter = empty.map( r =>  (r._1,r._2, r._3, r._4, r._5, r._2))
  update_inter.saveToCassandra("keySpace", "myTable",
SomeColumns("a","b","c","date", "d", "e", "f"))*

this works when i check in cqlsh , but i still get the value null when i
request the same rows by spark cassandra . 

Is this a bug in spark cassandra connector? Thanks for your help.





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Update-cassandra-rows-problem-tp24844.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to