Michał Matłoka created SPARK-14751:
--------------------------------------

             Summary: SparkR fails on Cassandra map with numeric key
                 Key: SPARK-14751
                 URL: https://issues.apache.org/jira/browse/SPARK-14751
             Project: Spark
          Issue Type: Bug
          Components: SparkR
    Affects Versions: 1.6.1
            Reporter: Michał Matłoka


Hi,
I have created an issue for spark  cassandra connector ( 
https://datastax-oss.atlassian.net/projects/SPARKC/issues/SPARKC-366 ) but 
after a bit of digging it seems this is a better place for this issue:

{code}
CREATE TABLE test.map (
    id text,
    somemap map<tinyint, decimal>,
    PRIMARY KEY (id)
);

insert into test.map(id, somemap) values ('a', { 0 : 12 }); 
{code}
{code}
  sqlContext <- sparkRSQL.init(sc)
  test <-read.df(sqlContext,  source = "org.apache.spark.sql.cassandra",  
keyspace = "test", table = "map")
  head(test)
{code}
Results in:
{code}
16/04/19 14:47:02 ERROR RBackendHandler: dfToCols on 
org.apache.spark.sql.api.r.SQLUtils failed
Error in readBin(con, raw(), stringLen, endian = "big") :
  invalid 'n' argument
{code}

Problem occurs even for int key. For text key it works. Every scenario works 
under scala & python.
 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to