Francisco Guerrero created CASSANDRA-19223:
----------------------------------------------

             Summary: [Analytics] Column type mapping error for timestamp type 
during bulk writes
                 Key: CASSANDRA-19223
                 URL: https://issues.apache.org/jira/browse/CASSANDRA-19223
             Project: Cassandra
          Issue Type: Bug
          Components: Analytics Library
            Reporter: Francisco Guerrero
            Assignee: Francisco Guerrero


When doing bulk reads with the analytics library, a user can specify the last 
modified column as an option. Bulk reader will add a column with the last 
modified column to the data frame. If a user wants to use the bulk-read data 
frame to persist data, and using the {{WriterOptions.TIMESTAMP}} feature from 
the last modified column from the bulk-read data frame, the bulk write will 
fail with a data type mapping error.

{code:java}
Caused by: java.lang.RuntimeException: Unsupported conversion for LONG from 
java.sql.Timestamp
        at 
org.apache.cassandra.spark.bulkwriter.SqlToCqlTypeConverter$LongConverter.convertInternal(SqlToCqlTypeConverter.java:245)
        at 
org.apache.cassandra.spark.bulkwriter.SqlToCqlTypeConverter$LongConverter.convertInternal(SqlToCqlTypeConverter.java:231)
        at 
org.apache.cassandra.spark.bulkwriter.SqlToCqlTypeConverter$Converter.convert(SqlToCqlTypeConverter.java:203)
        at 
org.apache.cassandra.spark.bulkwriter.SqlToCqlTypeConverter$NullableConverter.convert(SqlToCqlTypeConverter.java:212)
        at 
org.apache.cassandra.spark.bulkwriter.TableSchema.normalize(TableSchema.java:91)
        at 
org.apache.spark.api.java.JavaPairRDD$.$anonfun$toScalaFunction$1(JavaPairRDD.scala:1070)
        at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
{code}




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to