chandu-1101 commented on issue #8333:
URL: https://github.com/apache/iceberg/issues/8333#issuecomment-1681814454

   Any update? I get the exception still.. 
   
   ```
   scala>     val _cdcDf =sess.sql(""" select *
        |                              from cdc c1
        |                              where cdc_pk in (
        |                                                 select max(cdc_pk)
        |                                                 from cdc c2
        |                                                 where _id.oid is not 
null
        |                                                   and _id.oid !=''
        |                                                   and 
c2.__created_date_=c1.__created_date_
        |                                                 group by _id.oid)  
""")
   _cdcDf: org.apache.spark.sql.DataFrame = [cdc_pk: string, cdc_oid: string 
... 174 more fields]
   
   scala>     _cdcDf.registerTempTable("_cdc")
   
   scala>  sess.sql(""" MERGE INTO x11  t
        |                 using (
        |                         select *
        |                         from _cdc )u
        |                 on t._id.oid = u._id.oid
        |                   when matched then update set *
        |                   when not matched then insert * """)
   23/08/17 07:00:37 WARN HiveConf: HiveConf of name hive.server2.thrift.url 
does not exist
   23/08/17 07:12:45 WARN TaskSetManager: Lost task 0.0 in stage 6.0 (TID 1764) 
(ip-172-25-26-76.prod.x.local executor 8): java.lang.ClassCastException: 
org.apache.spark.unsafe.types.UTF8String cannot be cast to java.lang.Long
        at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:107)
        at 
org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getLong(rows.scala:42)
        at 
org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getLong$(rows.scala:42)
        at 
org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getLong(rows.scala:195)
        at 
org.apache.spark.sql.catalyst.expressions.JoinedRow.getLong(JoinedRow.scala:95)
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields_32_62$(Unknown
 Source)
        at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown
 Source)
        at 
org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.addMetadataColumnsIfNeeded(FileScanRDD.scala:291)
        at 
org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.next(FileScanRDD.scala:318)
        at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
        at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
        at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
        at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
        at 
org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:184)
        at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
        at org.apache.spark.scheduler.Task.run(Task.scala:138)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1516)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to