[jira] [Commented] (SPARK-20977) NPE in CollectionAccumulator

2018-08-28 Thread howie yu (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-20977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16594659#comment-16594659 ] howie yu commented on SPARK-20977: -- I use pyspark 2.3.1 also have this problem , but in different line

[jira] [Comment Edited] (SPARK-19512) codegen for compare structs fails

2018-05-08 Thread howie yu (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-19512?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16467403#comment-16467403 ] howie yu edited comment on SPARK-19512 at 5/8/18 1:18 PM: -- Hi I think thisĀ  is

[jira] [Commented] (SPARK-19512) codegen for compare structs fails

2018-05-08 Thread howie yu (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-19512?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16467403#comment-16467403 ] howie yu commented on SPARK-19512: -- Hi I think thisĀ  is similar issue, may not the same. I have

[jira] [Commented] (SPARK-10925) Exception when joining DataFrames

2018-05-08 Thread howie yu (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-10925?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16467142#comment-16467142 ] howie yu commented on SPARK-10925: -- Same issue on Spark 2.3.0. Add checkpoint still have same error >

[jira] [Commented] (SPARK-19512) codegen for compare structs fails

2018-05-06 Thread howie yu (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-19512?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16465386#comment-16465386 ] howie yu commented on SPARK-19512: -- Hi I still have this issue in 2.3.0

[jira] [Commented] (SPARK-15384) Codegen CompileException "mapelements_isNull" is not an rvalue

2018-05-04 Thread howie yu (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-15384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16464581#comment-16464581 ] howie yu commented on SPARK-15384: -- Hi I have similar error at spark 2.3.0