Paride Casulli created SPARK-46362:
--
Summary: calculation error
Key: SPARK-46362
URL: https://issues.apache.org/jira/browse/SPARK-46362
Project: Spark
Issue Type: Bug
Components:
Paride Casulli created SPARK-45908:
--
Summary: write empty parquet file while using partitioned write
Key: SPARK-45908
URL: https://issues.apache.org/jira/browse/SPARK-45908
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-20852?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Paride Casulli closed SPARK-20852.
--
Resolution: Duplicate
duplicate issue of https://issues.apache.org/jira/browse/SPARK-18528
>
[
https://issues.apache.org/jira/browse/SPARK-20852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16020994#comment-16020994
]
Paride Casulli commented on SPARK-20852:
Thank you Sean, you're right, without limit works fine
Paride Casulli created SPARK-20852:
--
Summary: NullPointerException on distinct (Dataset)
Key: SPARK-20852
URL: https://issues.apache.org/jira/browse/SPARK-20852
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-20158?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15949643#comment-15949643
]
Paride Casulli commented on SPARK-20158:
Hi,
our code uses spark streaming for copying rows from
Paride Casulli created SPARK-20158:
--
Summary: crash in Spark sql insert in partitioned hive tables
Key: SPARK-20158
URL: https://issues.apache.org/jira/browse/SPARK-20158
Project: Spark