Jason Hubbard created SPARK-6067:
------------------------------------
Summary: Spark sql hive dynamic partitions job will fail if task
fails
Key: SPARK-6067
URL: https://issues.apache.org/jira/browse/SPARK-6067
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 1.2.0
Reporter: Jason Hubbard
Priority: Minor
When inserting into a hive table from spark sql while using dynamic
partitioning, if a task fails it will cause the task to continue to fail and
eventually fail the job.
/mytable/.hive-staging_hive_2015-02-27_11-53-19_573_222-3/-ext-10000/partition=2015-02-04/part-00001
for client <ip> already exists
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]