Su Qilong created SPARK-33855:
---------------------------------

             Summary: Add spark job maximum created files limit configuration
                 Key: SPARK-33855
                 URL: https://issues.apache.org/jira/browse/SPARK-33855
             Project: Spark
          Issue Type: New Feature
          Components: SQL
    Affects Versions: 3.0.1, 2.4.3
            Reporter: Su Qilong


Add a configuration item like : 
[hive.exec.max.created.files|https://cwiki.apache.org/confluence/display/Hive/Configuration+Properties#ConfigurationProperties-hive.exec.max.created.files]
 to limit maximum number of HDFS files created by a single spark job.

 

This is useful when dynamic partition insertion is enabled, or for those jobs 
contains only 1 stage with very large parallelism



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to