LantaoJin edited a comment on issue #25840: [SPARK-29166][SQL] Add parameters to limit the number of dynamic partitions for data source table URL: https://github.com/apache/spark/pull/25840#issuecomment-542093641 > Do you mean data source and Hive tables would have different configs for the same feature? Yes for now. Do you want the new configs also restrict Hive tables? For example, if we set `spark.sql.dynamic.partition.maxPartitions=100`, insert into hive table also throw exception when dynamic partitions is over 100, similar like we had set `spark.hadoop.hive.exec.max.dynamic.partitions=100` @cloud-fan
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
