Github user MaxGekk commented on a diff in the pull request:
https://github.com/apache/spark/pull/22233#discussion_r213327251
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/command/ddl.scala ---
@@ -671,7 +674,7 @@ case class AlterTableRecoverPartitionsCommand(
val value = ExternalCatalogUtils.unescapePathName(ps(1))
if (resolver(columnName, partitionNames.head)) {
scanPartitions(spark, fs, filter, st.getPath, spec ++
Map(partitionNames.head -> value),
- partitionNames.drop(1), threshold, resolver)
+ partitionNames.drop(1), threshold, resolver,
listFilesInParallel = false)
--- End diff --
> Can we use another implemetation?
This is what @zsxwing proposed. Please, look at my comment
https://github.com/apache/spark/pull/22233#discussion_r213075357
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]