abellina commented on PR #36505:
URL: https://github.com/apache/spark/pull/36505#issuecomment-1124014672
Ok, as feared I am causing some issues here. Specifically:
```
java.lang.RuntimeException: Max iterations (100) reached for batch Operator
Optimization after Inferring Filters, please set
'spark.sql.optimizer.maxIterations' to a larger value.
```
That seems to indicate that the added join/filter is causing this later
batch to reach the fixedPoint limit for a few tests:
- Spark vectorized reader - without partition data column - SPARK-38918:
nested schema pruning with correlated subqueries *** FAILED ***
- Spark vectorized reader - with partition data column - SPARK-38918: nested
schema pruning with correlated subqueries *** FAILED ***
- Non-vectorized reader - without partition data column - SPARK-38918:
nested schema pruning with correlated subqueries *** FAILED ***
- Non-vectorized reader - with partition data column - SPARK-38918: nested
schema pruning with correlated subqueries *** FAILED ***
- Case-insensitive parser - mixed-case schema - subquery filter with
different-case column names *** FAILED ***
There are unfortunately other errors as well:
[info] - SPARK-32290: SingleColumn Null Aware Anti Join Optimize *** FAILED
*** (15 milliseconds)
[info]
joinExec.asInstanceOf[org.apache.spark.sql.execution.joins.BroadcastHashJoinExec].isNullAwareAntiJoin
was false (JoinSuite.scala:1161)
And then we have a number of logical plans that were expected to look like a
checked in version in the repo, that have now changed. I am not entirely sure
if it's ok to change those plans.
I think I need to understand first how to actually fix this issue, before
going off to fix most of the test failures.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]