cloud-fan commented on code in PR #50291:
URL: https://github.com/apache/spark/pull/50291#discussion_r1998724964
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##########
@@ -4090,7 +4090,7 @@ object SQLConf {
.createWithDefault(ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH)
val PRUNE_FILTERS_CAN_PRUNE_STREAMING_SUBPLAN =
- buildConf("spark.sql.optimizer.pruneFiltersCanPruneStreamingSubplan")
+
buildConf("spark.databricks.sql.optimizer.pruneFiltersCanPruneStreamingSubplan")
Review Comment:
This breaks CI because we now have a code-style check to ban
`spark.databricks` configs.
I'm going to revert the style check from branch-4.0 as well, as it depends
on a commit that should not have been merged. Or I close this PR if the
community can reach a consensus about the migration story for Spark 4.0 by
tomorrow.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]