dongjoon-hyun commented on PR #49905:
URL: https://github.com/apache/spark/pull/49905#issuecomment-2655211889

   I'm technically not okay with covering up our mistake silently in the 
community. At the same time, I agree with you guys that I really don't want to 
block the Apache Spark 4.0.0 release.
   
   For the record, given that we have well-known internal configs like the 
following, `internal` or not is not a criteria for the decision here.
   
   
https://github.com/apache/spark/blob/f7497425cd6d9e0f32847c19f5ec78a151f309b0/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala#L1080-L1081
   
   
https://github.com/apache/spark/blob/f7497425cd6d9e0f32847c19f5ec78a151f309b0/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala#L1639-L1640
   
   The bottom line is that how can we handle this properly in the Apache way. 
Since this is our mistake, we should be  responsible to correct this instead of 
ignoring any possibility.
   
   As I mentioned early, we cannot delete the exposed config at Apache Spark 
3.5.x. We can delete that in Apache Spark 4.0.0+ only.
   https://github.com/apache/spark/pull/49905#pullrequestreview-2612266829


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to