Ngone51 commented on pull request #31684:
URL: https://github.com/apache/spark/pull/31684#issuecomment-787581992


   > I think if you propose new approach, you should restore all removed 
configs and make them as .removed() then. Don't think it is good idea to 
maintain two approach at the same time.
   
   Yeah, I was thinking that too. But I'd like to see people's feedback before 
going further.
   
   > IMHO, gathering all removed/deprecated configs in one place is convenient. 
   
   I think it's convenient (fast) for people to know what are deprecated or 
removed configs now. But it's not convenient for developers to maintain. For 
example, to deprecate a config with the current way, a developer needs to 
manually add `DepracatedConfig` to the deprecated list. And if the config is 
going to be removed in the future, the developer has to remove the 
`DepracatedConfig` from the deprecated list and add `RemovedConfig` to the 
removed list. And this process might be problematic because these two config 
types are separated in two places. e.g., developers may forget to remove the 
`DepracatedConfig` one. While with the proposed way in this PR, the developer 
only needs to add `deprecated()` and change from `deprecated()` to `removed()` 
later.
   
   > Currently, removed configs are actually removed but you propose to keep 
them forever (just mark them as removed), am I right?
   
   Yes. But those removed configs don't disappear from Spark totally as they 
are still maintained in the way of `RemovedConfig`.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to