pan3793 commented on code in PR #46047:
URL: https://github.com/apache/spark/pull/46047#discussion_r1566609420


##########
core/src/main/scala/org/apache/spark/SparkConf.scala:
##########
@@ -640,7 +640,8 @@ private[spark] object SparkConf extends Logging {
       DeprecatedConfig("spark.blacklist.killBlacklistedExecutors", "3.1.0",
         "Please use spark.excludeOnFailure.killExcludedExecutors"),
       
DeprecatedConfig("spark.yarn.blacklist.executor.launch.blacklisting.enabled", 
"3.1.0",
-        "Please use spark.yarn.executor.launch.excludeOnFailure.enabled")
+        "Please use spark.yarn.executor.launch.excludeOnFailure.enabled"),
+      DeprecatedConfig("spark.network.remoteReadNioBufferConversion", "3.5.2", 
"")

Review Comment:
   @dongjoon-hyun how should we fill the message in this case? BTW, according 
to the comment, I think this is an internal configuration, should we go through 
the deprecated procedure the same as user-facing configurations too?
   
   > // SPARK-24307 undocumented "escape-hatch" in case there are any issues in 
converting to
   > // ChunkedByteBuffer, to go back to old code-path.  Can be removed post 
Spark 2.4 if
   > // new path is stable.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to