HyukjinKwon commented on a change in pull request #25123: 
[SPARK-28355][CORE][PYTHON] Use Spark conf for threshold at which command is 
compressed by broadcast
URL: https://github.com/apache/spark/pull/25123#discussion_r302808224
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##########
 @@ -1246,6 +1246,14 @@ package object config {
       "mechanisms to guarantee data won't be corrupted during broadcast")
     .booleanConf.createWithDefault(true)
 
+  private[spark] val BROADCAST_FOR_UDF_COMPRESSION_THRESHOLD =
+    ConfigBuilder("spark.broadcast.UDFCompressionThreshold")
+      .doc("The threshold at which a a user-defined function (UDF) is 
compressed by broadcast, " +
 
 Review comment:
   I think this also applies to RDD APIs. We can just say, for instance, `The 
threshold at which Python commands for RDD APIs and user-defined function (UDF) 
are serialized by broadcast ...`. Feel free to change wording.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to