yaooqinn commented on a change in pull request #34794:
URL: https://github.com/apache/spark/pull/34794#discussion_r762676403
##########
File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
##########
@@ -2267,4 +2267,13 @@ package object config {
.version("3.3.0")
.intConf
.createWithDefault(5)
+
+ private[spark]val MAX_RDD_NAME_LENGTH =
ConfigBuilder("spark.rdd.nameMaxLength")
+ .internal()
+ .doc("Maximum number of characters for RDD name. For example, some of the
HadoopRDD API will" +
+ "use the path parameter as RDD name which could be extremely long")
+ .version("3.3.0")
+ .intConf
+ .checkValue(_ > 3, "This value must be bigger than 3.")
+ .createWithDefault(256)
Review comment:
yes, I had intended to change the spark auto-generated ones, which might
be too much longer than user's expectation, such as
https://github.com/apache/spark/blob/abecdfe831805fb53fa949558ea7a5ca9042e465/core/src/main/scala/org/apache/spark/SparkContext.scala#L992
https://github.com/apache/spark/blob/abecdfe831805fb53fa949558ea7a5ca9042e465/core/src/main/scala/org/apache/spark/SparkContext.scala#L1044
https://github.com/apache/spark/blob/abecdfe831805fb53fa949558ea7a5ca9042e465/core/src/main/scala/org/apache/spark/SparkContext.scala#L1269
How about that? also cc @srowen
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]