hvanhovell commented on a change in pull request #23169: [SPARK-26103][SQL] 
Limit the length of debug strings for query plans
URL: https://github.com/apache/spark/pull/23169#discussion_r249370703
 
 

 ##########
 File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
 ##########
 @@ -1625,6 +1625,14 @@ object SQLConf {
     .intConf
     .createWithDefault(25)
 
+  val MAX_PLAN_STRING_LENGTH = buildConf("spark.sql.maxPlanLength")
+    .doc("Maximum number of characters to output for a plan string.  If the 
plan is " +
+      "longer, further output will be truncated.  The default setting always 
generates a full " +
+      "plan.  Set this to a lower value such as 8192 if plan strings are 
taking up too much " +
+      "memory or are causing OutOfMemory errors in the driver or UI 
processes.")
+    .intConf
 
 Review comment:
   Can you check that the value set is > 0? You can use the `checkValue(i => i 
> 0 && i <= ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH, "...")` function for 
this.
   
   Also use `ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH` instead of 
`Int.MaxValue` as the default here.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to