Github user LantaoJin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22353#discussion_r216121128
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/DataSourceScanExec.scala 
---
    @@ -54,7 +54,7 @@ trait DataSourceScanExec extends LeafExecNode with 
CodegenSupport {
       override def simpleString: String = {
         val metadataEntries = metadata.toSeq.sorted.map {
           case (key, value) =>
    -        key + ": " + StringUtils.abbreviate(redact(value), 100)
    --- End diff --
    
    I think it’s overkill to parameterizing it. And Spark user doesn’t care 
about it, no one will reset it before submitting app. Besides, simply raise up 
to 1000 also can resolve the problem on most cases, but longer than 1000 chars 
is still meanlessness.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to