Github user MaxGekk commented on a diff in the pull request:
https://github.com/apache/spark/pull/22429#discussion_r224391712
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/package.scala ---
@@ -167,6 +172,58 @@ package object util {
builder.toString()
}
+ /**
+ * The performance overhead of creating and logging strings for wide
schemas can be large. To
+ * limit the impact, we bound the number of fields to include by
default. This can be overridden
+ * by setting the 'spark.debug.maxToStringFields' conf in SparkEnv or by
settings the SQL config
+ * `spark.sql.debug.maxToStringFields`.
+ */
+ private[spark] def maxNumToStringFields: Int = {
+ val legacyLimit = if (SparkEnv.get != null) {
--- End diff --
Taking into account that old config wasn't well documented and could use
mostly in debugging, I think we can remove it in Spark 3.0. Initially the PR
targeted to Spark 2.4, in the minor version removing a public config can break
user apps potentially. If you are ok to remove it, I will do that.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]