Github user AndrewKL commented on a diff in the pull request:
https://github.com/apache/spark/pull/22162#discussion_r216045605
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala
---
@@ -969,6 +969,22 @@ class DatasetSuite extends QueryTest with
SharedSQLContext {
checkShowString(ds, expected)
}
+
+ test("SPARK-2444git stat2 Show should follow
spark.show.default.number.of.rows") {
+ withSQLConf("spark.sql.show.defaultNumRows" -> "100") {
+ val ds = (1 to 1000).toDS().as[Int].show
--- End diff --
The way show is currently implemented makes it difficult to fully test
this. The truncate and max length parameters are passed into the showString
function that would normally be used to test the out put. unfortunately the
parameters and final string is then passed into the println function without an
easy way to capture the results. This could be tested with a spying library
but there isn't one in in spark (yet).
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]