[GitHub] spark issue #19387: [SPARK-22160][SQL] Make sample points per partition (in ...

2017-09-28 Thread xuanwang14
Github user xuanwang14 commented on the issue:

https://github.com/apache/spark/pull/19387
  
LGTM


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19387: [SPARK-22160][SQL] Make sample points per partiti...

2017-09-28 Thread xuanwang14
Github user xuanwang14 commented on a diff in the pull request:

https://github.com/apache/spark/pull/19387#discussion_r141763083
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/ConfigBehaviorSuite.scala ---
@@ -0,0 +1,64 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql
+
+import org.apache.commons.math3.stat.inference.ChiSquareTest
+
+import org.apache.spark.sql.internal.SQLConf
+import org.apache.spark.sql.test.SharedSQLContext
+
+
+class ConfigBehaviorSuite extends QueryTest with SharedSQLContext {
+
+  import testImplicits._
+
+  test("SPARK-22160 
spark.sql.execution.rangeExchange.sampleSizePerPartition") {
+// In this test, we run a sort and compute the histogram for partition 
size post shuffle.
+// With a high sample count, the partition size should be more evenly 
distributed, and has a
+// low chi-sq test value.
+
+val numPartitions = 4
+
+def computeChiSquareTest(): Double = {
+  val n = 1
+  // Trigger a sort
+  val data = spark.range(0, n, 1, 1).sort('id)
+.selectExpr("SPARK_PARTITION_ID() pid", "id").as[(Int, 
Long)].collect()
+
+  // Compute histogram for the number of records per partition post 
sort
+  val dist = data.groupBy(_._1).map(_._2.length.toLong).toArray
+  assert(dist.length == 4)
+
+  new ChiSquareTest().chiSquare(
+Array.fill(numPartitions) { n.toDouble / numPartitions },
+dist)
+}
+
+withSQLConf(SQLConf.SHUFFLE_PARTITIONS.key -> numPartitions.toString) {
+  // The default chi-sq value should be low
+  assert(computeChiSquareTest() < 100)
+
+  withSQLConf(SQLConf.RANGE_EXCHANGE_SAMPLE_SIZE_PER_PARTITION.key -> 
"1") {
+// If we only sample one point, the range boundaries will be 
pretty bad and the
+// chi-sq value would be very high.
+assert(computeChiSquareTest() > 1000)
--- End diff --

This test may be flaky as well. 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19387: [SPARK-22160][SQL] Make sample points per partiti...

2017-09-28 Thread xuanwang14
Github user xuanwang14 commented on a diff in the pull request:

https://github.com/apache/spark/pull/19387#discussion_r141762963
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/ConfigBehaviorSuite.scala ---
@@ -0,0 +1,64 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql
+
+import org.apache.commons.math3.stat.inference.ChiSquareTest
+
+import org.apache.spark.sql.internal.SQLConf
+import org.apache.spark.sql.test.SharedSQLContext
+
+
+class ConfigBehaviorSuite extends QueryTest with SharedSQLContext {
+
+  import testImplicits._
+
+  test("SPARK-22160 
spark.sql.execution.rangeExchange.sampleSizePerPartition") {
+// In this test, we run a sort and compute the histogram for partition 
size post shuffle.
+// With a high sample count, the partition size should be more evenly 
distributed, and has a
+// low chi-sq test value.
+
+val numPartitions = 4
+
+def computeChiSquareTest(): Double = {
+  val n = 1
+  // Trigger a sort
+  val data = spark.range(0, n, 1, 1).sort('id)
+.selectExpr("SPARK_PARTITION_ID() pid", "id").as[(Int, 
Long)].collect()
+
+  // Compute histogram for the number of records per partition post 
sort
+  val dist = data.groupBy(_._1).map(_._2.length.toLong).toArray
+  assert(dist.length == 4)
+
+  new ChiSquareTest().chiSquare(
+Array.fill(numPartitions) { n.toDouble / numPartitions },
+dist)
+}
+
+withSQLConf(SQLConf.SHUFFLE_PARTITIONS.key -> numPartitions.toString) {
+  // The default chi-sq value should be low
+  assert(computeChiSquareTest() < 100)
--- End diff --

This test may be flaky. It depends on the ratio of 
`n/RANGE_EXCHANGE_SAMPLE_SIZE_PER_PARTITION`. What is the default value of 
`RANGE_EXCHANGE_SAMPLE_SIZE_PER_PARTITION` ?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org