[GitHub] spark issue #21169: [SPARK-23715][SQL] the input of to/from_utc_timestamp ca...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21169 **[Test build #90099 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90099/testReport)** for PR 21169 at commit [`b6d91db`](https://github.com/apache/spark/commit/b6d91db2fd71b50389cf3647a31eefc83d5dbc44). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21169: [SPARK-23715][SQL] the input of to/from_utc_timestamp ca...
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/21169 The CRAN issue seems fixed given the response from CRAN sysadmin by @viirya's request. Let me restart this. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21169: [SPARK-23715][SQL] the input of to/from_utc_timestamp ca...
Github user HyukjinKwon commented on the issue: https://github.com/apache/spark/pull/21169 retest this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21225: [SPARK-24168][SQL] WindowExec should not access SQLConf ...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21225 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21225: [SPARK-24168][SQL] WindowExec should not access SQLConf ...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21225 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2844/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21226: [SPARK-24169][SQL] JsonToStructs should not access SQLCo...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21226 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21226: [SPARK-24169][SQL] JsonToStructs should not access SQLCo...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21226 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2843/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21226: [SPARK-24169][SQL] JsonToStructs should not access SQLCo...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21226 **[Test build #90098 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90098/testReport)** for PR 21226 at commit [`e1491f0`](https://github.com/apache/spark/commit/e1491f0aeb62d1eda8cd8c55f890c8f87eec5761). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21224: [SPARK-24167][SQL] ParquetFilters should not access SQLC...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21224 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2842/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21226: [SPARK-24169][SQL] JsonToStructs should not access SQLCo...
Github user cloud-fan commented on the issue: https://github.com/apache/spark/pull/21226 cc @HyukjinKwon @gatorsmile --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21226: [SPARK-24169][SQL] JsonToStructs should not acces...
GitHub user cloud-fan opened a pull request: https://github.com/apache/spark/pull/21226 [SPARK-24169][SQL] JsonToStructs should not access SQLConf at executor side ## What changes were proposed in this pull request? This PR is extracted from #21190 , to make it easier to backport. `JsonToStructs` can be serialized to executors and evaluate, we should not call `SQLConf.get.getConf(SQLConf.FROM_JSON_FORCE_NULLABLE_SCHEMA)` in the body. ## How was this patch tested? tested in #21190 You can merge this pull request into a Git repository by running: $ git pull https://github.com/cloud-fan/spark minor4 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/21226.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #21226 commit e1491f0aeb62d1eda8cd8c55f890c8f87eec5761 Author: Wenchen FanDate: 2018-05-01T11:50:05Z JsonToStructs should not access SQLConf at executor side --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21224: [SPARK-24167][SQL] ParquetFilters should not access SQLC...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21224 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21224: [SPARK-24167][SQL] ParquetFilters should not access SQLC...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21224 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90096/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21224: [SPARK-24167][SQL] ParquetFilters should not access SQLC...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21224 **[Test build #90096 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90096/testReport)** for PR 21224 at commit [`c58baad`](https://github.com/apache/spark/commit/c58baad051259d7d2d54f1eb5e84c4bdac0867a6). * This patch **fails to build**. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21224: [SPARK-24167][SQL] ParquetFilters should not access SQLC...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21224 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21225: [SPARK-24168][SQL] WindowExec should not access SQLConf ...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21225 **[Test build #90097 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90097/testReport)** for PR 21225 at commit [`c5ab777`](https://github.com/apache/spark/commit/c5ab77710de805eab3a0a815790a58cd2de56cb2). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21225: [SPARK-24168][SQL] WindowExec should not access SQLConf ...
Github user cloud-fan commented on the issue: https://github.com/apache/spark/pull/21225 cc @hvanhovell @viirya --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21225: [SPARK-24168][SQL] WindowExec should not access S...
GitHub user cloud-fan opened a pull request: https://github.com/apache/spark/pull/21225 [SPARK-24168][SQL] WindowExec should not access SQLConf at executor side ## What changes were proposed in this pull request? This PR is extracted from #21190 , to make it easier to backport. `WindowExec#createBoundOrdering` is called on executor side, so we can't use `conf.sessionLocalTimezone` there. ## How was this patch tested? tested in #21190 You can merge this pull request into a Git repository by running: $ git pull https://github.com/cloud-fan/spark minor3 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/21225.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #21225 commit c5ab77710de805eab3a0a815790a58cd2de56cb2 Author: Wenchen FanDate: 2018-05-03T05:29:09Z WindowExec should not access SQLConf at executor side --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21224: [SPARK-24167][SQL] ParquetFilters should not access SQLC...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21224 **[Test build #90096 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90096/testReport)** for PR 21224 at commit [`c58baad`](https://github.com/apache/spark/commit/c58baad051259d7d2d54f1eb5e84c4bdac0867a6). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21224: [SPARK-24167][SQL] ParquetFilters should not access SQLC...
Github user cloud-fan commented on the issue: https://github.com/apache/spark/pull/21224 cc @gatorsmile --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21224: [SPARK-24167][SQL] ParquetFilters should not acce...
GitHub user cloud-fan opened a pull request: https://github.com/apache/spark/pull/21224 [SPARK-24167][SQL] ParquetFilters should not access SQLConf at executor side ## What changes were proposed in this pull request? This PR is extracted from #21190 , to make it easier to backport. `ParquetFilters` is used in the file scan function, which is executed in executor side, so we can't can't call `conf.parquetFilterPushDownDate` there. ## How was this patch tested? it's tested in #21190 You can merge this pull request into a Git repository by running: $ git pull https://github.com/cloud-fan/spark minor2 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/21224.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #21224 commit c58baad051259d7d2d54f1eb5e84c4bdac0867a6 Author: Wenchen FanDate: 2018-05-03T05:20:06Z ParquetFilters should not access SQLConf at executor side --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21210 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21210 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90090/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21221: [SPARK-23429][CORE] Add executor memory metrics to heart...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21221 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21210 **[Test build #90090 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90090/testReport)** for PR 21210 at commit [`51d4c0e`](https://github.com/apache/spark/commit/51d4c0ed72c15893a112c39d9e360e4cfabe6a62). * This patch passes all tests. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21221: [SPARK-23429][CORE] Add executor memory metrics to heart...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21221 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90087/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21221: [SPARK-23429][CORE] Add executor memory metrics to heart...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21221 **[Test build #90087 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90087/testReport)** for PR 21221 at commit [`ad10d28`](https://github.com/apache/spark/commit/ad10d2814bbfbaf8c21fcbb1abe83ef7a8e9ffe7). * This patch **fails Spark unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21223: [SPARK-24166][SQL] InMemoryTableScanExec should not acce...
Github user kiszk commented on the issue: https://github.com/apache/spark/pull/21223 LGTM --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21223: [SPARK-24166][SQL] InMemoryTableScanExec should not acce...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21223 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21223: [SPARK-24166][SQL] InMemoryTableScanExec should not acce...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21223 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2841/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21223: [SPARK-24166][SQL] InMemoryTableScanExec should not acce...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21223 **[Test build #90095 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90095/testReport)** for PR 21223 at commit [`d900b4c`](https://github.com/apache/spark/commit/d900b4c04f5def9bac6cec33c1c7753761a19658). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21223: [SPARK-24166][SQL] InMemoryTableScanExec should not acce...
Github user cloud-fan commented on the issue: https://github.com/apache/spark/pull/21223 cc @kiszk @viirya --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21223: [SPARK-24166][SQL] InMemoryTableScanExec should n...
GitHub user cloud-fan opened a pull request: https://github.com/apache/spark/pull/21223 [SPARK-24166][SQL] InMemoryTableScanExec should not access SQLConf at executor side ## What changes were proposed in this pull request? This PR is extracted from https://github.com/apache/spark/pull/21190 , to make it easier to backport. `InMemoryTableScanExec#createAndDecompressColumn` is executed inside `rdd.map`, we can't access `conf.offHeapColumnVectorEnabled` there. ## How was this patch tested? it's tested in #21190 You can merge this pull request into a Git repository by running: $ git pull https://github.com/cloud-fan/spark minor1 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/21223.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #21223 commit d900b4c04f5def9bac6cec33c1c7753761a19658 Author: Wenchen FanDate: 2018-05-03T05:03:23Z InMemoryTableScanExec should not access SQLConf at executor side --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21187: [SPARK-24035][SQL] SQL syntax for Pivot
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21187 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2840/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21187: [SPARK-24035][SQL] SQL syntax for Pivot
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21187 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21187: [SPARK-24035][SQL] SQL syntax for Pivot
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21187 **[Test build #90094 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90094/testReport)** for PR 21187 at commit [`c7eacf5`](https://github.com/apache/spark/commit/c7eacf50ee3ea0e7da34ceb521aeb13ed5f5f84c). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21073: [SPARK-23936][SQL] Implement map_concat
Github user ueshin commented on a diff in the pull request: https://github.com/apache/spark/pull/21073#discussion_r185694644 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala --- @@ -116,6 +117,169 @@ case class MapValues(child: Expression) override def prettyName: String = "map_values" } +/** + * Returns the union of all the given maps. + */ +@ExpressionDescription( +usage = "_FUNC_(map, ...) - Returns the union of all the given maps", +examples = """ +Examples: + > SELECT _FUNC_(map(1, 'a', 2, 'b'), map(2, 'c', 3, 'd')); + [[1 -> "a"], [2 -> "c"], [3 -> "d"] + """, since = "2.4.0") +case class MapConcat(children: Seq[Expression]) extends Expression { + + override def checkInputDataTypes(): TypeCheckResult = { +// check key types and value types separately to allow valueContainsNull to vary +if (children.exists(!_.dataType.isInstanceOf[MapType])) { + TypeCheckResult.TypeCheckFailure( +s"The given input of function $prettyName should all be of type map, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else if (children.map(_.dataType.asInstanceOf[MapType].keyType) + .exists(_.isInstanceOf[MapType])) { + // map_concat needs to pick a winner when multiple maps contain the same key. map_concat + // can do that only if it can detect when two keys are the same. SPARK-9415 states "map type + // should not support equality, hash". As a result, map_concat does not support a map type + // as a key + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName cannot have a map type as a key") +} else if (children.map(_.dataType.asInstanceOf[MapType].keyType).distinct.length > 1) { + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName should all be the same type, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else if (children.map(_.dataType.asInstanceOf[MapType].valueType).distinct.length > 1) { + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName should all be the same type, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else { + TypeCheckResult.TypeCheckSuccess +} + } + + override def dataType: MapType = { +MapType( + keyType = children.headOption + .map(_.dataType.asInstanceOf[MapType].keyType).getOrElse(StringType), + valueType = children.headOption + .map(_.dataType.asInstanceOf[MapType].valueType).getOrElse(StringType), + valueContainsNull = children.map { c => +c.dataType.asInstanceOf[MapType] + }.exists(_.valueContainsNull) +) + } + + override def nullable: Boolean = children.exists(_.nullable) + + override def eval(input: InternalRow): Any = { +val union = new util.LinkedHashMap[Any, Any]() +children.map(_.eval(input)).foreach { raw => + if (raw == null) { +return null + } + val map = raw.asInstanceOf[MapData] + map.foreach(dataType.keyType, dataType.valueType, (k, v) => +union.put(k, v) + ) +} +val (keyArray, valueArray) = union.entrySet().toArray().map { e => + val e2 = e.asInstanceOf[java.util.Map.Entry[Any, Any]] + (e2.getKey, e2.getValue) +}.unzip +new ArrayBasedMapData(new GenericArrayData(keyArray), new GenericArrayData(valueArray)) --- End diff -- nit: we can use `ArrayBasedMapData.apply()`. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21073: [SPARK-23936][SQL] Implement map_concat
Github user ueshin commented on a diff in the pull request: https://github.com/apache/spark/pull/21073#discussion_r185695887 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala --- @@ -116,6 +117,169 @@ case class MapValues(child: Expression) override def prettyName: String = "map_values" } +/** + * Returns the union of all the given maps. + */ +@ExpressionDescription( +usage = "_FUNC_(map, ...) - Returns the union of all the given maps", +examples = """ +Examples: + > SELECT _FUNC_(map(1, 'a', 2, 'b'), map(2, 'c', 3, 'd')); + [[1 -> "a"], [2 -> "c"], [3 -> "d"] + """, since = "2.4.0") +case class MapConcat(children: Seq[Expression]) extends Expression { + + override def checkInputDataTypes(): TypeCheckResult = { +// check key types and value types separately to allow valueContainsNull to vary +if (children.exists(!_.dataType.isInstanceOf[MapType])) { + TypeCheckResult.TypeCheckFailure( +s"The given input of function $prettyName should all be of type map, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else if (children.map(_.dataType.asInstanceOf[MapType].keyType) + .exists(_.isInstanceOf[MapType])) { + // map_concat needs to pick a winner when multiple maps contain the same key. map_concat + // can do that only if it can detect when two keys are the same. SPARK-9415 states "map type + // should not support equality, hash". As a result, map_concat does not support a map type + // as a key + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName cannot have a map type as a key") +} else if (children.map(_.dataType.asInstanceOf[MapType].keyType).distinct.length > 1) { + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName should all be the same type, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else if (children.map(_.dataType.asInstanceOf[MapType].valueType).distinct.length > 1) { + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName should all be the same type, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else { + TypeCheckResult.TypeCheckSuccess +} + } + + override def dataType: MapType = { +MapType( + keyType = children.headOption + .map(_.dataType.asInstanceOf[MapType].keyType).getOrElse(StringType), + valueType = children.headOption + .map(_.dataType.asInstanceOf[MapType].valueType).getOrElse(StringType), + valueContainsNull = children.map { c => +c.dataType.asInstanceOf[MapType] + }.exists(_.valueContainsNull) +) + } + + override def nullable: Boolean = children.exists(_.nullable) + + override def eval(input: InternalRow): Any = { +val union = new util.LinkedHashMap[Any, Any]() +children.map(_.eval(input)).foreach { raw => + if (raw == null) { +return null + } + val map = raw.asInstanceOf[MapData] + map.foreach(dataType.keyType, dataType.valueType, (k, v) => +union.put(k, v) + ) +} +val (keyArray, valueArray) = union.entrySet().toArray().map { e => + val e2 = e.asInstanceOf[java.util.Map.Entry[Any, Any]] + (e2.getKey, e2.getValue) +}.unzip +new ArrayBasedMapData(new GenericArrayData(keyArray), new GenericArrayData(valueArray)) + } + + override def doGenCode(ctx: CodegenContext, ev: ExprCode): ExprCode = { +val mapCodes = children.map(c => c.genCode(ctx)) +val keyType = dataType.keyType +val valueType = dataType.valueType +val mapRefArrayName = ctx.freshName("mapRefArray") +val unionMapName = ctx.freshName("union") + +val mapDataClass = classOf[MapData].getName +val arrayBasedMapDataClass = classOf[ArrayBasedMapData].getName +val arrayDataClass = classOf[ArrayData].getName +val genericArrayDataClass = classOf[GenericArrayData].getName +val hashMapClass = classOf[util.LinkedHashMap[Any, Any]].getName +val entryClass = classOf[util.Map.Entry[Any, Any]].getName + +val init = + s""" +|$mapDataClass[] $mapRefArrayName = new $mapDataClass[${mapCodes.size}]; +|boolean ${ev.isNull} = false; +|$mapDataClass ${ev.value} = null; + """.stripMargin + +val assignments = mapCodes.zipWithIndex.map { case (m, i) => + val initCode = mapCodes(i).code +
[GitHub] spark pull request #21073: [SPARK-23936][SQL] Implement map_concat
Github user ueshin commented on a diff in the pull request: https://github.com/apache/spark/pull/21073#discussion_r185695875 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala --- @@ -116,6 +117,169 @@ case class MapValues(child: Expression) override def prettyName: String = "map_values" } +/** + * Returns the union of all the given maps. + */ +@ExpressionDescription( +usage = "_FUNC_(map, ...) - Returns the union of all the given maps", +examples = """ +Examples: + > SELECT _FUNC_(map(1, 'a', 2, 'b'), map(2, 'c', 3, 'd')); + [[1 -> "a"], [2 -> "c"], [3 -> "d"] + """, since = "2.4.0") +case class MapConcat(children: Seq[Expression]) extends Expression { + + override def checkInputDataTypes(): TypeCheckResult = { +// check key types and value types separately to allow valueContainsNull to vary +if (children.exists(!_.dataType.isInstanceOf[MapType])) { + TypeCheckResult.TypeCheckFailure( +s"The given input of function $prettyName should all be of type map, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else if (children.map(_.dataType.asInstanceOf[MapType].keyType) + .exists(_.isInstanceOf[MapType])) { + // map_concat needs to pick a winner when multiple maps contain the same key. map_concat + // can do that only if it can detect when two keys are the same. SPARK-9415 states "map type + // should not support equality, hash". As a result, map_concat does not support a map type + // as a key + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName cannot have a map type as a key") +} else if (children.map(_.dataType.asInstanceOf[MapType].keyType).distinct.length > 1) { + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName should all be the same type, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else if (children.map(_.dataType.asInstanceOf[MapType].valueType).distinct.length > 1) { + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName should all be the same type, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else { + TypeCheckResult.TypeCheckSuccess +} + } + + override def dataType: MapType = { +MapType( + keyType = children.headOption + .map(_.dataType.asInstanceOf[MapType].keyType).getOrElse(StringType), + valueType = children.headOption + .map(_.dataType.asInstanceOf[MapType].valueType).getOrElse(StringType), + valueContainsNull = children.map { c => +c.dataType.asInstanceOf[MapType] + }.exists(_.valueContainsNull) +) + } + + override def nullable: Boolean = children.exists(_.nullable) + + override def eval(input: InternalRow): Any = { +val union = new util.LinkedHashMap[Any, Any]() +children.map(_.eval(input)).foreach { raw => + if (raw == null) { +return null + } + val map = raw.asInstanceOf[MapData] + map.foreach(dataType.keyType, dataType.valueType, (k, v) => +union.put(k, v) + ) +} +val (keyArray, valueArray) = union.entrySet().toArray().map { e => + val e2 = e.asInstanceOf[java.util.Map.Entry[Any, Any]] + (e2.getKey, e2.getValue) +}.unzip +new ArrayBasedMapData(new GenericArrayData(keyArray), new GenericArrayData(valueArray)) + } + + override def doGenCode(ctx: CodegenContext, ev: ExprCode): ExprCode = { +val mapCodes = children.map(c => c.genCode(ctx)) +val keyType = dataType.keyType +val valueType = dataType.valueType +val mapRefArrayName = ctx.freshName("mapRefArray") +val unionMapName = ctx.freshName("union") + +val mapDataClass = classOf[MapData].getName +val arrayBasedMapDataClass = classOf[ArrayBasedMapData].getName +val arrayDataClass = classOf[ArrayData].getName +val genericArrayDataClass = classOf[GenericArrayData].getName +val hashMapClass = classOf[util.LinkedHashMap[Any, Any]].getName +val entryClass = classOf[util.Map.Entry[Any, Any]].getName + +val init = + s""" +|$mapDataClass[] $mapRefArrayName = new $mapDataClass[${mapCodes.size}]; +|boolean ${ev.isNull} = false; +|$mapDataClass ${ev.value} = null; + """.stripMargin + +val assignments = mapCodes.zipWithIndex.map { case (m, i) => + val initCode = mapCodes(i).code ---
[GitHub] spark pull request #21073: [SPARK-23936][SQL] Implement map_concat
Github user ueshin commented on a diff in the pull request: https://github.com/apache/spark/pull/21073#discussion_r185696085 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala --- @@ -116,6 +117,169 @@ case class MapValues(child: Expression) override def prettyName: String = "map_values" } +/** + * Returns the union of all the given maps. + */ +@ExpressionDescription( +usage = "_FUNC_(map, ...) - Returns the union of all the given maps", +examples = """ +Examples: + > SELECT _FUNC_(map(1, 'a', 2, 'b'), map(2, 'c', 3, 'd')); + [[1 -> "a"], [2 -> "c"], [3 -> "d"] + """, since = "2.4.0") +case class MapConcat(children: Seq[Expression]) extends Expression { + + override def checkInputDataTypes(): TypeCheckResult = { +// check key types and value types separately to allow valueContainsNull to vary +if (children.exists(!_.dataType.isInstanceOf[MapType])) { + TypeCheckResult.TypeCheckFailure( +s"The given input of function $prettyName should all be of type map, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else if (children.map(_.dataType.asInstanceOf[MapType].keyType) + .exists(_.isInstanceOf[MapType])) { + // map_concat needs to pick a winner when multiple maps contain the same key. map_concat + // can do that only if it can detect when two keys are the same. SPARK-9415 states "map type + // should not support equality, hash". As a result, map_concat does not support a map type + // as a key + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName cannot have a map type as a key") +} else if (children.map(_.dataType.asInstanceOf[MapType].keyType).distinct.length > 1) { + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName should all be the same type, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else if (children.map(_.dataType.asInstanceOf[MapType].valueType).distinct.length > 1) { + TypeCheckResult.TypeCheckFailure( +s"The given input maps of function $prettyName should all be the same type, " + + "but they are " + children.map(_.dataType.simpleString).mkString("[", ", ", "]")) +} else { + TypeCheckResult.TypeCheckSuccess +} + } + + override def dataType: MapType = { +MapType( + keyType = children.headOption + .map(_.dataType.asInstanceOf[MapType].keyType).getOrElse(StringType), + valueType = children.headOption + .map(_.dataType.asInstanceOf[MapType].valueType).getOrElse(StringType), + valueContainsNull = children.map { c => +c.dataType.asInstanceOf[MapType] + }.exists(_.valueContainsNull) +) + } + + override def nullable: Boolean = children.exists(_.nullable) + + override def eval(input: InternalRow): Any = { +val union = new util.LinkedHashMap[Any, Any]() +children.map(_.eval(input)).foreach { raw => + if (raw == null) { +return null + } + val map = raw.asInstanceOf[MapData] + map.foreach(dataType.keyType, dataType.valueType, (k, v) => +union.put(k, v) + ) +} +val (keyArray, valueArray) = union.entrySet().toArray().map { e => + val e2 = e.asInstanceOf[java.util.Map.Entry[Any, Any]] + (e2.getKey, e2.getValue) +}.unzip +new ArrayBasedMapData(new GenericArrayData(keyArray), new GenericArrayData(valueArray)) + } + + override def doGenCode(ctx: CodegenContext, ev: ExprCode): ExprCode = { +val mapCodes = children.map(c => c.genCode(ctx)) +val keyType = dataType.keyType +val valueType = dataType.valueType +val mapRefArrayName = ctx.freshName("mapRefArray") +val unionMapName = ctx.freshName("union") + +val mapDataClass = classOf[MapData].getName +val arrayBasedMapDataClass = classOf[ArrayBasedMapData].getName +val arrayDataClass = classOf[ArrayData].getName +val genericArrayDataClass = classOf[GenericArrayData].getName +val hashMapClass = classOf[util.LinkedHashMap[Any, Any]].getName +val entryClass = classOf[util.Map.Entry[Any, Any]].getName + +val init = + s""" +|$mapDataClass[] $mapRefArrayName = new $mapDataClass[${mapCodes.size}]; +|boolean ${ev.isNull} = false; +|$mapDataClass ${ev.value} = null; + """.stripMargin + +val assignments = mapCodes.zipWithIndex.map { case (m, i) => + val initCode = mapCodes(i).code +
[GitHub] spark issue #21219: [SPARK-24160] ShuffleBlockFetcherIterator should fail if...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21219 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21219: [SPARK-24160] ShuffleBlockFetcherIterator should fail if...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21219 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2839/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21215: [SPARK-24148][SQL] Overloading array function to support...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21215 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90086/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21215: [SPARK-24148][SQL] Overloading array function to support...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21215 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21215: [SPARK-24148][SQL] Overloading array function to support...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21215 **[Test build #90086 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90086/testReport)** for PR 21215 at commit [`9c12457`](https://github.com/apache/spark/commit/9c124574a3fefe2e63dcd95bd03e47f1f8d5071a). * This patch **fails PySpark unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21219: [SPARK-24160] ShuffleBlockFetcherIterator should fail if...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21219 **[Test build #90093 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90093/testReport)** for PR 21219 at commit [`41d06e1`](https://github.com/apache/spark/commit/41d06e13d0f95f1dd146b6b512a0becc88eb2caa). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21219: [SPARK-24160] ShuffleBlockFetcherIterator should fail if...
Github user JoshRosen commented on the issue: https://github.com/apache/spark/pull/21219 jenkins retest this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21193: [SPARK-24121][SQL][WIP] Add API for handling expression ...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21193 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21193: [SPARK-24121][SQL][WIP] Add API for handling expression ...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21193 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2838/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21164: [SPARK-24098][SQL] ScriptTransformationExec should wait ...
Github user cloud-fan commented on the issue: https://github.com/apache/spark/pull/21164 cc @dongjoon-hyun @gatorsmile --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21193: [SPARK-24121][SQL][WIP] Add API for handling expression ...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21193 **[Test build #90092 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90092/testReport)** for PR 21193 at commit [`5fe425c`](https://github.com/apache/spark/commit/5fe425c2d2837f00bdfe9ba5e6f446829fba32c1). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user dongjoon-hyun commented on the issue: https://github.com/apache/spark/pull/21210 Thank you for review, @cloud-fan ! --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21193: [SPARK-24121][SQL][WIP] Add API for handling expression ...
Github user maropu commented on the issue: https://github.com/apache/spark/pull/21193 retest this please. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21073: [SPARK-23936][SQL] Implement map_concat
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21073 **[Test build #90091 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90091/testReport)** for PR 21073 at commit [`77ae014`](https://github.com/apache/spark/commit/77ae014bea3d9fbc20fbd11d5508c3606e26343d). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21164: [SPARK-24098][SQL] ScriptTransformationExec shoul...
Github user liutang123 commented on a diff in the pull request: https://github.com/apache/spark/pull/21164#discussion_r185693669 --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala --- @@ -137,13 +137,12 @@ case class ScriptTransformationExec( throw writerThread.exception.get } - if (!proc.isAlive) { -val exitCode = proc.exitValue() -if (exitCode != 0) { - logError(stderrBuffer.toString) // log the stderr circular buffer - throw new SparkException(s"Subprocess exited with status $exitCode. " + -s"Error: ${stderrBuffer.toString}", cause) -} + proc.waitFor() --- End diff -- Although writerThread._exception is a volatile variable, some times writerThread.exception function may be called before writerThread._exception's assignment due to readThread and writerThread are working parallel. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21010: [SPARK-23900][SQL] format_number support user spe...
Github user ueshin commented on a diff in the pull request: https://github.com/apache/spark/pull/21010#discussion_r185691812 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala --- @@ -2108,35 +2133,53 @@ case class FormatNumber(x: Expression, d: Expression) // SPARK-13515: US Locale configures the DecimalFormat object to use a dot ('.') // as a decimal separator. val usLocale = "US" - val i = ctx.freshName("i") - val dFormat = ctx.freshName("dFormat") - val lastDValue = -ctx.addMutableState(CodeGenerator.JAVA_INT, "lastDValue", v => s"$v = -100;") - val pattern = ctx.addMutableState(sb, "pattern", v => s"$v = new $sb();") val numberFormat = ctx.addMutableState(df, "numberFormat", v => s"""$v = new $df("", new $dfs($l.$usLocale));""") - s""" -if ($d >= 0) { - $pattern.delete(0, $pattern.length()); - if ($d != $lastDValue) { -$pattern.append("#,###,###,###,###,###,##0"); - -if ($d > 0) { - $pattern.append("."); - for (int $i = 0; $i < $d; $i++) { -$pattern.append("0"); + right.dataType match { +case IntegerType => + val pattern = ctx.addMutableState(sb, "pattern", v => s"$v = new $sb();") + val i = ctx.freshName("i") + val lastDIntValue = +ctx.addMutableState(CodeGenerator.JAVA_INT, "lastDValue", v => s"$v = -100;") + s""" +if ($d >= 0) { + $pattern.delete(0, $pattern.length()); + if ($d != $lastDIntValue) { +$pattern.append("$defaultFormat"); + +if ($d > 0) { + $pattern.append("."); + for (int $i = 0; $i < $d; $i++) { +$pattern.append("0"); + } +} +$lastDIntValue = $d; +$numberFormat.applyLocalizedPattern($pattern.toString()); } + ${ev.value} = UTF8String.fromString($numberFormat.format(${typeHelper(num)})); +} else { + ${ev.value} = null; + ${ev.isNull} = true; } -$lastDValue = $d; -$numberFormat.applyLocalizedPattern($pattern.toString()); - } - ${ev.value} = UTF8String.fromString($numberFormat.format(${typeHelper(num)})); -} else { - ${ev.value} = null; - ${ev.isNull} = true; -} - """ + """ +case StringType => + val lastDStringValue = +ctx.addMutableState("String", "lastDValue", v => s"""$v = "$defaultFormat";""") + val dValue = ctx.addMutableState("String", "dValue") --- End diff -- Do we need to make this mutable state? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21010: [SPARK-23900][SQL] format_number support user spe...
Github user ueshin commented on a diff in the pull request: https://github.com/apache/spark/pull/21010#discussion_r185691861 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala --- @@ -2108,35 +2133,53 @@ case class FormatNumber(x: Expression, d: Expression) // SPARK-13515: US Locale configures the DecimalFormat object to use a dot ('.') // as a decimal separator. val usLocale = "US" - val i = ctx.freshName("i") - val dFormat = ctx.freshName("dFormat") - val lastDValue = -ctx.addMutableState(CodeGenerator.JAVA_INT, "lastDValue", v => s"$v = -100;") - val pattern = ctx.addMutableState(sb, "pattern", v => s"$v = new $sb();") val numberFormat = ctx.addMutableState(df, "numberFormat", v => s"""$v = new $df("", new $dfs($l.$usLocale));""") - s""" -if ($d >= 0) { - $pattern.delete(0, $pattern.length()); - if ($d != $lastDValue) { -$pattern.append("#,###,###,###,###,###,##0"); - -if ($d > 0) { - $pattern.append("."); - for (int $i = 0; $i < $d; $i++) { -$pattern.append("0"); + right.dataType match { +case IntegerType => + val pattern = ctx.addMutableState(sb, "pattern", v => s"$v = new $sb();") + val i = ctx.freshName("i") + val lastDIntValue = +ctx.addMutableState(CodeGenerator.JAVA_INT, "lastDValue", v => s"$v = -100;") + s""" +if ($d >= 0) { + $pattern.delete(0, $pattern.length()); + if ($d != $lastDIntValue) { +$pattern.append("$defaultFormat"); + +if ($d > 0) { + $pattern.append("."); + for (int $i = 0; $i < $d; $i++) { +$pattern.append("0"); + } +} +$lastDIntValue = $d; +$numberFormat.applyLocalizedPattern($pattern.toString()); } + ${ev.value} = UTF8String.fromString($numberFormat.format(${typeHelper(num)})); +} else { + ${ev.value} = null; + ${ev.isNull} = true; } -$lastDValue = $d; -$numberFormat.applyLocalizedPattern($pattern.toString()); - } - ${ev.value} = UTF8String.fromString($numberFormat.format(${typeHelper(num)})); -} else { - ${ev.value} = null; - ${ev.isNull} = true; -} - """ + """ +case StringType => + val lastDStringValue = +ctx.addMutableState("String", "lastDValue", v => s"""$v = "$defaultFormat";""") + val dValue = ctx.addMutableState("String", "dValue") + s""" +$dValue = $d.toString(); +if (!$dValue.equals($lastDStringValue)) { --- End diff -- What if the first `dValue` is the same as the default format? Can you add the test case? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21169: [SPARK-23715][SQL] the input of to/from_utc_timestamp ca...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21169 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21169: [SPARK-23715][SQL] the input of to/from_utc_timestamp ca...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21169 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90085/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21169: [SPARK-23715][SQL] the input of to/from_utc_timestamp ca...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21169 **[Test build #90085 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90085/testReport)** for PR 21169 at commit [`b6d91db`](https://github.com/apache/spark/commit/b6d91db2fd71b50389cf3647a31eefc83d5dbc44). * This patch **fails SparkR unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21165: [Spark-20087][CORE] Attach accumulators / metrics...
Github user advancedxy commented on a diff in the pull request: https://github.com/apache/spark/pull/21165#discussion_r185690094 --- Diff: core/src/main/scala/org/apache/spark/TaskEndReason.scala --- @@ -212,9 +212,15 @@ case object TaskResultLost extends TaskFailedReason { * Task was killed intentionally and needs to be rescheduled. */ @DeveloperApi -case class TaskKilled(reason: String) extends TaskFailedReason { +case class TaskKilled( +reason: String, +accumUpdates: Seq[AccumulableInfo] = Seq.empty, +private[spark] val accums: Seq[AccumulatorV2[_, _]] = Nil) --- End diff -- I'm ok with not keeping `Seq[AccumulableInfo]`. But it means inconsistent logic and api and may make future refactoring a bit difficult. Let's see what I can do. > I'd like to not keep the Seq[AccumulableInfo], we may deprecate it in the existing APIs in the near future. BTW, I think we have already deprecated `AccumulableInfo`. Unless we are planing to remove it in Spark 3.0 and Spark 3.0 is the next release, `AccumulableInfo` will be there for a long time --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21187: [SPARK-24035][SQL] SQL syntax for Pivot
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21187 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90084/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21187: [SPARK-24035][SQL] SQL syntax for Pivot
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21187 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21187: [SPARK-24035][SQL] SQL syntax for Pivot
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21187 **[Test build #90084 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90084/testReport)** for PR 21187 at commit [`edd0eb1`](https://github.com/apache/spark/commit/edd0eb1f879423a436c33164e8777252a9c0e1da). * This patch **fails SparkR unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21010: [SPARK-23900][SQL] format_number support user specifed f...
Github user wangyum commented on the issue: https://github.com/apache/spark/pull/21010 retest please. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21021: [SPARK-23921][SQL] Add array_sort function
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21021 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21021: [SPARK-23921][SQL] Add array_sort function
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21021 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90082/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21021: [SPARK-23921][SQL] Add array_sort function
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21021 **[Test build #90082 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90082/testReport)** for PR 21021 at commit [`9f63a76`](https://github.com/apache/spark/commit/9f63a766dc7308c564a7d59cbad58ee8c0a15faa). * This patch **fails SparkR unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21193: [SPARK-24121][SQL][WIP] Add API for handling expression ...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21193 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21193: [SPARK-24121][SQL][WIP] Add API for handling expression ...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21193 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90083/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21193: [SPARK-24121][SQL][WIP] Add API for handling expression ...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21193 **[Test build #90083 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90083/testReport)** for PR 21193 at commit [`5fe425c`](https://github.com/apache/spark/commit/5fe425c2d2837f00bdfe9ba5e6f446829fba32c1). * This patch **fails SparkR unit tests**. * This patch merges cleanly. * This patch adds the following public classes _(experimental)_: * `case class ExprCode(var code: Block, var isNull: ExprValue, var value: ExprValue)` * `trait Block extends JavaCode ` * ` implicit class BlockHelper(val sc: StringContext) extends AnyVal ` * `case class CodeBlock(codeParts: Seq[String], exprValues: Seq[Any]) extends Block ` * `case class Blocks(blocks: Seq[Block]) extends Block ` * `trait ExprValue extends JavaCode ` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21165: [Spark-20087][CORE] Attach accumulators / metrics...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/21165#discussion_r185688449 --- Diff: core/src/main/scala/org/apache/spark/TaskEndReason.scala --- @@ -212,9 +212,15 @@ case object TaskResultLost extends TaskFailedReason { * Task was killed intentionally and needs to be rescheduled. */ @DeveloperApi -case class TaskKilled(reason: String) extends TaskFailedReason { +case class TaskKilled( +reason: String, +accumUpdates: Seq[AccumulableInfo] = Seq.empty, +private[spark] val accums: Seq[AccumulatorV2[_, _]] = Nil) --- End diff -- now the question is: shall we keep the unnecessary Seq[AccumulableInfo] in new APIs, to make the API consistent? I'd like to not keep the Seq[AccumulableInfo], we may deprecate it in the existing APIs in the near future. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21210 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2837/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21210 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21210 **[Test build #90090 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90090/testReport)** for PR 21210 at commit [`51d4c0e`](https://github.com/apache/spark/commit/51d4c0ed72c15893a112c39d9e360e4cfabe6a62). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user cloud-fan commented on the issue: https://github.com/apache/spark/pull/21210 LGTM --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21210: [SPARK-23489][SQL][TEST] HiveExternalCatalogVersionsSuit...
Github user cloud-fan commented on the issue: https://github.com/apache/spark/pull/21210 retest this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21222: [SPARK-24161][SS] Enable debug package feature on struct...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21222 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21222: [SPARK-24161][SS] Enable debug package feature on struct...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21222 Can one of the admins verify this patch? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21214: [SPARK-23775][TEST] Make DataFrameRangeSuite not ...
Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/21214#discussion_r185687677 --- Diff: sql/core/src/test/scala/org/apache/spark/sql/DataFrameRangeSuite.scala --- @@ -153,23 +153,17 @@ class DataFrameRangeSuite extends QueryTest with SharedSQLContext with Eventuall test("Cancelling stage in a query with Range.") { val listener = new SparkListener { - override def onJobStart(jobStart: SparkListenerJobStart): Unit = { -eventually(timeout(10.seconds), interval(1.millis)) { - assert(DataFrameRangeSuite.stageToKill > 0) -} -sparkContext.cancelStage(DataFrameRangeSuite.stageToKill) + override def onTaskStart(taskStart: SparkListenerTaskStart): Unit = { +sparkContext.cancelStage(taskStart.stageId) } } sparkContext.addSparkListener(listener) for (codegen <- Seq(true, false)) { withSQLConf(SQLConf.WHOLESTAGE_CODEGEN_ENABLED.key -> codegen.toString()) { -DataFrameRangeSuite.stageToKill = -1 val ex = intercept[SparkException] { - spark.range(0, 1000L, 1, 1).map { x => -DataFrameRangeSuite.stageToKill = TaskContext.get().stageId() -x - }.toDF("id").agg(sum("id")).collect() + spark.range(0, 1000L, 1, 1) --- End diff -- +1 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21217: [SPARK-24151][SQL] Fix CURRENT_DATE, CURRENT_TIMESTAMP t...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21217 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90081/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21217: [SPARK-24151][SQL] Fix CURRENT_DATE, CURRENT_TIMESTAMP t...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21217 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #21222: [SPARK-24161][SS] Enable debug package feature on...
GitHub user HeartSaVioR opened a pull request: https://github.com/apache/spark/pull/21222 [SPARK-24161][SS] Enable debug package feature on structured streaming ## What changes were proposed in this pull request? Currently, debug package has a implicit class "DebugQuery" which matches Dataset to provide debug features on Dataset class. It doesn't work with structured streaming: it requires query is already started, and the information can be retrieved from StreamingQuery, not Dataset. I guess that's why "explain" had to be placed to StreamingQuery whereas it already exists on Dataset. This patch adds a new implicit class "DebugStreamQuery" which matches StreamingQuery to provide similar debug features on StreamingQuery class. ## How was this patch tested? Added relevant unit tests. You can merge this pull request into a Git repository by running: $ git pull https://github.com/HeartSaVioR/spark SPARK-24161 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/21222.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #21222 commit c1ad1c557e6165455457adb6f148d6d9616548a1 Author: Jungtaek LimDate: 2018-05-03T02:26:48Z SPARK-24161 Enable debug package feature on structured streaming * added implicit class which adds debug features for StreamingQuery * added unit tests for new functionalities --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21217: [SPARK-24151][SQL] Fix CURRENT_DATE, CURRENT_TIMESTAMP t...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21217 **[Test build #90081 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90081/testReport)** for PR 21217 at commit [`662dd2e`](https://github.com/apache/spark/commit/662dd2e1dcd9de2ada87811e116b420f138cfa13). * This patch **fails SparkR unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21190: [SPARK-22938][SQL][followup] Assert that SQLConf.get is ...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21190 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2836/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21190: [SPARK-22938][SQL][followup] Assert that SQLConf.get is ...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21190 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21219: [SPARK-24160] ShuffleBlockFetcherIterator should fail if...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21219 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21219: [SPARK-24160] ShuffleBlockFetcherIterator should fail if...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21219 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90077/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21216: [SPARK-24149][YARN] Retrieve all federated namespaces to...
Github user jerryshao commented on the issue: https://github.com/apache/spark/pull/21216 I'm not so familiar with federated HDFS, but is it transparent to the downside applications like Spark, or Spark should know all the configured NNs? If it is transparent, then I think the token acquisition mechanism should also be transparent to Spark, Spark doesn't need to know all the configured NNs. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21190: [SPARK-22938][SQL][followup] Assert that SQLConf.get is ...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21190 **[Test build #90089 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90089/testReport)** for PR 21190 at commit [`0503118`](https://github.com/apache/spark/commit/0503118d6a9109021d360291a9913bff205b9fe4). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21219: [SPARK-24160] ShuffleBlockFetcherIterator should fail if...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21219 **[Test build #90077 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90077/testReport)** for PR 21219 at commit [`41d06e1`](https://github.com/apache/spark/commit/41d06e13d0f95f1dd146b6b512a0becc88eb2caa). * This patch **fails SparkR unit tests**. * This patch merges cleanly. * This patch adds no public classes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21213: [SPARK-24120] Show `Jobs` page when `jobId` is missing
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21213 Test PASSed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/2835/ Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21213: [SPARK-24120] Show `Jobs` page when `jobId` is missing
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21213 Merged build finished. Test PASSed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21213: [SPARK-24120] Show `Jobs` page when `jobId` is missing
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21213 **[Test build #90088 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90088/testReport)** for PR 21213 at commit [`5e7d4a3`](https://github.com/apache/spark/commit/5e7d4a34cf4103300b4fd083ff709866edc8c2d4). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21220: [SPARK-24157][SS] Enabled no-data batches in MicroBatchE...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21220 Merged build finished. Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21220: [SPARK-24157][SS] Enabled no-data batches in MicroBatchE...
Github user AmplabJenkins commented on the issue: https://github.com/apache/spark/pull/21220 Test FAILed. Refer to this link for build results (access rights to CI server needed): https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/90078/ Test FAILed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21221: [SPARK-23429][CORE] Add executor memory metrics to heart...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21221 **[Test build #90087 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90087/testReport)** for PR 21221 at commit [`ad10d28`](https://github.com/apache/spark/commit/ad10d2814bbfbaf8c21fcbb1abe83ef7a8e9ffe7). --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21220: [SPARK-24157][SS] Enabled no-data batches in MicroBatchE...
Github user SparkQA commented on the issue: https://github.com/apache/spark/pull/21220 **[Test build #90078 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/90078/testReport)** for PR 21220 at commit [`7fa11c0`](https://github.com/apache/spark/commit/7fa11c0ac362ace43ce02dee6309a3a632b0c3ee). * This patch **fails SparkR unit tests**. * This patch merges cleanly. * This patch adds the following public classes _(experimental)_: * `class WatermarkTracker extends Logging ` * `trait MemorySinkBase extends BaseStreamingSink ` * `class MemorySink(val schema: StructType, outputMode: OutputMode) extends Sink` * `class MemorySinkV2 extends DataSourceV2 with StreamWriteSupport with MemorySinkBase with Logging ` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #21213: [SPARK-24120] Show `Jobs` page when `jobId` is missing
Github user jongyoul commented on the issue: https://github.com/apache/spark/pull/21213 test this please --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark issue #20940: [SPARK-23429][CORE] Add executor memory metrics to heart...
Github user squito commented on the issue: https://github.com/apache/spark/pull/20940 thanks, can you close this pull request now? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org