[GitHub] spark pull request #22685: [SQL][MINOR][Refactor] Refactor on sql/core
Github user sadhen closed the pull request at: https://github.com/apache/spark/pull/22685 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22685: [SQL][MINOR][Refactor] Refactor on sql/core
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/22685#discussion_r224317853 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSource.scala --- @@ -96,7 +95,7 @@ case class DataSource( private val caseInsensitiveOptions = CaseInsensitiveMap(options) private val equality = sparkSession.sessionState.conf.resolver - bucketSpec.map { bucket => + bucketSpec.foreach { bucket => --- End diff -- Yea, this is legitimate change. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22685: [SQL][MINOR][Refactor] Refactor on sql/core
Github user sadhen commented on a diff in the pull request: https://github.com/apache/spark/pull/22685#discussion_r224054521 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/expressions/UserDefinedFunction.scala --- @@ -147,7 +147,7 @@ private[sql] object SparkUserDefinedFunction { f: AnyRef, dataType: DataType, inputSchemas: Option[Seq[ScalaReflection.Schema]]): UserDefinedFunction = { -val udf = new UserDefinedFunction(f, dataType, inputSchemas.map(_.map(_.dataType))) +val udf = UserDefinedFunction(f, dataType, inputSchemas.map(_.map(_.dataType))) --- End diff -- useless `new` for case class --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22685: [SQL][MINOR][Refactor] Refactor on sql/core
Github user sadhen commented on a diff in the pull request: https://github.com/apache/spark/pull/22685#discussion_r224054362 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/metric/SQLMetrics.scala --- @@ -95,7 +95,7 @@ object SQLMetrics { def createMetric(sc: SparkContext, name: String): SQLMetric = { val acc = new SQLMetric(SUM_METRIC) -acc.register(sc, name = Some(name), countFailedValues = false) +acc.register(sc, name = Some(name)) --- End diff -- duplicated assignment --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22685: [SQL][MINOR][Refactor] Refactor on sql/core
Github user sadhen commented on a diff in the pull request: https://github.com/apache/spark/pull/22685#discussion_r224053894 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/KeyValueGroupedDataset.scala --- @@ -567,10 +567,10 @@ class KeyValueGroupedDataset[K, V] private[sql]( override def toString: String = { val builder = new StringBuilder val kFields = kExprEnc.schema.map { - case f => s"${f.name}: ${f.dataType.simpleString(2)}" + f => s"${f.name}: ${f.dataType.simpleString(2)}" --- End diff -- Should not be a partial function! --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22685: [SQL][MINOR][Refactor] Refactor on sql/core
Github user sadhen commented on a diff in the pull request: https://github.com/apache/spark/pull/22685#discussion_r224054020 --- Diff: sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala --- @@ -188,7 +188,7 @@ private[sql] object SQLUtils extends Logging { dataType match { case 's' => // Read StructType for DataFrame -val fields = SerDe.readList(dis, jvmObjectTracker = null).asInstanceOf[Array[Object]] +val fields = SerDe.readList(dis, jvmObjectTracker = null) --- End diff -- useless asInstanceOf --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #22685: [SQL][MINOR][Refactor] Refactor on sql/core
Github user sadhen commented on a diff in the pull request: https://github.com/apache/spark/pull/22685#discussion_r224053653 --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/internal/StaticSQLConf.scala --- @@ -123,7 +123,7 @@ object StaticSQLConf { val UI_RETAINED_EXECUTIONS = buildStaticConf("spark.sql.ui.retainedExecutions") .doc("Number of executions to retain in the Spark UI.") - .intConf + .longConf --- End diff -- This involves a implicit `int2long` cast. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org