[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/20016 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157941778 --- Diff: examples/src/main/scala/org/apache/spark/examples/HdfsTest.scala --- @@ -39,7 +39,7 @@ object HdfsTest { val start = System.currentTimeMillis() for (x <- mapped) { x + 2 } val end = System.currentTimeMillis() - println("Iteration " + iter + " took " + (end-start) + " ms") + println(s"Iteration ${iter} took ${(end-start)} ms") --- End diff -- @HyukjinKwon $end-start won't work, both are different variables see. I made changes. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157941813 --- Diff: examples/src/main/scala/org/apache/spark/examples/SparkALS.scala --- @@ -100,7 +100,7 @@ object SparkALS { ITERATIONS = iters.getOrElse("5").toInt slices = slices_.getOrElse("2").toInt case _ => -System.err.println("Usage: SparkALS [M] [U] [F] [iters] [partitions]") +System.err.println(s"Usage: SparkALS [M] [U] [F] [iters] [partitions]") --- End diff -- @HyukjinKwon Addressed ! Kindly do review --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157816044 --- Diff: examples/src/main/scala/org/apache/spark/examples/LocalALS.scala --- @@ -95,7 +95,7 @@ object LocalALS { def showWarning() { System.err.println( - """WARN: This is a naive implementation of ALS and is given as an example! + s"""WARN: This is a naive implementation of ALS and is given as an example! --- End diff -- @mgaido91 Thank you for feedback, changed addressed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157794109 --- Diff: examples/src/main/scala/org/apache/spark/examples/DFSReadWriteTest.scala --- @@ -127,11 +125,11 @@ object DFSReadWriteTest { spark.stop() if (localWordCount == dfsWordCount) { - println(s"Success! Local Word Count ($localWordCount) " + -s"and DFS Word Count ($dfsWordCount) agree.") + println(s"Success! Local Word Count ($localWordCount) +and DFS Word Count ($dfsWordCount) agree.") --- End diff -- @srowen Thanks for review, I did addressed changes. Please review. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157941808 --- Diff: examples/src/main/scala/org/apache/spark/examples/SparkALS.scala --- @@ -80,7 +80,7 @@ object SparkALS { def showWarning() { System.err.println( - """WARN: This is a naive implementation of ALS and is given as an example! + s"""WARN: This is a naive implementation of ALS and is given as an example! --- End diff -- @HyukjinKwon Addressed ! Kindly do review --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157816026 --- Diff: examples/src/main/scala/org/apache/spark/examples/DFSReadWriteTest.scala --- @@ -49,12 +49,10 @@ object DFSReadWriteTest { } private def printUsage(): Unit = { -val usage: String = "DFS Read-Write Test\n" + -"\n" + -"Usage: localFile dfsDir\n" + -"\n" + -"localFile - (string) local file to use in test\n" + -"dfsDir - (string) DFS directory for read/write tests\n" +val usage = s"""DFS Read-Write Test --- End diff -- @mgaido91 Thank you for feedback, changed addressed. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157794196 --- Diff: examples/src/main/scala/org/apache/spark/examples/LocalFileLR.scala --- @@ -58,10 +58,10 @@ object LocalFileLR { // Initialize w to a random value val w = DenseVector.fill(D) {2 * rand.nextDouble - 1} -println("Initial w: " + w) +println(s"Initial w: ${w}") --- End diff -- @srowen Thanks for review, I did addressed changes. Please review --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157925500 --- Diff: examples/src/main/scala/org/apache/spark/examples/SparkALS.scala --- @@ -80,7 +80,7 @@ object SparkALS { def showWarning() { System.err.println( - """WARN: This is a naive implementation of ALS and is given as an example! + s"""WARN: This is a naive implementation of ALS and is given as an example! --- End diff -- Seems we don't need `s`. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157925520 --- Diff: examples/src/main/scala/org/apache/spark/examples/SparkALS.scala --- @@ -100,7 +100,7 @@ object SparkALS { ITERATIONS = iters.getOrElse("5").toInt slices = slices_.getOrElse("2").toInt case _ => -System.err.println("Usage: SparkALS [M] [U] [F] [iters] [partitions]") +System.err.println(s"Usage: SparkALS [M] [U] [F] [iters] [partitions]") --- End diff -- ditto for `s`. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user HyukjinKwon commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157925457 --- Diff: examples/src/main/scala/org/apache/spark/examples/HdfsTest.scala --- @@ -39,7 +39,7 @@ object HdfsTest { val start = System.currentTimeMillis() for (x <- mapped) { x + 2 } val end = System.currentTimeMillis() - println("Iteration " + iter + " took " + (end-start) + " ms") + println(s"Iteration ${iter} took ${(end-start)} ms") --- End diff -- Let's just write as `$iter` and `$end-start` --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user mgaido91 commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157810171 --- Diff: examples/src/main/scala/org/apache/spark/examples/LocalALS.scala --- @@ -95,7 +95,7 @@ object LocalALS { def showWarning() { System.err.println( - """WARN: This is a naive implementation of ALS and is given as an example! + s"""WARN: This is a naive implementation of ALS and is given as an example! --- End diff -- please here revert the change and do the same in all similar places, since there is no variable to interpolate --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user mgaido91 commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157809630 --- Diff: examples/src/main/scala/org/apache/spark/examples/DFSReadWriteTest.scala --- @@ -49,12 +49,10 @@ object DFSReadWriteTest { } private def printUsage(): Unit = { -val usage: String = "DFS Read-Write Test\n" + -"\n" + -"Usage: localFile dfsDir\n" + -"\n" + -"localFile - (string) local file to use in test\n" + -"dfsDir - (string) DFS directory for read/write tests\n" +val usage = s"""DFS Read-Write Test --- End diff -- here you should use ``` """ |... """.stripMargin ``` otherwise you introduce a lot of spaces --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157793283 --- Diff: examples/src/main/scala/org/apache/spark/examples/DFSReadWriteTest.scala --- @@ -97,22 +95,22 @@ object DFSReadWriteTest { def main(args: Array[String]): Unit = { parseArgs(args) -println("Performing local word count") +println(s"Performing local word count") --- End diff -- @srowen Thanks for review, I did addressed changes. Please review --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user chetkhatri commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157792884 --- Diff: examples/src/main/scala/org/apache/spark/examples/BroadcastTest.scala --- @@ -42,7 +42,7 @@ object BroadcastTest { val arr1 = (0 until num).toArray for (i <- 0 until 3) { - println("Iteration " + i) + println(s"Iteration ${i}") --- End diff -- @markhamstra Thank you for valueable suggestion, I am addressed and did new commit. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user markhamstra commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157790330 --- Diff: examples/src/main/scala/org/apache/spark/examples/BroadcastTest.scala --- @@ -42,7 +42,7 @@ object BroadcastTest { val arr1 = (0 until num).toArray for (i <- 0 until 3) { - println("Iteration " + i) + println(s"Iteration ${i}") --- End diff -- Beyond the unnecessary { } that @srowen has already mentioned, this isn't really a style improvement. `"a string " + anotherString` is arguably at least as good stylistically as using string interpolation for such simple concatenations of a string reference to the end of a string literal. It's only when there are multiple concatenations and/or multiple string references that interpolation is clearly the better way. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157756696 --- Diff: examples/src/main/scala/org/apache/spark/examples/LocalFileLR.scala --- @@ -58,10 +58,10 @@ object LocalFileLR { // Initialize w to a random value val w = DenseVector.fill(D) {2 * rand.nextDouble - 1} -println("Initial w: " + w) +println(s"Initial w: ${w}") --- End diff -- You can just write `$w` in cases like this --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157756545 --- Diff: examples/src/main/scala/org/apache/spark/examples/DFSReadWriteTest.scala --- @@ -127,11 +125,11 @@ object DFSReadWriteTest { spark.stop() if (localWordCount == dfsWordCount) { - println(s"Success! Local Word Count ($localWordCount) " + -s"and DFS Word Count ($dfsWordCount) agree.") + println(s"Success! Local Word Count ($localWordCount) +and DFS Word Count ($dfsWordCount) agree.") --- End diff -- This should be reverted; you've added a bunch of space to the string. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/20016#discussion_r157756418 --- Diff: examples/src/main/scala/org/apache/spark/examples/DFSReadWriteTest.scala --- @@ -97,22 +95,22 @@ object DFSReadWriteTest { def main(args: Array[String]): Unit = { parseArgs(args) -println("Performing local word count") +println(s"Performing local word count") --- End diff -- There is no interpolation here; revert changes like this. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20016: SPARK-22830 Scala Coding style has been improved ...
GitHub user chetkhatri opened a pull request: https://github.com/apache/spark/pull/20016 SPARK-22830 Scala Coding style has been improved in Spark Examples ## What changes were proposed in this pull request? * Under Spark Scala Examples: Some of the syntax were written like Java way, It has been re-written as per scala style guide. * Most of all changes are followed to println() statement. ## How was this patch tested? Since, All changes proposed are re-writing println statements in scala way, manual run used to test println. You can merge this pull request into a Git repository by running: $ git pull https://github.com/chetkhatri/spark scala-style-spark-examples Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/20016.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #20016 commit 4ac1cb1c2aa6f72eee339e8b8b647647e879d91f Author: chetkhatriDate: 2017-12-19T07:17:37Z SPARK-22830 Scala Coding style has been improved in Spark Examples --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org