This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new c0ef52760de9 [SPARK-48127][INFRA] Fix `dev/scalastyle` to check
`hadoop-cloud` and `jvm-profiler` modules
c0ef52760de9 is described below
commit c0ef52760de90a5d843e40c2fa990599d01bc798
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri May 3 22:01:48 2024 -0700
[SPARK-48127][INFRA] Fix `dev/scalastyle` to check `hadoop-cloud` and
`jvm-profiler` modules
### What changes were proposed in this pull request?
This PR aims to fix `dev/scalastyle` to check `hadoop-cloud` and
`jam-profiler` modules.
Also, the detected scalastyle issues are fixed.
### Why are the changes needed?
To prevent future scalastyle issues.
Scala style violation was introduced here, but we missed because we didn't
check all optional modules.
- https://github.com/apache/spark/pull/46022
`jvm-profiler` module was added newly at Apache Spark 4.0.0 but we missed
to add this to `dev/scalastyle`. Note that there was no scala style issues in
that `module` at that time.
- #44021
`hadoop-cloud` module was added at Apache Spark 2.3.0.
- #17834
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass the CIs with newly revised `dev/scalastyle`.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #46376 from dongjoon-hyun/SPARK-48127.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
.../scala/org/apache/spark/executor/profiler/ExecutorJVMProfiler.scala | 2 +-
.../org/apache/spark/executor/profiler/ExecutorProfilerPlugin.scala | 2 +-
dev/scalastyle | 2 +-
3 files changed, 3 insertions(+), 3 deletions(-)
diff --git
a/connector/profiler/src/main/scala/org/apache/spark/executor/profiler/ExecutorJVMProfiler.scala
b/connector/profiler/src/main/scala/org/apache/spark/executor/profiler/ExecutorJVMProfiler.scala
index 4c3f467fa15b..20b6db5221fa 100644
---
a/connector/profiler/src/main/scala/org/apache/spark/executor/profiler/ExecutorJVMProfiler.scala
+++
b/connector/profiler/src/main/scala/org/apache/spark/executor/profiler/ExecutorJVMProfiler.scala
@@ -25,8 +25,8 @@ import org.apache.hadoop.fs.{FileSystem, FSDataOutputStream,
Path}
import org.apache.spark.SparkConf
import org.apache.spark.deploy.SparkHadoopUtil
-import org.apache.spark.internal.LogKeys.PATH
import org.apache.spark.internal.{Logging, MDC}
+import org.apache.spark.internal.LogKeys.PATH
import org.apache.spark.util.ThreadUtils
diff --git
a/connector/profiler/src/main/scala/org/apache/spark/executor/profiler/ExecutorProfilerPlugin.scala
b/connector/profiler/src/main/scala/org/apache/spark/executor/profiler/ExecutorProfilerPlugin.scala
index ff753f04868f..b6b622127796 100644
---
a/connector/profiler/src/main/scala/org/apache/spark/executor/profiler/ExecutorProfilerPlugin.scala
+++
b/connector/profiler/src/main/scala/org/apache/spark/executor/profiler/ExecutorProfilerPlugin.scala
@@ -23,8 +23,8 @@ import scala.util.Random
import org.apache.spark.SparkConf
import org.apache.spark.api.plugin.{DriverPlugin, ExecutorPlugin,
PluginContext, SparkPlugin}
-import org.apache.spark.internal.LogKeys.EXECUTOR_ID
import org.apache.spark.internal.{Logging, MDC}
+import org.apache.spark.internal.LogKeys.EXECUTOR_ID
/**
diff --git a/dev/scalastyle b/dev/scalastyle
index 12457af1ae7b..9de1fd1c9d9d 100755
--- a/dev/scalastyle
+++ b/dev/scalastyle
@@ -17,7 +17,7 @@
# limitations under the License.
#
-SPARK_PROFILES=${1:-"-Pkubernetes -Pyarn -Pspark-ganglia-lgpl -Pkinesis-asl
-Phive-thriftserver -Phive -Pvolcano"}
+SPARK_PROFILES=${1:-"-Pkubernetes -Pyarn -Pspark-ganglia-lgpl -Pkinesis-asl
-Phive-thriftserver -Phive -Pvolcano -Pjvm-profiler -Phadoop-cloud"}
# NOTE: echo "q" is needed because SBT prompts the user for input on
encountering a build file
# with failure (either resolution or compilation); the "q" makes SBT quit.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]