LuciferYang commented on code in PR #43526:
URL: https://github.com/apache/spark/pull/43526#discussion_r1382330727
##########
project/SparkBuild.scala:
##########
@@ -247,8 +247,6 @@ object SparkBuild extends PomBuild {
"-Wconf:cat=deprecation&msg=Auto-application to \\`\\(\\)\\` is
deprecated&site=org.apache.spark.streaming.kafka010.KafkaRDDSuite:s",
// SPARK-35574 Prevent the recurrence of compilation warnings related
to `procedure syntax is deprecated`
"-Wconf:cat=deprecation&msg=procedure syntax is deprecated:e",
- // SPARK-40497 Upgrade Scala to 2.13.11 and suppress `Implicit
definition should have explicit type`
- "-Wconf:msg=Implicit definition should have explicit type:s",
Review Comment:
The corresponding entries also need to be removed in the parent pom.xml.
##########
core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala:
##########
@@ -60,7 +60,7 @@ private[spark] class CoarseGrainedExecutorBackend(
import CoarseGrainedExecutorBackend._
- private implicit val formats = DefaultFormats
+ private implicit val formats: DefaultFormats.type = DefaultFormats
Review Comment:
Does `CoarseGrainedExecutorBackend` still need this implicit variable? Can
we directly delete this line of code?
##########
sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2Suites.scala:
##########
@@ -441,7 +441,7 @@ class HiveThriftBinaryServerSuite extends
HiveThriftServer2Test {
s"LOAD DATA LOCAL INPATH '${TestData.smallKv}' OVERWRITE INTO TABLE
test_map")
queries.foreach(statement.execute)
- implicit val ec = ExecutionContext.fromExecutorService(
+ implicit val ec: ExecutionContextExecutorService =
ExecutionContext.fromExecutorService(
Review Comment:
maybe `ExecutionContext` also ok?
##########
connector/kafka-0-10-sql/src/main/scala/org/apache/spark/sql/kafka010/JsonUtils.scala:
##########
@@ -18,17 +18,18 @@
package org.apache.spark.sql.kafka010
import scala.collection.mutable.HashMap
+import scala.math._
Review Comment:
Why do we need to add this imports?
##########
core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala:
##########
@@ -340,7 +341,7 @@ private object FaultToleranceTest extends App with Logging {
private class TestMasterInfo(val ip: String, val dockerId: DockerId, val
logFile: File)
Review Comment:
`FaultToleranceTest` is a strange file, it's a test class, but it's in the
production directory (so it probably has never been run by GA task). Can we
directly delete this file? @srowen @dongjoon-hyun @HyukjinKwon
##########
resource-managers/yarn/src/main/scala/org/apache/spark/scheduler/cluster/YarnSchedulerBackend.scala:
##########
@@ -64,15 +64,15 @@ private[spark] abstract class YarnSchedulerBackend(
private val yarnSchedulerEndpointRef = rpcEnv.setupEndpoint(
YarnSchedulerBackend.ENDPOINT_NAME, yarnSchedulerEndpoint)
- private implicit val askTimeout = RpcUtils.askRpcTimeout(sc.conf)
+ private implicit val askTimeout: Any = RpcUtils.askRpcTimeout(sc.conf)
/**
* Declare implicit single thread execution context for futures
doRequestTotalExecutors and
* doKillExecutors below, avoiding using the global execution context that
may cause conflict
* with user code's execution of futures.
*/
- private implicit val schedulerEndpointEC =
ExecutionContext.fromExecutorService(
- ThreadUtils.newDaemonSingleThreadExecutor("yarn-scheduler-endpoint"))
+ private implicit val schedulerEndpointEC: ExecutionContextExecutorService =
ExecutionContext
Review Comment:
Is this implicit variable really being used? need to further confirm this.
##########
core/src/main/scala/org/apache/spark/resource/ResourceInformation.scala:
##########
@@ -69,7 +69,7 @@ private[spark] object ResourceInformation {
* Parses a JSON string into a [[ResourceInformation]] instance.
*/
def parseJson(json: String): ResourceInformation = {
- implicit val formats = DefaultFormats
+ implicit val formats: DefaultFormats.type = DefaultFormats
Review Comment:
Does it have to be `DefaultFormats.type`? Can it be `DefaultFormats` or
`Formats`?
From the signature of the function `extract`, maybe it's ok to unify it as
`Formats`?
```scala
def extract[A](implicit formats: Formats, mf: scala.reflect.Manifest[A]): A
=
Extraction.extract(jv)(formats, mf)
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]