HyukjinKwon commented on a change in pull request #23439:
[SPARK-26454][CORE][Minor] lowering the log level to warn of exceptio…
URL: https://github.com/apache/spark/pull/23439#discussion_r244967400
##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -1780,7 +1780,7 @@ class SparkContext(config: SparkConf) extends Logging {
env.rpcEnv.fileServer.addJar(file)
} catch {
case NonFatal(e) =>
- logError(s"Failed to add $path to Spark environment", e)
+ logWarning(s"Failed to add $path to Spark environment", e)
Review comment:
@vanzin, I suggested this change - WDYT?
From my reading, looks it might have to be warning then error since the API
call itself works. However, not super strongly feel about it.
For the issue itself,
```sql
CREATE FUNCTION generic_udf1 AS '...GenericUDF1' USING JAR 'udfs.jar'
CREATE FUNCTION generic_udf2 AS '...GenericUDF2' USING JAR 'udfs.jar'
```
and it shows, for instance, a log like:
```
18/12/31 12:28:23 ERROR SparkContext: Failed to add
/opt/sparkclient/Spark2x/tmp/spark-ed12eb5e-b7b9-49d0-a7a4-a0dba9141ac9/custom.jar
to Spark environment
java.lang.IllegalArgumentException: requirement failed: File custom.jar was
already registered with a different path (old path =
/opt/sparkclient/Spark2x/tmp/spark-10ea8f59-fa23-46c5-af12-aa029bf2f5cb/custom.jar,
new path =
/opt/sparkclient/Spark2x/tmp/spark-ed12eb5e-b7b9-49d0-a7a4-a0dba9141ac9/custom.jar
at scala.Predef$.require(Predef.scala:224)
at
org.apache.spark.rpc.netty.NettyStreamManager.addJar(NettyStreamManager.scala:78)
at
org.apache.spark.SparkContext.addJarFile$1(SparkContext.scala:1829)
at org.apache.spark.SparkContext.addJar(SparkContext.scala:1851)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]