LuciferYang opened a new pull request #34759:
URL: https://github.com/apache/spark/pull/34759
### What changes were proposed in this pull request?
`scalatest-maven-plugin` configuration uses
`file:src/test/resources/log4j.properties` as the UT log configuration, so this
PR adds this `log4j.properties` file to the mesos module for UT.
### Why are the changes needed?
Supplement missing log4j configuration file for mesos module .
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
- Pass the Jenkins or GitHub Action
- Manual test
**Before**
Run
```
mvn clean install -pl resource-managers/mesos -Pmesos -am -DskipTests
mvn test -pl resource-managers/mesos -Pmesos
```
will print the following log:
```
log4j:ERROR Could not read configuration file from URL
[file:src/test/resources/log4j.properties].
java.io.FileNotFoundException: src/test/resources/log4j.properties (No such
file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at java.io.FileInputStream.<init>(FileInputStream.java:93)
at
sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
at
sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:557)
at
org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
at
org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
at
org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:222)
at
org.apache.spark.internal.Logging.initializeLogging(Logging.scala:127)
at
org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
at
org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
at
org.apache.spark.SparkFunSuite.initializeLogIfNecessary(SparkFunSuite.scala:62)
at
org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:102)
at
org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:101)
at
org.apache.spark.SparkFunSuite.initializeLogIfNecessary(SparkFunSuite.scala:62)
at org.apache.spark.internal.Logging.log(Logging.scala:49)
at org.apache.spark.internal.Logging.log$(Logging.scala:47)
at org.apache.spark.SparkFunSuite.log(SparkFunSuite.scala:62)
at org.apache.spark.SparkFunSuite.<init>(SparkFunSuite.scala:74)
at
org.apache.spark.scheduler.cluster.mesos.MesosCoarseGrainedSchedulerBackendSuite.<init>(MesosCoarseGrainedSchedulerBackendSuite.scala:43)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at
org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:66)
at
org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38)
at
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
at scala.collection.Iterator.foreach(Iterator.scala:943)
at scala.collection.Iterator.foreach$(Iterator.scala:943)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
at scala.collection.IterableLike.foreach(IterableLike.scala:74)
at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
at scala.collection.TraversableLike.map(TraversableLike.scala:286)
at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at org.scalatest.tools.DiscoverySuite.<init>(DiscoverySuite.scala:37)
at org.scalatest.tools.Runner$.genDiscoSuites$1(Runner.scala:1132)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1226)
at
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
at
org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
at
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1482)
at
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
at org.scalatest.tools.Runner$.main(Runner.scala:775)
at org.scalatest.tools.Runner.main(Runner.scala)
log4j:ERROR Ignoring configuration file
[file:src/test/resources/log4j.properties].
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
```
and test log will print to console.
**After**
No above log in console and test log will print to
`resource-managers/mesos/target/unit-tests.log` as other module.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]