LuciferYang opened a new pull request, #37454:
URL: https://github.com/apache/spark/pull/37454

   ### What changes were proposed in this pull request?
   This pr adds `assume(isPythonAvailable)`  to `testPySpark` method in 
`YarnClusterSuite` to make `YarnClusterSuite` test succeeded in an environment 
without Python 3 configured.
   
   
   ### Why are the changes needed?
   `YarnClusterSuite` should not `ABORTED` when `python3` is not configured.
   
   
   ### Does this PR introduce _any_ user-facing change?
   No.
   
   
   ### How was this patch tested?
   
   - Pass GitHub Actions
   - Manual test
   
   Run 
   
   ```
   mvn clean test -pl resource-managers/yarn -am -Pyarn 
-DwildcardSuites=org.apache.spark.deploy.yarn.YarnClusterSuite  -Dtest=none
   ``` 
   in an environment without Python 3 configured:
   
   **Before**
   
   ```
   YarnClusterSuite:
   org.apache.spark.deploy.yarn.YarnClusterSuite *** ABORTED ***
     java.lang.RuntimeException: Unable to load a Suite class that was 
discovered in the runpath: org.apache.spark.deploy.yarn.YarnClusterSuite
     at 
org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:81)
     at 
org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38)
     at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
     at scala.collection.Iterator.foreach(Iterator.scala:943)
     at scala.collection.Iterator.foreach$(Iterator.scala:943)
     at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
     at scala.collection.IterableLike.foreach(IterableLike.scala:74)
     at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
     at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
     at scala.collection.TraversableLike.map(TraversableLike.scala:286)
     ...
   Run completed in 833 milliseconds.
   Total number of tests run: 0
   Suites: completed 1, aborted 1
   Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
   *** 1 SUITE ABORTED ***
   ```
   
   **After**
   
   ```
   YarnClusterSuite:
   - run Spark in yarn-client mode
   - run Spark in yarn-cluster mode
   - run Spark in yarn-client mode with unmanaged am
   - run Spark in yarn-client mode with different configurations, ensuring 
redaction
   - run Spark in yarn-cluster mode with different configurations, ensuring 
redaction
   - yarn-cluster should respect conf overrides in SparkHadoopUtil 
(SPARK-16414, SPARK-23630)
   - SPARK-35672: run Spark in yarn-client mode with additional jar using URI 
scheme 'local'
   - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI 
scheme 'local'
   - SPARK-35672: run Spark in yarn-client mode with additional jar using URI 
scheme 'local' and gateway-replacement path
   - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI 
scheme 'local' and gateway-replacement path
   - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI 
scheme 'local' and gateway-replacement path containing an environment variable
   - SPARK-35672: run Spark in yarn-client mode with additional jar using URI 
scheme 'file'
   - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI 
scheme 'file'
   - run Spark in yarn-cluster mode unsuccessfully
   - run Spark in yarn-cluster mode failure after sc initialized
   - run Python application in yarn-client mode !!! CANCELED !!!
     YarnClusterSuite.this.isPythonAvailable was false 
(YarnClusterSuite.scala:376)
   - run Python application in yarn-cluster mode !!! CANCELED !!!
     YarnClusterSuite.this.isPythonAvailable was false 
(YarnClusterSuite.scala:376)
   - run Python application in yarn-cluster mode using spark.yarn.appMasterEnv 
to override local envvar !!! CANCELED !!!
     YarnClusterSuite.this.isPythonAvailable was false 
(YarnClusterSuite.scala:376)
   - user class path first in client mode
   - user class path first in cluster mode
   - monitor app using launcher library
   - running Spark in yarn-cluster mode displays driver log links
   - timeout to get SparkContext in cluster mode triggers failure
   - executor env overwrite AM env in client mode
   - executor env overwrite AM env in cluster mode
   - SPARK-34472: ivySettings file with no scheme or file:// scheme should be 
localized on driver in cluster mode
   - SPARK-34472: ivySettings file with no scheme or file:// scheme should 
retain user provided path in client mode
   - SPARK-34472: ivySettings file with non-file:// schemes should throw an 
error
   Run completed in 7 minutes, 2 seconds.
   Total number of tests run: 25
   Suites: completed 2, aborted 0
   Tests: succeeded 25, failed 0, canceled 3, ignored 0, pending 0
   All tests passed.
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to