dongjoon-hyun edited a comment on issue #27347: [SPARK-30626][K8S] Add 
SPARK_APPLICATION_ID into driver pod env
URL: https://github.com/apache/spark/pull/27347#issuecomment-578278747
 
 
   Oh, the second run looks good. All tests passed.
   ```
   KubernetesSuite:
   - Run SparkPi with no resources
   - Run SparkPi with a very long application name.
   - Use SparkLauncher.NO_RESOURCE
   - Run SparkPi with a master URL without a scheme.
   - Run SparkPi with an argument.
   - Run SparkPi with custom labels, annotations, and environment variables.
   - All pods have the same service account by default
   - Run extraJVMOptions check on driver
   - Run SparkRemoteFileTest using a remote data file
   - Run SparkPi with env and mount secrets.
   - Run PySpark on simple pi.py example
   - Run PySpark with Python2 to test a pyfiles example
   - Run PySpark with Python3 to test a pyfiles example
   - Run PySpark with memory customization
   - Run in client mode.
   - Start pod creation from template
   - PVs with local storage
   - Launcher client dependencies
   - Run SparkR on simple dataframe.R example
   Run completed in 17 minutes, 27 seconds.
   Total number of tests run: 19
   Suites: completed 2, aborted 0
   Tests: succeeded 19, failed 0, canceled 0, ignored 0, pending 0
   All tests passed.
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to