Yikun edited a comment on pull request #35422:
URL: https://github.com/apache/spark/pull/35422#issuecomment-1037039764


   @dongjoon-hyun @william-wang @martin-g Thanks for all helps, it's also works 
and passed all integration test on my arm64 env.
   
   <details>
   <summary> Env info: </summary>
   
   ```
   ubuntu@yikun-aarch64:~$ uname -a
   Linux yikun-aarch64 5.4.0-91-generic #102-Ubuntu SMP Fri Nov 5 16:30:45 UTC 
2021 aarch64 aarch64 aarch64 GNU/Linux
   ubuntu@yikun-aarch64:~$ kubectl version
   Client Version: version.Info{Major:"1", Minor:"23", GitVersion:"v1.23.1", 
GitCommit:"86ec240af8cbd1b60bcc4c03c20da9b98005b92e", GitTreeState:"clean", 
BuildDate:"2021-12-16T11:41:01Z", GoVersion:"go1.17.5", Compiler:"gc", 
Platform:"linux/arm64"}
   Server Version: version.Info{Major:"1", Minor:"23", GitVersion:"v1.23.1", 
GitCommit:"86ec240af8cbd1b60bcc4c03c20da9b98005b92e", GitTreeState:"clean", 
BuildDate:"2021-12-16T11:34:54Z", GoVersion:"go1.17.5", Compiler:"gc", 
Platform:"linux/arm64"}
   ```
   
   </details>
   
   <details>
   <summary> Test result on arm64 env: </summary>
   
   ```
   [info] KubernetesSuite:
   [info] - Run SparkPi with no resources (18 seconds, 751 milliseconds)
   [info] - Run SparkPi with no resources & statefulset allocation (18 seconds, 
422 milliseconds)
   [info] - Run SparkPi with a very long application name. (18 seconds, 151 
milliseconds)
   [info] - Use SparkLauncher.NO_RESOURCE (18 seconds, 450 milliseconds)
   [info] - Run SparkPi with a master URL without a scheme. (18 seconds, 180 
milliseconds)
   [info] - Run SparkPi with an argument. (18 seconds, 295 milliseconds)
   [info] - Run SparkPi with custom labels, annotations, and environment 
variables. (18 seconds, 372 milliseconds)
   [info] - All pods have the same service account by default (18 seconds, 694 
milliseconds)
   [info] - Run extraJVMOptions check on driver (10 seconds, 331 milliseconds)
   [info] - Run SparkRemoteFileTest using a remote data file (19 seconds, 330 
milliseconds)
   [info] - Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j2.properties (31 seconds, 586 milliseconds)
   [info] - Run SparkPi with env and mount secrets. (32 seconds, 28 
milliseconds)
   [info] - Run PySpark on simple pi.py example (20 seconds, 393 milliseconds)
   [info] - Run PySpark to test a pyfiles example (23 seconds, 430 milliseconds)
   [info] - Run PySpark with memory customization (19 seconds, 258 milliseconds)
   [info] - Run in client mode. (12 seconds, 282 milliseconds)
   [info] - Start pod creation from template (18 seconds, 549 milliseconds)
   [info] - Test basic decommissioning (52 seconds, 144 milliseconds)
   [info] - Test basic decommissioning with shuffle cleanup (52 seconds, 793 
milliseconds)
   [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 
minutes, 53 seconds)
   [info] - Test decommissioning timeouts (51 seconds, 503 milliseconds)
   [info] - SPARK-37576: Rolling decommissioning (1 minute, 12 seconds)
   [info] VolcanoSuite:
   [info] - Run SparkPi with no resources (18 seconds, 506 milliseconds)
   [info] - Run SparkPi with no resources & statefulset allocation (18 seconds, 
262 milliseconds)
   [info] - Run SparkPi with a very long application name. (18 seconds, 283 
milliseconds)
   [info] - Use SparkLauncher.NO_RESOURCE (18 seconds, 467 milliseconds)
   [info] - Run SparkPi with a master URL without a scheme. (18 seconds, 303 
milliseconds)
   [info] - Run SparkPi with an argument. (19 seconds, 254 milliseconds)
   [info] - Run SparkPi with custom labels, annotations, and environment 
variables. (18 seconds, 223 milliseconds)
   [info] - All pods have the same service account by default (18 seconds, 397 
milliseconds)
   [info] - Run extraJVMOptions check on driver (9 seconds, 304 milliseconds)
   [info] - Run SparkRemoteFileTest using a remote data file (19 seconds, 217 
milliseconds)
   [info] - Verify logging configuration is picked from the provided 
SPARK_CONF_DIR/log4j2.properties (32 seconds, 603 milliseconds)
   [info] - Run SparkPi with env and mount secrets. (35 seconds, 860 
milliseconds)
   [info] - Run PySpark on simple pi.py example (20 seconds, 385 milliseconds)
   [info] - Run PySpark to test a pyfiles example (24 seconds, 638 milliseconds)
   [info] - Run PySpark with memory customization (19 seconds, 467 milliseconds)
   [info] - Run in client mode. (11 seconds, 180 milliseconds)
   [info] - Start pod creation from template (18 seconds, 587 milliseconds)
   [info] - Test basic decommissioning (53 seconds, 381 milliseconds)
   [info] - Test basic decommissioning with shuffle cleanup (52 seconds, 668 
milliseconds)
   [info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 
minutes, 53 seconds)
   [info] - Test decommissioning timeouts (52 seconds, 605 milliseconds)
   [info] - SPARK-37576: Rolling decommissioning (1 minute, 12 seconds)
   [info] - Run SparkPi with volcano scheduler (18 seconds, 529 milliseconds)
   ``` 
   
   </details>
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to