LuciferYang opened a new pull request #34286:
URL: https://github.com/apache/spark/pull/34286


   ### What changes were proposed in this pull request?
   `UseCompressedOops` Option has some different behaviors between Java 8 and 
Java 17, if we execute command 
   
   ```
   java -XX:+PrintFlagsFinal -XX:-UseCompressedOops -version | grep Compressed
   ```
   
   The result of Java 8 is
   
   ```
   bool UseCompressedClassPointers                = false                       
        {lp64_product}
   bool UseCompressedOops                             := false                  
             {lp64_product}
   ```
   
   The result of Java 11 is
   
   ```
   bool UseCompressedClassPointers               = false                        
        {lp64_product} {default}
   bool UseCompressedOops                             = false                   
             {lp64_product} {command line}
   ```
   
   The result of Java 17 is
   ```
   bool UseCompressedClassPointers               = true                         
  {product lp64_product} {ergonomic}
   bool UseCompressedOops                             = false                   
       {product lp64_product} {command line}
   ```
   
   We found that `-XX:-UseCompressedOops` and `-XX:-UseCompressedClassPointers` 
are no longer  react in chain, so when we execute
   
   ```
   mvn clean install -pl sql/core -am -Dtest=none 
-DwildcardSuites=org.apache.spark.sql.execution.WholeStageCodegenSparkSubmitSuite
  
   ```
   
   with Java 17, the test failed due to `Exception in thread "main" 
org.scalatest.exceptions.TestFailedException: 16 was not greater than 16`, so 
the change of this pr is  replace `UseCompressedOops` with 
`UseCompressedClassPointers` to pass `WholeStageCodegenSparkSubmitSuite` with 
Java 17 .
   
   ### Why are the changes needed?
   Pass UT with JDK 17
   
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   
   - Pass the Jenkins or GitHub Action
   - Manual test with Java 11 passed 
   - Manual test with Java 17
   
   ```
   mvn clean install -pl sql/core -am -Dtest=none 
-DwildcardSuites=org.apache.spark.sql.execution.WholeStageCodegenSparkSubmitSuite
  
   ```
   
   **Before**
   
   ```
   2021-10-14 04:32:38.038 - stderr> Exception in thread "main" 
org.scalatest.exceptions.TestFailedException: 16 was not greater than 16
     2021-10-14 04:32:38.038 - stderr>  at 
org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
     2021-10-14 04:32:38.038 - stderr>  at 
org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
     2021-10-14 04:32:38.038 - stderr>  at 
org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
     2021-10-14 04:32:38.038 - stderr>  at 
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.sql.execution.WholeStageCodegenSparkSubmitSuite$.main(WholeStageCodegenSparkSubmitSuite.scala:82)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.sql.execution.WholeStageCodegenSparkSubmitSuite.main(WholeStageCodegenSparkSubmitSuite.scala)
     2021-10-14 04:32:38.038 - stderr>  at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
     2021-10-14 04:32:38.038 - stderr>  at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
     2021-10-14 04:32:38.038 - stderr>  at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
     2021-10-14 04:32:38.038 - stderr>  at 
java.base/java.lang.reflect.Method.invoke(Method.java:568)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
     2021-10-14 04:32:38.038 - stderr>  at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   ```
   
   **After**
   
   ```
   Run completed in 11 seconds, 15 milliseconds.
   Total number of tests run: 1
   Suites: completed 2, aborted 0
   Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
   All tests passed.
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to