LuciferYang opened a new pull request, #40566:
URL: https://github.com/apache/spark/pull/40566

   ### What changes were proposed in this pull request?
   When testing `OrcEncryptionSuite` using maven, all test suites are always 
skipped. So this pr move `spark.hadoop.hadoop.security.key.provider.path` from 
`systemPropertyVariables` of `maven-surefire-plugin` to `systemProperties` of 
`scalatest-maven-plugin` to make `OrcEncryptionSuite` can test by maven.
   
   ### Why are the changes needed?
   Make `OrcEncryptionSuite` can test by maven.
   
   
   ### Does this PR introduce _any_ user-facing change?
   No, just for maven test
   
   
   ### How was this patch tested?
   
   - Pass GitHub Actions
   - Manual testing:
   
   run
    
   ```
   build/mvn clean install -pl sql/core -DskipTests -am
   build/mvn test -pl sql/core -Dtest=none 
-DwildcardSuites=org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite
 
   ```
   
   **Before**
   
   ```
   Discovery starting.
   Discovery completed in 3 seconds, 218 milliseconds.
   Run starting. Expected test count is: 4
   OrcEncryptionSuite:
   21:57:58.344 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
   - Write and read an encrypted file !!! CANCELED !!!
     [] was empty org.apache.orc.impl.NullKeyProvider@5af5d76f doesn't has the 
test keys. ORC shim is created with old Hadoop libraries 
(OrcEncryptionSuite.scala:37)
   - Write and read an encrypted table !!! CANCELED !!!
     [] was empty org.apache.orc.impl.NullKeyProvider@5ad6cc21 doesn't has the 
test keys. ORC shim is created with old Hadoop libraries 
(OrcEncryptionSuite.scala:65)
   - SPARK-35325: Write and read encrypted nested columns !!! CANCELED !!!
     [] was empty org.apache.orc.impl.NullKeyProvider@691124ee doesn't has the 
test keys. ORC shim is created with old Hadoop libraries 
(OrcEncryptionSuite.scala:116)
   - SPARK-35992: Write and read fully-encrypted columns with default masking 
!!! CANCELED !!!
     [] was empty org.apache.orc.impl.NullKeyProvider@5403799b doesn't has the 
test keys. ORC shim is created with old Hadoop libraries 
(OrcEncryptionSuite.scala:166)
   21:58:00.035 WARN 
org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite: 
   
   ===== POSSIBLE THREAD LEAK IN SUITE 
o.a.s.sql.execution.datasources.orc.OrcEncryptionSuite, threads: rpc-boss-3-1 
(daemon=true), shuffle-boss-6-1 (daemon=true) =====
   
   Run completed in 5 seconds, 41 milliseconds.
   Total number of tests run: 0
   Suites: completed 2, aborted 0
   Tests: succeeded 0, failed 0, canceled 4, ignored 0, pending 0
   No tests were executed.
   ```
   
   **After**
   
   ```
   Discovery starting.
   Discovery completed in 3 seconds, 185 milliseconds.
   Run starting. Expected test count is: 4
   OrcEncryptionSuite:
   21:58:46.540 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
   - Write and read an encrypted file
   - Write and read an encrypted table
   - SPARK-35325: Write and read encrypted nested columns
   - SPARK-35992: Write and read fully-encrypted columns with default masking
   21:58:51.933 WARN 
org.apache.spark.sql.execution.datasources.orc.OrcEncryptionSuite: 
   
   ===== POSSIBLE THREAD LEAK IN SUITE 
o.a.s.sql.execution.datasources.orc.OrcEncryptionSuite, threads: rpc-boss-3-1 
(daemon=true), shuffle-boss-6-1 (daemon=true) =====
   
   Run completed in 8 seconds, 708 milliseconds.
   Total number of tests run: 4
   Suites: completed 2, aborted 0
   Tests: succeeded 4, failed 0, canceled 0, ignored 0, pending 0
   All tests passed.
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to