andersonm-ibm commented on pull request #32146:
URL: https://github.com/apache/spark/pull/32146#issuecomment-821945651


   > 
   > 
   > It seems that some Parquet unit tests fail. Is there a side-effect on 
Parquet encryption, @andersonm-ibm and @ggershinsky ?
   > 
   > Let me re-trigger the CIs. Just FYI, the similar situations happened 
before when I added `OrcEncryptionSuite`. So, `OrcEncryptionSuite` was reverted 
once.
   > 
   > ```
   > $ git log --oneline | grep OrcEn
   > 03f4cf5845 [SPARK-34029][SQL][TESTS] Add OrcEncryptionSuite and 
FakeKeyProvider
   > 194edc86a2 Revert "[SPARK-34029][SQL][TESTS] Add OrcEncryptionSuite and 
FakeKeyProvider"
   > 8bb70bf0d6 [SPARK-34029][SQL][TESTS] Add OrcEncryptionSuite and 
FakeKeyProvider
   > ```
   > 
   > In this suspicious case, it seems that we are entering a hard verification 
situation. To make it sure that we break nothing, I'd like to recommend the 
followings.
   > 
   >     1. Jenkins CI had better pass with the following combinations at least.
   >        
   >        * SBT / Hadoop 3.2 / Java8 (the default)
   >        * SBT / Hadoop 3.2 / Java11 by adding `[test-java11]` to the PR 
title.
   >        * SBT / Hadoop 2.7 / Java8 by adding `[test-hadoop2.7]` to the PR 
title.
   >        * Maven / Hadoop 3.2 / Java8 by adding `[test-maven]` to the PR 
title.
   >        * Maven / Hadoop 2.7 / Java8 by adding 
`[test-maven][test-hadoop2.7]` to the PR title.
   > 
   >     2. GitHub Action should passed.
   >        
   >        * Recently, Apache Spark community enabled `PR-owner GitHub Action 
runner` feature. (by @HyukjinKwon). This means `GitHub Action` should run on 
@andersonm-ibm 's GitHub Action . Currently, @andersonm-ibm seems to disable 
GitHub Action feature in his fork.
   >        * To see GitHub Action result, please turn on GitHub Action in your 
Apache Spark fork, @andersonm-ibm .
   
   @dongjoon-hyun  I've run GitHub Action on my fork, and the only problem 
there is Run TPC-DS queries with SF=1 , which doesn't seem related:
    java.net.BindException: Cannot assign requested address: Service 
'sparkDriver' failed after 100 retries (on a random free port)! Consider 
explicitly setting the appropriate binding address for the service 
'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the 
correct binding address.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to