[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1411954193 Thanks @steveloughran -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1364827388 > so the k8s integration test doesn't pick up any -Psnapshots-and-staging profile? yes -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1363686415 I will keep this pr open to test the next rc or release in time -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1363544363 @sunchao @steveloughran @dongjoon-hyun Now all GA Task have passed, except `Spark on Kubernetes Integration test`, but I think it can also pass when it can be downloaded hadoop 3.3.5 from `https://repo1.maven.org/maven2/` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1361207053 Re-trigger GA found that the dependencies of hadoop 3.3.5 could not be downloaded. Let's wait until downloading is available again to re-analyze the test failed. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1361086643 [efec8ce](https://github.com/apache/spark/pull/39124/commits/efec8cecb4fc8be9e341ba627a69737fb6e7ad52) merge with master, then `org.apache.spark.sql.hive.execution.command.AlterTableAddColumnsSuite` local test pass, let us retry with GA -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1361081097 @steveloughran Do you know the correct class type that `XMLUtils.newSecureTransformerFactory` should return? I want to try to configure `javax.xml.transform.TransformerFactory`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1361011180 hmm..., maybe there is some conflict. The attribute ACCESS_EXTERNAL_DTD is not recognized by TransformerFactory https://github.com/apache/hadoop/blob/5f08e51b72330b2dd2405896b39179a64a3a7efe/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/util/XMLUtils.java#L141 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1360979354 Maybe due to https://github.com/apache/hadoop/pull/4940/files? Some xml parsers features are disabled, possibly to fix CVE-2022-34169? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1358800857 also cc @wangyum -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] [spark] LuciferYang commented on pull request #39124: [DON'T MERGE] Test build and test with hadoop 3.3.5-RC0
LuciferYang commented on PR #39124: URL: https://github.com/apache/spark/pull/39124#issuecomment-1358794388 Many test failed as follows: ``` 2022-12-20T03:15:37.0609530Z [info] org.apache.spark.sql.hive.execution.command.AlterTableAddColumnsSuite *** ABORTED *** (28 milliseconds) 2022-12-20T03:15:37.0701184Z [info] java.lang.reflect.InvocationTargetException: 2022-12-20T03:15:37.0701846Z [info] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2022-12-20T03:15:37.0702983Z [info] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2022-12-20T03:15:37.0703732Z [info] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2022-12-20T03:15:37.0704398Z [info] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2022-12-20T03:15:37.0705400Z [info] at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:315) 2022-12-20T03:15:37.0706077Z [info] at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:514) 2022-12-20T03:15:37.0706751Z [info] at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:374) 2022-12-20T03:15:37.0707378Z [info] at org.apache.spark.sql.hive.test.TestHiveExternalCatalog.$anonfun$client$1(TestHive.scala:90) 2022-12-20T03:15:37.0707917Z [info] at scala.Option.getOrElse(Option.scala:189) 2022-12-20T03:15:37.0708804Z [info] at org.apache.spark.sql.hive.test.TestHiveExternalCatalog.client$lzycompute(TestHive.scala:90) 2022-12-20T03:15:37.0709589Z [info] at org.apache.spark.sql.hive.test.TestHiveExternalCatalog.client(TestHive.scala:88) 2022-12-20T03:15:37.0710320Z [info] at org.apache.spark.sql.hive.test.TestHiveSingleton.$init$(TestHiveSingleton.scala:33) 2022-12-20T03:15:37.0711253Z [info] at org.apache.spark.sql.hive.execution.command.AlterTableAddColumnsSuite.(AlterTableAddColumnsSuite.scala:27) 2022-12-20T03:15:37.0712160Z [info] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2022-12-20T03:15:37.0712844Z [info] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2022-12-20T03:15:37.0713829Z [info] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2022-12-20T03:15:37.0714480Z [info] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2022-12-20T03:15:37.0714972Z [info] at java.lang.Class.newInstance(Class.java:442) 2022-12-20T03:15:37.0715625Z [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:454) 2022-12-20T03:15:37.0716141Z [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413) 2022-12-20T03:15:37.0716638Z [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266) 2022-12-20T03:15:37.0717222Z [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 2022-12-20T03:15:37.0718079Z [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 2022-12-20T03:15:37.0718637Z [info] at java.lang.Thread.run(Thread.java:750) 2022-12-20T03:15:37.0719260Z [info] Cause: java.lang.RuntimeException: Failed to initialize default Hive configuration variables! 2022-12-20T03:15:37.0719939Z [info] at org.apache.hadoop.hive.conf.HiveConf.getConfVarInputStream(HiveConf.java:3638) 2022-12-20T03:15:37.0720558Z [info] at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:4057) 2022-12-20T03:15:37.0721115Z [info] at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:4014) 2022-12-20T03:15:37.0721873Z [info] at org.apache.spark.sql.hive.client.HiveClientImpl$.newHiveConf(HiveClientImpl.scala:1309) 2022-12-20T03:15:37.0722615Z [info] at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:176) 2022-12-20T03:15:37.0723562Z [info] at org.apache.spark.sql.hive.client.HiveClientImpl.(HiveClientImpl.scala:141) 2022-12-20T03:15:37.0724265Z [info] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2022-12-20T03:15:37.0725154Z [info] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2022-12-20T03:15:37.0815583Z [info] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2022-12-20T03:15:37.0816308Z [info] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2022-12-20T03:15:37.0817005Z [info] at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:315) 2022-12-20T03:15:37.0817691Z [info] at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:514) 2022-12-20T03:15:37.0818294Z [info] at