HiuKwok commented on PR #45154:
URL: https://github.com/apache/spark/pull/45154#issuecomment-1953601887

   > > Re-introduce the javax.servlet-api and jaxb-api jars, as Hive-related 
Jars are using old servelet reference internally, or else it will throw 
classNotFound during test and runtime, also this makes us decouple the HIve 
upgrade from this MR.
   > 
   > could you please elaborate more on the invocation chains? e.g. provide 
stacktrace
   
   This is the stack trace I had from one of the old builds during the 
development, which asks for `javax/servlet/Filter` mainly.
   
   ```
   [info] 
org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInBinarySuite
 *** ABORTED *** (4 milliseconds)
   [info]   java.lang.NoClassDefFoundError: javax/servlet/Filter
   [info]   at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.logAuditEvent(HiveMetaStore.java:297)
   [info]   at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.logInfo(HiveMetaStore.java:782)
   [info]   at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.access$1100(HiveMetaStore.java:228)
   [info]   at 
org.apache.hadoop.hive.metastore.HiveMetaStore.cleanupRawStore(HiveMetaStore.java:7283)
   [info]   at 
org.apache.hadoop.hive.metastore.HiveMetaStore.access$600(HiveMetaStore.java:163)
   [info]   at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.shutdown(HiveMetaStore.java:844)
   [info]   at jdk.internal.reflect.GeneratedMethodAccessor74.invoke(Unknown 
Source)
   [info]   at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [info]   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
   [info]   at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
   [info]   at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
   [info]   at jdk.proxy2/jdk.proxy2.$Proxy35.shutdown(Unknown Source)
   [info]   at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:553)
   [info]   at jdk.internal.reflect.GeneratedMethodAccessor73.invoke(Unknown 
Source)
   [info]   at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [info]   at java.base/java.lang.reflect.Method.invoke(Method.java:568)
   [info]   at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
   [info]   at jdk.proxy2/jdk.proxy2.$Proxy36.close(Unknown Source)
   [info]   at org.apache.hadoop.hive.ql.metadata.Hive.close(Hive.java:416)
   [info]   at org.apache.hadoop.hive.ql.metadata.Hive.access$000(Hive.java:169)
   [info]   at org.apache.hadoop.hive.ql.metadata.Hive$1.remove(Hive.java:190)
   [info]   at 
org.apache.hadoop.hive.ql.metadata.Hive.closeCurrent(Hive.java:383)
   [info]   at 
org.apache.spark.sql.hive.thriftserver.SharedThriftServer.afterAll(SharedThriftServer.scala:77)
   [info]   at 
org.apache.spark.sql.hive.thriftserver.SharedThriftServer.afterAll$(SharedThriftServer.scala:69)
   [info]   at 
org.apache.spark.sql.hive.thriftserver.ThriftServerWithSparkContextInBinarySuite.afterAll(ThriftServerWithSparkContextSuite.scala:275)
   [info]   at 
org.scalatest.BeforeAndAfterAll.$anonfun$run$1(BeforeAndAfterAll.scala:225)
   [info]   at org.scalatest.Status.$anonfun$withAfterEffect$1(Status.scala:377)
   [info]   at 
org.scalatest.Status.$anonfun$withAfterEffect$1$adapted(Status.scala:373)
   [info]   at org.scalatest.FailedStatus$.whenCompleted(Status.scala:505)
   [info]   at org.scalatest.Status.withAfterEffect(Status.scala:373)
   [info]   at org.scalatest.Status.withAfterEffect$(Status.scala:371)
   [info]   at org.scalatest.FailedStatus$.withAfterEffect(Status.scala:477)
   [info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:224)
   [info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
   [info]   at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)
   [info]   at 
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
   [info]   at 
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
   [info]   at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
   [info]   at 
java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
   [info]   at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
   [info]   at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
   [info]   at java.base/java.lang.Thread.run(Thread.java:840)
   ```
   
   Also here is the old build which threw the exception, if this provides more 
clarity on the issue. 
   https://github.com/HiuKwok/spark/actions/runs/7940788196/job/21687327232


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to