LuciferYang commented on PR #52496:
URL: https://github.com/apache/spark/pull/52496#issuecomment-3494600687

   > @hvanhovell @xi-db, unfortunately, the [daily maven 
test](https://github.com/apache/spark/actions/workflows/build_maven.yml) starts 
to fail after this patch
   > 
   > ```
   > ClientE2ETestSuite:
   > - throw SparkException with null filename in stack trace elements *** 
FAILED ***
   >   null was not instance of org.apache.spark.SparkException 
(ClientE2ETestSuite.scala:81)
   >   ...
   > - throw SparkException with large cause exception *** FAILED ***
   >   null was not instance of org.apache.spark.SparkException 
(ClientE2ETestSuite.scala:134)
   >   ...
   > ```
   > 
   > after a closer look, I think this should be a test-only issue related to 
maven classpath and won't cause problems on real deployment
   > 
   > ```
   > org.apache.spark.SparkException: 
org.apache.spark.SparkClassNotFoundException: [INTERNAL_ERROR] Failed to load 
class: io.grpc.ClientInterceptor. Make sure the artifact where the class is 
defined is installed by calling session.addArtifact. SQLSTATE: XX000
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.unpackScalaUDF(SparkConnectPlanner.scala:2086)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.org$apache$spark$sql$connect$planner$SparkConnectPlanner$$unpackUdf(SparkConnectPlanner.scala:2064)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.transformScalaFunction(SparkConnectPlanner.scala:2130)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.transformScalaUDF(SparkConnectPlanner.scala:2108)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.transformCommonInlineUserDefinedFunction(SparkConnectPlanner.scala:2036)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.doTransformExpression(SparkConnectPlanner.scala:1917)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.$anonfun$transformExpression$1(SparkConnectPlanner.scala:1837)
   >    at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:107)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.transformExpression(SparkConnectPlanner.scala:1837)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.transformExpression(SparkConnectPlanner.scala:1816)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.$anonfun$transformWithColumns$1(SparkConnectPlanner.scala:1303)
   >    at scala.collection.immutable.List.map(List.scala:236)
   >    at scala.collection.immutable.List.map(List.scala:79)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.transformWithColumns(SparkConnectPlanner.scala:1292)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.$anonfun$transformRelation$1(SparkConnectPlanner.scala:194)
   >    at 
org.apache.spark.sql.connect.service.SessionHolder.$anonfun$usePlanCache$4(SessionHolder.scala:589)
   >    at scala.Option.getOrElse(Option.scala:201)
   >    at 
org.apache.spark.sql.connect.service.SessionHolder.usePlanCache(SessionHolder.scala:588)
   >    at 
org.apache.spark.sql.connect.planner.SparkConnectPlanner.transformRelation(SparkConnectPlanner.scala:146)
   >    at 
org.apache.spark.sql.connect.execution.SparkConnectPlanExecution.handlePlan(SparkConnectPlanExecution.scala:73)
   >    at 
org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:225)
   >    at 
org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:197)
   >    at 
org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:396)
   >    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:804)
   >    at 
org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:396)
   >    at 
org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:94)
   >    at 
org.apache.spark.sql.artifact.ArtifactManager.$anonfun$withResources$1(ArtifactManager.scala:112)
   >    at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:185)
   >    at 
org.apache.spark.sql.artifact.ArtifactManager.withClassLoaderIfNeeded(ArtifactManager.scala:102)
   >    at 
org.apache.spark.sql.artifact.ArtifactManager.withResources(ArtifactManager.scala:111)
   >    at 
org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:395)
   >    at 
org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:197)
   >    at 
org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:126)
   >    at 
org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:334)
   > ```
   > 
   > for reference, there was a similar issue #41622 but I'm afraid the 
solution is not applicable for this PR.
   > 
   > also cc @LuciferYang @dongjoon-hyun
   
   @xi-db xi-dbDo you have time to fix this problem?
   also cc @hvanhovell and @HyukjinKwon 
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to