[ 
https://issues.apache.org/jira/browse/SPARK-36774?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17415867#comment-17415867
 ] 

Apache Spark commented on SPARK-36774:
--------------------------------------

User 'JoshRosen' has created a pull request for this issue:
https://github.com/apache/spark/pull/34013

> Use SparkSubmitTestUtils to core and use it in SparkSubmitSuite
> ---------------------------------------------------------------
>
>                 Key: SPARK-36774
>                 URL: https://issues.apache.org/jira/browse/SPARK-36774
>             Project: Spark
>          Issue Type: Improvement
>          Components: Tests
>    Affects Versions: 3.0.0
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>            Priority: Minor
>
> {{SparkSubmitSuite}} test failures can be painful to debug: currently, failed 
> tests result in messages like
> {code:java}
> org.scalatest.exceptions.TestFailedException: Process returned with exit code 
> 1. See the log4j logs for more detail.
> {code}
> which require the developer to hunt in test logs in order to identify the 
> root cause of the test failure.
> In contrast, {{HiveSparkSubmitSuite}}'s output packs lots more useful 
> information into its test failure messages:
> {code:java}
> [info] - temporary Hive UDF: define a UDF and use it *** FAILED *** (4 
> seconds, 135 milliseconds)
> [info]   spark-submit returned with exit code 101.
> [info]   Command line: '/Users/joshrosen/oss-spark/bin/spark-submit' 
> '--class' 'wrongClassName' '--name' 'TemporaryHiveUDFTest' '--master' 
> 'local-cluster[2,1,1024]' '--conf' 'spark.ui.enabled=false' '--conf' 
> 'spark.master.rest.enabled=false' '--driver-java-options' 
> '-Dderby.system.durability=test' '--jars' 
> 'file:/Users/joshrosen/oss-spark/target/tmp/spark-32d0a47c-33eb-4488-866a-9994b3727b5b/testJar-1631766233448.jar,file:/Users/joshrosen/oss-spark/target/tmp/spark-e9e32588-83fa-43bd-a60f-9538fd30ab9a/testJar-1631766233601.jar'
>  
> 'file:/Users/joshrosen/oss-spark/target/tmp/spark-daef4628-d953-41e3-b935-a726a7796418/testJar-1631766232701.jar'
>  'SparkSubmitClassA' 'SparkSubmitClassB'
> [info]
> [info]   2021-09-15 21:23:54.752 - stderr> SLF4J: Class path contains 
> multiple SLF4J bindings.
> [info]   2021-09-15 21:23:54.752 - stderr> SLF4J: Found binding in 
> [jar:file:/Users/joshrosen/oss-spark/assembly/target/scala-2.12/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> [info]   2021-09-15 21:23:54.752 - stderr> SLF4J: Found binding in 
> [jar:file:/Users/joshrosen/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> [info]   2021-09-15 21:23:54.752 - stderr> SLF4J: See 
> http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> [info]   2021-09-15 21:23:54.753 - stderr> SLF4J: Actual binding is of type 
> [org.slf4j.impl.Log4jLoggerFactory]
> [info]   2021-09-15 21:23:55.623 - stderr> Error: Failed to load class 
> wrongClassName. (SparkSubmitTestUtils.scala:97)
> [info]   org.scalatest.exceptions.TestFailedException:
> [info]   at 
> org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
> [...]
> {code}
> This is done via code in {{SparkSubmitTestUtils}}, which is currently located 
> in the {{sql/hive}} module. I propose a refactoring to move this into the 
> {{core}} module so that it can be used by both {{SparkSubmitSuite}} and 
> {{HiveSparkSubmitSuite}}, enriching the former with nicer error outputs.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to