[ 
https://issues.apache.org/jira/browse/SPARK-23776?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-23776.
----------------------------------
       Resolution: Fixed
    Fix Version/s: 2.4.0

Issue resolved by pull request 21628
[https://github.com/apache/spark/pull/21628]

> pyspark-sql tests should display build instructions when components are 
> missing
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-23776
>                 URL: https://issues.apache.org/jira/browse/SPARK-23776
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.3.0
>            Reporter: Bruce Robbins
>            Assignee: Bruce Robbins
>            Priority: Minor
>             Fix For: 2.4.0
>
>
> This is a follow up to SPARK-23417.
> The pyspark-streaming tests print useful build instructions when certain 
> components are missing in the build.
> pyspark-sql's udf and readwrite tests also have specific build requirements: 
> the build must compile test scala files, and the build must also create the 
> Hive assembly. When those class or jar files are not created, the tests throw 
> only partially helpful exceptions, e.g.:
> {noformat}
> AnalysisException: u'Can not load class 
> test.org.apache.spark.sql.JavaStringLength, please make sure it is on the 
> classpath;'
> {noformat}
> or
> {noformat}
> IllegalArgumentException: u"Error while instantiating 
> 'org.apache.spark.sql.hive.HiveExternalCatalog':"
> {noformat}
> You end up in this situation when you follow Spark's build instructions and 
> then attempt to run the pyspark tests.
> It would be nice if pyspark-sql tests provide helpful build instructions in 
> these cases.
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to