[ https://issues.apache.org/jira/browse/SPARK-22940?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16310717#comment-16310717 ]
Apache Spark commented on SPARK-22940: -------------------------------------- User 'bersprockets' has created a pull request for this issue: https://github.com/apache/spark/pull/20147 > Test suite HiveExternalCatalogVersionsSuite fails on platforms that don't > have wget installed > --------------------------------------------------------------------------------------------- > > Key: SPARK-22940 > URL: https://issues.apache.org/jira/browse/SPARK-22940 > Project: Spark > Issue Type: Bug > Components: Tests > Affects Versions: 2.2.1 > Environment: MacOS Sierra 10.12.6 > Reporter: Bruce Robbins > Priority: Minor > > On platforms that don't have wget installed (e.g., Mac OS X), test suite > org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite throws an > exception and aborts: > java.io.IOException: Cannot run program "wget": error=2, No such file or > directory > HiveExternalCatalogVersionsSuite uses wget to download older versions of > Spark for compatibility testing. First it uses wget to find a suitable > mirror, and then it uses wget to download a tar file from the mirror. > There are several ways to fix this (in reverse order of difficulty of > implementation) > 1. Require Mac OS X users to install wget if they wish to run unit tests (or > at the very least if they wish to run HiveExternalCatalogVersionsSuite). > Also, update documentation to make this requirement explicit. > 2. Fall back on curl when wget is not available. > 3. Use an HTTP library to query for a suitable mirror and download the tar > file. > Number 2 is easy to implement, and I did so to get the unit test to run. But > it relies on another external program if wget is not installed. > Number 3 is probably slightly more complex to implement and requires more > corner-case checking (e.g, redirects, etc.). -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org