[ 
https://issues.apache.org/jira/browse/SPARK-19836?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15899292#comment-15899292
 ] 

Song Jun commented on SPARK-19836:
----------------------------------

I have do this similar https://github.com/apache/spark/pull/16803

> Customizable remote repository url for hive versions unit test
> --------------------------------------------------------------
>
>                 Key: SPARK-19836
>                 URL: https://issues.apache.org/jira/browse/SPARK-19836
>             Project: Spark
>          Issue Type: Bug
>          Components: Tests
>    Affects Versions: 2.1.0
>            Reporter: Elek, Marton
>              Labels: ivy, unittest
>
> When the VersionSuite test runs from sql/hive it downloads different versions 
> from hive.
> Unfortunately the IsolatedClientClassloader (which is used by the 
> VersionSuite) uses hardcoded fix repositories:
> {code}
>     val classpath = quietly {
>       SparkSubmitUtils.resolveMavenCoordinates(
>         hiveArtifacts.mkString(","),
>         SparkSubmitUtils.buildIvySettings(
>           Some("http://www.datanucleus.org/downloads/maven2";),
>           ivyPath),
>         exclusions = version.exclusions)
>     }
> {code}
> The problem is with the hard-coded repositories:
>  1. it's hard to run unit tests in an environment where only one internal 
> maven repository is available (and central/datanucleus is not)
>  2. it's impossible to run unit tests against custom built hive/hadoop 
> artifacts (which are not available from the central repository)
> VersionSuite has already a specific SPARK_VERSIONS_SUITE_IVY_PATH environment 
> variable to define a custom local repository as ivy cache.
> I suggest to add an additional environment variable 
> (SPARK_VERSIONS_SUITE_IVY_REPOSITORIES to the HiveClientBuilder.scala), to 
> make it possible adding new remote repositories for testing the different 
> hive versions.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to