[
https://issues.apache.org/jira/browse/SPARK-8410?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14593453#comment-14593453
]
Josiah Samuel Sathiadass commented on SPARK-8410:
-------------------------------------------------
My findings so far,
1. In most of our machines where it runs successfully, these
jars(netty-3.2.2.Final.jar, groovy-all-2.1.6.jar, asm-3.2.jar) are already
found in
the following locations,
~/.ivy2/jars
~/.ivy2/cache
~/.m2/repository/
In a fresh installation, these jars are not found and need to be downloaded
explicitly from the maven repo. But its not
triggered when spark is built/package/install/test since these dependencies
seems to be missed out in one of the pom.xml(Please correct me if I'm wrong )
2. To confirm it, clear up the jars from the above listed directories, it will
consistently
fail to retrieve the jars from the repositories in all the machines.
3. The following code throws the exception while trying to resolve the
MavenCoordinates by
the "SparkSubmitUtils.resolveMavenCoordinates" call,
>>>
// resolve dependencies
val rr: ResolveReport = ivy.resolve(md, resolveOptions)
if (rr.hasError) {
throw new RuntimeException(rr.getAllProblemMessages.toString)
}
<<<
Here is a verbose output of the call,
Ivy Default Cache set to: /home/joe/.ivy2/cache
The jars for the packages stored in: /home/joe/.ivy2/jars
http://www.datanucleus.org/downloads/maven2 added as a remote repository with
the name: repo-1
org.apache.hive#hive-metastore added as a dependency
org.apache.hive#hive-exec added as a dependency
org.apache.hive#hive-common added as a dependency
org.apache.hive#hive-serde added as a dependency
com.google.guava#guava added as a dependency
org.apache.hadoop#hadoop-client added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found org.apache.hive#hive-metastore;0.13.1 in central
found org.apache.hive#hive-serde;0.13.1 in central
found org.apache.hive#hive-common;0.13.1 in central
found org.apache.hive#hive-shims;0.13.1 in central
found org.apache.hive.shims#hive-shims-common;0.13.1 in central
...
..
..
.
:: resolution report :: resolve 4149ms :: artifacts dl 100ms
:: modules in use:
antlr#antlr;2.7.7 from local-m2-cache in [default]
aopalliance#aopalliance;1.0 from local-m2-cache in [default]
asm#asm;3.2 from local-m2-cache in [default]
com.google.code.findbugs#jsr305;1.3.9 from local-m2-cache in [default]
com.google.guava#guava;14.0.1 from local-m2-cache in [default]
com.google.inject#guice;3.0 from local-m2-cache in [default]
...
..
..
.
:: evicted modules:
log4j#log4j;1.2.16 by [log4j#log4j;1.2.17] in [default]
com.google.guava#guava;11.0.2 by [com.google.guava#guava;14.0.1] in
[default]
commons-lang#commons-lang;2.4 by [commons-lang#commons-lang;2.6] in
[default]
commons-collections#commons-collections;3.1 by
[commons-collections#commons-collections;3.2.1] in [default]
commons-httpclient#commons-httpclient;3.0.1 by
[commons-httpclient#commons-httpclient;3.1] in [default]
org.codehaus.jackson#jackson-core-asl;1.8.8 by
[org.codehaus.jackson#jackson-core-asl;1.9.2] in [default]
org.codehaus.jackson#jackson-mapper-asl;1.8.8 by
[org.codehaus.jackson#jackson-mapper-asl;1.9.2] in [default]
org.apache.avro#avro;1.7.4 by [org.apache.avro#avro;1.7.5] in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 108 | 0 | 0 | 8 || 100 | 0 |
---------------------------------------------------------------------
:: problems summary ::
:::: WARNINGS
[NOT FOUND ]
org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle) (2ms)
==== local-m2-cache: tried
file:/home/joe/.m2/repository/org/jboss/netty/netty/3.2.2.Final/netty-3.2.2.Final.jar
[NOT FOUND ]
org.codehaus.groovy#groovy-all;2.1.6!groovy-all.jar (0ms)
==== local-m2-cache: tried
file:/home/joe/.m2/repository/org/codehaus/groovy/groovy-all/2.1.6/groovy-all-2.1.6.jar
[NOT FOUND ] asm#asm;3.2!asm.jar (0ms)
==== local-m2-cache: tried
file:/home/joe/.m2/repository/asm/asm/3.2/asm-3.2.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: FAILED DOWNLOADS ::
:: ^ see resolution messages for details ^ ::
::::::::::::::::::::::::::::::::::::::::::::::
:: org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle)
:: org.codehaus.groovy#groovy-all;2.1.6!groovy-all.jar
:: asm#asm;3.2!asm.jar
::::::::::::::::::::::::::::::::::::::::::::::
Kindly help me find a solution.
> Hive VersionsSuite RuntimeException
> -----------------------------------
>
> Key: SPARK-8410
> URL: https://issues.apache.org/jira/browse/SPARK-8410
> Project: Spark
> Issue Type: Question
> Components: SQL
> Affects Versions: 1.3.1, 1.4.0
> Environment: IBM Power system - P7
> running Ubuntu 14.04LE
> with IBM JDK version 1.7.0
> Reporter: Josiah Samuel Sathiadass
> Priority: Minor
>
> While testing Spark Project Hive, there are RuntimeExceptions as follows,
> VersionsSuite:
> - success sanity check *** FAILED ***
> java.lang.RuntimeException: [download failed:
> org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed:
> org.codehaus.groovy#groovy-all;2.1.6!groovy-all.jar, download failed:
> asm#asm;3.2!asm.jar]
> at
> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
> at org.apache.spark.sql.catalyst.util.package$.quietly(package.scala:38)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$.org$apache$spark$sql$hive$client$IsolatedClientLoader$$downloadVersion(IsolatedClientLoader.scala:61)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$1.apply(IsolatedClientLoader.scala:44)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$1.apply(IsolatedClientLoader.scala:44)
> at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:189)
> at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91)
> at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$.forVersion(IsolatedClientLoader.scala:44)
> ...
> The tests are executed with the following set of options,
> build/mvn --pl sql/hive --fail-never -Pyarn -Phadoop-2.4
> -Dhadoop.version=2.6.0 test
> Adding the following dependencies in the "spark/sql/hive/pom.xml" file
> solves this issue,
> < <dependency>
> < <groupId>org.jboss.netty</groupId>
> < <artifactId>netty</artifactId>
> < <version>3.2.2.Final</version>
> < <scope>test</scope>
> < </dependency>
> < <dependency>
> < <groupId>org.codehaus.groovy</groupId>
> < <artifactId>groovy-all</artifactId>
> < <version>2.1.6</version>
> < <scope>test</scope>
> < </dependency>
> <
> < <dependency>
> < <groupId>asm</groupId>
> < <artifactId>asm</artifactId>
> < <version>3.2</version>
> < <scope>test</scope>
> < </dependency>
> <
> The question is, Is this the correct way to fix this runtimeException ?
> If yes, Can a pull request fix this issue permanently ?
> If not, suggestions please.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]