Hi,

Unfortunately I am new to Scala so I don’t even know what ‘implicit default’
> means, if indeed you’re referring to Scala!
>
>
>

I was merely referring to the fact, that maven-central is often set as
default repository somewhere internally (i.e. source) or externally
(Zeppelin config, Maven/aether config...), and figuring out  where this
setting comes from is going to take some....patience.
Nothing scala-specific in this case, but rather something inherited from
sonatype-aether.

Also, for info/comparison, the trace from the crashing unit tests:

Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 102.584 sec
<<< FAILURE! - in org.apache.zeppelin.spark.SparkInterpreterTest
testZContextDependencyLoading(org.apache.zeppelin.spark.SparkInterpreterTest)
Time elapsed: 63.612 sec  <<< FAILURE!
java.lang.AssertionError: expected:<SUCCESS> but was:<ERROR>
        at org.junit.Assert.fail(Assert.java:88)
        at org.junit.Assert.failNotEquals(Assert.java:743)
        at org.junit.Assert.assertEquals(Assert.java:118)
        at org.junit.Assert.assertEquals(Assert.java:144)
        at
org.apache.zeppelin.spark.SparkInterpreterTest.testZContextDependencyLoading(SparkInterpreterTest.java:159)

which is followed up by

Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 69.667 sec
<<< FAILURE! - in org.apache.zeppelin.spark.DepInterpreterTest
testDefault(org.apache.zeppelin.spark.DepInterpreterTest)  Time elapsed:
69.528 sec  <<< ERROR!
java.lang.NullPointerException: null
        at
org.sonatype.aether.impl.internal.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:352)
        at
org.apache.zeppelin.spark.dep.DependencyContext.fetchArtifactWithDep(DependencyContext.java:141)
        at
org.apache.zeppelin.spark.dep.DependencyContext.fetch(DependencyContext.java:98)
        at
org.apache.zeppelin.spark.DepInterpreter.interpret(DepInterpreter.java:189)
        at
org.apache.zeppelin.spark.DepInterpreterTest.testDefault(DepInterpreterTest.java:88)


I'll give this a few hours yet, before opening an issue, maybe I'm just
"holding it wrong". I doubt an issue will push this along much faster
either, unless one of us actually submits a patch/PR to go along with it ;)

Best,

Rick



> *From:* Rick Moritz [mailto:rah...@gmail.com]
> *Sent:* 21 September 2015 15:19
> *To:* users@zeppelin.incubator.apache.org
> *Cc:* Partridge, Lucas (GE Aviation)
> *Subject:* RE: Can't load a dependency (because I'm behind a proxy?)
>
>
>
> Hello Lucas, hello list,
>
> hopefully this message will thread properly.
>
> This problem can actually be reproduced by the corresponding unit tests -
> at least on my "disconnected" system, the corresponding tests for the
> SparkInterpreter fail in exactly the same way as your code does. This is
> also an issue for me, since I will probably have to get those tests to
> pass, in order to deploy Zeppelin on our production system.
>
> Since this actually fails unit tests, I think creating a corresponding
> issue is a logical next step.
>
> I'm currently looking at the code of the test to figure out which
> component is responsible for directing the dependency lookup to the target,
> and how this can be overridden, but there's probably some implicit default
> in use, which makes figuring the root out slightly more tricky.
>
> Have you had a look at where this could be overridden yet? Filed an issue
> already?
>
> Unless we get some Progress going in this thread, we should start the
> usual procedures...
>
> Thanks and Best,
>
> Rick
>

Reply via email to