Your proxy/dns could be blocking it.

Thanks
Best Regards

On Tue, Oct 28, 2014 at 4:06 PM, Yanbo Liang <yanboha...@gmail.com> wrote:

> Maybe you had wrong configuration of sbt proxy.
>
> 2014-10-28 18:27 GMT+08:00 nl19856 <hanspeter.sl...@gmail.com>:
>
>> Hi,
>> I have downloaded the binary spark distribution.
>> When building the package with sbt package I get the following:
>> [root@nlvora157 ~]# sbt package
>> [info] Set current project to Simple Project (in build file:/root/)
>> [info] Updating {file:/root/}root...
>> [info] Resolving org.apache.spark#spark-core_2.10;1.1.0 ...
>> [warn] Host repo1.maven.org not found.
>> url=
>> https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.1.0/spark-core_2.10-1.1.0.pom
>> [info] You probably access the destination server through a proxy server
>> that is not well configured.
>> [warn]  module not found: org.apache.spark#spark-core_2.10;1.1.0
>> [warn] ==== local: tried
>> [warn]
>> /root/.ivy2/local/org.apache.spark/spark-core_2.10/1.1.0/ivys/ivy.xml
>> [warn] ==== public: tried
>> [warn]
>>
>> https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10/1.1.0/spark-core_2.10-1.1.0.pom
>> [info] Resolving org.fusesource.jansi#jansi;1.4 ...
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]  ::          UNRESOLVED DEPENDENCIES         ::
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]  :: org.apache.spark#spark-core_2.10;1.1.0: not found
>> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]
>> [warn]  Note: Unresolved dependencies path:
>> [warn]          org.apache.spark:spark-core_2.10:1.1.0
>> (/root/simple.sbt#L7-8)
>> [warn]            +- simple-project:simple-project_2.10:1.0
>> sbt.ResolveException: unresolved dependency:
>> org.apache.spark#spark-core_2.10;1.1.0: not found
>>
>> What am I doing wrong?
>>
>> Regards Hans-Peter
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/newbie-question-quickstart-example-sbt-issue-tp17477.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to