So yes the individual artifacts are released however, there is no
deployable bundle prebuilt for Spark 1.5.1 and Scala 2.11.7, something
like: spark-1.5.1-bin-hadoop-2.6_scala-2.11.tgz. The spark site even
states this:
*Note: Scala 2.11 users should download the Spark source package and
build
A dependency couldn't be downloaded:
[INFO] +- com.h2database:h2:jar:1.4.183:test
Have you checked your network settings ?
Cheers
On Sun, Oct 25, 2015 at 10:22 AM, Bilinmek Istemiyor
wrote:
> Thank you for the quick reply. You are God Send. I have long not been
>
Thank you for the quick reply. You are God Send. I have long not been
programming in java, nothing know about maven, scala, sbt ant spark stuff.
I used java 7 since build failed with java 8. Which java version do you
advise in general to use spark. I can downgrade scala version as well. Can
you
Hi Bilnmek,
Spark 1.5.x does not support Scala 2.11.7 so the easiest thing to do it
build it like your trying. Here are the steps I followed to build it on a
Max OS X 10.10.5 environment, should be very similar on ubuntu.
1. set theJAVA_HOME environment variable in my bash session via export
Hm, why do you say it doesn't support 2.11? It does.
It is not even this difficult; you just need a source distribution,
and then run "./dev/change-scala-version.sh 2.11" as you say. Then
build as normal
On Sun, Oct 25, 2015 at 4:00 PM, Todd Nist wrote:
> Hi Bilnmek,
>
>
Sorry Sean you are absolutely right it supports 2.11 all o meant is there
is no release available as a standard download and that one has to build
it. Thanks for the clairification.
-Todd
On Sunday, October 25, 2015, Sean Owen wrote:
> Hm, why do you say it doesn't support
No, 2.11 artifacts are in fact published:
http://search.maven.org/#search%7Cga%7C1%7Ca%3A%22spark-parent_2.11%22
On Sun, Oct 25, 2015 at 7:37 PM, Todd Nist wrote:
> Sorry Sean you are absolutely right it supports 2.11 all o meant is there is
> no release available as a