Re: spark 2.4.3 build fails using java 8 and scala 2.11 with NumberFormatException: Not a version: 9

2019-05-19 Thread Bulldog20630405
after blowing away my m2 repo cache; i was able to build just fine... i dont know why; but now it works :-) On Sun, May 19, 2019 at 10:22 PM Bulldog20630405 wrote: > i am trying to build spark 2.4.3 with the following env: > >- fedora 29 >- 1.8.0_202 >- spark 2.4.3 >- scala 2.11.

spark 2.4.3 build fails using java 8 and scala 2.11 with NumberFormatException: Not a version: 9

2019-05-19 Thread Bulldog20630405
i am trying to build spark 2.4.3 with the following env: - fedora 29 - 1.8.0_202 - spark 2.4.3 - scala 2.11.12 - maven 3.5.4 - hadoop 2.6.5 according to the documentation this can be done with the following commands: *export TERM=xterm-color* *./build/mvn -Pyarn -DskipTests clea

Re: Build spark source code with scala 2.11

2019-03-12 Thread Stephen Boesch
You might have better luck downloading the 2.4.X branch Am Di., 12. März 2019 um 16:39 Uhr schrieb swastik mittal : > Then are the mlib of spark compatible with scala 2.12? Or can I change the > spark version from spark3.0 to 2.3 or 2.4 in local spark/master? > > > > -- > Sent from: http://apache

Re: Build spark source code with scala 2.11

2019-03-12 Thread swastik mittal
Then are the mlib of spark compatible with scala 2.12? Or can I change the spark version from spark3.0 to 2.3 or 2.4 in local spark/master? -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-m

Re: Build spark source code with scala 2.11

2019-03-12 Thread Stephen Boesch
I think scala 2.11 support was removed with the spark3.0/master Am Di., 12. März 2019 um 16:26 Uhr schrieb swastik mittal : > I am trying to build my spark using build/sbt package, after changing the > scala versions to 2.11 in pom.xml because my applications jar files use > scala

Build spark source code with scala 2.11

2019-03-12 Thread swastik mittal
I am trying to build my spark using build/sbt package, after changing the scala versions to 2.11 in pom.xml because my applications jar files use scala 2.11. But building the spark code gives an error in sql saying "A method with a varargs annotation produces a forwarder method with the

Re: Spark 2.0 Scala 2.11 and Kafka 0.10 Scala 2.10

2017-02-08 Thread Cody Koeninger
com wrote: > Dear devs, > > is it possible to use Spark 2.0.2 Scala 2.11 and consume messages from kafka > server 0.10.0.2 running on Scala 2.10? > I tried this the last two days by using createDirectStream and can't get no > message out of kafka?! > > I'm using HDP 2.

Spark 2.0 Scala 2.11 and Kafka 0.10 Scala 2.10

2017-02-08 Thread u...@moosheimer.com
Dear devs, is it possible to use Spark 2.0.2 Scala 2.11 and consume messages from kafka server 0.10.0.2 running on Scala 2.10? I tried this the last two days by using createDirectStream and can't get no message out of kafka?! I'm using HDP 2.5.3 running kafka_2.10-0.10.0.2.5.3.0-37

Re: Assembly for Kafka >= 0.10.0, Spark 2.2.0, Scala 2.11

2017-01-18 Thread Cody Koeninger
at 2:21 AM, Karamba wrote: > |Hi, I am looking for an assembly for Spark 2.2.0 with Scala 2.11. I > can't find one in MVN Repository. Moreover, "org.apache.spark" %% > "spark-streaming-kafka-0-10_2.11" % "2.1.0 shows that even sbt does not > find one:

Assembly for Kafka >= 0.10.0, Spark 2.2.0, Scala 2.11

2017-01-18 Thread Karamba
|Hi, I am looking for an assembly for Spark 2.2.0 with Scala 2.11. I can't find one in MVN Repository. Moreover, "org.apache.spark" %% "spark-streaming-kafka-0-10_2.11" % "2.1.0 shows that even sbt does not find one: [error] (*:update) sbt.ResolveException: unresolve

Re: Is there a way to run a jar built for scala 2.11 on spark 1.6.1 (which is using 2.10?)

2016-05-18 Thread Ted Yu
Depending on the version of hadoop you use, you may find tar ball prebuilt with Scala 2.11: https://s3.amazonaws.com/spark-related-packages FYI On Wed, May 18, 2016 at 3:35 PM, Koert Kuipers wrote: > no but you can trivially build spark 1.6.1 for scala 2.11 > > On Wed, May 18, 2016 a

Re: Is there a way to run a jar built for scala 2.11 on spark 1.6.1 (which is using 2.10?)

2016-05-18 Thread Koert Kuipers
no but you can trivially build spark 1.6.1 for scala 2.11 On Wed, May 18, 2016 at 6:11 PM, Sergey Zelvenskiy wrote: > >

Is there a way to run a jar built for scala 2.11 on spark 1.6.1 (which is using 2.10?)

2016-05-18 Thread Sergey Zelvenskiy

Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
; Jenkins jobs have been running against Scala 2.11: > > [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ > java8-tests_2.11 --- > > > FYI > > > On Mon, May 16, 2016 at 2:45 PM, Eric Richardson > wrote: > >> On Thu, May 12, 2016 at

Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Ted Yu
For 2.0, I believe that is the case. Jenkins jobs have been running against Scala 2.11: [INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) @ java8-tests_2.11 --- FYI On Mon, May 16, 2016 at 2:45 PM, Eric Richardson wrote: > On Thu, May 12, 2016 at 9:23 PM, Luci

Re: sbt for Spark build with Scala 2.11

2016-05-16 Thread Eric Richardson
On Thu, May 12, 2016 at 9:23 PM, Luciano Resende wrote: > Spark has moved to build using Scala 2.11 by default in master/trunk. > Does this mean that the pre-built binaries for download will also move to 2.11 as well? > > > As for the 2.0.0-SNAPSHOT, it is actually the version

Re: sbt for Spark build with Scala 2.11

2016-05-13 Thread Raghava Mutharaju
Dependencies += sparksql ) Regards, Raghava. On Fri, May 13, 2016 at 12:23 AM, Luciano Resende wrote: > Spark has moved to build using Scala 2.11 by default in master/trunk. > > As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and > you might be missing

Re: sbt for Spark build with Scala 2.11

2016-05-12 Thread Luciano Resende
Spark has moved to build using Scala 2.11 by default in master/trunk. As for the 2.0.0-SNAPSHOT, it is actually the version of master/trunk and you might be missing some modules/profiles for your build. What command did you use to build ? On Thu, May 12, 2016 at 9:01 PM, Raghava Mutharaju

sbt for Spark build with Scala 2.11

2016-05-12 Thread Raghava Mutharaju
Hello All, I built Spark from the source code available at https://github.com/apache/spark/. Although I haven't specified the "-Dscala-2.11" option (to build with Scala 2.11), from the build messages I see that it ended up using Scala 2.11. Now, for my application sbt, what shou

spark w/ scala 2.11 and PackratParsers

2016-05-04 Thread matd
Hi folks, Our project is a mess of scala 2.10 and 2.11, so I tried to switch everything to 2.11. I had some exasperating errors like this : java.lang.NoClassDefFoundError: org/apache/spark/sql/execution/datasources/DDLParser at org.apache.spark.sql.SQLContext.(SQLContext.scala:208) at org.apache

Re: Launching EC2 instances with Spark compiled for Scala 2.11

2016-01-25 Thread Darren Govoni
Why not deploy it. Then build a custom distribution with Scala 2.11 and just overlay it. Sent from my Verizon Wireless 4G LTE smartphone Original message From: Nuno Santos Date: 01/25/2016 7:38 AM (GMT-05:00) To: user@spark.apache.org Subject: Re: Launching EC2

Re: Launching EC2 instances with Spark compiled for Scala 2.11

2016-01-25 Thread Nuno Santos
Hello, Any updates on this question? I'm also very interested in a solution, as I'm trying to use Spark on EC2 but need Scala 2.11 support. The scripts in the ec2 directory of the Spark distribution install use Scala 2.10 by default and I can't see any obvious option to chang

"impossible to get artifacts " error when using sbt to build 1.6.0 for scala 2.11

2016-01-07 Thread Lin Zhao
I tried to build 1.6.0 for yarn and scala 2.11, but have an error. Any help is appreciated. [warn] Strategy 'first' was applied to 2 files [info] Assembly up to date: /Users/lin/git/spark/network/yarn/target/scala-2.11/spark-network-yarn-1.6.0-hadoop2.7.1.jar java.lang.IllegalStat

Re: Scala 2.11 and Akka 2.4.0

2015-12-07 Thread RodrigoB
Hi Manas, Thanks for the reply. I've done that. The problem lies with Spark + akka 2.4.0 build. Seems the maven shader plugin is altering some class files and breaking the Akka runtime. Seems the Spark build on Scala 2.11 using SBT is broken. I'm getting build errors using sbt due to

Re: Scala 2.11 and Akka 2.4.0

2015-12-05 Thread manasdebashiskar
There are steps to build spark using scala 2.11 in the spark docs. the first step is /dev/change-scala-version.sh 2.11 which changes the scala version to 2.11. I have not tried compiling spark with akka 2.4.0. ..Manas -- View this message in context: http://apache-spark-user-list.1001560.n3

Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Iulian Dragoș
ge- > From: Jacek Laskowski [mailto:ja...@japila.pl] > Sent: 01 December 2015 18:17 > To: Boavida, Rodrigo > Cc: user > Subject: Re: Scala 2.11 and Akka 2.4.0 > > On Tue, Dec 1, 2015 at 2:32 PM, RodrigoB > wrote: > > > I'm currently trying to build spar

RE: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Boavida, Rodrigo
#x27;s not available yet. Any suggestions are very welcomed. Tnks, Rod -Original Message- From: Jacek Laskowski [mailto:ja...@japila.pl] Sent: 01 December 2015 18:17 To: Boavida, Rodrigo Cc: user Subject: Re: Scala 2.11 and Akka 2.4.0 On Tue, Dec 1, 2015 at 2:32 PM, RodrigoB wro

Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Jacek Laskowski
On Tue, Dec 1, 2015 at 2:32 PM, RodrigoB wrote: > I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0. Why? AFAIK Spark's leaving Akka's boat and joins Netty's. Jacek - To unsubscribe

Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Ted Yu
>From the dependency tree, akka 2.4.0 was in effect. Maybe check the classpath of master to see if there is older version of akka. Cheers

RE: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Boavida, Rodrigo
Thanks that worked! I let you know the results. Tnks, Rod From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: 01 December 2015 15:36 To: Boavida, Rodrigo Cc: user@spark.apache.org Subject: Re: Scala 2.11 and Akka 2.4.0 Please specify the following in your maven commands: -Dscala-2.11 Cheers This

Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Ted Yu
Please specify the following in your maven commands: -Dscala-2.11 Cheers

Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Ted Yu
p 1] > > So it seems somehow it's still pulling some 2.10 dependencies. Do you > think this could be the cause for the observed problem? > > tnks, > Rod > > -Original Message- > From: Ted Yu [mailto:yuzhih...@gmail.com] > Sent: 01 December 2015 14:13 >

RE: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Boavida, Rodrigo
So it seems somehow it's still pulling some 2.10 dependencies. Do you think this could be the cause for the observed problem? tnks, Rod -Original Message- From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: 01 December 2015 14:13 To: Boavida, Rodrigo Cc: user@spark.apache.org Subject:

Re: Scala 2.11 and Akka 2.4.0

2015-12-01 Thread Ted Yu
Have you run 'mvn dependency:tree' and examined the output ? There should be some hint. Cheers > On Dec 1, 2015, at 5:32 AM, RodrigoB wrote: > > Hi, > > I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0. > I've changed the main pom.x

Scala 2.11 and Akka 2.4.0

2015-12-01 Thread RodrigoB
Hi, I'm currently trying to build spark with Scala 2.11 and Akka 2.4.0. I've changed the main pom.xml files to corresponding akka version and am getting the following exception when starting the master on standalone: Exception Details: Location: akk

Re: Error Compiling Spark 1.4.1 w/ Scala 2.11 & Hive Support

2015-10-26 Thread Sean Owen
Did you switch the build to Scala 2.11 by running the script in dev/? It won't work otherwise, but does work if you do. @Ted 2.11 was supported in 1.4, not just 1.5. On Mon, Oct 26, 2015 at 2:13 PM, Bryan Jeffrey wrote: > All, > > The error resolved to a bad version of jline pull

Re: Error Compiling Spark 1.4.1 w/ Scala 2.11 & Hive Support

2015-10-26 Thread Bryan Jeffrey
an Jeffrey On Mon, Oct 26, 2015 at 9:01 AM, Bryan Jeffrey wrote: > All, > > I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive > support. Any ideas? > > mvn -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive > -Phive-thriftserver pac

Re: Error Compiling Spark 1.4.1 w/ Scala 2.11 & Hive Support

2015-10-26 Thread Ted Yu
Scala 2.11 is supported in 1.5.1 release: http://search.maven.org/#search%7Cga%7C1%7Ca%3A%22spark-parent_2.11%22 Can you upgrade ? Cheers On Mon, Oct 26, 2015 at 6:01 AM, Bryan Jeffrey wrote: > All, > > I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & H

Error Compiling Spark 1.4.1 w/ Scala 2.11 & Hive Support

2015-10-26 Thread Bryan Jeffrey
All, I'm seeing the following error compiling Spark 1.4.1 w/ Scala 2.11 & Hive support. Any ideas? mvn -Dhadoop.version=2.6.1 -Dscala-2.11 -DskipTests -Pyarn -Phive -Phive-thriftserver package [INFO] Spark Project Parent POM .. SUCCESS [4.124s] [INFO] Spark

Re: Building with SBT and Scala 2.11

2015-10-14 Thread Adrian Tanase
ers On Tue, Oct 13, 2015 at 10:05 PM, Adrian Tanase mailto:atan...@adobe.com>> wrote: Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works. -adrian Sent from my iPhone On 14 Oct 2015, a

Re: Building with SBT and Scala 2.11

2015-10-14 Thread Jakob Odersky
Tanase wrote: > >> Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also >> compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works. >> >> -adrian >> >> Sent from my iPhone >> >> On 14 Oct 2015, at 03:53, Jak

Re: Building with SBT and Scala 2.11

2015-10-14 Thread Ted Yu
Adrian: Likely you were using maven. Jakob's report was with sbt. Cheers On Tue, Oct 13, 2015 at 10:05 PM, Adrian Tanase wrote: > Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also > compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it wo

Re: Building with SBT and Scala 2.11

2015-10-13 Thread Adrian Tanase
Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works. -adrian Sent from my iPhone On 14 Oct 2015, at 03:53, Jakob Odersky mailto:joder...@gmail.com>> wrote: I'm having trouble comp

Re: Building with SBT and Scala 2.11

2015-10-13 Thread Ted Yu
See this thread: http://search-hadoop.com/m/q3RTtY7aX22B44dB On Tue, Oct 13, 2015 at 5:53 PM, Jakob Odersky wrote: > I'm having trouble compiling Spark with SBT for Scala 2.11. The command I > use is: > > dev/change-version-to-2.11.sh > build/sbt -Pyarn -Phado

Building with SBT and Scala 2.11

2015-10-13 Thread Jakob Odersky
I'm having trouble compiling Spark with SBT for Scala 2.11. The command I use is: dev/change-version-to-2.11.sh build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11 followed by compile in the sbt shell. The error I get specifically is: spark/core/src/main/scala/org/apache/spark/rpc/

Re: Launching EC2 instances with Spark compiled for Scala 2.11

2015-10-08 Thread Aniket Bhatnagar
here is an easy way launch EC2 instances which have a > Spark built for Scala 2.11. > > The only way I can think of is to prepare the sources for 2.11 as shown in > the Spark build instructions ( > http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211), >

Launching EC2 instances with Spark compiled for Scala 2.11

2015-10-08 Thread Theodore Vasiloudis
Hello, I was wondering if there is an easy way launch EC2 instances which have a Spark built for Scala 2.11. The only way I can think of is to prepare the sources for 2.11 as shown in the Spark build instructions ( http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211

Re: Spark does not yet support its JDBC component for Scala 2.11.

2015-09-21 Thread Ted Yu
I think the document should be updated to reflect the integration of SPARK-8013 Cheers On Mon, Sep 21, 2015 at 3:48 AM, Petr Novak wrote: > Nice, thanks. > > So the note in build instruction for 2.11 is obsolete? Or there are still > some limit

Re: Spark does not yet support its JDBC component for Scala 2.11.

2015-09-21 Thread Petr Novak
Nice, thanks. So the note in build instruction for 2.11 is obsolete? Or there are still some limitations? http://spark.apache.org/docs/latest/building-spark.html#building-for-scala-211 On Fri, Sep 11, 2015 at 2:19 PM, Petr Novak wrote: > Nice, thanks. > > So the note in build instruction for 2

Re: Spark does not yet support its JDBC component for Scala 2.11.

2015-09-11 Thread Ted Yu
Have you looked at: https://issues.apache.org/jira/browse/SPARK-8013 > On Sep 11, 2015, at 4:53 AM, Petr Novak wrote: > > Does it still apply for 1.5.0? > > What actual limitation does it mean when I switch to 2.11? No JDBC > Thriftserver? No JDBC DataSource? No JdbcRDD (which is already obs

Spark does not yet support its JDBC component for Scala 2.11.

2015-09-11 Thread Petr Novak
Does it still apply for 1.5.0? What actual limitation does it mean when I switch to 2.11? No JDBC Thriftserver? No JDBC DataSource? No JdbcRDD (which is already obsolete I believe)? Some more? What library is the blocker to upgrade JDBC component to 2.11? Is there any estimate when it could be a

Re: Issue with building Spark v1.4.1-rc4 with Scala 2.11

2015-08-26 Thread Ted Yu
Have you run dev/change-version-to-2.11.sh ? Cheers On Wed, Aug 26, 2015 at 7:07 AM, Felix Neutatz wrote: > Hi everybody, > > I tried to build Spark v1.4.1-rc4 with Scala 2.11: > ../apache-maven-3.3.3/bin/mvn -Dscala-2.11 -DskipTests clean install > > Before running this, I

Fwd: Issue with building Spark v1.4.1-rc4 with Scala 2.11

2015-08-26 Thread Felix Neutatz
Hi everybody, I tried to build Spark v1.4.1-rc4 with Scala 2.11: ../apache-maven-3.3.3/bin/mvn -Dscala-2.11 -DskipTests clean install Before running this, I deleted: ../.m2/repository/org/apache/spark ../.m2/repository/org/spark-project My changes to the code: I just changed line 174 of

Re: spark and scala-2.11

2015-08-24 Thread Lanny Ripple
We're going to be upgrading from spark 1.0.2 and using hadoop-1.2.1 so need to build by hand. (Yes, I know. Use hadoop-2.x but standard resource constraints apply.) I want to build against scala-2.11 and publish to our artifact repository but finding build/spark-2.10.4 and tracing down

Re: spark and scala-2.11

2015-08-24 Thread Jonathan Coveney
I've used the instructions and it worked fine. Can you post exactly what you're doing, and what it fails with? Or are you just trying to understand how it works? 2015-08-24 15:48 GMT-04:00 Lanny Ripple : > Hello, > > The instructions for building spark against scala-

Re: spark and scala-2.11

2015-08-24 Thread Sean Owen
The property "scala-2.11" triggers the profile "scala-2.11" -- and additionally disables the scala-2.10 profile, so that's the way to do it. But yes, you also need to run the script before-hand to set up the build for Scala 2.11 as well. On Mon, Aug 24, 2015 at 8:4

spark and scala-2.11

2015-08-24 Thread Lanny Ripple
Hello, The instructions for building spark against scala-2.11 indicate using -Dspark-2.11. When I look in the pom.xml I find a profile named 'spark-2.11' but nothing that would indicate I should set a property. The sbt build seems to need the -Dscala-2.11 property set. Finally build/

Re: Spark on scala 2.11 build fails due to incorrect jline dependency in REPL

2015-08-17 Thread Stephen Boesch
In 1.4 it is change-scala-version.sh 2.11 But the problem was it is a -Dscala-211 not a -P. I misread the doc's. 2015-08-17 14:17 GMT-07:00 Ted Yu : > You were building against 1.4.x, right ? > > In master branch, switch-to-scala-2.11.sh is gone. There is scala-2.11 >

Re: Spark on scala 2.11 build fails due to incorrect jline dependency in REPL

2015-08-17 Thread Ted Yu
You were building against 1.4.x, right ? In master branch, switch-to-scala-2.11.sh is gone. There is scala-2.11 profile. FYI On Sun, Aug 16, 2015 at 11:12 AM, Stephen Boesch wrote: > > I am building spark with the following options - most notably the > **scala-2.11**: > >

Spark on scala 2.11 build fails due to incorrect jline dependency in REPL

2015-08-16 Thread Stephen Boesch
I am building spark with the following options - most notably the **scala-2.11**: . dev/switch-to-scala-2.11.sh mvn -Phive -Pyarn -Phadoop-2.6 -Dhadoop2.6.2 -Pscala-2.11 -DskipTests -Dmaven.javadoc.skip=true clean package The build goes pretty far but fails in one of the minor modules

Re: master compile broken for scala 2.11

2015-07-14 Thread Josh Rosen
I've opened a PR to fix this; please take a look: https://github.com/apache/spark/pull/7405 On Tue, Jul 14, 2015 at 11:22 AM, Koert Kuipers wrote: > it works for scala 2.10, but for 2.11 i get: > > [ERROR] > /home/koert/src/spark/sql/catalyst/src/main/java/org/apache/spark/sql/execution/UnsafeEx

master compile broken for scala 2.11

2015-07-14 Thread Koert Kuipers
it works for scala 2.10, but for 2.11 i get: [ERROR] /home/koert/src/spark/sql/catalyst/src/main/java/org/apache/spark/sql/execution/UnsafeExternalRowSorter.java:135: error: is not abstract and does not override abstract method minBy(Function1,Ordering) in TraversableOnce [ERROR] return new

Re: Roadmap for Spark with Kafka on Scala 2.11?

2015-06-04 Thread Tathagata Das
> wrote: > > > > > On Tue, May 26, 2015 at 10:09 AM, algermissen1971 < > algermissen1...@icloud.com> wrote: > > Hi, > > > > I am setting up a project that requires Kafka support and I wonder what > the roadmap is for Scala 2.11 Support (including Kafka

Re: Roadmap for Spark with Kafka on Scala 2.11?

2015-06-04 Thread algermissen1971
Hi Iulian, On 26 May 2015, at 13:04, Iulian Dragoș wrote: > > On Tue, May 26, 2015 at 10:09 AM, algermissen1971 > wrote: > Hi, > > I am setting up a project that requires Kafka support and I wonder what the > roadmap is for Scala 2.11 Support (including Kafka). >

Re: Roadmap for Spark with Kafka on Scala 2.11?

2015-05-26 Thread Iulian Dragoș
On Tue, May 26, 2015 at 10:09 AM, algermissen1971 < algermissen1...@icloud.com> wrote: > Hi, > > I am setting up a project that requires Kafka support and I wonder what > the roadmap is for Scala 2.11 Support (including Kafka). > > Can we expect to see 2.11 support anyti

Roadmap for Spark with Kafka on Scala 2.11?

2015-05-26 Thread algermissen1971
Hi, I am setting up a project that requires Kafka support and I wonder what the roadmap is for Scala 2.11 Support (including Kafka). Can we expect to see 2.11 support anytime soon? Jan - To unsubscribe, e-mail: user-unsubscr

Re: spark-shell breaks for scala 2.11 (with yarn)?

2015-05-08 Thread Koert Kuipers
i searched the jiras but couldnt find any recent mention of this. let me try with 1.4.0 branch and see if it goes away... On Wed, May 6, 2015 at 3:05 PM, Koert Kuipers wrote: > hello all, > i build spark 1.3.1 (for cdh 5.3 with yarn) twice: for scala 2.10 and > scala 2.11. i am run

Re: branch-1.4 scala 2.11

2015-05-07 Thread Iulian Dragoș
There's an open PR to fix it: https://github.com/apache/spark/pull/5966 On Thu, May 7, 2015 at 6:07 PM, Koert Kuipers wrote: > i am having no luck using the 1.4 branch with scala 2.11 > > $ build/mvn -DskipTests -Pyarn -Dscala-2.11 -Pscala-2.11 clean package > > [erro

branch-1.4 scala 2.11

2015-05-07 Thread Koert Kuipers
i am having no luck using the 1.4 branch with scala 2.11 $ build/mvn -DskipTests -Pyarn -Dscala-2.11 -Pscala-2.11 clean package [error] /home/koert/src/opensource/spark/core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala:78: in object RDDOperationScope, multiple overloaded

spark-shell breaks for scala 2.11 (with yarn)?

2015-05-06 Thread Koert Kuipers
hello all, i build spark 1.3.1 (for cdh 5.3 with yarn) twice: for scala 2.10 and scala 2.11. i am running on a secure cluster. the deployment configs are identical. i can launch jobs just fine on both the scala 2.10 and scala 2.11 versions. spark-shell works on the scala 2.10 version, but not on

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Michael Allman
ilding Spark against 2.11.2 >> and still saw the problems with the REPL. I've created a bug report: >> >> https://issues.apache.org/jira/browse/SPARK-6989 >> >> I hope this helps. >> >> Cheers, >> >> Michael >> >> On Apr

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Sean Owen
015, at 1:41 AM, Sean Owen wrote: > > Doesn't this reduce to "Scala isn't compatible with itself across > maintenance releases"? Meaning, if this were "fixed" then Scala > 2.11.{x < 6} would have similar failures. It's not not-ready; it's > jus

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Michael Allman
ache.org/jira/browse/SPARK-6989> I hope this helps. Cheers, Michael > On Apr 17, 2015, at 1:41 AM, Sean Owen wrote: > > Doesn't this reduce to "Scala isn't compatible with itself across > maintenance releases"? Meaning, if this were "fixed" then Scal

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Sean Owen
Doesn't this reduce to "Scala isn't compatible with itself across maintenance releases"? Meaning, if this were "fixed" then Scala 2.11.{x < 6} would have similar failures. It's not not-ready; it's just not the Scala 2.11.6 REPL. Still, sure I'd f

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-17 Thread Michael Allman
'm experiencing some serious stability problems simply trying to run the Spark 1.3 Scala 2.11 REPL. Most of the time it fails to load and spews a torrent of compiler assertion failures, etc. See attached.spark@dp-cluster-master-node-001:~/spark/bin$ spark-shell Spark Command: java -cp /opt/spark

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-09 Thread Alex Nakos
ssues.apache.org/jira/browse/SPARK-2988 >> >> Any help in getting this working would be much appreciated! >> >> Thanks >> Alex >> >> On Thu, Apr 9, 2015 at 11:32 AM, Prashant Sharma >> wrote: >> >>> You are right this needs to be done. I

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-09 Thread Prashant Sharma
a > wrote: > >> You are right this needs to be done. I can work on it soon, I was not >> sure if there is any one even using scala 2.11 spark repl. Actually there >> is a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), >> which has to be porte

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-09 Thread Alex Nakos
gt; if there is any one even using scala 2.11 spark repl. Actually there is a > patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), which > has to be ported for scala 2.11 too. If however, you(or anyone else) are > planning to work, I can help you ? > > Prashant Sharma

Re: External JARs not loading Spark Shell Scala 2.11

2015-04-09 Thread Prashant Sharma
You are right this needs to be done. I can work on it soon, I was not sure if there is any one even using scala 2.11 spark repl. Actually there is a patch in scala 2.10 shell to support adding jars (Lost the JIRA ID), which has to be ported for scala 2.11 too. If however, you(or anyone else) are

External JARs not loading Spark Shell Scala 2.11

2015-04-09 Thread anakos
Hi- I am having difficulty getting the 1.3.0 Spark shell to find an external jar. I have build Spark locally for Scala 2.11 and I am starting the REPL as follows: bin/spark-shell --master yarn --jars data-api-es-data-export-4.0.0.jar I see the following line in the console output: 15/04/09 09

Re: Building Spark 1.3 for Scala 2.11 using Maven

2015-03-12 Thread Fernando O.
gt; doesn't matter. > > This sounds like you are trying to only build core without building > everything else, which you can't do in general unless you already > built and installed these snapshot artifacts locally. > > On Fri, Mar 6, 2015 at 12:46 AM, Night Wolf >

Re: Building Spark 1.3 for Scala 2.11 using Maven

2015-03-06 Thread Sean Owen
x27;t do in general unless you already built and installed these snapshot artifacts locally. On Fri, Mar 6, 2015 at 12:46 AM, Night Wolf wrote: > Hey guys, > > Trying to build Spark 1.3 for Scala 2.11. > > I'm running with the folllowng Maven command; > > -DskipTests -Dscal

Re: Building Spark 1.3 for Scala 2.11 using Maven

2015-03-05 Thread Marcelo Vanzin
Ah, and you may have to use dev/change-version-to-2.11.sh. (Again, never tried compiling with scala 2.11.) On Thu, Mar 5, 2015 at 4:52 PM, Marcelo Vanzin wrote: > I've never tried it, but I'm pretty sure in the very least you want > "-Pscala-2.11" (not -D). > >

Re: Building Spark 1.3 for Scala 2.11 using Maven

2015-03-05 Thread Marcelo Vanzin
I've never tried it, but I'm pretty sure in the very least you want "-Pscala-2.11" (not -D). On Thu, Mar 5, 2015 at 4:46 PM, Night Wolf wrote: > Hey guys, > > Trying to build Spark 1.3 for Scala 2.11. > > I'm running with the folllowng Maven command; &g

Building Spark 1.3 for Scala 2.11 using Maven

2015-03-05 Thread Night Wolf
Hey guys, Trying to build Spark 1.3 for Scala 2.11. I'm running with the folllowng Maven command; -DskipTests -Dscala-2.11 clean install package *Exception*: [ERROR] Failed to execute goal on project spark-core_2.10: Could not resolve dependencies for project org.apache.spark:spark-core

Re: Spark 1.2.1: ClassNotFoundException when running hello world example in scala 2.11

2015-02-19 Thread Akhil Das
Can you downgrade your scala dependency to 2.10 and give it a try? Thanks Best Regards On Fri, Feb 20, 2015 at 12:40 AM, Luis Solano wrote: > I'm having an issue with spark 1.2.1 and scala 2.11. I detailed the > symptoms in this stackoverflow question. > > > http://stackov

Spark 1.2.1: ClassNotFoundException when running hello world example in scala 2.11

2015-02-19 Thread Luis Solano
I'm having an issue with spark 1.2.1 and scala 2.11. I detailed the symptoms in this stackoverflow question. http://stackoverflow.com/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11 Has anyone experienced anything similar? Thank you!

Re: maven doesn't build dependencies with Scala 2.11

2015-02-05 Thread Ted Yu
e what the OP means by "maven doesn't build Spark's >> dependencies" because Ted indicates it does, and of course you can see >> that these artifacts are published. >> >> On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu wrote: >> > There're 3 jars und

Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-31 Thread Ted Yu
gt; > + if (!fileSystem.isDirectory(new Path(logBaseDir))) { > > > > When there is no schema associated with logBaseDir, local path should be > > assumed. > > > > On Fri, Jan 30, 2015 at 8:37 AM, Stephen Haberman > > wrote: > >> > >> Hi K

Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-31 Thread Sean Owen
Krishna/all, >> >> I think I found it, and it wasn't related to Scala-2.11... >> >> I had "spark.eventLog.dir=/mnt/spark/work/history", which worked >> in Spark 1.2, but now am running Spark master, and it wants a >> Hadoop URI, e.g. file:///m

Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-31 Thread Stephen Haberman
> Looking at https://github.com/apache/spark/pull/1222/files , > the following change may have caused what Stephen described: > > + if (!fileSystem.isDirectory(new Path(logBaseDir))) { > > When there is no schema associated with logBaseDir, local path > should be assumed. Yes, that looks right.

Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-31 Thread Ted Yu
, Stephen Haberman < stephen.haber...@gmail.com> wrote: > Hi Krishna/all, > > I think I found it, and it wasn't related to Scala-2.11... > > I had "spark.eventLog.dir=/mnt/spark/work/history", which worked > in Spark 1.2, but now am running Spark master, and it

Re: spark-shell working in scala-2.11 (breaking change?)

2015-01-30 Thread Stephen Haberman
Hi Krishna/all, I think I found it, and it wasn't related to Scala-2.11... I had "spark.eventLog.dir=/mnt/spark/work/history", which worked in Spark 1.2, but now am running Spark master, and it wants a Hadoop URI, e.g. file:///mnt/spark/work/history (I believe due to commit 4

Re: spark-shell working in scala-2.11

2015-01-28 Thread Krishna Sankar
Stephen, Scala 2.11 worked fine for me. Did the dev change and then compile. Not using in production, but I go back and forth between 2.10 & 2.11. Cheers On Wed, Jan 28, 2015 at 12:18 PM, Stephen Haberman < stephen.haber...@gmail.com> wrote: > Hey, > > I recently co

spark-shell working in scala-2.11

2015-01-28 Thread Stephen Haberman
Hey, I recently compiled Spark master against scala-2.11 (by running the dev/change-versions script), but when I run spark-shell, it looks like the "sc" variable is missing. Is this a known/unknown issue? Are others successfully using Spark with scala-2.11, and specifically spark-she

Re: maven doesn't build dependencies with Scala 2.11

2015-01-18 Thread Ted Yu
at 2:46 AM, Ted Yu wrote: > > There're 3 jars under lib_managed/jars directory with and without > > -Dscala-2.11 flag. > > > > Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10 > > profile has the following: > > > >

Re: maven doesn't build dependencies with Scala 2.11

2015-01-18 Thread Sean Owen
tes it does, and of course you can see that these artifacts are published. On Sun, Jan 18, 2015 at 2:46 AM, Ted Yu wrote: > There're 3 jars under lib_managed/jars directory with and without > -Dscala-2.11 flag. > > Difference between scala-2.10 and scala-2.11 profiles

Re: maven doesn't build dependencies with Scala 2.11

2015-01-17 Thread Ted Yu
There're 3 jars under lib_managed/jars directory with and without -Dscala-2.11 flag. Difference between scala-2.10 and scala-2.11 profiles is that scala-2.10 profile has the following: external/kafka FYI On Sat, Jan 17, 2015 at 4:07 PM, Ted Yu wrote: > I

Re: maven doesn't build dependencies with Scala 2.11

2015-01-17 Thread Ted Yu
building for Scala 2.11 On Sat, Jan 17, 2015 at 3:43 PM, Walrus theCat wrote: > Hi, > > When I run this: > > dev/change-version-to-2.11.sh > mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package > > as per here > <https://spark.apache.org/docs/latest/bu

maven doesn't build dependencies with Scala 2.11

2015-01-17 Thread Walrus theCat
make maven do this? - How can I specify the use of Scala 2.11 in my own .pom files? Thanks

  1   2   >