Re: Spark 1.5.0-SNAPSHOT broken with Scala 2.11

2015-06-29 Thread Iulian Dragoș
On Mon, Jun 29, 2015 at 3:02 AM, Alessandro Baretta alexbare...@gmail.com
wrote:

 I am building the current master branch with Scala 2.11 following these
 instructions:

 Building for Scala 2.11

 To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11
  property:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package


 Here's what I'm seeing:

 log4j:WARN No appenders could be found for logger
 (org.apache.hadoop.security.Groups).
 log4j:WARN Please initialize the log4j system properly.
 log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
 more info.
 Using Spark's repl log4j profile:
 org/apache/spark/log4j-defaults-repl.properties
 To adjust logging level use sc.setLogLevel(INFO)
 Welcome to
     __
  / __/__  ___ _/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 1.5.0-SNAPSHOT
   /_/

 Using *Scala version 2.10.4* (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
 Type in expressions to have them evaluated.


Something is deeply wrong with your build.

iulian



 Type :help for more information.
 15/06/29 00:42:20 ERROR ActorSystemImpl: Uncaught fatal error from thread
 [sparkDriver-akka.remote.default-remote-dispatcher-6] shutting down
 ActorSystem [sparkDriver]
 java.lang.VerifyError: class akka.remote.WireFormats$AkkaControlMessage
 overrides final method
 getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
 at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
 at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
 at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.constructControlMessagePdu(AkkaPduCodec.scala:231)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.init(AkkaPduCodec.scala:153)
 at akka.remote.transport.AkkaPduProtobufCodec$.clinit(AkkaPduCodec.scala)
 at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:733)
 at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:703)

 What am I doing wrong?




-- 

--
Iulian Dragos

--
Reactive Apps on the JVM
www.typesafe.com


Re: Spark 1.5.0-SNAPSHOT broken with Scala 2.11

2015-06-29 Thread Steve Loughran

On 29 Jun 2015, at 11:27, Iulian Dragoș 
iulian.dra...@typesafe.commailto:iulian.dra...@typesafe.com wrote:



On Mon, Jun 29, 2015 at 3:02 AM, Alessandro Baretta 
alexbare...@gmail.commailto:alexbare...@gmail.com wrote:
I am building the current master branch with Scala 2.11 following these 
instructions:



Type :help for more information.
15/06/29 00:42:20 ERROR ActorSystemImpl: Uncaught fatal error from thread 
[sparkDriver-akka.remote.default-remote-dispatcher-6] shutting down ActorSystem 
[sparkDriver]
java.lang.VerifyError: class akka.remote.WireFormats$AkkaControlMessage 
overrides final method getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at 
akka.remote.transport.AkkaPduProtobufCodec$.constructControlMessagePdu(AkkaPduCodec.scala:231)
at akka.remote.transport.AkkaPduProtobufCodec$.init(AkkaPduCodec.scala:153)
at akka.remote.transport.AkkaPduProtobufCodec$.clinit(AkkaPduCodec.scala)
at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:733)
at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:703)

What am I doing wrong?




oh, that's just the version of the protoc protobuf compiler generating code 
that implementation classes aren't compatible with, and/or the version of 
probuf.jar on the classpath. Google's libraries are turning out to be 
surprisingly brittle that way.

when you type  protoc --version on the command line, you should expect to see, 
libprotoc 2.5.0'; and have protobuf-2.5.0 on the classpath. If neither of 
those conditions are met: fix them

-Steve



Re: Spark 1.5.0-SNAPSHOT broken with Scala 2.11

2015-06-29 Thread Alessandro Baretta
Steve,

It was indeed a protocol buffers issue. I am able to build spark now.
Thanks.

On Mon, Jun 29, 2015 at 7:37 AM, Steve Loughran ste...@hortonworks.com
wrote:


  On 29 Jun 2015, at 11:27, Iulian Dragoș iulian.dra...@typesafe.com
 wrote:



 On Mon, Jun 29, 2015 at 3:02 AM, Alessandro Baretta alexbare...@gmail.com
  wrote:

  I am building the current master branch with Scala 2.11 following these
 instructions:




  Type :help for more information.
 15/06/29 00:42:20 ERROR ActorSystemImpl: Uncaught fatal error from thread
 [sparkDriver-akka.remote.default-remote-dispatcher-6] shutting down
 ActorSystem [sparkDriver]
 java.lang.VerifyError: class akka.remote.WireFormats$AkkaControlMessage
 overrides final method
 getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
 at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
 at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
 at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.constructControlMessagePdu(AkkaPduCodec.scala:231)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.init(AkkaPduCodec.scala:153)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.clinit(AkkaPduCodec.scala)
 at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:733)
 at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:703)

  What am I doing wrong?




  oh, that's just the version of the protoc protobuf compiler generating
 code that implementation classes aren't compatible with, and/or the version
 of probuf.jar on the classpath. Google's libraries are turning out to be
 surprisingly brittle that way.

 when you type  protoc --version on the command line, you should expect to
 see, libprotoc 2.5.0'; and have protobuf-2.5.0 on the classpath. If
 neither of those conditions are met: fix them

  -Steve




Spark 1.5.0-SNAPSHOT broken with Scala 2.11

2015-06-28 Thread Alessandro Baretta
I am building the current master branch with Scala 2.11 following these
instructions:

Building for Scala 2.11

To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11
 property:

dev/change-version-to-2.11.sh
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package


Here's what I'm seeing:

log4j:WARN No appenders could be found for logger
(org.apache.hadoop.security.Groups).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Using Spark's repl log4j profile:
org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel(INFO)
Welcome to
    __
 / __/__  ___ _/ /__
_\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.5.0-SNAPSHOT
  /_/

Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
15/06/29 00:42:20 ERROR ActorSystemImpl: Uncaught fatal error from thread
[sparkDriver-akka.remote.default-remote-dispatcher-6] shutting down
ActorSystem [sparkDriver]
java.lang.VerifyError: class akka.remote.WireFormats$AkkaControlMessage
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at
akka.remote.transport.AkkaPduProtobufCodec$.constructControlMessagePdu(AkkaPduCodec.scala:231)
at
akka.remote.transport.AkkaPduProtobufCodec$.init(AkkaPduCodec.scala:153)
at akka.remote.transport.AkkaPduProtobufCodec$.clinit(AkkaPduCodec.scala)
at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:733)
at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:703)

What am I doing wrong?


Re: Spark 1.5.0-SNAPSHOT broken with Scala 2.11

2015-06-28 Thread Ted Yu
Spark-Master-Scala211-Compile build is green.

However it is not clear what the actual command is:

[EnvInject] - Variables injected successfully.
[Spark-Master-Scala211-Compile] $ /bin/bash /tmp/hudson8945334776362889961.sh


FYI


On Sun, Jun 28, 2015 at 6:02 PM, Alessandro Baretta alexbare...@gmail.com
wrote:

 I am building the current master branch with Scala 2.11 following these
 instructions:

 Building for Scala 2.11

 To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11
  property:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package


 Here's what I'm seeing:

 log4j:WARN No appenders could be found for logger
 (org.apache.hadoop.security.Groups).
 log4j:WARN Please initialize the log4j system properly.
 log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
 more info.
 Using Spark's repl log4j profile:
 org/apache/spark/log4j-defaults-repl.properties
 To adjust logging level use sc.setLogLevel(INFO)
 Welcome to
     __
  / __/__  ___ _/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 1.5.0-SNAPSHOT
   /_/

 Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
 Type in expressions to have them evaluated.
 Type :help for more information.
 15/06/29 00:42:20 ERROR ActorSystemImpl: Uncaught fatal error from thread
 [sparkDriver-akka.remote.default-remote-dispatcher-6] shutting down
 ActorSystem [sparkDriver]
 java.lang.VerifyError: class akka.remote.WireFormats$AkkaControlMessage
 overrides final method
 getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
 at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
 at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
 at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.constructControlMessagePdu(AkkaPduCodec.scala:231)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.init(AkkaPduCodec.scala:153)
 at akka.remote.transport.AkkaPduProtobufCodec$.clinit(AkkaPduCodec.scala)
 at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:733)
 at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:703)

 What am I doing wrong?




Re: Spark 1.5.0-SNAPSHOT broken with Scala 2.11

2015-06-28 Thread Josh Rosen
The 2.11 compile build is going to be green because this is an issue with
tests, not compilation.

On Sun, Jun 28, 2015 at 6:30 PM, Ted Yu yuzhih...@gmail.com wrote:

 Spark-Master-Scala211-Compile build is green.

 However it is not clear what the actual command is:

 [EnvInject] - Variables injected successfully.
 [Spark-Master-Scala211-Compile] $ /bin/bash /tmp/hudson8945334776362889961.sh


 FYI


 On Sun, Jun 28, 2015 at 6:02 PM, Alessandro Baretta alexbare...@gmail.com
  wrote:

 I am building the current master branch with Scala 2.11 following these
 instructions:

 Building for Scala 2.11

 To produce a Spark package compiled with Scala 2.11, use the -Dscala-2.11
  property:

 dev/change-version-to-2.11.sh
 mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package


 Here's what I'm seeing:

 log4j:WARN No appenders could be found for logger
 (org.apache.hadoop.security.Groups).
 log4j:WARN Please initialize the log4j system properly.
 log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
 more info.
 Using Spark's repl log4j profile:
 org/apache/spark/log4j-defaults-repl.properties
 To adjust logging level use sc.setLogLevel(INFO)
 Welcome to
     __
  / __/__  ___ _/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 1.5.0-SNAPSHOT
   /_/

 Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
 Type in expressions to have them evaluated.
 Type :help for more information.
 15/06/29 00:42:20 ERROR ActorSystemImpl: Uncaught fatal error from thread
 [sparkDriver-akka.remote.default-remote-dispatcher-6] shutting down
 ActorSystem [sparkDriver]
 java.lang.VerifyError: class akka.remote.WireFormats$AkkaControlMessage
 overrides final method
 getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
 at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
 at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
 at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.constructControlMessagePdu(AkkaPduCodec.scala:231)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.init(AkkaPduCodec.scala:153)
 at
 akka.remote.transport.AkkaPduProtobufCodec$.clinit(AkkaPduCodec.scala)
 at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:733)
 at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:703)

 What am I doing wrong?