Hi,

It appears that the last commit [1] broke the build. Is anyone working
on it? I can when told so.

➜  spark git:(master) ✗ ./build/mvn -Pyarn -Phadoop-2.6
-Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
-DskipTests clean install
...
[info] Compiling 8 Scala sources and 1 Java source to
/Users/jacek/dev/oss/spark/external/flume/target/scala-2.11/classes...
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:33:
object jboss is not a member of package org
[error] import org.jboss.netty.handler.codec.compression._
[error]            ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:31:
object jboss is not a member of package org
[error] import org.jboss.netty.channel.{ChannelPipeline,
ChannelPipelineFactory, Channels}
[error]            ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:32:
object jboss is not a member of package org
[error] import org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory
[error]            ^
[warn] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelPipelineFactory not found
- continuing with a stub.
[warn] Class org.jboss.netty.handler.execution.ExecutionHandler not
found - continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[warn] Class org.jboss.netty.handler.execution.ExecutionHandler not
found - continuing with a stub.
[warn] Class org.jboss.netty.channel.group.ChannelGroup not found -
continuing with a stub.
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:149:
not found: type NioServerSocketChannelFactory
[error]       val channelFactory = new
NioServerSocketChannelFactory(Executors.newCachedThreadPool(),
[error]                                ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:196:
not found: type ChannelPipelineFactory
[error]   class CompressionChannelPipelineFactory extends
ChannelPipelineFactory {
[error]                                                   ^
[error] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[error] Class org.jboss.netty.channel.ChannelPipelineFactory not found
- continuing with a stub.
[error] Class org.jboss.netty.handler.execution.ExecutionHandler not
found - continuing with a stub.
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:197:
not found: type ChannelPipeline
[error]     def getPipeline(): ChannelPipeline = {
[error]                        ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:198:
not found: value Channels
[error]       val pipeline = Channels.pipeline()
[error]                      ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:199:
not found: type ZlibEncoder
[error]       val encoder = new ZlibEncoder(6)
[error]                         ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:29:
object jboss is not a member of package org
[error] import org.jboss.netty.channel.socket.nio.NioClientSocketChannelFactory
[error]            ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:73:
not found: type NioClientSocketChannelFactory
[error]     new NioClientSocketChannelFactory(channelFactoryExecutor,
channelFactoryExecutor)
[error]         ^
[warn] Class org.jboss.netty.channel.ChannelFuture not found -
continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[warn] Class org.jboss.netty.channel.ChannelUpstreamHandler not found
- continuing with a stub.
[error] Class org.jboss.netty.channel.ChannelFactory not found -
continuing with a stub.
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:33:
object jboss is not a member of package org
[error] import org.jboss.netty.channel.ChannelPipeline
[error]            ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:34:
object jboss is not a member of package org
[error] import org.jboss.netty.channel.socket.SocketChannel
[error]            ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:35:
object jboss is not a member of package org
[error] import org.jboss.netty.channel.socket.nio.NioClientSocketChannelFactory
[error]            ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:36:
object jboss is not a member of package org
[error] import org.jboss.netty.handler.codec.compression.{ZlibDecoder,
ZlibEncoder}
[error]            ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:107:
not found: type NioClientSocketChannelFactory
[error]     extends NioClientSocketChannelFactory {
[error]             ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:79:
overloaded method constructor NettyTransceiver with alternatives:
[error]   (x$1: java.net.InetSocketAddress,x$2:
org.jboss.netty.channel.ChannelFactory)org.apache.avro.ipc.NettyTransceiver
<and>
[error]   (x$1: java.net.InetSocketAddress,x$2:
Long)org.apache.avro.ipc.NettyTransceiver
[error]  cannot be applied to (java.net.InetSocketAddress,
FlumeTestUtils.this.CompressionChannelFactory)
[error]         new NettyTransceiver(testAddress, new
CompressionChannelFactory(6))
[error]         ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:109:
not found: type SocketChannel
[error]     override def newChannel(pipeline: ChannelPipeline):
SocketChannel = {
[error]                                                         ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:109:
not found: type ChannelPipeline
[error]     override def newChannel(pipeline: ChannelPipeline):
SocketChannel = {
[error]                                       ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:110:
not found: type ZlibEncoder
[error]       val encoder = new ZlibEncoder(compressionLevel)
[error]                         ^
[error] 
/Users/jacek/dev/oss/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeTestUtils.scala:113:
value newChannel is not a member of AnyRef
[error]       super.newChannel(pipeline)
[error]             ^
[warn] 13 warnings found
[error] 24 errors found
[error] Compile failed at Jan 11, 2016 8:43:02 AM [0.602s]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [  5.427 s]
[INFO] Spark Project Test Tags ............................ SUCCESS [  4.430 s]
[INFO] Spark Project Launcher ............................. SUCCESS [  8.698 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 16.379 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  8.800 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 11.254 s]
[INFO] Spark Project Core ................................. SUCCESS [02:04 min]
[INFO] Spark Project GraphX ............................... SUCCESS [ 16.071 s]
[INFO] Spark Project Streaming ............................ SUCCESS [ 35.991 s]
[INFO] Spark Project Catalyst ............................. SUCCESS [01:34 min]
[INFO] Spark Project SQL .................................. SUCCESS [01:08 min]
[INFO] Spark Project ML Library ........................... SUCCESS [01:18 min]
[INFO] Spark Project Tools ................................ SUCCESS [  5.614 s]
[INFO] Spark Project Hive ................................. SUCCESS [ 40.699 s]
[INFO] Spark Project Docker Integration Tests ............. SUCCESS [  2.102 s]
[INFO] Spark Project REPL ................................. SUCCESS [  6.258 s]
[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [  6.547 s]
[INFO] Spark Project YARN ................................. SUCCESS [ 12.898 s]
[INFO] Spark Project Hive Thrift Server ................... SUCCESS [  9.361 s]
[INFO] Spark Project Assembly ............................. SUCCESS [ 40.149 s]
[INFO] Spark Project External Twitter ..................... SUCCESS [  7.137 s]
[INFO] Spark Project External Flume Sink .................. SUCCESS [  6.203 s]
[INFO] Spark Project External Flume ....................... FAILURE [  1.010 s]
[INFO] Spark Project External Flume Assembly .............. SKIPPED

[1] 
https://github.com/apache/spark/commit/3ab0138b0fe0f9208b4b476855294a7c729583b7

Pozdrawiam,
Jacek

Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to