I'm trying to compile the latest code, with the hadoop-version set for
2.0.0-mr1-cdh4.6.0.

I'm getting the following error, which I don't get when I don't set the
hadoop version:

[error]
/data/hdfs/1/home/nkronenfeld/git/spark-ndk/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeInputDStream.scala:156:
overloaded method constructor NioServerSocketChannelFactory with
alternatives:
[error]   (x$1: java.util.concurrent.Executor,x$2:
java.util.concurrent.Executor,x$3:
Int)org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory <and>
[error]   (x$1: java.util.concurrent.Executor,x$2:
java.util.concurrent.Executor)org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory
[error]  cannot be applied to ()
[error]       val channelFactory = new NioServerSocketChannelFactory
[error]                            ^
[error] one error found


I don't know flume from a hole in the wall - does anyone know what I can do
to fix this?


Thanks,
         -Nathan


-- 
Nathan Kronenfeld
Senior Visualization Developer
Oculus Info Inc
2 Berkeley Street, Suite 600,
Toronto, Ontario M5A 4J5
Phone:  +1-416-203-3003 x 238
Email:  nkronenf...@oculusinfo.com

Reply via email to