Hm. Problem is, core depends directly on it:
[error]
/Users/srowen/Documents/spark/core/src/main/scala/org/apache/spark/SecurityManager.scala:25:
object sasl is not a member of package org.apache.spark.network
[error] import org.apache.spark.network.sasl.SecretKeyHolder
[error] ^
[error]
/Users/srowen/Documents/spark/core/src/main/scala/org/apache/spark/SecurityManager.scala:147:
not found: type SecretKeyHolder
[error] private[spark] class SecurityManager(sparkConf: SparkConf)
extends Logging with SecretKeyHolder {
[error]
^
[error]
/Users/srowen/Documents/spark/core/src/main/scala/org/apache/spark/network/netty/NettyBlockTransferService.scala:29:
object RetryingBlockFetcher is not a member of package
org.apache.spark.network.shuffle
[error] import org.apache.spark.network.shuffle.{RetryingBlockFetcher,
BlockFetchingListener, OneForOneBlockFetcher}
[error] ^
[error]
/Users/srowen/Documents/spark/core/src/main/scala/org/apache/spark/deploy/worker/StandaloneWorkerShuffleService.scala:23:
object sasl is not a member of package org.apache.spark.network
[error] import org.apache.spark.network.sasl.SaslRpcHandler
[error]
...
[error]
/Users/srowen/Documents/spark/core/src/main/scala/org/apache/spark/storage/BlockManager.scala:124:
too many arguments for constructor ExternalShuffleClient: (x$1:
org.apache.spark.network.util.TransportConf, x$2:
String)org.apache.spark.network.shuffle.ExternalShuffleClient
[error] new
ExternalShuffleClient(SparkTransportConf.fromSparkConf(conf),
securityManager,
[error] ^
[error]
/Users/srowen/Documents/spark/core/src/main/scala/org/apache/spark/storage/BlockManager.scala:39:
object protocol is not a member of package
org.apache.spark.network.shuffle
[error] import org.apache.spark.network.shuffle.protocol.ExecutorShuffleInfo
[error] ^
[error]
/Users/srowen/Documents/spark/core/src/main/scala/org/apache/spark/storage/BlockManager.scala:214:
not found: type ExecutorShuffleInfo
[error] val shuffleConfig = new ExecutorShuffleInfo(
[error]
...
More refactoring needed? Either to support YARN alpha as a separate
shuffle module, or sever this dependency?
Of course this goes away when yarn-alpha goes away too.
On Sat, Nov 8, 2014 at 7:45 AM, Patrick Wendell <[email protected]> wrote:
> I bet it doesn't work. +1 on isolating it's inclusion to only the
> newer YARN API's.
>
> - Patrick
>
> On Fri, Nov 7, 2014 at 11:43 PM, Sean Owen <[email protected]> wrote:
>> I noticed that this doesn't compile:
>>
>> mvn -Pyarn-alpha -Phadoop-0.23 -Dhadoop.version=0.23.7 -DskipTests clean
>> package
>>
>> [error] warning: [options] bootstrap class path not set in conjunction
>> with -source 1.6
>> [error]
>> /Users/srowen/Documents/spark/network/yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:26:
>> error: cannot find symbol
>> [error] import org.apache.hadoop.yarn.server.api.AuxiliaryService;
>> [error] ^
>> [error] symbol: class AuxiliaryService
>> [error] location: package org.apache.hadoop.yarn.server.api
>> [error]
>> /Users/srowen/Documents/spark/network/yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:27:
>> error: cannot find symbol
>> [error] import
>> org.apache.hadoop.yarn.server.api.ApplicationInitializationContext;
>> [error] ^
>> ...
>>
>> Should it work? if not shall I propose to enable the service only with
>> -Pyarn?
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]