Hi,
YARN has already reached stable release 2.2.0. But due to API change Spark
cannot build.
Here is the build log:
root@master:~/spark# SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true ./sbt/sbt
assembly
[info] Loading project definition from /root/spark/project/project
[info] Loading project definition from /root/spark/project
[info] Set current project to root (in build file:/root/spark/)
[info] Compiling 8 Scala sources to
/root/spark/yarn/target/scala-2.9.3/classes...
[error]
/root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:42:
not found: type AMRMProtocol
[error] private var resourceManager: AMRMProtocol = null
[error] ^
[error]
/root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:126:
not found: type AMRMProtocol
[error] private def registerWithResourceManager(): AMRMProtocol = {
[error] ^
[error]
/root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:119:
value AM_CONTAINER_ID_ENV is not a member of object
org.apache.hadoop.yarn.api.ApplicationConstants
[error] val containerIdString =
envs.get(ApplicationConstants.AM_CONTAINER_ID_ENV)
[error] ^
[error]
/root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:131:
not found: type AMRMProtocol
[error] return rpc.getProxy(classOf[AMRMProtocol], rmAddress,
conf).asInstanceOf[AMRMProtocol]
[error] ^
[error]
/root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:131:
not found: type AMRMProtocol
[error] return rpc.getProxy(classOf[AMRMProtocol], rmAddress,
conf).asInstanceOf[AMRMProtocol]
[error]
^
[error]
/root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:138:
value setApplicationAttemptId is not a member of
org.apache.hadoop.yarn.api.protocolrecords.RegisterApplicationMasterRequest
[error] appMasterRequest.setApplicationAttemptId(appAttemptId)
[error] ^
... (CUT)
Is there plan to support YARN 2.2.0?
Cheers,
--
Pei-Lun Lee