I've heard of an open source project called Apache Mesos, which may be able
to help out in this kind of situation.


On Thu, Oct 24, 2013 at 10:28 AM, Matei Zaharia <[email protected]>wrote:

> Yup, unfortunately YARN changed its API upon releasing 2.2, which puts us
> in an awkward position because all the major current users are on the old
> YARN API (from 0.23.x and 2.0.x) but new users will try this one. We'll
> probably change the default version in Spark 0.8.1 or 0.8.2. If you look on
> this list, there were some earlier messages on how to patch Spark to
> compile against the new YARN.
>
> Matei
>
> On Oct 23, 2013, at 11:39 PM, Pei-Lun Lee <[email protected]> wrote:
>
> > Hi,
> >
> > YARN has already reached stable release 2.2.0. But due to API change
> Spark cannot build.
> >
> > Here is the build log:
> >
> > root@master:~/spark# SPARK_HADOOP_VERSION=2.2.0 SPARK_YARN=true
> ./sbt/sbt assembly
> > [info] Loading project definition from /root/spark/project/project
> > [info] Loading project definition from /root/spark/project
> > [info] Set current project to root (in build file:/root/spark/)
> > [info] Compiling 8 Scala sources to
> /root/spark/yarn/target/scala-2.9.3/classes...
> > [error]
> /root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:42:
> not found: type AMRMProtocol
> > [error]   private var resourceManager: AMRMProtocol = null
> > [error]                                ^
> > [error]
> /root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:126:
> not found: type AMRMProtocol
> > [error]   private def registerWithResourceManager(): AMRMProtocol = {
> > [error]                                              ^
> > [error]
> /root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:119:
> value AM_CONTAINER_ID_ENV is not a member of object
> org.apache.hadoop.yarn.api.ApplicationConstants
> > [error]     val containerIdString =
> envs.get(ApplicationConstants.AM_CONTAINER_ID_ENV)
> > [error]                                                           ^
> > [error]
> /root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:131:
> not found: type AMRMProtocol
> > [error]     return rpc.getProxy(classOf[AMRMProtocol], rmAddress,
> conf).asInstanceOf[AMRMProtocol]
> > [error]                                 ^
> > [error]
> /root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:131:
> not found: type AMRMProtocol
> > [error]     return rpc.getProxy(classOf[AMRMProtocol], rmAddress,
> conf).asInstanceOf[AMRMProtocol]
> > [error]
>              ^
> > [error]
> /root/spark/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala:138:
> value setApplicationAttemptId is not a member of
> org.apache.hadoop.yarn.api.protocolrecords.RegisterApplicationMasterRequest
> > [error]     appMasterRequest.setApplicationAttemptId(appAttemptId)
> > [error]                      ^
> >
> > ... (CUT)
> >
> > Is there plan to support YARN 2.2.0?
> >
> > Cheers,
> > --
> > Pei-Lun Lee
>
>

Reply via email to