I think Cloudera only started adding Spark to CDH4 starting with 4.6,
so maybe that's the minimum if you want to try out Spark on CDH4.

On Thu, Aug 7, 2014 at 11:22 AM, Sean Owen <so...@cloudera.com> wrote:
> Yep, this command given in the Spark docs is correct:
>
> mvn -Pyarn-alpha -Dhadoop.version=2.0.0-cdh4.2.0 -DskipTests clean package
>
> and while I also would hope that this works, it doesn't compile:
>
> mvn -Pyarn -Dhadoop.version=2.0.0-cdh4.6.0 -DskipTests clean package
>
> I believe later 4.x includes effectively "YARN beta", which Spark
> doesn't support. In fact I think it's just a bonus that ~4.2 is enough
> like "YARN alpha", which is supported as a one-off as I understand, to
> work.
>
> All bets are off before YARN stable really, in my book.
>
> On Thu, Aug 7, 2014 at 6:32 PM, Marcelo Vanzin <van...@cloudera.com> wrote:
>> Can you try with "-Pyarn" instead of "-Pyarn-alpha"?
>>
>> I'm pretty sure CDH4 ships with the newer Yarn API.
>>
>> On Thu, Aug 7, 2014 at 8:11 AM, linkpatrickliu <linkpatrick...@live.com> 
>> wrote:
>>> Hi,
>>>
>>> Following the "" document:
>>>
>>> # Cloudera CDH 4.2.0
>>> mvn -Pyarn-alpha -Dhadoop.version=2.0.0-cdh4.2.0 -DskipTests clean package
>>>
>>> I compile Spark 1.0.2 with this cmd:
>>> mvn -Pyarn-alpha -Dhadoop.version=2.0.0-cdh4.6.0 -DskipTests clean package
>>>
>>> However, I got two errors:
>>>
>>> [INFO] Compiling 14 Scala sources to
>>> /Users/liuyufan/Develop/github/apache/spark-1.0.2/yarn/alpha/target/scala-2.10/classes...
>>> [ERROR]
>>> /Users/liuyufan/Develop/github/apache/spark-1.0.2/yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocationHandler.scala:36:
>>> object AMResponse is not a member of package
>>> org.apache.hadoop.yarn.api.records
>>> [ERROR] import org.apache.hadoop.yarn.api.records.{AMResponse,
>>> ApplicationAttemptId}
>>> [ERROR]        ^
>>> [ERROR]
>>> /Users/liuyufan/Develop/github/apache/spark-1.0.2/yarn/alpha/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocationHandler.scala:114:
>>> value getAMResponse is not a member of
>>> org.apache.hadoop.yarn.api.protocolrecords.AllocateResponse
>>> [ERROR]     val amResp =
>>> allocateExecutorResources(executorsToRequest).getAMResponse
>>> [ERROR]                                                                ^
>>> [ERROR] two errors found
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] Reactor Summary:
>>> [INFO]
>>> [INFO] Spark Project Parent POM .......................... SUCCESS [2.243s]
>>> [INFO] Spark Project YARN Parent POM ..................... SUCCESS [4.139s]
>>> [INFO] Spark Project YARN Alpha API ...................... FAILURE [8.906s]
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] BUILD FAILURE
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] Total time: 15.587s
>>> [INFO] Finished at: Thu Aug 07 23:06:21 CST 2014
>>> [INFO] Final Memory: 21M/145M
>>> [INFO]
>>> ------------------------------------------------------------------------
>>>
>>>
>>> Anyone can help solve this?
>>>
>>>
>>>
>>> --
>>> View this message in context: 
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Compile-error-Spark-1-0-2-against-cloudera-2-0-0-cdh4-6-0-error-tp11656.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>
>>
>>
>> --
>> Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to