Again this is probably not the place for CDH-specific questions, and
this one is already answered at
http://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/CDH-5-3-0-container-cannot-be-fetched-because-of/m-p/23497#M478
On Fri, Jan 9, 2015 at 9:23 AM, Mukesh Jha wrote:
> I am using pre
I am using pre built *spark-1.2.0-bin-hadoop2.4* from *[1] *to submit spark
applications to yarn, I cannot find the pre built spark for *CDH-5.x*
versions. So, In my case the org.apache.hadoop.yarn.util.ConverterUtils class
is coming from the spark-assembly-1.1.0-hadoop2.4.0.jar which is part of
th
Just to add to Sandy's comment, check your client configuration
(generally in /etc/spark/conf). If you're using CM, you may need to
run the "Deploy Client Configuration" command on the cluster to update
the configs to match the new version of CDH.
On Thu, Jan 8, 2015 at 11:38 AM, Sandy Ryza wrote
Hi Mukesh,
Those line numbers in ConverterUtils in the stack trace don't appear to
line up with CDH 5.3:
https://github.com/cloudera/hadoop-common/blob/cdh5-2.5.0_5.3.0/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common/src/main/java/org/apache/hadoop/yarn/util/ConverterUtils.java
Is it possible
On Thu, Jan 8, 2015 at 5:08 PM, Mukesh Jha wrote:
> Hi Experts,
>
> I am running spark inside YARN job.
>
> The spark-streaming job is running fine in CDH-5.0.0 but after the upgrade
> to 5.3.0 it cannot fetch containers with the below errors. Looks like the
> container id is incorrect and a stri
Hi Experts,
I am running spark inside YARN job.
The spark-streaming job is running fine in CDH-5.0.0 but after the upgrade
to 5.3.0 it cannot fetch containers with the below errors. Looks like the
container id is incorrect and a string is present in a pace where it's
expecting a number.
java.l