Marcelo,

The error below is from the application logs. The spark streaming context is 
initialized and actively processing data when yarn claims that the context is 
not initialized.

There are a number of errors, but they're all associated with the ssc shutting 
down.

Regards,

Bryan Jeffrey

-----Original Message-----
From: "Marcelo Vanzin" <van...@cloudera.com>
Sent: ‎9/‎23/‎2015 5:55 PM
To: "Bryan Jeffrey" <bryan.jeff...@gmail.com>
Cc: "user" <user@spark.apache.org>
Subject: Re: Yarn Shutting Down Spark Processing

Did you look at your application's logs (using the "yarn logs" command?).

That error means your application is failing to create a SparkContext.
So either you have a bug in your code, or there will be some error in
the log pointing at the actual reason for the failure.

On Tue, Sep 22, 2015 at 5:49 PM, Bryan Jeffrey <bryan.jeff...@gmail.com> wrote:
> Hello.
>
> I have a Spark streaming job running on a cluster managed by Yarn.  The
> spark streaming job starts and receives data from Kafka.  It is processing
> well and then after several seconds I see the following error:
>
> 15/09/22 14:53:49 ERROR yarn.ApplicationMaster: SparkContext did not
> initialize after waiting for 100000 ms. Please check earlier log output for
> errors. Failing the application.
> 15/09/22 14:53:49 INFO yarn.ApplicationMaster: Final app status: FAILED,
> exitCode: 13, (reason: Timed out waiting for SparkContext.)
>
> The spark process is then (obviously) shut down by Yarn.
>
> What do I need to change to allow Yarn to initialize Spark streaming (vs.
> batch) jobs?
>
> Thank you,
>
> Bryan Jeffrey



-- 
Marcelo

Reply via email to