Hi All,
I am trying to submit the spark application using yarn rest API. I am able
to submit the application but final status shows as 'UNDEFINED.'. Couple of
other observations:
User shows as Dr.who
Application type is empty though I specify it as Spark
Is any one had this problem before?
I am
ATE_INTERVAL_MS =
> conf.getInt("spark.history.fs.updateInterval",
> conf.getInt("spark.history.updateInterval", 10)) * 1000
>
> private val retainedApplications =
> conf.getInt("spark.history.retainedApplications", 50)
>
>
> On Tue, Mar 10, 2015 at 12:37
Hi All,
What are the default values for the following conf properities if we don't
set in the conf file?
# spark.history.fs.updateInterval 10
# spark.history.retainedApplications 500
Regards,
Srini.
this file manually it is showing
application in the UI. So the problem is with application which is not
calling Stop method on the spark context.
Thank you and Todd for helping. Hopefully I will be able to apply these on
the actual cluster.
Regards,
Srini.
On Wed, Mar 4, 2015 at 10:20 AM, Srini Ka
-bin-hadoop2.4/bin/tmp/spark-events
On Wed, Mar 4, 2015 at 10:15 AM, Marcelo Vanzin wrote:
> On Wed, Mar 4, 2015 at 10:08 AM, Srini Karri wrote:
> > spark.executor.extraClassPath
> >
> D:\\Apache\\spark-1.2.1-bin-hadoop2\\spark-1.2.1-bin-hadoop2.4\\bin\\classes
> > sp
ished, and will
> take you to the history server to view the app's UI.
>
>
>
> On Tue, Mar 3, 2015 at 9:47 AM, Srini Karri wrote:
> > Hi All,
> >
> > I am having trouble finding data related to my requirement. Here is the
> > context, I have trie
Hi All,
I am having trouble finding data related to my requirement. Here is the
context, I have tried Standalone Spark Installation on Windows, I am able
to submit the logs, able to see the history of events. My question is, is
it possible to achieve the same monitoring UI experience with Yarn Clu