How did you specify the HDFS path?  When i put

spark.eventLog.dir       hdfs://
crosby.research.intel-research.net:54310/tmp/spark-events

in my spark-defaults.conf file, I receive the following error:

An error occurred while calling
None.org.apache.spark.api.java.JavaSparkContext.
: java.io.IOException: Call to
crosby.research.intel-research.net/10.212.84.53:54310 failed on local
exception: java.io.EOFException

-Brad


On Thu, Aug 28, 2014 at 12:26 PM, SK <skrishna...@gmail.com> wrote:

> I was able to recently solve this problem for standalone mode. For this
> mode,
> I did not use a history server. Instead, I set spark.eventLog.dir (in
> conf/spark-defaults.conf) to a directory in hdfs (basically this directory
> should be in a place that is writable by the master and accessible globally
> to all the nodes).
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-webUI-application-details-page-tp3490p13055.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to