Yes, currently if log is aggregated, then accessing through UI is not
worked, you can create a JIRA to improve this if you would like to.

On Thu, Jun 8, 2017 at 1:43 PM, ckhari4u <ckhar...@gmail.com> wrote:

> Hey Guys,
>
> I am hitting the below issue when trying to access the STDOUT/STDERR logs
> in
> Spark History Server for the executors of a Spark application executed in
> Yarn mode. I have enabled Yarn log aggregation.
>
> Repro Steps:
>
> 1) Run the spark-shell in yarn client mode. Or run Pi job in Yarn mode.
> 2) Once the job is completed, (in the case of spark shell, exit after doing
> some simple operations), try to access the STDOUT or STDERR logs of the
> application from the Executors tab in the Spark History Server UI.
> 3) If yarn log aggregation is enabled, then logs won't be available in node
> manager's log location. But history Server is trying to access the logs
> from
> the nodemanager's log
> location({yarn.nodemanager.log-dirs}/application_${appid) giving below
> error
> in the UI:
>
>
>
> Failed redirect for container_e31_1496881617682_0003_01_000002
> ResourceManager
> RM Home
> NodeManager
> Tools
> Failed while trying to construct the redirect url to the log server. Log
> Server url may not be configured
> java.lang.Exception: Unknown container. Container either has not started or
> has already completed or doesn't belong to this node at all.
>
>
> Either Spark History Server should be able to read from the aggregated logs
> and display the logs in the UI or it should give a graceful message. As of
> now its redirecting to the NM webpage and trying to fetch the logs from the
> node managers local location.
>
>
>
>
>
> --
> View this message in context: http://apache-spark-
> developers-list.1001551.n3.nabble.com/Spark-History-
> Server-does-not-redirect-to-Yarn-aggregated-logs-for-
> container-logs-tp21706.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to