[ 
https://issues.apache.org/jira/browse/SPARK-19316?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15840348#comment-15840348
 ] 

Jisoo Kim commented on SPARK-19316:
-----------------------------------

Found the duplicate and made a PR to resolve the issue 
https://github.com/apache/spark/pull/16714. I wasn't sure if I needed to 
include this JIRA ticket to the name so I only mentioned the old one.

> Spark event logs are huge compared to 1.5.2
> -------------------------------------------
>
>                 Key: SPARK-19316
>                 URL: https://issues.apache.org/jira/browse/SPARK-19316
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0
>            Reporter: Jisoo Kim
>
> I have a Spark application with many tasks (more than 40k). The event logs 
> for such application used to be around 2g when I was using Spark 1.5.2 
> standalone cluster. Now that I am using Spark 2.0 with Mesos, the size of the 
> event log of such application drastically increased from 2g to 60g with a 
> similar number of tasks. This is affecting Spark History Server since it is 
> having trouble reading such huge event log. I wonder the increase in a size 
> of an event log is expected in Spark 2.0.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to