Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/880#discussion_r13158307
  
    --- Diff: docs/configuration.md ---
    @@ -201,54 +282,41 @@ Apart from these, the following properties are also 
available, and may be useful
       </td>
     </tr>
     <tr>
    -  <td><code>spark.ui.filters</code></td>
    -  <td>None</td>
    +  <td><code>spark.ui.killEnabled</code></td>
    +  <td>true</td>
       <td>
    -    Comma separated list of filter class names to apply to the Spark web 
ui. The filter should be a
    -    standard javax servlet Filter. Parameters to each filter can also be 
specified by setting a
    -    java system property of spark.&lt;class name of 
filter&gt;.params='param1=value1,param2=value2'
    -    (e.g. -Dspark.ui.filters=com.test.filter1 
-Dspark.com.test.filter1.params='param1=foo,param2=testing')
    +    Allows stages and corresponding jobs to be killed from the web ui.
       </td>
     </tr>
     <tr>
    -  <td><code>spark.ui.acls.enable</code></td>
    +  <td><code>spark.eventLog.enabled</code></td>
       <td>false</td>
       <td>
    -    Whether spark web ui acls should are enabled. If enabled, this checks 
to see if the user has
    -    access permissions to view the web ui. See 
<code>spark.ui.view.acls</code> for more details.
    -    Also note this requires the user to be known, if the user comes across 
as null no checks
    -    are done. Filters can be used to authenticate and set the user.
    -  </td>
    -</tr>
    -<tr>
    -  <td><code>spark.ui.view.acls</code></td>
    -  <td>Empty</td>
    -  <td>
    -    Comma separated list of users that have view access to the spark web 
ui. By default only the
    -    user that started the Spark job has view access.
    -  </td>
    -</tr>
    -<tr>
    -  <td><code>spark.ui.killEnabled</code></td>
    -  <td>true</td>
    -  <td>
    -    Allows stages and corresponding jobs to be killed from the web ui.
    +    Whether to log spark events, useful for reconstructing the Web UI 
after the application has
    +    finished.
       </td>
     </tr>
     <tr>
    -  <td><code>spark.shuffle.compress</code></td>
    -  <td>true</td>
    +  <td><code>spark.eventLog.compress</code></td>
    +  <td>false</td>
       <td>
    -    Whether to compress map output files. Generally a good idea.
    +    Whether to compress logged events, if 
<code>spark.eventLog.enabled</code> is true.
    --- End diff --
    
    It doesn't do application-aware compression, it just uses a standard stream 
compression algorithm.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to