Github user mateiz commented on a diff in the pull request:

    https://github.com/apache/spark/pull/880#discussion_r13110289
  
    --- Diff: docs/configuration.md ---
    @@ -705,40 +698,69 @@ Apart from these, the following properties are also 
available, and may be useful
       </td>
     </tr>
     <tr>
    -  <td><code>spark.task.cpus</code></td>
    -  <td>1</td>
    +  <td><code>spark.ui.filters</code></td>
    +  <td>None</td>
       <td>
    -    Number of cores to allocate for each task.
    +    Comma separated list of filter class names to apply to the Spark web 
ui. The filter should be a
    +    standard javax servlet Filter. Parameters to each filter can also be 
specified by setting a
    +    java system property of spark.&lt;class name of 
filter&gt;.params='param1=value1,param2=value2'
    +    (e.g. -Dspark.ui.filters=com.test.filter1
    +    -Dspark.com.test.filter1.params='param1=foo,param2=testing')
       </td>
     </tr>
     <tr>
    -  <td><code>spark.executor.extraJavaOptions</code></td>
    -  <td>(none)</td>
    +  <td><code>spark.ui.acls.enable</code></td>
    +  <td>false</td>
       <td>
    -    A string of extra JVM options to pass to executors. For instance, GC 
settings or other
    -    logging. Note that it is illegal to set Spark properties or heap size 
settings with this 
    -    option. Spark properties should be set using a SparkConf object or the 
    -    spark-defaults.conf file used with the spark-submit script. Heap size 
settings can be set
    -    with spark.executor.memory.
    +    Whether spark web ui acls should are enabled. If enabled, this checks 
to see if the user has
    +    access permissions to view the web ui. See 
<code>spark.ui.view.acls</code> for more details.
    +    Also note this requires the user to be known, if the user comes across 
as null no checks
    +    are done. Filters can be used to authenticate and set the user.
       </td>
     </tr>
     <tr>
    -  <td><code>spark.executor.extraClassPath</code></td>
    -  <td>(none)</td>
    +  <td><code>spark.ui.view.acls</code></td>
    +  <td>Empty</td>
       <td>
    -    Extra classpath entries to append to the classpath of executors. This 
exists primarily
    -    for backwards-compatibility with older versions of Spark. Users 
typically should not need
    -    to set this option.
    +    Comma separated list of users that have view access to the spark web 
ui. By default only the
    +    user that started the Spark job has view access.
       </td>
     </tr>
    +</table>
    +
    +#### Spark Streaming
    +<table class="table">
    +<tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
     <tr>
    -  <td><code>spark.executor.extraLibraryPath</code></td>
    -  <td>(none)</td>
    +  <td><code>spark.cleaner.ttl</code></td>
    --- End diff --
    
    This is not really a streaming-specifc setting. I'd just move it into 
execution behavior since it's also useful for other long-running apps.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to