Github user rxin commented on a diff in the pull request:
https://github.com/apache/spark/pull/856#discussion_r12977642
--- Diff: docs/configuration.md ---
@@ -750,12 +779,95 @@ The following variables can be set in `spark-env.sh`:
</tr>
</table>
-In addition to the above, there are also options for setting up the Spark
[standalone cluster scripts](spark-standalone.html#cluster-launch-scripts),
such as number of cores to use on each machine and maximum memory.
+In addition to the above, there are also options for setting up the Spark
[standalone cluster
+scripts](spark-standalone.html#cluster-launch-scripts), such as number of
cores to use on each
+machine and maximum memory.
-Since `spark-env.sh` is a shell script, some of these can be set
programmatically -- for example, you might
-compute `SPARK_LOCAL_IP` by looking up the IP of a specific network
interface.
+Since `spark-env.sh` is a shell script, some of these can be set
programmatically -- for example,
+you might compute `SPARK_LOCAL_IP` by looking up the IP of a specific
network interface.
# Configuring Logging
-Spark uses [log4j](http://logging.apache.org/log4j/) for logging. You can
configure it by adding a `log4j.properties`
-file in the `conf` directory. One way to start is to copy the existing
`log4j.properties.template` located there.
+Spark uses [log4j](http://logging.apache.org/log4j/) for logging. You can
configure it by adding a
+`log4j.properties` file in the `conf` directory. One way to start is to
copy the existing
+`log4j.properties.template` located there.
+
+# Configuring ports for network security
--- End diff --
Capitalize the title to be consistent with the rest of the file
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---