Repository: spark
Updated Branches:
  refs/heads/branch-1.6 47a918661 -> 5153356d5


[MINOR][DOCS] typo in docs/configuration.md

`<\code>` end tag missing backslash in
docs/configuration.md{L308-L339}

ref #8795

Author: Kai Jiang <jiang...@gmail.com>

Closes #9715 from vectorijk/minor-typo-docs.

(cherry picked from commit 9a73b33a9a440d7312b92df9f6a9b9e17917b582)
Signed-off-by: Sean Owen <so...@cloudera.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5153356d
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5153356d
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/5153356d

Branch: refs/heads/branch-1.6
Commit: 5153356d5df37c6b38fd9cb6184a84d87387dd49
Parents: 47a9186
Author: Kai Jiang <jiang...@gmail.com>
Authored: Sat Nov 14 11:59:37 2015 +0000
Committer: Sean Owen <so...@cloudera.com>
Committed: Sat Nov 14 11:59:50 2015 +0000

----------------------------------------------------------------------
 docs/configuration.md | 10 +++++-----
 1 file changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/5153356d/docs/configuration.md
----------------------------------------------------------------------
diff --git a/docs/configuration.md b/docs/configuration.md
index c276e8e..d961f43 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -305,7 +305,7 @@ Apart from these, the following properties are also 
available, and may be useful
   <td>daily</td>
   <td>
     Set the time interval by which the executor logs will be rolled over.
-    Rolling is disabled by default. Valid values are <code>daily</code>, 
<code>hourly<code>, <code>minutely<code> or
+    Rolling is disabled by default. Valid values are <code>daily</code>, 
<code>hourly</code>, <code>minutely</code> or
     any interval in seconds. See 
<code>spark.executor.logs.rolling.maxRetainedFiles</code>
     for automatic cleaning of old logs.
   </td>
@@ -330,13 +330,13 @@ Apart from these, the following properties are also 
available, and may be useful
   <td><code>spark.python.profile</code></td>
   <td>false</td>
   <td>
-    Enable profiling in Python worker, the profile result will show up by 
<code>sc.show_profiles()<code>,
+    Enable profiling in Python worker, the profile result will show up by 
<code>sc.show_profiles()</code>,
     or it will be displayed before the driver exiting. It also can be dumped 
into disk by
-    <code>sc.dump_profiles(path)<code>. If some of the profile results had 
been displayed manually,
+    <code>sc.dump_profiles(path)</code>. If some of the profile results had 
been displayed manually,
     they will not be displayed automatically before driver exiting.
 
-    By default the <code>pyspark.profiler.BasicProfiler<code> will be used, 
but this can be overridden by
-    passing a profiler class in as a parameter to the <code>SparkContext<code> 
constructor.
+    By default the <code>pyspark.profiler.BasicProfiler</code> will be used, 
but this can be overridden by
+    passing a profiler class in as a parameter to the 
<code>SparkContext</code> constructor.
   </td>
 </tr>
 <tr>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to