mridulm commented on code in PR #46571:
URL: https://github.com/apache/spark/pull/46571#discussion_r1603972872


##########
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##########
@@ -1317,6 +1317,16 @@ package object config {
           s" be less than or equal to 
${ByteArrayMethods.MAX_ROUNDED_ARRAY_LENGTH}.")
       .createWithDefault(64 * 1024 * 1024)
 
+  private[spark] val CHECKPOINT_DIR =
+    ConfigBuilder("spark.checkpoint.dir")
+      .doc(
+        "Equivalent with SparkContext.setCheckpointDir. If set, the path 
becomes" +
+          "the default directory for checkpointing. It can be overwritten by" +
+          "SparkContext.setCheckpointDir.")

Review Comment:
   nit:
    ```suggestion
           "Set the default directory for checkpointing. It can be overwritten 
by " +
             "SparkContext.setCheckpointDir.")
   ```
   
   (I missed this in my earlier pass, my bad)



##########
docs/configuration.md:
##########
@@ -1795,6 +1795,16 @@ Apart from these, the following properties are also 
available, and may be useful
   </td>
   <td>0.6.0</td>
 </tr>
+<tr>
+  <td><code>spark.checkpoint.dir</code></td>
+  <td>(none)</td>
+  <td>
+    Equivalent with SparkContext.setCheckpointDir. If set, the path becomes
+    the default directory for checkpointing. It can be overwritten by
+    SparkContext.setCheckpointDir.

Review Comment:
   Same as above.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to