holdenk commented on a change in pull request #31707:
URL: https://github.com/apache/spark/pull/31707#discussion_r586811183
##########
File path: core/src/main/scala/org/apache/spark/rdd/RDD.scala
##########
@@ -2013,7 +2013,7 @@ abstract class RDD[T: ClassTag](
* Returns the deterministic level of this RDD's output. Please refer to
[[DeterministicLevel]]
* for the definition.
*
- * By default, an reliably checkpointed RDD, or RDD without parents(root
RDD) is DETERMINATE. For
+ *l By default, an reliably checkpointed RDD, or RDD without parents(root
RDD) is DETERMINATE. For
Review comment:
nit: is the l intentional?
##########
File path: core/src/test/scala/org/apache/spark/util/JsonProtocolSuite.scala
##########
@@ -1171,6 +1171,7 @@ private[spark] object JsonProtocolSuite extends
Assertions {
| "Replication": 1
| },
| "Barrier" : false,
+ | "DeterministicLevel" : "DETERMINATE",
Review comment:
It might be good to add an INDETERMINATE test case to this file.
##########
File path: core/src/main/scala/org/apache/spark/rdd/RDD.scala
##########
@@ -2022,7 +2022,7 @@ abstract class RDD[T: ClassTag](
// TODO: make it public so users can set deterministic level to their custom
RDDs.
// TODO: this can be per-partition. e.g. UnionRDD can have different
deterministic level for
// different partitions.
- private[spark] final lazy val outputDeterministicLevel:
DeterministicLevel.Value = {
+ private[spark] final def outputDeterministicLevel: DeterministicLevel.Value
= {
Review comment:
Should expose this given the TODO?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]