tgravescs commented on a change in pull request #28874:
URL: https://github.com/apache/spark/pull/28874#discussion_r444392875



##########
File path: 
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala
##########
@@ -181,23 +181,23 @@ private[history] class FsHistoryProvider(conf: SparkConf, 
clock: Clock)
     processing.remove(path.getName)
   }
 
-  private val blacklist = new ConcurrentHashMap[String, Long]
+  private val ignoreList = new ConcurrentHashMap[String, Long]
 
   // Visible for testing
-  private[history] def isBlacklisted(path: Path): Boolean = {
-    blacklist.containsKey(path.getName)
+  private[history] def isIgnored(path: Path): Boolean = {
+    ignoreList.containsKey(path.getName)
   }
 
-  private def blacklist(path: Path): Unit = {
-    blacklist.put(path.getName, clock.getTimeMillis())
+  private def ignore(path: Path): Unit = {
+    ignoreList.put(path.getName, clock.getTimeMillis())
   }
 
   /**
-   * Removes expired entries in the blacklist, according to the provided 
`expireTimeInSeconds`.
+   * Removes expired entries in the ignoreList, according to the provided 
`expireTimeInSeconds`.
    */
-  private def clearBlacklist(expireTimeInSeconds: Long): Unit = {
+  private def clearIgnoreList(expireTimeInSeconds: Long): Unit = {
     val expiredThreshold = clock.getTimeMillis() - expireTimeInSeconds * 1000
-    blacklist.asScala.retain((_, creationTime) => creationTime >= 
expiredThreshold)
+    ignoreList.asScala.retain((_, creationTime) => creationTime >= 
expiredThreshold)

Review comment:
       since this is internal not as big of a deal but not sure if we want to 
come to consensus on https://issues.apache.org/jira/browse/SPARK-32037 
terminology first.  Here ignorelist seems ok, although blocklist or notallowed 
(since its an access control exception) might fit better.

##########
File path: docs/streaming-programming-guide.md
##########
@@ -1162,7 +1162,7 @@ joinedStream = stream1.join(stream2)
 {% endhighlight %}
 </div>
 </div>
-Here, in each batch interval, the RDD generated by `stream1` will be joined 
with the RDD generated by `stream2`. You can also do `leftOuterJoin`, 
`rightOuterJoin`, `fullOuterJoin`. Furthermore, it is often very useful to do 
joins over windows of the streams. That is pretty easy as well. 

Review comment:
       try to minimize changes not directly related to the renaming to make 
reviewing easier

##########
File path: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveQueryFileTest.scala
##########
@@ -45,25 +45,34 @@ abstract class HiveQueryFileTest extends HiveComparisonTest 
{
     runOnlyDirectories.nonEmpty ||
     skipDirectories.nonEmpty
 
-  val whiteListProperty: String = "spark.hive.whitelist"
-  // Allow the whiteList to be overridden by a system property
-  val realWhiteList: Seq[String] =
-    
Option(System.getProperty(whiteListProperty)).map(_.split(",").toSeq).getOrElse(whiteList)
+  val deprecatedIncludeListProperty: String = "spark.hive.whitelist"
+  val includeListProperty: String = "spark.hive.includelist"

Review comment:
       did you see what was using this? I don't see it used anywhere internally.

##########
File path: 
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/json/JsonSuite.scala
##########
@@ -2488,7 +2488,7 @@ abstract class JsonSuite extends QueryTest with 
SharedSparkSession with TestJson
           .json(testFile("test-data/utf16LE.json"))
           .count()
       }
-      assert(exception.getMessage.contains("encoding must not be included in 
the blacklist"))
+      assert(exception.getMessage.contains("encoding must not be included in 
the denyList"))

Review comment:
       inconsistent on how we capitalize denyList.  I've seen it this way, 
DenyList, Denylist. We should try to be consistent

##########
File path: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JSONOptions.scala
##########
@@ -188,7 +188,7 @@ private[sql] object JSONOptionsInRead {
   // only the first lines will have the BOM which leads to impossibility for 
reading
   // the rest lines. Besides of that, the lineSep option must have the BOM in 
such
   // encodings which can never present between lines.
-  val blacklist = Seq(
+  val DenyList = Seq(

Review comment:
       why is first letter capital here? we maybe inconsistent about it but 
usually constants are all caps or regular variable syntax.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to