dongjoon-hyun commented on code in PR #39015:
URL: https://github.com/apache/spark/pull/39015#discussion_r1044989412


##########
dev/sparktestsupport/utils.py:
##########
@@ -34,19 +34,22 @@ def determine_modules_for_files(filenames):
     Given a list of filenames, return the set of modules that contain those 
files.
     If a file is not associated with a more specific submodule, then this 
method will consider that
     file to belong to the 'root' module. `.github` directory is counted only 
in GitHub Actions,
-    and `appveyor.yml` is always ignored because this file is dedicated only 
to AppVeyor builds.
+    and `appveyor.yml` is always ignored because this file is dedicated only 
to AppVeyor builds,
+    and `README.md` is always ignored too.
 
     >>> sorted(x.name for x in 
determine_modules_for_files(["python/pyspark/a.py", "sql/core/foo"]))
     ['pyspark-core', 'sql']
     >>> [x.name for x in 
determine_modules_for_files(["file_not_matched_by_any_subproject"])]
     ['root']
-    >>> [x.name for x in determine_modules_for_files(["appveyor.yml"])]
+    >>> [x.name for x in determine_modules_for_files(["appveyor.yml", 
"sql/README.md"])]
     []
     """
     changed_modules = set()
     for filename in filenames:
         if filename in ("appveyor.yml",):
             continue
+        if filename.endswith("README.md"):

Review Comment:
   Since we need `endswith` for `README.md`, we have a new `if` statement, 
@HyukjinKwon .



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to