Github user shivaram commented on a diff in the pull request:
https://github.com/apache/spark/pull/7883#discussion_r36054829
--- Diff: dev/run-tests.py ---
@@ -385,6 +385,10 @@ def run_sparkr_tests():
if which("R"):
run_cmd([os.path.join(SPARK_HOME, "R", "install-dev.sh")])
+ # R style check should be executed after `install-dev.sh`.
+ # Since warnings about `no visible global function definition`
appear
+ # without the installation. SEE ALSO: SPARK-9121.
+ run_cmd([os.path.join(SPARK_HOME, "dev", "lint-r")])
--- End diff --
Yeah thats a valid point, but the R lint tests have this problem where they
don't resolve internal / private functions unless the corresponding package is
included (i.e. SparkR in this case) . This is SPARK-9121 that is described in
the comment.
FWIW I think we can do this right after the Maven build finishes as the
SparkR package would have been built at that point -- so this will be before
running Scala unit tests at least.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]