Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/8051#issuecomment-129227952
I see the point that some tests from outside the `org.apache.spark` package
would be useful to test via only public methods. MiMa is already kind of
covering this problem for the project. The rest of the Spark code does not do
this, with the exception of a random class or two in core; this sets of tests
does not look like it's supposed to be checking public API stability.
It doesn't look like this code was trying to be in test.org.apache.spark
since it didn't declare that as the package. But then I don't know how anything
ever compiled it since its package doesn't match the dir. Fixing that is fine
enough but I'm also wondering whether the other ~10 instances of this in the
code are accidents or on purpose.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]