Github user CodingCat commented on the pull request:
https://github.com/apache/spark/pull/5252#issuecomment-87502439
@rxin , thanks for the comments
I change the code in SparkBuild and found there are several related
problems in the current code
1. the original SparkBuild filters all classes under `collection ` .
However, some of the classes there are marked as `DeveloperAPI`, shouldn't we
document all DeveloperAPIs? but if so, I found that some classes are
DeveloperAPI but meanwhile they are `private[spark]`, so they are not documented
2. regarding the classes under `org.apache.spark.sql.types`, nearly all
classes defined in dataTypes.scala are 'DeveloperAPIs' but they are excluded
from the scaladoc originally, similar with 1, we should actually document them,
right? (in the current version of patch, I exclude them temporarily)
If we do document classes under `org.apache.spark.sql.types`, for the
other helper classes in the same package, e.g. DataTypeConversions, shall we
document them as well? (it is `protected[sql]` now)
also cc: @liancheng and @marmbrus
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]