Github user jkbradley commented on a diff in the pull request:
https://github.com/apache/spark/pull/6791#discussion_r34220035
--- Diff: python/pyspark/mllib/clustering.py ---
@@ -562,5 +564,67 @@ def _test():
exit(-1)
+class LDAModel(JavaModelWrapper):
--- End diff --
@davies In Scala, LDAModel is abstract. LocalLDAModel and
DistributedLDAModel inherit from it. We should eventually have this same setup
in Python. What is needed to maintain backwards compatibility? If we add this
API in Spark 1.5, can we later make LDAModel abstract, and have LocalLDAModel
and DistributedLDAModel inherit from it?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]