Github user mengxr commented on a diff in the pull request:
https://github.com/apache/spark/pull/6499#discussion_r32784734
--- Diff: python/pyspark/mllib/clustering.py ---
@@ -264,6 +270,190 @@ def train(cls, rdd, k, convergenceTol=1e-3,
maxIterations=100, seed=None, initia
return GaussianMixtureModel(weight, mvg_obj)
+class StreamingKMeansModel(KMeansModel):
+ """
+ .. note:: Experimental
+ Clustering model which can perform an online update of the centroids.
+
+ The update formula for each centroid is given by
+ c_t+1 = [(c_t * n_t * a) + (x_t * m_t)] / [n_t + m_t]
+ n_t+1 = n_t * a + m_t
+
+ where
+ c_t: Centroid at the n_th iteration.
+ n_t: Number of samples (or) weights associated with the centroid
+ at the n_th iteration.
+ x_t: Centroid of the new data closest to c_t.
+ m_t: Number of samples (or) weights of the new data closest to c_t
+ c_t+1: New centroid.
+ n_t+1: New number of weights.
+ a: Decay Factor, which gives the forgetfulness.
+
+ Note that if a is set to 1, it is the weighted mean of the previous
+ and new data. If it set to zero, the old centroids are completely
+ forgotten.
+
+ >>> initCenters, initWeights = [[0.0, 0.0], [1.0, 1.0]], [1.0, 1.0]
+ >>> stkm = StreamingKMeansModel(initCenters, initWeights)
+ >>> data = sc.parallelize([[-0.1, -0.1], [0.1, 0.1],
+ ... [0.9, 0.9], [1.1, 1.1]])
+ >>> stkm = stkm.update(data, 1.0, u"batches")
+ >>> stkm.centers
+ array([[ 0., 0.],
+ [ 1., 1.]])
+ >>> stkm.predict([-0.1, -0.1]) == stkm.predict([0.1, 0.1]) == 0
--- End diff --
It should be sufficient to show only one example, and simulate what users
would type in a console:
~~~python
>>> stkm.predict([-0.1, -0.1])
1
~~~
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]