Hi,
I found windowed mean as fallows:
 val counts = myStream.map(x => (x.toDouble,1)).reduceByWindow((a, b) => (a._1 
+ b._1, a._2 + b._2),(a, b) => (a._1 - b._1, a._2 - b._2), Seconds(2), 
Seconds(2)) val windowMean = counts.map(x => (x._1.toFloat/x._2)) Now I want to 
find cumulative moving average (CMA)  for all the past values after certain 
time. I am thinking towards updatestateByKey by having the same key for all 
values.
Anyone else has done CMA before with Spark streaming. Any ideas?? Also is it 
feasible??
Regards,Laeeq 

Reply via email to