I solved the problem by passing the HLL object to the function, updating it
and returning it as new state. This was obviously a thinking barrier... ;-)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Sharing-object-state-accross-transformations
>> purposes...
>>>
>>> So how can I achive to use one global defined HLL-object in a spark
>>> stream
>>> transformation? I also tried to implement a custom Accumulator but this
>>> also
>>> failed because I don't get how to use the Accumulabl
addAccumulator, addInPlace
and zero?
Thanks in advance for your help and your advice!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Sharing-object-state-accross-transformations-tp25544.html
Sent from the Ap
he AccumulableParam interface. I
> implemented the Accumulator and overwrote the add and value methods. But
> what do I have to do in the AccumulableParam with addAccumulator,
> addInPlace
> and zero?
>
> Thanks in advance for your help and your advice!
>
>
>