Excuse me - I should have mentioned: I am running Spark 1.4.1, Scala 2.11.
I am running in streaming mode receiving data from Kafka.

Regards,

Bryan Jeffrey

On Mon, Feb 1, 2016 at 9:19 PM, Bryan Jeffrey <bryan.jeff...@gmail.com>
wrote:

> Hello.
>
> I have a reduceByKeyAndWindow function with an invertable function and
> filter function defined.  I am seeing an error as follows:
>
> "Neither previous window has value for key, nor new values found. Are you
> sure your key classhashes consistently?"
>
> We're using case classes, and so I am sure we're doing consistent
> hashing.  The 'reduceAdd' function is adding to a map. The
> 'inverseReduceFunction' is subtracting from the map. The filter function is
> removing items where the number of entries in the map is zero.  Has anyone
> seen this error before?
>
> Regards,
>
> Bryan Jeffrey
>
>

Reply via email to