I'm missing something simpler (I think). That is, why do I need a Some instead
of Tuple2? Because a Some might or might not be there, but a Tuple2 must be
there? Or something like that?
From: Adrian Mocanu
mailto:amoc...@verticalscope.com>>
You are correct; the filtering I’m talking about i
...@spark.incubator.apache.org
Subject: Re: "overloaded method value updateStateByKey ... cannot be applied to
..." when Key is a Tuple2
Adrian, do you know if this is documented somewhere? I was also under the
impression that setting a key's value to None would cause the key to be
dis
Sent: November-12-14 2:25 PM
> To: u...@spark.incubator.apache.org
> Subject: Re: "overloaded method value updateStateByKey ... cannot be
> applied to ..." when Key is a Tuple2
>
> After comparing with previous code, I got it work by making the return a
> Some instead of T
bject: Re: "overloaded method value updateStateByKey ... cannot be applied to
..." when Key is a Tuple2
After comparing with previous code, I got it work by making the return a Some
instead of Tuple2. Perhaps some day I will understand this.
spr wrote
> --code
>
>
gt; // (currentCount + previousCount, Seq(minTime, newMinTime).min)
> <== old
> Some(currentCount + previousCount, Seq(minTime, newMinTime).min)
> // <== new
> }
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/over
error] var DnsSvrCum = DnsSvr.updateStateByKey[(Int,
Time)](updateDnsCount)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/overloaded-method-value-updateStateByKey-cannot-be-applied-to-when-Key-is-a-Tuple2-tp1864