Shixiong Zhu created SPARK-4481:
-----------------------------------

             Summary: Some comments for `updateStateByKey` are wrong
                 Key: SPARK-4481
                 URL: https://issues.apache.org/jira/browse/SPARK-4481
             Project: Spark
          Issue Type: Documentation
          Components: Streaming
    Affects Versions: 1.1.0
            Reporter: Shixiong Zhu


The following `updateStateByKey` overloads have wrong description for 
`updateFunc`

{code:java}
  /**
   * @param updateFunc State update function. If `this` function returns None, 
then
   *                   corresponding state key-value pair will be eliminated. 
Note, that
   *                   this function may generate a different a tuple with a 
different key
   *                   than the input key. It is up to the developer to decide 
whether to
   *                   remember the partitioner despite the key being changed.
   */
  def updateStateByKey[S: ClassTag](
      updateFunc: (Iterator[(K, Seq[V], Option[S])]) => Iterator[(K, S)],
      partitioner: Partitioner,
      rememberPartitioner: Boolean
    ): DStream[(K, S)]

  /**
   * @param updateFunc State update function. If `this` function returns None, 
then
   *                   corresponding state key-value pair will be eliminated. 
Note, that
   *                   this function may generate a different a tuple with a 
different key
   *                   than the input key. It is up to the developer to decide 
whether to
   *                   remember the partitioner despite the key being changed.
   */
  def updateStateByKey[S: ClassTag](
      updateFunc: (Iterator[(K, Seq[V], Option[S])]) => Iterator[(K, S)],
      partitioner: Partitioner,
      rememberPartitioner: Boolean,
      initialRDD: RDD[(K, S)]
    ): DStream[(K, S)]
{code}

` If `this` function returns None, then corresponding state key-value pair will 
be eliminated.` should be removed.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to