Github user koeninger commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14340#discussion_r72251359
  
    --- Diff: 
external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/KafkaUtils.scala
 ---
    @@ -286,6 +285,30 @@ private[kafka010] class KafkaUtilsPythonHelper extends 
Logging {
         val kafkaRDD = kafkaRDDs.head.asInstanceOf[KafkaRDD[_, _]]
         kafkaRDD.offsetRanges.toSeq.asJava
       }
    +
    +  def commitAsyncForKafkaDStream(dstream: DStream[_], offsetRanges: 
ju.List[OffsetRange]): Unit = {
    +    val dstreams = new mutable.HashSet[DStream[_]]()
    +
    +    def visit(parent: DStream[_]): Unit = {
    +      val parents = parent.dependencies
    +      parents.filterNot(dstreams.contains).foreach { p =>
    +        dstreams.add(p)
    +        visit(p)
    +      }
    +    }
    +    visit(dstream)
    +
    +    val kafkaDStreams = dstreams.filter(s => 
s.isInstanceOf[DirectKafkaInputDStream[_, _]])
    +    require(
    +      kafkaDStreams.size == 1,
    +      "Cannot commit offset ranges to DirectKafkaInputDStream, as there 
may be multiple Kafka " +
    --- End diff --
    
    It's slightly odd to refer to private classes in an error message.  But it 
may make more sense than referring to the CanCommitOffsets / HasOffsets 
interfaces ... from a python user point of view, which do you think is more 
likely to make sense?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to