Github user jose-torres commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20253#discussion_r161359025
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/continuous/ContinuousDataSourceRDDIter.scala
 ---
    @@ -177,6 +176,7 @@ class DataReaderThread(
       private[continuous] var failureReason: Throwable = _
     
       override def run(): Unit = {
    +    TaskContext.setTaskContext(context)
    --- End diff --
    
    TaskContext is a ThreadLocal. We need to get it in the Kafka data reader to 
check if an interrupt has come.
    
    I argue that this doesn't represent an abstraction failure in the V2 API, 
but a workaround to the fact that Kafka consumers are badly behaved and do not 
respect interrupts.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to