rdblue commented on issue #24129: [SPARK-27190][SQL] add table capability for 
streaming
URL: https://github.com/apache/spark/pull/24129#issuecomment-483357024
 
 
   I've already suggested several ways to add checks:
   * For any streaming scan of a V2Relation, validate that the underlying table 
supports either micro-batch or continuous mode.
   * For any streaming write to a V2Relation, validate that the underlying 
table supports streaming writes.
   * For the set of streaming scans in the plan, validate that the intersection 
of supported modes is non-empty. For example, fail if s1 supports only 
microbatch and s2 supports only continuous.
   
   > the batch scan capability check is done in DataFrameReader, not an 
analyzer rule
   
   This isn't true. Validation is done in 
[`V2WriteSupportCheck`](https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2WriteSupportCheck.scala)
 to catch all plans, not just plans produced by the dataframe APIs.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to