HeartSaVioR edited a comment on issue #24738: [WIP][SPARK-23098][SQL] Migrate 
Kafka Batch source to v2.
URL: https://github.com/apache/spark/pull/24738#issuecomment-497677198
 
 
   Looks like Table interface requires **static schema** along with read and 
write, for batch and streaming. It would hold true for many sources which are 
related to `table` or having simple schema, but it doesn't seem to hold true 
for Kafka source, which reader and writer have different schema.
   
   One other thing I've also observed is, even Table interface requires schema, 
(unlike batch write,) Spark doesn't throw error on streaming write even it 
doesn't match with schema. For streaming query, `schema` method doesn't look 
like being referenced.
   
   @cloud-fan @rdblue Could you please review the specific case on Kafka 
source? I'd like to see whether this case is one of missed spots, or a known 
limitation.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to