[ 
https://issues.apache.org/jira/browse/FLINK-12256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17036279#comment-17036279
 ] 

Leonard Xu commented on FLINK-12256:
------------------------------------

Hi,[~phoenixjiangnan].
  I found SQL Kafka connector can not consume avro data that was serialized by 
`KafkaAvroSerializer` and only can consume Row data with avro schema because we 
use `AvroRowDeserializationSchema/AvroRowSerializationSchema` to se/de data in  
`AvroRowFormatFactory`.

If I'm right I think we should  consider this. How do you think?

> Implement Confluent Schema Registry Catalog
> -------------------------------------------
>
>                 Key: FLINK-12256
>                 URL: https://issues.apache.org/jira/browse/FLINK-12256
>             Project: Flink
>          Issue Type: New Feature
>          Components: Connectors / Kafka, Table SQL / Client
>    Affects Versions: 1.9.0
>            Reporter: Artsem Semianenka
>            Assignee: Bowen Li
>            Priority: Major
>             Fix For: 1.11.0
>
>
>  KafkaReadableCatalog is a special implementation of ReadableCatalog 
> interface (which introduced in 
> [FLIP-30|https://cwiki.apache.org/confluence/display/FLINK/FLIP-30%3A+Unified+Catalog+APIs]
>  )  to retrieve meta information such topic name/schema of the topic from 
> Apache Kafka and Confluent Schema Registry. 
> New ReadableCatalog allows a user to run SQL queries like:
> {code:java}
> Select * form kafka.topic_name  
> {code}
> without the need for manual definition of the table schema.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to