Re: upgrade kafka-schema-registry-client to 6.1.2 for Flink-avro-confluent-registry

2021-06-25 Thread Lian Jiang
Thanks Fabian. [FLINK-23160] upgrade kafka-schema-registry-client to 6.1.2
for Flink-avro-confluent-registry - ASF JIRA (apache.org)
<https://issues.apache.org/jira/browse/FLINK-23160> is created. I will
create a private build Flink to try out the fix. If it goes well, I can
contribute back. Thanks. Regards!

On Fri, Jun 25, 2021 at 2:02 AM Fabian Paul 
wrote:

> Hi,
>
> Thanks for bringing this up. It looks to me like something we definitely
> want to fix. Unfortunately, I also do not see an easy workaround
> besides building your own flink-avro-confluent-registry and bumping the
> dependency.
>
> Can you create a JIRA ticket for bumping the dependencies and would you be
> willing to work on this? A few things are still a bit unclear
> i.e. are the newer confluent schema registry versions compatible with out
> Kafka version (2.4.1).
>
> Best,
> Fabian



-- 

Create your own email signature
<https://www.wisestamp.com/signature-in-email/?utm_source=promotion_medium=signature_campaign=create_your_own=5234462839406592>


Re: upgrade kafka-schema-registry-client to 6.1.2 for Flink-avro-confluent-registry

2021-06-25 Thread Fabian Paul
Hi,

Thanks for bringing this up. It looks to me like something we definitely want 
to fix. Unfortunately, I also do not see an easy workaround
besides building your own flink-avro-confluent-registry and bumping the 
dependency.

Can you create a JIRA ticket for bumping the dependencies and would you be 
willing to work on this? A few things are still a bit unclear
i.e. are the newer confluent schema registry versions compatible with out Kafka 
version (2.4.1).

Best,
Fabian

upgrade kafka-schema-registry-client to 6.1.2 for Flink-avro-confluent-registry

2021-06-24 Thread Lian Jiang
Hi,

I am using ConfluentRegistryAvroSerializationSchema and blocked by a bug
mentioned by https://github.com/confluentinc/schema-registry/issues/1352
and the final fix is done by
https://github.com/confluentinc/schema-registry/pull/1839. I did not find
any easy workaround.

Flink-avro-confluent-registry (
https://mvnrepository.com/artifact/org.apache.flink/flink-avro-confluent-registry)
still use 5.5.2 kafka-schema-registry-client
<https://mvnrepository.com/artifact/io.confluent/kafka-schema-registry-client>
which does not have this fix. Could the next Flink upgrade this dependency
of 6.1.2 which has the fix? Meanwhile, does the Flink community have any
recommended workaround before dependency upgrade? Thanks for any help!


Re: Avro and Kafka Schema Registry Client versions out of date

2020-07-09 Thread Arvid Heise
Hi Lucas,

1.9.2 [1] support depends on Hive upgrading as well [2] . You could cast a
vote on both tickets to accelerate it.

Schema registry 5.5.0 depends on Avro 1.9.2 and I'm not sure what the
implications of a downgrade are.

Of course, you could build the module yourself with 5.5.0, test, and report
back. [3]

[1] https://issues.apache.org/jira/browse/FLINK-12532
[2] https://issues.apache.org/jira/browse/HIVE-21737
[3]
https://github.com/apache/flink/blob/master/flink-formats/flink-avro-confluent-registry/pom.xml#L33

On Thu, Jul 9, 2020 at 8:09 PM Lucas Heimberg 
wrote:

> Hello,
>
> I noticed that even in Flink 1.11. Avro in flink-avro and the Kafka Schema
> Registry client in flink-avro-confluent-registry are still at version 1.8.2
> and 4.1.0, respectively.
>
> Avro 1.9.2 brings a lot of improvements and bugfixes, in particular in
> respect to logical types.
> The Kafka Schema Registry Client 5.5.0 finally supports schema references,
> i.e., schemas that are composed from different subjects of the Schema
> Registry, which is a very useful feature for the reuse of schemas.
>
> I would like to ask if there are plans to bump both version numbers in the
> near future, or whether there are specific obstacles for that?
>
> Thank you very much & kind regards,
> Lucas
>
> --
> Dr. Lucas Heimberg
>


-- 

Arvid Heise | Senior Java Developer

<https://www.ververica.com/>

Follow us @VervericaData

--

Join Flink Forward <https://flink-forward.org/> - The Apache Flink
Conference

Stream Processing | Event Driven | Real Time

--

Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany

--
Ververica GmbH
Registered at Amtsgericht Charlottenburg: HRB 158244 B
Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji
(Toni) Cheng


Fwd: Avro and Kafka Schema Registry Client versions out of date

2020-07-09 Thread Lucas Heimberg
Hello,

I noticed that even in Flink 1.11. Avro in flink-avro and the Kafka Schema
Registry client in flink-avro-confluent-registry are still at version 1.8.2
and 4.1.0, respectively.

Avro 1.9.2 brings a lot of improvements and bugfixes, in particular in
respect to logical types.
The Kafka Schema Registry Client 5.5.0 finally supports schema references,
i.e., schemas that are composed from different subjects of the Schema
Registry, which is a very useful feature for the reuse of schemas.

I would like to ask if there are plans to bump both version numbers in the
near future, or whether there are specific obstacles for that?

Thank you very much & kind regards,
Lucas

-- 
Dr. Lucas Heimberg


Fwd: Avro and Kafka Schema Registry Client versions out of date

2020-07-09 Thread Lucas Heimberg
Hello,

I noticed that even in Flink 1.11. Avro in flink-avro and the Kafka Schema
Registry client in flink-avro-confluent-registry are still at version 1.8.2
and 4.1.0, respectively.

Avro 1.9.2 brings a lot of improvements and bugfixes, in particular in
respect to logical types.
The Kafka Schema Registry Client 5.5.0 finally supports schema references,
i.e., schemas that are composed from different subjects of the Schema
Registry, which is a very useful feature for the reuse of schemas.

I would like to ask if there are plans to bump both version numbers in the
near future, or whether there are specific obstacles for that?

Thank you very much & kind regards,
Lucas

--
Dr. Lucas Heimberg


Re: Kafka Schema registry

2020-01-14 Thread aj
 ConfluentRegistryAvroDeserializationSchema.forGeneric() is require reader
schema .How we can used it deseralize using writer schema.

On Fri, Sep 13, 2019 at 12:04 AM Lasse Nedergaard 
wrote:

> Hi Elias
>
> Thanks for letting me know. I have found it but we also need the option to
> register Avro Schema’s and use the registry when we write to Kafka. So we
> will create a serialisation version and when it works implement it into
> Flink and create a pull request for the community.
>
> Med venlig hilsen / Best regards
> Lasse Nedergaard
>
>
> Den 12. sep. 2019 kl. 17.45 skrev Elias Levy  >:
>
> Just for a Kafka source:
>
>
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema
>
>
>- There is also a version of this schema available that can lookup the
>writer’s schema (schema which was used to write the record) in Confluent
>Schema Registry
><https://docs.confluent.io/current/schema-registry/docs/index.html>.
>Using these deserialization schema record will be read with the schema that
>was retrieved from Schema Registry and transformed to a statically
>provided( either through
>ConfluentRegistryAvroDeserializationSchema.forGeneric(...) or
>ConfluentRegistryAvroDeserializationSchema.forSpecific(...)).
>
>
> On Wed, Sep 11, 2019 at 1:48 PM Lasse Nedergaard <
> lassenederga...@gmail.com> wrote:
>
>> Hi.
>> Do Flink have out of the Box Support for Kafka Schema registry for both
>> sources and sinks?
>> If not, does anyone knows about a implementation we can build on so we
>> can help make it general available in a future release.
>>
>> Med venlig hilsen / Best regards
>> Lasse Nedergaard
>>
>>

-- 
Thanks & Regards,
Anuj Jain
Mob. : +91- 8588817877
Skype : anuj.jain07
<http://www.oracle.com/>


<http://www.cse.iitm.ac.in/%7Eanujjain/>


Re: Kafka Schema registry

2019-09-12 Thread Lasse Nedergaard
Hi Elias

Thanks for letting me know. I have found it but we also need the option to 
register Avro Schema’s and use the registry when we write to Kafka. So we will 
create a serialisation version and when it works implement it into Flink and 
create a pull request for the community. 

Med venlig hilsen / Best regards
Lasse Nedergaard


> Den 12. sep. 2019 kl. 17.45 skrev Elias Levy :
> 
> Just for a Kafka source:
> 
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema
> 
> There is also a version of this schema available that can lookup the writer’s 
> schema (schema which was used to write the record) in Confluent Schema 
> Registry. Using these deserialization schema record will be read with the 
> schema that was retrieved from Schema Registry and transformed to a 
> statically provided( either through 
> ConfluentRegistryAvroDeserializationSchema.forGeneric(...) or 
> ConfluentRegistryAvroDeserializationSchema.forSpecific(...)).
> 
>> On Wed, Sep 11, 2019 at 1:48 PM Lasse Nedergaard  
>> wrote:
>> Hi. 
>> Do Flink have out of the Box Support for Kafka Schema registry for both 
>> sources and sinks?
>> If not, does anyone knows about a implementation we can build on so we can 
>> help make it general available in a future release. 
>> 
>> Med venlig hilsen / Best regards
>> Lasse Nedergaard
>> 


Re: Kafka Schema registry

2019-09-12 Thread Elias Levy
Just for a Kafka source:

https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema


   - There is also a version of this schema available that can lookup the
   writer’s schema (schema which was used to write the record) in Confluent
   Schema Registry
   <https://docs.confluent.io/current/schema-registry/docs/index.html>.
   Using these deserialization schema record will be read with the schema that
   was retrieved from Schema Registry and transformed to a statically
   provided( either through
   ConfluentRegistryAvroDeserializationSchema.forGeneric(...) or
   ConfluentRegistryAvroDeserializationSchema.forSpecific(...)).


On Wed, Sep 11, 2019 at 1:48 PM Lasse Nedergaard 
wrote:

> Hi.
> Do Flink have out of the Box Support for Kafka Schema registry for both
> sources and sinks?
> If not, does anyone knows about a implementation we can build on so we can
> help make it general available in a future release.
>
> Med venlig hilsen / Best regards
> Lasse Nedergaard
>
>