Re: state schema evolution for case classes

2020-04-02 Thread Tzu-Li (Gordon) Tai
Hi, I see the problem that you are bumping into as the following: - In your previous job, you seem to be falling back to Kryo for the state serialization. - In your new job, you are trying to change that to use a custom serializer. You can confirm this by looking at the stack trace of the "new

Re: state schema evolution for case classes

2020-04-02 Thread Apoorv Upadhyay
Hi Gordon, thanks for your response , So I have done a POC on state migration using avro, it seems it works out well. I am using custom avro serializer (with avro schema and (TypeSerializer, TypeSerializerSnapshot) and based on that written my own custom serializer for the scala case class that

Re: state schema evolution for case classes

2020-03-27 Thread Tzu-Li (Gordon) Tai
Hi Apoorv, Sorry for the late reply, have been quite busy with backlog items the past days. On Fri, Mar 20, 2020 at 4:37 PM Apoorv Upadhyay < apoorv.upadh...@razorpay.com> wrote: > Thanks Gordon for the suggestion, > > I am going by this repo : >

Re: state schema evolution for case classes

2020-03-20 Thread Apoorv Upadhyay
Thanks Gordon for the suggestion, I am going by this repo : https://github.com/mrooding/flink-avro-state-serialization So far I am able to alter the scala case classes and able to restore from savepoint using memory state backend, but when I am using rocksdb as statebackend and try to restore

Re: state schema evolution for case classes

2020-03-17 Thread Tzu-Li (Gordon) Tai
Hi Apoorv, Flink currently does not natively support schema evolution for state types using Scala case classes [1]. So, as Roman has pointed out, there are 2 possible ways for you to do that: - Implementing a custom serializer that support schema evolution for your specific Scala case classes,

Re: state schema evolution for case classes

2020-03-17 Thread Apoorv Upadhyay
Thanks a lot , Also can you share one example where these has been implemented? I have gone through docs does not happen to work still On Wed, Feb 26, 2020 at 7:59 PM Khachatryan Roman < khachatryan.ro...@gmail.com> wrote: > Hi Apoorv, > > You can achieve this by implementing custom serializers

Re: state schema evolution for case classes

2020-02-26 Thread Khachatryan Roman
Hi Apoorv, You can achieve this by implementing custom serializers for your state. Please refer to https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/stream/state/custom_serialization.html Regards, Roman On Wed, Feb 26, 2020 at 6:53 AM Apoorv Upadhyay <

Re: state schema evolution for case classes

2020-02-25 Thread Apoorv Upadhyay
Hi Roman, I have successfully migrated to flink 1.8.2 with the savepoint created by flink 1.6.2. Now I have to modify few case classes due to new requirement I have created a savepoint and when I run the app with modified class from the savepoint it throws error "state not compatible" Previously

Re: state schema evolution for case classes

2020-02-25 Thread Khachatryan Roman
Hi ApoorvK, I understand that you have a savepoint created by Flink 1.6.2 and you want to use it with Flink 1.8.2. The classes themselves weren't modified. Is that correct? Which serializer did you use? Regards, Roman On Tue, Feb 25, 2020 at 8:38 AM ApoorvK wrote: > Hi Team, > > Earlier we

state schema evolution for case classes

2020-02-24 Thread ApoorvK
Hi Team, Earlier we have developed on flink 1.6.2 , So there are lots of case classes which have Map,Nested case class within them for example below : case class MyCaseClass(var a: Boolean, var b: Boolean, var c: Boolean,