On Wed, Mar 31, 2021 at 05:33:19PM +0800, Tzu-Li (Gordon) Tai wrote:
> You can try using TypeInfo annotations to specify a TypeInformationFactory
> for your key class [1].
> This allows you to "plug-in" the TypeInformation extracted by Flink for a
> given class. In that custom TypeInformation, you
Hi CZ,
The issue here is that the Scala DataStream API uses Scala macros to decide
the serializer to be used. Since that recognizes Scala case classes, the
CaseClassSerializer will be used.
However, in the State Processor API, those Scala macros do not come into
play, and therefore it directly goe
Hi,
Currently we use sbt-avrohugger [0] to generate key class for keyed
state. The key class generated by sbt-avrohugger is both case class,
and AVRO specific record. However, in the following scenarons, Flink
uses different serializers:
* In streaming application, Flink uses CaseClassSerialize