Hi Naehee,
the serializer for case classes is generated using the Scala macro that
is also responsible for extracting the TypeInformation implcitly from
your DataStream API program.
It should be possible to use POJO serializer with case classes. But
wouldn't it be easier to just use regular
Hi Dawid,
Thanks for your reply. Good to know it is due to historic and compatibility
reasons.
The reason why I started looking into POJO rules is to understand if Scala
Case Class can conform to POJO rules to support schema evolution. In our
case, we store several Scala Case Classes to RocksDB
Hi Naehee,
Short answer would be for historic reasons and compatibility reasons. It
was implemented that way back in the days and we don't want to change
the default type extraction logic. Otherwise user jobs that rely on the
default type extraction logic for state storing would end up with a
According to the Flink doc,
Flink recognizes a data type as a POJO type (and allows “by-name” field
referencing) if the following conditions are fulfilled:
- The class is public and standalone (no non-static inner class)
- The class has a public no-argument constructor
- All non-static,