Hi Steve,
The memory catalog does not persist metadata and needs to be repopulated
everytime.
However, you can implement a catalog that persists the metadata to a file
or a database.
There is an effort to implement a Catalog interface of Hive's metastore.
A preview is available in the latest
Hi Fabian ,
thank you for your answer it is indeed the solution that I am currently
testing
i use TypeInformation convert =
JsonRowSchemaConverter.convert(JSON_SCHEMA); provided by the
flink-json and provide the TypeFormation to the operatorStream
its look like to work :) with this solution
Hi Steve,
Maybe you could implement a custom TableSource that queries the data from
the rest API and converts the JSON directly into a Row data type.
This would also avoid going through the DataStream API just for ingesting
the data.
Best, Fabian
Am Mi., 4. Sept. 2019 um 15:57 Uhr schrieb Steve
Hi guys ,
It's been a while since I'm studying TABLE APIs for integration into my
system.
when i take a look on this documentation
:
https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/table/connect.html#connectors
I understand that it is possible to apply a JSON FORMAT on the