. It’s a production blocker for us.
Regards
Aswin
From: Sivaraman Venkataraman, Aswin Ram
Date: Monday, February 15, 2021 at 12:15 AM
To: dev@flink.apache.org
Cc: Sivaraman Venkataraman, Aswin Ram
Subject: Out of Memory Error-Heap when storing parquet files using Flink Table
API (Flink version
Hi Everyone,
Hope everything is well. We are using Flink's Table API to read data from Kafka
and write to Google Cloud Storage in Parquet File format. The flink version we
are using is 1.11.2. The checkpointing interval we specified was 3 minutes. The
issue we are facing is, though we generated