Hey Ingo,
Thanks for the suggestion. It's definitely an issue with the Parquet
connector, when we try with the CSV or Blackhole connector it's all fine.
I will be trying this approach and report back.
Thanks,
Natu
On Wed, Dec 8, 2021 at 7:02 PM Ingo Bürk wrote:
> Hi Natu,
>
> Something you
Hi Natu,
Something you could try is removing the packaged parquet format and
defining a custom format[1]. For this custom format you can then fix the
dependencies by packaging all of the following into the format:
* flink-sql-parquet
* flink-shaded-hadoop-2-uber
* hadoop-aws
*
Hey Timo and Flink community,
I wonder if there is a fix for this issue. The last time I rollbacked to
version 12 of Flink and downgraded Ververica.
I am really keen to leverage the new features on the latest versions of
Ververica 2.5+ , i have tried a myriad of tricks suggested ( example :
Could this be related to https://issues.apache.org/jira/browse/FLINK-22414?
On Thu, Jul 22, 2021 at 3:53 PM Timo Walther wrote:
> Thanks, this should definitely work with the pre-packaged connectors of
> Ververica platform.
>
> I guess we have to investigate what is going on. Until then, a
>
Thanks, this should definitely work with the pre-packaged connectors of
Ververica platform.
I guess we have to investigate what is going on. Until then, a
workaround could be to add Hadoop manually and set the HADOOP_CLASSPATH
environment variable. The root cause seems that Hadoop cannot be
Sure.
That's how the ddl table looks like:
CREATE TABLE tablea (
`a` BIGINT,
`b` BIGINT,
`c` BIGINT
)
COMMENT ''
WITH (
'auto-compaction' = 'false',
'connector' = 'filesystem',
'format' = 'parquet',
'parquet.block.size' = '134217728',
'parquet.compression' = 'SNAPPY',
Maybe you can share also which connector/format you are using? What is
the DDL?
Regards,
Timo
On 22.07.21 14:11, Natu Lauchande wrote:
Hey Timo,
Thanks for the reply.
No custom file as we are using Flink SQL and submitting the job directly
through the SQL Editor UI. We are using Flink
Hey Timo,
Thanks for the reply.
No custom file as we are using Flink SQL and submitting the job directly
through the SQL Editor UI. We are using Flink 1.13.1 as the supported flink
version. No custom code all through Flink SQL on UI no jars.
Thanks,
Natu
On Thu, Jul 22, 2021 at 2:08 PM Timo
Hi Natu,
Ververica Platform 2.5 has updated the bundled Hadoop version but this
should not result in a NoClassDefFoundError exception. How are you
submitting your SQL jobs? You don't use Ververica's SQL service but have
built a regular JAR file, right? If this is the case, can you share your
Good day Flink community,
Apache Flink/Ververica Community Edition - Question
I am having an issue with my Flink SQL jobs since updating from Flink
1.12/Ververica 2.4 to Ververica 2.5 . For all the jobs running on parquet
and S3 i am getting the following error continuously:
INITIALIZING to
10 matches
Mail list logo