Re: CVE-2021-44228 - Log4j2 vulnerability

2021-12-21 Thread Debraj Manna
Any idea when can we expect https://issues.apache.org/jira/browse/FLINK-25375 to be released? On Mon, Dec 20, 2021 at 8:18 PM Martijn Visser wrote: > Hi, > > The status and Flink ticket for upgrading to Log4j 2.17.0 can be tracked > at https://issues.apache.org/jira/browse/FLINK-25375. > > Best

Re: Flink KafkaConsumer metrics in DataDog

2021-09-05 Thread Debraj Manna
:44 AM Debraj Manna wrote: > Yes we are also facing the same problem and not able to find any solution. > > On Thu, Aug 26, 2021 at 5:59 PM Chesnay Schepler > wrote: > >> AFAIK this metric is directly forwarded from Kafka as-is; so Flink isn't >> calculating anyth

Re: Flink KafkaConsumer metrics in DataDog

2021-08-26 Thread Debraj Manna
Yes we are also facing the same problem and not able to find any solution. On Thu, Aug 26, 2021 at 5:59 PM Chesnay Schepler wrote: > AFAIK this metric is directly forwarded from Kafka as-is; so Flink isn't > calculating anything. > > I suggest to reach out to the Kafka folks. > > On 25/08/2021

Re: Flink 1.13.1 Kafka Producer Error

2021-08-24 Thread Debraj Manna
Fabian I am running it inside yarn. Thanks, On Tue, Aug 24, 2021 at 5:27 PM Fabian Paul wrote: > Hi Debraj > > How do you run your application? If you run it from an IDE you can set a > breakpoint and inspect the serializer class which is used. > > Best, > Fabian

Re: Flink 1.13.1 Kafka Producer Error

2021-08-24 Thread Debraj Manna
yes I initially did not add ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG` or `ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG. I was getting the same error so tried setting them explicitly. I did mvn dependency:tree | grep -i kafka. I did not see any other versions of Kafka in non test dependency and

Re: Flink 1.13.1 Kafka Producer Error

2021-08-24 Thread Debraj Manna
exception-bytearrayserializer-is-not-an-in> on Stackoverflow. Does anyone have any suggestions? On Mon, Aug 23, 2021 at 9:07 PM Debraj Manna wrote: > I am trying to use flink kafka producer like below > > public static FlinkKafkaProducer createProducer() > { > Proper

Flink 1.13.1 Kafka Producer Error

2021-08-23 Thread Debraj Manna
I am trying to use flink kafka producer like below public static FlinkKafkaProducer createProducer() { Properties props = new Properties(); props.setProperty("bootstrap.servers", ""); props.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,

Re: How to use RichAsyncFunction with Test harness in flink 1.13.1?

2021-08-06 Thread Debraj Manna
t test the logic of some > methods of your AsyncFlowLookup). If you want to test how it interacts with > Flink, you get an integration test by definition. > > > > On Fri, Aug 6, 2021 at 10:11 AM Debraj Manna > wrote: > >> Thanks it worked. >> >&

Re: How to use RichAsyncFunction with Test harness in flink 1.13.1?

2021-08-06 Thread Debraj Manna
final List result = stream.executeAndCollect(1); > assertThat(result, containsInAnyOrder(LongStream.rangeClosed(1, > 1000).boxed().toArray())); > } > > See also > https://github.com/apache/flink/blob/master/flink-tests/src/test/java/org/apache/flink/api/connector/source/lib/

How to use RichAsyncFunction with Test harness in flink 1.13.1?

2021-08-04 Thread Debraj Manna
HI I am trying to use RichAsyncFunction with flink's test harness. My code looks like below final MetadataEnrichment.AsyncFlowLookup fn = new MetadataEnrichment.AsyncFlowLookup(); final AsyncWaitOperatorFactory> operator = new AsyncWaitOperatorFactory<>(fn, 2000, 1,

Re: Flink 1.13.1 - Vulnerabilities CVE-2019-12900 for librocksdbjni

2021-07-17 Thread Debraj Manna
Anyone any thoughts on this? On Fri, 16 Jul 2021, 15:22 Debraj Manna, wrote: > Hi > > I am observing flink-1.13.1 is being flagged for CVE-2019-12900 for > librocksdbjni > > The issue seems to have been fixed in RocksDB in > > https://github.com/facebook/rocksdb/issues/6

Flink 1.13.1 - Vulnerabilities CVE-2019-12900 for librocksdbjni

2021-07-16 Thread Debraj Manna
Hi I am observing flink-1.13.1 is being flagged for CVE-2019-12900 for librocksdbjni The issue seems to have been fixed in RocksDB in https://github.com/facebook/rocksdb/issues/6703. Can someone let me know if flink is affected by this? I see this has been discussed a bit on

Re: Flink 1.13.1 PartitionNotFoundException

2021-07-14 Thread Debraj Manna
I have increased it to 9 and seems to be running fine. If I see the failure still when I add some load I will post back in this thread. On Wed, Jul 14, 2021 at 7:19 PM Debraj Manna wrote: > Yes I forgot to mention in my first email. I have tried increasing > taskmanager.network.r

Re: Flink 1.13.1 PartitionNotFoundException

2021-07-14 Thread Debraj Manna
.2336050.n4.nabble.com/Network-PartitionNotFoundException-when-run-on-multi-nodes-td21683.html > > https://issues.apache.org/jira/browse/FLINK-9413 > > Let us know if it helped. > > Regards, > Timo > > > > On 14.07.21 14:51, Debraj Manna wrote: > > Hi > > > > I am observi

Flink 1.13.1 PartitionNotFoundException

2021-07-14 Thread Debraj Manna
Hi I am observing my flink jobs is failing with the below error 2021-07-14T12:07:00.918Z INFO runtime.executiongraph.Execution flink-akka.actor.default-dispatcher-29 transitionState:1446 MetricAggregateFunction -> (Sink: LateMetricSink10, Sink: TSDBSink9) (12/30)

Re: owasp-dependency-check is flagging flink 1.13 for scala 2.12.7

2021-07-03 Thread Debraj Manna
ich > Hadoop version you are using, because we expect the user to provide Hadoop > (and you can use later and more secure versions if you wish). IOW, the > Hadoop 2.4 dependency in flink-hadoop-fs is just a hint to the user that > this version _can_ be used. > > On 7/3/2021 8:03 PM,

Re: owasp-dependency-check is flagging flink 1.13 for scala 2.12.7

2021-07-03 Thread Debraj Manna
o be relevant for you since the vulnerability only affects > the scaladocs, i.e., documentation. > > On 7/2/2021 2:10 PM, Debraj Manna wrote: > > Hi, > > I was running owasp-dependency-check > <https://owasp.org/www-project-dependency-check/> in a java application >

owasp-dependency-check is flagging flink 1.13 for scala 2.12.7

2021-07-02 Thread Debraj Manna
Hi, I was running owasp-dependency-check in a java application based on flink-1.13.0 (scala 2.12). scala 2.12.7 was getting flagged for this

Re: "Legacy Source Thread" line in logs

2021-06-24 Thread Debraj Manna
Thanks Fabian again for the clarification. On Thu, Jun 24, 2021 at 8:16 PM Fabian Paul wrote: > Hi Debraj, > > Sorry for the confusion the FlinkKafkaConsumer is the old source and the > overhauled one you can find here [1]. > You would need to replace the FlinkKafkaConsumer with the KafkaSource

Re: "Legacy Source Thread" line in logs

2021-06-24 Thread Debraj Manna
Thanks Fabian for replying. But I am using KafkaSource only. The code is something like below. class MetricSource { final Set metricSdms = new HashSet(); ... env.addSource(MetricKafkaSourceFactory.createConsumer(jobParams)) .name(MetricSource.class.getSimpleName())

"Legacy Source Thread" line in logs

2021-06-23 Thread Debraj Manna
Hi I am seeing the below logs in flink 1.13.0 running in YARN 2021-06-23T13:41:45.761Z WARN grid.task.MetricSdmStalenessUtils Legacy Source Thread - Source: MetricSource -> Filter -> MetricStoreMapper -> (Filter -> Timestamps/Watermarks -> Map -> Flat Map, Sink: FlinkKafkaProducer11, Sink:

Flink Protobuf serialization messages does not contain a setter for field bitField0_

2021-06-21 Thread Debraj Manna
Hi As mentioned in the documentation I have registered the Protobuf Serializer like below env.getConfig().registerTypeWithKryoSerializer(SelfDescribingMessageDO.class,

flink all events getting dropped as late

2021-05-19 Thread Debraj Manna
Crossposting from stackoverflow My flink pipeline looks like below WatermarkStrategy watermarkStrategy = WatermarkStrategy .forBoundedOutOfOrderness(Duration.ofSeconds(900))

Hadoop Integration Link broken in downloads page

2021-03-08 Thread Debraj Manna
Hi It appears the Hadoop Interation link is broken on downloads page. Apache FlinkĀ® 1.12.2 is our latest stable release. > If you plan to use Apache Flink together

Re: Flink 1.12 Compatibility with hbase-shaded-client 2.1 in application jar

2021-03-03 Thread Debraj Manna
at 10:12 AM Debraj Manna wrote: > Hi > > I am trying to deploy an application in flink 1.12 having > hbase-shaded-client 2.1.0 as dependency in application mode > <https://ci.apache.org/projects/flink/flink-docs-stable/deployment/resource-providers/yarn.html#application-mod

Flink 1.12 Compatibility with hbase-shaded-client 2.1 in application jar

2021-03-02 Thread Debraj Manna
Hi I am trying to deploy an application in flink 1.12 having hbase-shaded-client 2.1.0 as dependency in application mode . On deploying the application I am seeing the below

Re: Flink 1.12.1 example applications failing on a single node yarn cluster

2021-02-26 Thread Debraj Manna
how you've set up the Single-Node YARN cluster, I would still > guess that it is a configuration issue on the YARN side. Flink does not > know about a .flink folder. Hence, there is no configuration to set this > folder. > > Best, > Matthias > > On Fri, Feb 26, 2021 at 2

Re: Flink 1.12.1 example applications failing on a single node yarn cluster

2021-02-26 Thread Debraj Manna
directory to create a .flink folder even when the job is submitted in yarn. It is working fine for me if I run the flink with yarn user. in my setup. Just for my knowledge is there any config in flink to specify the location of .flink folder? On Thu, Feb 25, 2021 at 10:48 AM Debraj Manna wrote

Re: Flink 1.12.1 example applications failing on a single node yarn cluster

2021-02-24 Thread Debraj Manna
The same has been asked in StackOverflow <https://stackoverflow.com/questions/66355206/flink-1-12-1-example-application-failing-on-a-single-node-yarn-cluster> also. Any suggestions here? On Wed, Feb 24, 2021 at 10:25 PM Debraj Manna wrote: > I am trying out flink example as explained

Flink 1.12.1 example applications failing on a single node yarn cluster

2021-02-24 Thread Debraj Manna
FileUploader[] - Copying from file:/home/ubuntu/build-target/flink/lib/flink-dist_2.12-1.12.1.jar to file:/home/ubuntu/.flink/application_1614159836384_0045/flink-dist_2.12-1.12.1.jar with replication factor 1 The entire DEBUG logs are placed here <https://gist.github.c

Re: How to debug flink serialization error?

2021-02-13 Thread Debraj Manna
oach of using tagging fields as "transient" is absolutely correct. > There's also this message: NotSerializableException: > java.lang.reflect.Method, but I can not find a field of type Method. > > Can you provide a minimal reproducible example of this issue? > > On F

How to debug flink serialization error?

2021-02-11 Thread Debraj Manna
HI I am having a ProcessFunction like below which is throwing an error like below whenever I am trying to use it in a opeator . My understanding when flink initializes the operator dag, it serializes things and sends over to the taskmanagers. So I have marked the operator state transient, since

How to implement a WindowableTask(similar to samza) in Apache flink?

2020-12-22 Thread Debraj Manna
I am new to flink and this is my first post in the community. Samza has a concept of windowing where a stream processing job needs to do something in regular intervals, regardless of how many incoming