Hi Branislav, thanks - the issue was caused by a wrong extraction of the timestamp field and should now be fixed [1]. The fix is in the latest dev version and will be part of release 0.68.0, which is now under vote.
Thanks for reporting this! Dominik [1] https://issues.apache.org/jira/browse/STREAMPIPES-394 On 2021/07/02 12:07:16, Branislav Jovicic <[email protected]> wrote: > Hello, Dominik, > > Thank you for such fast response! > > We tried using both the Flow Rate 1 Data Stream and data stream created with > Machine Data Simulator adapter, both resulting in same error. > As for the data sink, we were using the existing InfluxDB data sink. > > Let me know if you require any additional information. > > Kindest regards, > Branislav > ________________________________ > From: Dominik Riemer <[email protected]> > Sent: Friday, July 2, 2021 13:37 > To: Branislav Jovicic <[email protected]>; > [email protected] <[email protected]> > Cc: Canteri, Michele <[email protected]>; [email protected] > <[email protected]>; [email protected] <[email protected]> > Subject: RE: Issue with InfluxDB > > > Hi Branislav, > > > > thanks for reporting this! > > At first glance, it seems the problem comes from an error while reading the > key from a nested event schema, before data is written to influx. > > Can you please show an example event that you are trying to persist to > Influx? Then I’ll take a closer look at this issue. Are you using the > standard InfluxDB sink or a custom-written Influx sink? > > > > Best, > > Dominik > > > > From: Branislav Jovicic <[email protected]> > Sent: Friday, July 2, 2021 1:07 PM > To: [email protected] > Cc: Canteri, Michele <[email protected]>; [email protected]; > Dominik Riemer <[email protected]>; [email protected] > Subject: Issue with InfluxDB > > > > Hello, everyone, > > > > My partners (Michele and Federico, in cc) and I are trying to use an InfluxDB > for one of our projects. > > Since we ran into an issue while trying to set everything up, we decided to > try and create the test case that was presented > here<https://cwiki.apache.org/confluence/display/STREAMPIPES/Description+for+E2E+Tests>, > under InfluxDB section. > > > > We have configured the InfluxDB data sink according to the instructions. > However, when we start the pipeline, we get the following error: > > > > pipeline-elements-all-jvm_1 | Exception in thread "Thread-7" > java.lang.IllegalArgumentException: Key not found > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.model.runtime.Event.lambda$getNestedItem$2(Event.java:97) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.model.runtime.Event$$Lambda$885/0x00000000541938f0.get(Unknown > Source) > > pipeline-elements-all-jvm_1 | at > java.util.Optional.orElseThrow(Optional.java:290) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.model.runtime.Event.getNestedItem(Event.java:97) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.model.runtime.Event.getFieldBySelector(Event.java:90) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.model.runtime.Event.getFieldBySelector(Event.java:82) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.sinks.databases.jvm.influxdb.InfluxDbClient.save(InfluxDbClient.java:166) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.sinks.databases.jvm.influxdb.InfluxDb.onEvent(InfluxDb.java:54) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.wrapper.standalone.runtime.StandaloneEventSinkRuntime.process(StandaloneEventSinkRuntime.java:50) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.wrapper.standalone.routing.StandaloneSpInputCollector.send(StandaloneSpInputCollector.java:54) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.wrapper.standalone.routing.StandaloneSpInputCollector.lambda$onEvent$0(StandaloneSpInputCollector.java:48) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.wrapper.standalone.routing.StandaloneSpInputCollector$$Lambda$881/0x0000000054170f00.accept(Unknown > Source) > > pipeline-elements-all-jvm_1 | at > java.util.concurrent.ConcurrentHashMap.forEach(ConcurrentHashMap.java:1597) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.wrapper.standalone.routing.StandaloneSpInputCollector.onEvent(StandaloneSpInputCollector.java:48) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.wrapper.standalone.routing.StandaloneSpInputCollector.onEvent(StandaloneSpInputCollector.java:29) > > pipeline-elements-all-jvm_1 | at > org.apache.streampipes.messaging.kafka.SpKafkaConsumer.run(SpKafkaConsumer.java:120) > > pipeline-elements-all-jvm_1 | at java.lang.Thread.run(Thread.java:823) > > > > ... and no data gets stored in it. > > > > We checked that the data can, in fact, be stored in this DB (we manually > stored some data in it) and retrieved with InfluxDB Data Stream. So, we are > certain that the DB works properly, and that the issue lies in the InfluxDB > data sink. > > > > > > > > In addition, on an unrelated note, we noticed that upon using the pipeline > that has Notification Sink, and when we get some notifications, the > notification icon still has the number of notifications displayed in it, even > though we actually saw and deleted these notifications. It acts as if they > were unread all the time. > > We believe that this is strange behavior, and we wanted to report that, as > well. > > > > > > > > Let us know if you need anything else. > > > > Kindest regards, > > Branislav >
