Hi Till,

That solved my issue ! Many many thanks for the solution and for the useful 
StackOverflow link ! ☺️

Cheers,
Sébastien 

> Le 30 mars 2021 à 18:16, Till Rohrmann <trohrm...@apache.org> a écrit :
> 
> Hi Sebastien,
> 
> I think the Scala compiler infers the most specific type for deepCopy() which 
> is Nothing (Nothing is the subtype of every type) [1] because you haven't 
> specified a type here. In order to make it work you have to specify the 
> concrete type:
> 
> event.get("value").deepCopy[ObjectNode]()
> 
> [1] https://stackoverflow.com/a/18008591/4815083 
> <https://stackoverflow.com/a/18008591/4815083>
> 
> Cheers,
> Till
> 
> On Tue, Mar 30, 2021 at 3:45 PM Lehuede sebastien <lehued...@gmail.com 
> <mailto:lehued...@gmail.com>> wrote:
> Hi all,
> 
> I’m currently trying to use Scala to setup a simple Kafka consumer that 
> receive JSON formatted events and then just send them to Elasticsearch. This 
> is the first step and after I want to add some processing logic. 
> 
> My code works well but interesting fields form my JSON formatted events are 
> under the key « value ». So I tried to use a map function to get fields under 
> « value » and send them to ES.
> 
> But when I try to use this map function I always get a ClassCastException 
> error. 
> 
> Additional Information :
> 
> • My Job run on a Flink Kubernetes Application cluster with a standalone job.
> • If I remove the map function, everything is working fine and I can find my 
> events in Elasticsearch
> • I tried to replace to ‘JSONKeyValueDeserializationSchema’ by 
> ’SimpleStringSchema’ and then use a mapper in my map function but I still 
> have the same issue.
> • I’m using Scala version 2.12.12
> 
> Here is my code : 
> 
> 
> 
> // Create the Flink Kafka connector
> val flinkKafkaConsumer = new FlinkKafkaConsumer(topics, new 
> JSONKeyValueDeserializationSchema(false), kafkaProperties)
> flinkKafkaConsumer.setStartFromEarliest()
> 
> val eventsStream: DataStream[ObjectNode] = env.addSource(flinkKafkaConsumer)
>   .name("Event Stream : Kafka consumer")
>   .map(event => event.get("value").deepCopy())
> 
> eventsStream.map(_.toString).addSink(esSinkBuilder.build()).name("Event 
> Stream : Elasticsearch Stream")
> 
> // execute program
> env.execute(« Kafka to ES")
> 
> 
> 
> Here is the error : 
> 
> [nybble-jobmanager] org.apache.flink.runtime.JobException: Recovery is 
> suppressed by NoRestartBackoffTimeStrategy
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:665)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:447)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
> [nybble-jobmanager]    at 
> jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:?]
> [nybble-jobmanager]    at 
> jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) 
> ~[?:?]
> [nybble-jobmanager]    at java.lang.reflect.Method.invoke(Unknown Source) 
> ~[?:?]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:306)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:159)
>  ~[flink-runtime_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> scala.PartialFunction.applyOrElse(PartialFunction.scala:127) 
> [scala-library-2.12.12.jar:?]
> [nybble-jobmanager]    at 
> scala.PartialFunction.applyOrElse$(PartialFunction.scala:126) 
> [scala-library-2.12.12.jar:?]
> [nybble-jobmanager]    at 
> akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175) 
> [scala-library-2.12.12.jar:?]
> [nybble-jobmanager]    at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176) 
> [scala-library-2.12.12.jar:?]
> [nybble-jobmanager]    at 
> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176) 
> [scala-library-2.12.12.jar:?]
> [nybble-jobmanager]    at akka.actor.Actor.aroundReceive(Actor.scala:517) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at akka.actor.Actor.aroundReceive$(Actor.scala:515) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> akka.actor.ActorCell.receiveMessage(ActorCell.scala:592) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at akka.actor.ActorCell.invoke(ActorCell.scala:561) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at akka.dispatch.Mailbox.run(Mailbox.scala:225) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at akka.dispatch.Mailbox.exec(Mailbox.scala:235) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) 
> [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager]    at 
> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>  [akka-actor_2.12-2.5.21.jar:2.5.21]
> [nybble-jobmanager] Caused by: java.lang.ClassCastException: class 
> org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode
>  cannot be cast to class scala.runtime.Nothing$ 
> (org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode
>  and scala.runtime.Nothing$ are in unnamed module of loader 'app')
> [nybble-jobmanager]    at 
> io.nybble.NybbleMain$.$anonfun$main$4(NybbleMain.scala:138) ~[?:?]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.api.scala.DataStream$$anon$4.map(DataStream.scala:625)
>  ~[flink-streaming-scala_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.api.operators.StreamMap.processElement(StreamMap.java:38)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:71)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:46)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.collect(CopyingChainingOutput.java:26)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:50)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.api.operators.CountingOutput.collect(CountingOutput.java:28)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$ManualWatermarkContext.processAndCollectWithTimestamp(StreamSourceContexts.java:322)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.api.operators.StreamSourceContexts$WatermarkContext.collectWithTimestamp(StreamSourceContexts.java:426)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordsWithTimestamps(AbstractFetcher.java:365)
>  ~[flink-connector-kafka_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.partitionConsumerRecordsHandler(KafkaFetcher.java:183)
>  ~[flink-connector-kafka_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.connectors.kafka.internals.KafkaFetcher.runFetchLoop(KafkaFetcher.java:142)
>  ~[flink-connector-kafka_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:826)
>  ~[flink-connector-kafka_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:110)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:66)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> [nybble-jobmanager]    at 
> org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:241)
>  ~[flink-streaming-java_2.12-1.12.1.jar:1.12.1]
> 
> 
> Does anyone has already had this kind of issue ? I think I’ve missed 
> something even if it’s a really basic use-case and code, I’m beginner in 
> Scala. 
> 
> Thanks in advance for your help !
> Sebastien

Reply via email to