[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2021-04-29 Thread Flink Jira Bot (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17336942#comment-17336942
 ] 

Flink Jira Bot commented on FLINK-20617:


This issue was labeled "stale-critical" 7 ago and has not received any updates 
so it is being deprioritized. If this ticket is actually Critical, please raise 
the priority and ask a committer to assign you the issue or revive the public 
discussion.


> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
>  Labels: stale-critical
> Attachments: taskmanager.out
>
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2021-04-22 Thread Flink Jira Bot (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17329009#comment-17329009
 ] 

Flink Jira Bot commented on FLINK-20617:


This critical issue is unassigned and itself and all of its Sub-Tasks have not 
been updated for 7 days. So, it has been labeled "stale-critical". If this 
ticket is indeed critical, please either assign yourself or give an update. 
Afterwards, please remove the label. In 7 days the issue will be deprioritized.

> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
>  Labels: stale-critical
> Attachments: taskmanager.out
>
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2020-12-16 Thread Yang Wang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17250874#comment-17250874
 ] 

Yang Wang commented on FLINK-20617:
---

[~Georger] Thanks for your verification. I believe it is a limitation of the 
current application mode. Actually, we do not need to add the user jar to 
system classpath in application mode. Since it will always be added to 
distributed cache and then pulled by TaskManager.

 

cc [~kkl0u], [~aljoscha], do you think it makes sense to not put user jars in 
the system classpath in application mode?

> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
> Attachments: taskmanager.out
>
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2020-12-16 Thread Georger (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17250851#comment-17250851
 ] 

Georger commented on FLINK-20617:
-

[~fly_in_gis]  I has try , it works ,but it looks so weird ...

> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
> Attachments: taskmanager.out
>
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2020-12-16 Thread Yang Wang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17250821#comment-17250821
 ] 

Yang Wang commented on FLINK-20617:
---

[~Georger] Do you have a test via adding 
"-Dyarn.per-job-cluster.include-user-jar=DISABLED" in you submission command?

 

I am copying the analysis on the ML.

For application mode, the job submission happens in the JobManager side. We are 
using an embedded client to submit the job. So the user jar will be added to 
distributed cache. When deploying a task to TaskManager, it will be downloaded 
again and run in user classloader even though we already have it in the system 
classpath. I think it might be the reason why these classes are loaded by 
different classloaders.

 

For per-job mode, we are recovering the job and the user jars will not be added 
to distributed cache.

 

Please also refer the discussion here[1]. 

[1]. 
https://lists.apache.org/list.html?u...@flink.apache.org:lte=1M:failed%20w%2F%20Application%20Mode%20but%20succeeded%20w%2F%20Per-Job%20Cluster%20Mode

> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
> Attachments: taskmanager.out
>
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2020-12-16 Thread Georger (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17250762#comment-17250762
 ] 

Georger commented on FLINK-20617:
-

it seems the 
org.apache.kafka.common.serialization.Deserializer has loaded twice, here is 
the detail log:
[Loaded org.apache.kafka.common.serialization.Deserializer from 
file:/mnt/data1/yarn/nm/usercache/hdfs/appcache/application_1606119213964_1657/blobStore-a0f843a0-737f-491c-9d8e-a7ec76568e07/job_24c0a330e49ff08ff3355fc9efeaa7bf/blob_p-bf8f8652268b5550585fcc184b2925c5e613a255-932296ba280a1ae7a544c24b249e68cd]

[Loaded org.apache.kafka.common.serialization.Deserializer from 
file:/mnt/data0/yarn/nm/usercache/hdfs/appcache/application_1606119213964_1657/filecache/14/flink-examples-1.0-SNAPSHOT201217_104230_459_r57.jar]

> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2020-12-16 Thread Georger (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17250761#comment-17250761
 ] 

Georger commented on FLINK-20617:
-

it seems not the problem , i have only use flink-connector-kafka-0.11_2.11 or 
flink-connector-kafka_2.11, the same result it only with application mode 
[~aljoscha]

> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2020-12-16 Thread Aljoscha Krettek (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17250216#comment-17250216
 ] 

Aljoscha Krettek commented on FLINK-20617:
--

I think this is because of a dependency mismatch. You're excluding 
{{kafka-clients}} and including a custom version.

> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20617) Kafka Consumer Deserializer Exception on application mode

2020-12-16 Thread Yang Wang (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20617?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17250187#comment-17250187
 ] 

Yang Wang commented on FLINK-20617:
---

Could you add "-Dyarn.per-job-cluster.include-user-jar=DISABLED" in you 
submission command and have a try? We have a same discussion in the user ML.

> Kafka Consumer Deserializer Exception on application mode
> -
>
> Key: FLINK-20617
> URL: https://issues.apache.org/jira/browse/FLINK-20617
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Kafka
>Affects Versions: 1.11.2
> Environment: application mode
> flink 1.11.2 with  hadoop 2.6.0-cdh5.15.0
>Reporter: Georger
>Priority: Critical
>
> Kafka source may has some issues on application mode
>  
> when i run it with application mode on  flink 1.11.2 it can't startup
> the detail Excetion is:
> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:789)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:643)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:623)
> at 
> org.apache.flink.streaming.connectors.kafka.internal.KafkaPartitionDiscoverer.initializeConnections(KafkaPartitionDiscoverer.java:58)
> at 
> org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94)
> at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:550)
> at 
> org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
> at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:291)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:479)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:475)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:528)
> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.kafka.common.KafkaException: 
> org.apache.kafka.common.serialization.ByteArrayDeserializer is not an 
> instance of org.apache.kafka.common.serialization.Deserializer
> at 
> org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:263)
> at 
> org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:688)
> ... 15 more
> The pom is:
> 
>  org.apache.flink
>  flink-connector-kafka_2.11
>  ${flink.version}
>  
>  
>  org.slf4j
>  slf4j-api
>  
>  
>  org.apache.kafka
>  kafka-clients
>  
>  
> 
> 
>  org.apache.kafka
>  kafka-clients
>  1.0.1
> 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)