(litiliu)
主题: Re:Re: get NoSuchMethodError when using flink
flink-sql-connector-hive-2.2.0_2.11-1.14.4.jar
Hi, Liu.
It seems that you may use other own jars and thay has the common-lang3 with
other versions, which may cause the version conflict.
My suggesstion is that you can shade
ns-default
Best regards,
Yuxia
发件人: "Liting Liu (litiliu)"
收件人: "User"
发送时间: 星期三, 2022年 8 月 31日 下午 5:14:35
主题: get NoSuchMethodError when using flink
flink-sql-connector-hive-2.2.0_2.11-1.14.4.jar
Hi, i got NoSuchMethodError when using flink
flink-sql-connector-hive-2.
uot;
收件人: "User"
发送时间: 星期三, 2022年 8 月 31日 下午 5:14:35
主题: get NoSuchMethodError when using flink
flink-sql-connector-hive-2.2.0_2.11-1.14.4.jar
Hi, i got NoSuchMethodError when using flink
flink-sql-connector-hive-2.2.0_2.11-1.14.4.jar.
Excep
Hi, i got NoSuchMethodError when using flink
flink-sql-connector-hive-2.2.0_2.11-1.14.4.jar.
Exception in thread "main" org.apache.flink.table.client.SqlClientException:
Unexpected exception. This is a bug. Please consider filin
Hi!
I am having continued issues running a StateFun job in an existing Flink
Cluster. My Flink cluster is using Flink version 1.14.3 and the StateFun job is
using version 3.2.0 of the java SDK and statefun distribution. I get the
following error:
Caused by: java.lang.NoSuchMethodError:
HI 您好,
hbase-client 包是2.1.0 flink 1.12.4
hbase 代码如下:
hbase代码extends TableInputFormat>
try {
connection = ConnectionFactory.createConnection(hbaseConf);
// Table table=connection.getTable(TableName.valueOf(tableName));
table = (HTable) connection.getTable(TableName.valueOf(tableName));
Dependending on the build system used, you could check the dependency tree,
e.g. for Maven it would be `mvn dependency:tree
-Dincludes=org.apache.parquet`
Matthias
On Wed, Jun 30, 2021 at 8:40 AM Thomas Wang wrote:
> Thanks Matthias. Could you advise how I can confirm this in my environment?
>
Thanks Matthias. Could you advise how I can confirm this in my environment?
Thomas
On Tue, Jun 29, 2021 at 1:41 AM Matthias Pohl
wrote:
> Hi Rommel, Hi Thomas,
> Apache Parquet was bumped from 1.10.0 to 1.11.1 for Flink 1.12 in
> FLINK-19137 [1]. The error you're seeing looks like some
Hi Rommel, Hi Thomas,
Apache Parquet was bumped from 1.10.0 to 1.11.1 for Flink 1.12 in
FLINK-19137 [1]. The error you're seeing looks like some dependency issue
where you have a version other than 1.11.1
of org.apache.parquet:parquet-column:jar on your classpath?
Matthias
[1]
To give more information
parquet-avro version 1.10.0 with Flink 1.11.2 and it was running fine.
now Flink 1.12.1, the error msg shows up.
Thank you for help.
Rommel
On Tue, Jun 22, 2021 at 2:41 PM Thomas Wang wrote:
> Hi,
>
> We recently upgraded our Flink version from 1.11.2 to 1.12.1
Hi,
We recently upgraded our Flink version from 1.11.2 to 1.12.1 and one of our
jobs that used to run ok, now sees the following error. This error doesn't
seem to be related to any user code. Can someone help me take a look?
Thanks.
Thomas
java.lang.NoSuchMethodError:
bin/start-scala-shell.sh yarn
scala> Exception in thread "main" java.lang.NoSuchMethodError:
jline.console.completer.CandidateListCompletionHandler.setPrintSpaceAfterFullCompletion(Z)V
at
scala.tools.nsc.interpreter.jline.JLineConsoleReader.initCompletion(JLineReader.scala:139)
at
感觉像是jline和Scala 某些包冲突所致,Scala我不太了解,你可以从以下方面做些尝试
1.在pom.xml或者其他相关文件中, 排除hadoop(以及其他涉及到jline的依赖)依赖中的jline子依赖,单独引入jline的依赖
我当时遇到的问题是,hadoop-common出现了版本冲突,在某个依赖中包含hadoop-common包,我在该依赖中排除了hadoop-common,然后在单独引入hadoop-common依赖,问题得以解决。
2. 改变(升级)Scala的版本
Thanks!
Jacob
--
Sent from:
谢谢回复。
我已经采用 `hadoop classpath` 方式完成了 Hadoop 的集成。当前的问题是在 CDH 5.16.2 + Flink 环境下遇到的
补充下丢失的截图信息
使用 Scala REPL Yarn 运行模式报 NoSuchMethodError,详细错误信息如下:
$ ./bin/start-scala-shell.sh yarn
|
scala> Exception in thread "main" java.lang.NoSu
hi,
你的截图好像没有上传成功,通过你的描述,大概是NoSuchMethod之类的错误,我前几天在升级flink版本时候也遇到过类似问题,后来的解决方案是
导入hadoop classpath (export HADOOP_CLASSPATH=`hadoop
classpath`)解决的,如果没有解决你的问题,尝试把flink-shaded-hadoop-2-uber*-*.jar放在 flink/lib下面
Thanks!
Jacob
--
Sent from: http://apache-flink.147419.n8.nabble.com/
各位好,辛苦帮忙看个问题
使用 Scala REPL Yarn 运行模式报 NoSuchMethodError,截图如下:
$ ./bin/start-scala-shell.sh yarn
环境说明
CDH 5.16.2
测试 Flink 1.10.2 和 1.11.2 都能复现该问题
已分析内容
使用 Arthas 查看已加载类,加载的是 CDH 相关依赖
删除这个 CDH 依赖 jline-2.11.jar,不再报 NoSuchMethodError。但 Arthas 没有找到
hi
从报错信息看应该jar包冲突了,可以贴一下相关的依赖包吗
-
Best Wishes
JasonLee
--
Sent from: http://apache-flink.147419.n8.nabble.com/
Hi,
你的 Flink 版本是哪个呢。从报错来看你在用 legacy planner,可以使用 blink planner 试试。
Best,
Hailong
在 2020-12-03 10:02:08,"18293503878" <18293503...@163.com> 写道:
>大家使用Flink SQL的tumble函数时,将结果表转换为流,报如下错误的异常吗
>Exception in thread "main" java.lang.NoSuchMethodError:
Hi,
In 1.10 there is no
'Lorg/apache/flink/fs/s3presto/common/HadoopConfigLoader' . So I think
there might be a legacy S3FileSystemFactory in your jar. You could check
whether there is a 'org.apache.flink.fs.s3presto.common.HadoopConfigLoader'
in your jar or not. If there is one you could remove
Hello,
I'm trying to upgrade Flink from 1.7 to 1.10 retaining our Hadoop
integration. I copied the jar
file flink-shaded-hadoop-2-uber-2.7.5-10.0.jar into /opt/flink/lib. I also
copied the files flink-s3-fs-hadoop-1.10.0.jar and
flink-s3-fs-presto-1.10.0.jar into /opt/flink/plugins/s3 folder.
in rocksdbjni-5.17.2 (or we can say
> frocksdbjni-5.17.2-artisans-1.0 in Flink-1.8). That's why you come across
> this NoSuchMethodError exception.
>
> If no necessary, please do not assemble rocksdbjni package in your user
> code jar as flink-dist already provide all needed classes.
(or we can say
frocksdbjni-5.17.2-artisans-1.0 in Flink-1.8). That's why you come across this
NoSuchMethodError exception.
If no necessary, please do not assemble rocksdbjni package in your user code
jar as flink-dist already provide all needed classes. Moreover, adding
dependency of flink
Hi,
I am using Apache Beam 2.14.0 with Flink 1.8.0 and I have included the
RocksDb dependency in my projects pom.xml as well as baked it into the
Dockerfile like this:
FROM flink:1.8.0-scala_2.11
ADD --chown=flink:flink
Hi Lakeshen,
Thanks for trying out blink planner.
First question, are you using blink-1.5.1 or flink-1.9-table-planner-blink
?
We suggest to use the latter one because we don't maintain blink-1.5.1, you
can try flink 1.9 instead.
Best,
Jark
On Tue, 30 Jul 2019 at 17:02, LakeShen wrote:
> Hi
Hi all,when I use blink flink-sql-parser module,the maven dependency like
this:
com.alibaba.blink
flink-sql-parser
1.5.1
I also import the flink 1.9 blink-table-planner module , I
use FlinkPlannerImpl to parse the sql to get the List. But
when I run the program , it throws the exception like
).build()
> but when in running time, a NoSuchMethodError throws out, I think the
> reason is:
> There are two RestClient classes, one is in the jar I include, the other
> one is in flink-connector-elasticsearch5, but the argument of build method
&
der( new HttpHost(xxx,
> xxx,"http") ).build()
> but when in running time, a NoSuchMethodError throws out, I think the
> reason is:
> There are two RestClient classes, one is in the jar I include, the other
> one is in flink-connector-elasticsearch5, but the argument of
Hi All,
I use the following code try to build a RestClient
org.elasticsearch.client.RestClient.builder( new HttpHost(xxx,
xxx,"http") ).build()
but when in running time, a NoSuchMethodError throws out, I think the
reason is:
There are two RestClient cl
Hi,
I am trying to run the code examples from the Gelly documentation, in
particular this code:
import org.apache.flink.api.scala._
import org.apache.flink.graph.generator.GridGraph
object SampleObject {
def main(args: Array[String]) {
val env =
Hi All,
After successfully writing the wordcount program, I was trying to create a
streaming application but is getting below error when submitting the job in
local mode.
Vishnus-MacBook-Pro:flink vishnu$ flink run
target/scala-2.11/flink-vishnu_2.11-1.0.jar
java.lang.NoSuchMethodError:
I'm sorry that we changed the method name between minor versions.
We'll soon bring some infrastructure in place a) mark the audience of
classes and b) ensure that public APIs are stable.
On Wed, Sep 2, 2015 at 9:04 PM, Ferenc Turi wrote:
> Ok. As I see only the method name
Ok. As I see only the method name was changed. It was an unnecessary
modification which caused the incompatibility.
F.
On Wed, Sep 2, 2015 at 8:53 PM, Márton Balassi
wrote:
> Dear Ferenc,
>
> The Kafka consumer implementations was modified from 0.9.0 to 0.9.1,
>
Hi,
I tried to use the latest 0.9.1 release but I got:
java.lang.NoSuchMethodError:
org.apache.flink.util.NetUtils.ensureCorrectHostnamePort(Ljava/lang/String;)V
at
com.nventdata.kafkaflink.sink.FlinkKafkaTopicWriterSink.(FlinkKafkaTopicWriterSink.java:69)
at
Dear Ferenc,
The Kafka consumer implementations was modified from 0.9.0 to 0.9.1, please
use the new code. [1]
I suspect that your com.nventdata.kafkaflink.sink.FlinkKafkaTopicWriterSink
depends on the way the Flink code used to look in 0.9.0, if you take a
closer look Robert changed the
34 matches
Mail list logo