Hi
  从错误日志看,应该是 filesystem 相关的配置(或者 jar 包)有问题,可以参考下这个邮件列表[1]看看能否解决你的问题

[1]
http://apache-flink.147419.n8.nabble.com/Flink-1-11-1-on-k8s-hadoop-td5779.html#a5834
Best,
Congxian


superainbower <[email protected]> 于2020年9月30日周三 下午3:04写道:

> 补充一下,我的错误日志
> Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException:
> Hadoop is not in the classpath/dependencies.
> Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException:
> Could not find a file system implementation for scheme 'hdfs'. The scheme
> is not directly supported by Flink and no Hadoop file system to support
> this scheme could be loaded. For a full list of supported file systems,
> please seehttps://
> ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/.
>
>
> 应该是没有Hadoop的路径,这个在K8s下面 该怎么去配置呢
> | |
>
>
> superainbower
> |
> |
> [email protected]
> |
> 签名由网易邮箱大师定制
>
>
> 在2020年09月30日 14:33,superainbower<[email protected]> 写道:
> Hi,all
> 请教下,哪个朋友知道Flink on K8s上做 statebackend 配置,除了将下列配置写到flink-conf.yml里,还需要作哪些工作?
> state.backend: rocksdb
> state.checkpoints.dir: hdfs://master:8020/flink/checkpoints
> state.savepoints.dir: hdfs://master:8020/flink/savepoints
> state.backend.incremental: true
>
>
> | |
> superainbower
> |
> |
> [email protected]
> |
> 签名由网易邮箱大师定制
>
>

回复