栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

提交Flink任务报错的问题

提交Flink任务报错的问题

编写一个flink-cdc程序上传到服务器运行报错:

Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded. For a full list of supported file systems, please see https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/.
        at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:531)
        at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:408)
        at org.apache.flink.core.fs.Path.getFileSystem(Path.java:274)
        at org.apache.flink.runtime.state.filesystem.FsCheckpointStorageAccess.(FsCheckpointStorageAccess.java:64)
        at org.apache.flink.runtime.state.filesystem.FsStateBackend.createCheckpointStorage(FsStateBackend.java:518)
        at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.(CheckpointCoordinator.java:331)
        ... 19 more
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.
        at org.apache.flink.core.fs.UnsupportedSchemeFactory.create(UnsupportedSchemeFactory.java:55)
        at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:527)
        ... 24 more

上面的意思就是说操作不了hdfs。

我新安装的flink12,为啥操作不了。原来因为flink需要相对应的jar包才能连接hdfs,所以去下载jar包就好了。

https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop-3-uber/3.1.1.7.2.8.0-224-9.0

​​​​​​https://mvnrepository.com/artifact/commons-cli/commons-cli/1.4

下载好上面两个jar包后,上传到flink12的lib文件夹下,就好了。记得重启flink集群

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/735640.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号