栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies“)出现错误

Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies“)出现错误

Flink Web ui 提交运行打包jar带依赖(“xxxx.-jar-with-dependencies”)出现错误 方式1: 错误1:

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for ‘org.apache.flink.table.factories.TableSourceFactory’ in the classpath. Reason: Required context properties mismatch. The following properties are requested: connector.driver=com.mysql.jdbc.Driver

将mysql-connector-jdbc-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群(如果使用多个connector需要使用其他的maven打包插件 进入解决办法)。

然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突) 错误2:

Caused by: java.lang.NoClassDefFoundError: org/apache/flink/table/catalog/hive/HiveCatalog at com.MysqlToHiveSql.main(MysqlToHiveSql.java:90) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at

将flink-connector-hive-xxx.jar(对应版本)、hive-exec-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群。

然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突) 错误3:

Caused by: java.lang.ClassNotFoundException: org.apache.thrift.TException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

将hive-exec-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群。

然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突) 方式2: 错误1:

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for ‘org.apache.flink.table.factories.TableSourceFactory’ in the classpath. Reason: Required context properties mismatch. The following properties are requested: connector.driver=com.mysql.jdbc.Driver

错误2:

Caused by: java.lang.NoClassDefFoundError: org/apache/flink/table/catalog/hive/HiveCatalog at com.MysqlToHiveSql.main(MysqlToHiveSql.java:90) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at

错误3:

Caused by: java.lang.ClassNotFoundException: org.apache.thrift.TException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

使用Flink的远程提交方式,提交程序

pom问需要导入相关依赖


            org.apache.flink
            flink-connector-hive_${scala.binary.version}
            ${flink.version}


            org.apache.flink
            flink-connector-jdbc_${scala.binary.version}
            ${flink.version}


            mysql
            mysql-connector-java
            5.1.44


            org.apache.hive
            hive-exec
            2.3.6

提交任务-相关代码(跟jar中的代码有一点不同(环境不同),其他必须一致,否则无法运行)
public static void main(String[] args) throws Exception {
		//任务jar中的环境(任务jar使用)
		


		
        String mavenPath = "D:/apache-maven/maven-repository/";
        // 第一个jar是不带依赖的需要跑任务jar,后面的jar为任务jar需要的依赖
		//提交任务的环境(用于远程提交任务)
        StreamExecutionEnvironment env = StreamExecutionEnvironment.createRemoteEnvironment("192.168.88.108",
                8081,
                1,
                "Flink-Test/target/Flink-Test-1.0-SNAPSHOT.jar",
                mavenPath + "org/apache/flink/flink-connector-jdbc_2.11/1.12.1/flink-connector-jdbc_2.11-1.12.1.jar",
                mavenPath + "org/apache/flink/flink-connector-hive_2.11/1.12.1/flink-connector-hive_2.11-1.12.1.jar",
                mavenPath + "org/apache/hive/hive-exec/2.3.6/hive-exec-2.3.6.jar",
                mavenPath + "mysql/mysql-connector-java/5.1.44/mysql-connector-java-5.1.44.jar");
			
	EnvironmentSettings settings = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
	StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env, settings);
	
	//此处做了大量代码省略(跟任务jar的代码一致)
	..........................................
	
	HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir);
    tableEnv.registerCatalog(name, hive);
    tableEnv.useCatalog(name);
    tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
    tableEnv.useDatabase("test");


	//此处做了大量代码省略(跟任务jar的代码一致)
	.....................................

	env.execute();
}
              

其他 错误4:

Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at com.MysqlToHiveSql.main(MysqlToHiveSql.java:110) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

配置环境变量
# 配置在HADOOP_HOME的之后及export PATH=$PATH:$HADOOP_HOME/bin之后
export HADOOP_CLASSPATH=$(hadoop classpath)

实例

重启flink集群,解决 注意:使用的Maven打包插件

         
             
                 org.apache.maven.plugins
                 maven-assembly-plugin
                 3.3.0
                 
                     
                         jar-with-dependencies
                     
                     
                         
                             com.MysqlToHiveSqlProp
                         
                     
                 
                 
                     
                         make-assembly
                         package
                         
                             single
                         
                     
                 
             
         

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/746911.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号