将mysql-connector-jdbc-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群(如果使用多个connector需要使用其他的maven打包插件 进入解决办法)。 然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突) 错误2:Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for ‘org.apache.flink.table.factories.TableSourceFactory’ in the classpath. Reason: Required context properties mismatch. The following properties are requested: connector.driver=com.mysql.jdbc.Driver
将flink-connector-hive-xxx.jar(对应版本)、hive-exec-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群。 然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突) 错误3:Caused by: java.lang.NoClassDefFoundError: org/apache/flink/table/catalog/hive/HiveCatalog at com.MysqlToHiveSql.main(MysqlToHiveSql.java:90) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
将hive-exec-xxx.jar(对应版本)上传至flink集群的lib下,重启flink集群。 然后使用不带依赖的jar,提交运行(带依赖的jar,可能会引起jar冲突) 方式2: 错误1:Caused by: java.lang.ClassNotFoundException: org.apache.thrift.TException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
错误2:Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for ‘org.apache.flink.table.factories.TableSourceFactory’ in the classpath. Reason: Required context properties mismatch. The following properties are requested: connector.driver=com.mysql.jdbc.Driver
错误3:Caused by: java.lang.NoClassDefFoundError: org/apache/flink/table/catalog/hive/HiveCatalog at com.MysqlToHiveSql.main(MysqlToHiveSql.java:90) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
使用Flink的远程提交方式,提交程序Caused by: java.lang.ClassNotFoundException: org.apache.thrift.TException at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
pom问需要导入相关依赖
提交任务-相关代码(跟jar中的代码有一点不同(环境不同),其他必须一致,否则无法运行)org.apache.flink flink-connector-hive_${scala.binary.version} ${flink.version} org.apache.flink flink-connector-jdbc_${scala.binary.version} ${flink.version} mysql mysql-connector-java 5.1.44 org.apache.hive hive-exec 2.3.6
public static void main(String[] args) throws Exception {
//任务jar中的环境(任务jar使用)
String mavenPath = "D:/apache-maven/maven-repository/";
// 第一个jar是不带依赖的需要跑任务jar,后面的jar为任务jar需要的依赖
//提交任务的环境(用于远程提交任务)
StreamExecutionEnvironment env = StreamExecutionEnvironment.createRemoteEnvironment("192.168.88.108",
8081,
1,
"Flink-Test/target/Flink-Test-1.0-SNAPSHOT.jar",
mavenPath + "org/apache/flink/flink-connector-jdbc_2.11/1.12.1/flink-connector-jdbc_2.11-1.12.1.jar",
mavenPath + "org/apache/flink/flink-connector-hive_2.11/1.12.1/flink-connector-hive_2.11-1.12.1.jar",
mavenPath + "org/apache/hive/hive-exec/2.3.6/hive-exec-2.3.6.jar",
mavenPath + "mysql/mysql-connector-java/5.1.44/mysql-connector-java-5.1.44.jar");
EnvironmentSettings settings = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env, settings);
//此处做了大量代码省略(跟任务jar的代码一致)
..........................................
HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir);
tableEnv.registerCatalog(name, hive);
tableEnv.useCatalog(name);
tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
tableEnv.useDatabase("test");
//此处做了大量代码省略(跟任务jar的代码一致)
.....................................
env.execute();
}
其他
错误4:
配置环境变量Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at com.MysqlToHiveSql.main(MysqlToHiveSql.java:110) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
# 配置在HADOOP_HOME的之后及export PATH=$PATH:$HADOOP_HOME/bin之后 export HADOOP_CLASSPATH=$(hadoop classpath)
实例
org.apache.maven.plugins maven-assembly-plugin 3.3.0 jar-with-dependencies com.MysqlToHiveSqlProp make-assembly package single



