flink连接hive的相关配置和Embedded metastore is not allowed. Make sure you have set a valid value for hive.metastore.uris报错处理
1.在Maven中导入依赖
org.apache.flink flink-connector-hive_${scala.binary.version} ${flink.version} org.apache.hive hive-exec 3.1.2 com.google.guava guava org.apache.hadoop hadoop-client 3.1.3
2.在安装了hive的服务器上启动hive元数据
nohup hive --service metastore >/dev/null 2>&1 &
3.连接hive
System.setProperty("HADOOP_USER_NAME", "root");
//1.流的执行环境
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(1);
//2.表的执行环境
StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env);
//TODO 3.创建HiveCatalog
String name = "myhive"; // Catalog 名字
String defaultDatabase = "flink_test"; // 默认数据库
String hiveConfDir = "D:\java\learn\flink\hive"; // hive配置文件的目录. 需要把hive-site.xml添加到该目录
HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir);
//TODO 4.注册Catalog
tableEnv.registerCatalog(name, hive);
//TODO 5.设置相关参数
tableEnv.useCatalog(name);
tableEnv.useDatabase(defaultDatabase);
//TODO 6.设置Sql方言 指定SQL语法为Hive语法
tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
//TODO 7.查询数据
tableEnv.executeSql("select * from stu").print();
在连接的时候出现了报错
Exception in thread “main” java.lang.IllegalArgumentException: Embedded metastore is not allowed. Make sure you have set a valid value for hive.metastore.uris
后来检查发现需要在hive-site.xml中添加下面的依赖
hive.metastore.uris thrift://111.230.175.183:9083



