元数据初始化命令:
schematool -dbType mysql -initSchema
控制台日志(元数据初始化失败)
#bin/hive 进入客户端,查看数据时 Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHivemetaStor #元数据初始化错误提示信息 com.mysql.jdbc.exceptions.jdbc4.CommunicationsException : Communications link failure
控制台日志(元数据初始化成功)
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/sweet/software/apache-hive-2.3.6-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/sweet/software/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] metastore connection URL: jdbc:mysql://master:3306/metastore?createDatabaseIfNotExist=true metastore Connection Driver : com.mysql.jdbc.Driver metastore connection User: root Starting metastore schema initialization to 2.3.0 Initialization script hive-schema-2.3.0.mysql.sql Initialization script completed schemaTool completed
日志信息,会把hive-site.xml配置信息读出来,显示元数据配置相关信息
这里是成功日志,提示schemaTool completed
如果不成功就是失败的,日志中会提示失败的原因,这里失败日志没有保存
附上hive配置信息
hive.exec.scratchdir /home/hadoop/hive/tmp Hive作业的HDFS根目录位置 hive.scratch.dir.permission 733 Hive作业的HDFS根目录创建写权限 hive.metastore.warehouse.dir /home/hadoop/hive/warehouse hdfs上hive数据存放位置 javax.jdo.option.ConnectionURL jdbc:mysql://192.168.1.212:3306/hive?createDatabaseIfNotExist=true 连接数据库用户名称 javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver 连接数据库驱动 javax.jdo.option.ConnectionUserName root 连接数据库用户名称 javax.jdo.option.ConnectionPassword root 连接数据库用户密码 hive.cli.print.header true 客户端显示当前查询表的头信息 hive.cli.print.current.db true 客户端显示当前数据库名称信息 hive.metastore.schema.verification false hive.metastore.local false hive.metastore.uris thrift://master:9083 主节点元数据服务
关注公众号:大数据最后一公里



