栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 前沿技术 > 大数据 > 大数据系统

hive安装:3.1.2版本

hive安装:3.1.2版本

hive安装:3.1.2版本

hive下载地址: https://downloads.apache.org/hive/hive-3.1.2/apache-hive-3.1.2-bin.tar.gz
或者镜像:https://dlcdn.apache.org/hive/hive-3.1.2/

hive的home目录:https://cwiki.apache.org/confluence/display/Hive/Home
hive安装文档:https://cwiki.apache.org/confluence/display/Hive/GettingStarted
hive在线学习文档:https://cwiki.apache.org/confluence/display/Hive/Tutorial

安装环境:
hadoop+jdk+环境变量+mysql8.23

hive中sql语法类似mysql

1,解压文件并设置环境变量

解压文件
tar -zxvf apache-hive-3.1.2-bin.tar.gz
重命名
mv apache-hive-3.1.2-bin apache-hive-3.1.2
设置环境变量
vi  /etc/profile  
添加环境变量 
#hive
export HIVE_HOME=/apps/bigdata/apache-hive-3.1.2
export PATH=$HIVE_HOME/bin:$PATH
刷新环境变量
source  /etc/profile

2.元数据默认derby,生产库使用mysql,(此处使用mysql)
内置derby(测试)
bin/hive 启动 和 ./hive启动不一致
缺点:不同路径启动hive,每个hive拥有一套自己的元数据,无法共享,
mysql版本
修改配置文件 conf/hive-site.xml
复制文件并重命名
cp hive-default.xml.template hive-site.xml


javax.jdo.option.ConnectionURL
jdbc:mysql://192.168.189.10:3306/hive?createDatabaseIfNotExist=true

JDBC connect string for a JDBC metastore.
To use SSL to encrypt/authenticate the connection, provide database-specific SSL flag in the connection URL.
For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.




javax.jdo.option.ConnectionDriverName
com.mysql.cj.jdbc.Driver
Driver class name for a JDBC metastore



javax.jdo.option.ConnectionUserName
root
Username to use against metastore database



javax.jdo.option.ConnectionPassword
rootpassword
password to use against metastore database



    hive.metastore.warehouse.dir
    /hive/warehouse


    hive.exec.scratchdir
    /hive/tmp


    hive.querylog.location
    /hive/log

修改配置而文件 conf/hive-env.sh
复制文件并重命名
cp hive-env.sh.template hive-env.sh

配置hadoop地址
export HADOOP_HOME=/apps/bigdata/hadoop-3.2.2
配置hive的conf地址
export HIVE_CONF_DIR=/apps/bigdata/apache-hive-3.1.2/conf

3.mysql8.23安装 hadoop1上

echo "---------1----------"
echo "安装mysql的rpm包"
rpm -ivh mysql-community-common-8.0.23-1.el7.x86_64.rpm
rpm -ivh mysql-community-client-plugins-8.0.23-1.el7.x86_64.rpm
rpm -ivh mysql-community-libs-8.0.23-1.el7.x86_64.rpm --force --nodeps
rpm -ivh mysql-community-client-8.0.23-1.el7.x86_64.rpm
rpm -ivh mysql-community-server-8.0.23-1.el7.x86_64.rpm --force --nodeps
echo "---------2----------"
echo "修改/etc/my.cnf文件"
echo "#设置默认字符集UTF-8" >> /etc/my.cnf
echo "character_set_server=utf8" >> /etc/my.cnf
echo "#设置默认字符集UTF-8" >> /etc/my.cnf
echo "init_connect='SET NAMES utf8'" >> /etc/my.cnf
echo "#解决大小写敏感问题1=不敏感 默认0" >> /etc/my.cnf
echo "lower_case_table_names = 1" >> /etc/my.cnf
echo "skip-grant-tables" >> /etc/my.cnf
sleep 5s
echo "---------3----------"
systemctl start mysqld
echo "---------4----------"
mysql 
update user set host='%' where user='root';
SHOW VARIABLES LIKE 'validate_password%';
set global validate_password.policy=0;
set global validate_password.length=1;
flush privileges;
alter user root identified with mysql_native_password by 'rootpassword';
flush privileges;
#退出
exit
#查看状态
systemctl status mysqld
systemctl stop mysqld
echo '删除skip-grant-tables'
sed -i '$d' /etc/my.cnf
systemctl start mysqld
#登陆
mysql -uroot -prootpassword

4.配置
hive.metastore.warehouse.dir 默认/user/hive/warehouse
HDFS上创建/tmp和/user/hive/warehouse两个目录并修改他们的同组权限可写
配置环境变量直接使用命令不用进入hadoop-3.2.2目录下执行bin/hadoop fs -mkdir /tmp

hadoop fs -mkdir -p /hive/tmp
hadoop fs -mkdir -p /hive/warehouse
hadoop fs -mkdir -p /hive/log

hadoop fs -chmod 777 /hive/tmp
hadoop fs -chmod 777 /hive/warehouse
hadoop fs -chmod 777 /hive/log

5./apps/bigdata/apache-hive-3.1.2/bin初始化文件

./schematool -initSchema -dbType mysql

6.启动

进入apache-hive-3.1.2的bin目录
cd /apps/bigdata/apache-hive-3.1.2/bin

启动hive报错
报错Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380)
...
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
原因:
hadoop和hive的两个guava.jar版本不一致
两个位置分别位于下面两个目录:
/apps/bigdata/apache-hive-3.1.2/lib/
/apps/bigdata/hadoop-3.2.2/share/hadoop/common/lib/
解决办法:
删除低版本的那个,将高版本的复制到低版本目录下
guava-27.0-jre.jar用这个替换/apps/bigdata/apache-hive-3.1.2/lib/下的guava-19.0.jar

7.同步文件至hadoop2,hadoop3

xsync /etc/profile
xsync /apps/bigdata/apache-hive-3.1.2

8.hive

hadoop2,hadoop3上都可以使用

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/312733.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号