栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 软件开发 > 后端开发 > Java

Centos下mysql,jdk,hadoop,hive安装及ssm和多线程生成日志及hive处理数据

Java 更新时间: 发布时间: IT归档 最新发布 模块sitemap 名妆网 法律咨询 聚返吧 英语巴士网 伯小乐 网商动力

Centos下mysql,jdk,hadoop,hive安装及ssm和多线程生成日志及hive处理数据

修改主机名 

hostnamectl set-hostname cjy

 查看

hostname

下载vim

yum -y install vim

修改ip

vim /etc/sysconfig/network-scripts/ifcfg-ens33

更改

BOOTPROTO="static"

添加

IPADDR="192.168.100.151"
NETMASK="255.255.255.0"
GATEWAY="192.168.100.2"
DNS1="114.114.114.114"
DNS2="8.8.8.8"

重启

service network restart

查看ip

ip addr

用shell进行连接

关闭防火墙

systemctl stop firewalld
systemctl disable firewalld 

修改yum源为阿里源

yum install -y wget
cd /etc/yum.repos.d/
mv CentOS-base.repo CentOS-base.repo_bak
wget -O /etc/yum.repos.d/CentOS-base.repo http://mirrors.aliyun.com/repo/Centos-7.repo
yum clean all
yum makecache

安装jdk(将jdk安装包放在opt目录下

建立文件夹

mkdir -p /opt/soft/jdk180
tar -zxf /opt/jdk-8u60-linux-x64.tar.gz -C /opt/soft/jdk180 --strip-components 1
vim /etc/profile

添加java环境

#java environment
export JAVA_HOME=/opt/soft/jdk180
export CLASSPATH=.:${JAVA_HOME}/jre/lib/rt.jar:${JAVA_HOME}/lib/dt.jar:${JAVA_HOME}/lib/tools.jar
export PATH=$PATH:${JAVA_HOME}/bin
source /etc/profile

 安装MySQL

cd /opt/

检查系统中是否已安装 MySQL。

rpm -qa | grep mysql
返回空值的话,就说明没有安装 MySQL

查看已安装的 Mariadb 数据库版本。

rpm -qa|grep -i mariadb
卸载已安装的 Mariadb 数据库。

rpm -qa|grep mariadb|xargs rpm -e --nodeps
再次查看已安装的 Mariadb 数据库版本,确认是否卸载完成。

rpm -qa|grep -i mariadb

wget -i -c http://dev.mysql.com/get/mysql57-community-release-el7-10.noarch.rpm
yum -y install mysql57-community-release-el7-10.noarch.rpm
yum -y install mysql-community-server

 解决中文乱码问题

vim /etc/my.cnf

添加如下 

[mysqld]
character-set-server=utf8
skip-grant-tables
[client]
default-character-set=utf8
[mysql]
default-character-set=utf8

systemctl start  mysqld.service

进入mysql(mysql

use mysql;
update mysql.user set authentication_string=password('okok') where user='root'
GRANT ALL PRIVILEGES ON *.* TO root@'%' IDENTIFIED BY 'okok';

退出mysql

systemctl restart mysqld.service

 进入my.cnf删除skip。。。。(根据自己情况

进入sql检查

mysql -u root -p

 安装hadoop

 将hadoop压缩包放置在opt目录下

cd /opt/
mkdir -p soft/hadoop260
tar -zxf hadoop-2.6.0-cdh5.14.2.tar.gz -C /opt/soft/hadoop260 --strip-components 1
yum install -y vim

更改文件hadoop-env.sh

注意如下

cd /opt/soft/hadoop260/etc/hadoop
vim hadoop-env.sh

javahome改为如下 

export JAVA_HOME=/opt/soft/jdk180

更改文件core-site.xml

vim core-site.xml

添加如下(更改下面的地址ip


        
                fs.defaultFS
                hdfs://192.168.100.110:9000
        
        
                hadoop.tmp.dir
                /opt/soft/hadoop260/tmp
        
        
                hadoop.proxyuser.root.groups
                *
        
        
                hadoop.proxyuser.root.hosts
                *
        
        
                hadoop.proxyuser.root.users
                *
        

更改文件hdfs-site.xml

vim hdfs-site.xml

添加如下 


        
                dfs.replication
                1
        

更改文件

cp mapred-site.xml.template mapred-site.xml
vim mapred-site.xml

添加如下


        
                mapreduce.framework.name
                yarn
        

更改文件 yarn-site.xml

vim yarn-site.xml

添加如下




        
                yarn.resourcemanager.localhost
                localhost
        
        
                yarn.nodemanager.aux-services
                mapreduce_shuffle
        

更改文件

vim /etc/profile

最后添加 

#hadoop environent
export HADOOP_HOME=/opt/soft/hadoop260
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_INSTALL=$HADOOP_HOME

输入

source /etc/profile
hadoop namenode -format
start-all.sh
jps
stop-all.sh

设置无密码

ssh-keygen -t rsa -P ''
cd .ssh/
ll
cat id_rsa.pub
ssh-copy-id root@cjy
ll
cat authorized_keys
ssh cjy
exit
cd /opt/soft/hadoop260/bin
ll
cd ..
cd sbin/
start-dfs.sh
start-yarn.sh

安装hive和zeppelin

  将包放在opt目录下

/opt/soft/hive110/lib目录下

[root@cjy opt]# tar -zxf hive-1.1.0-cdh5.14.2.tar.gz
[root@cjy opt]# mv hive-1.1.0-cdh5.14.2 soft/hive110
[root@cjy opt]# cd /opt/soft/hive110/conf
[root@cjy conf]# touch hive-site.xml
[root@cjy conf]# vim hive-site.xml

修改hive-site.xml如下(ip地址需要更改




	
		hive.metastore.warehouse.dir
		hive110/warehouse
	
	
		hive.metastore.local
		false
	
	
		javax.jdo.option.ConnectionURL
		jdbc:mysql://192.168.100.155:3306/hive?useSSL=false&createDatabaseIfNotExist=true
	
	
		javax.jdo.option.ConnectionDriverName
		com.mysql.jdbc.Driver
	
	
		javax.jdo.option.ConnectionUserName
		root
	
	
		javax.jdo.option.ConnectionPassword
		okok
	
	
    		hive.server2.authentication
    		NONE
	
	
    		hive.server2.thrift.client.user
    		root
	
	
    		hive.server2.thrift.client.password
   		 root
	
[root@cjy conf]# vim /etc/profile
[root@cjy conf]# source /etc/profile

修改profile文件如下

#hive environment
export HIVE_HOME=/opt/soft/hive110
export PATH=$path:$hive_home/bin

bin下方 

schematool -dbType mysql -initSchema

启动hive (bin下

hive --service hiveserver2 & 

 第二章Beeline用法

beeline -u jdbc:hive2://192.168.100.154:10000/mydemo

Zeppelin安装
tar -zvxf zeppelin-0.8.1-bin-all.tgz
mv zeppelin-0.8.1-bin-all soft/zeppelin081
 cd soft/zeppelin081/conf

修改 配置文件

cp zeppelin-site.xml.template zeppelin-site.xml
vim zeppelin-site.xml

  zeppelin.helium.registry
  helium
cp zeppelin-env.sh.template zeppelin-env.sh

添加JAVA_HOME和HADOOP_CONF_DIR  (指定自己的java和hadoop安装目录)

vim zeppelin-env.sh   
 
 export JAVA_HOME=/opt/soft/jdk180
export HADOOP_CONF_DIR=/opt/soft/hadoop260/etc/hadoop     
 vim /etc/profile

source /etc/profile
#zeppelin environment
export ZEPPELIN_HOME=/opt/soft/zeppelin081

export PATH=$PATH:$ZEPPELIN_HOME/bin
[root@cjy bin]# cp /opt/soft/hive110/conf/hive-site.xml /opt/soft/zeppelin081/conf/

导入jar包

/opt/soft/hadoop260/share/hadoop/common 目录下的hadoop包

[root@cjy bin]# cp /opt/soft/hadoop260/share/hadoop/common/hadoop-common-2.6.0-cdh5.14.2.jar /opt/soft/opt/soft/zeppelin081/interpreter/jdbc
cp /opt/soft/hive110/lib/hive-jdbc-1.1.0-cdh5.14.2-standalone.jar /opt/soft/zeppelin081/interpreter/jdbc/
zeppelin-daemon.sh start

 启动脚本

#! /bin/bash

my_start(){
	if [ $1 == "start" ];then
		#start hadoop
		sh /opt/soft/hadoop260/sbin/start-dfs.sh
		sh /opt/soft/hadoop260/sbin/start-yarn.sh
		#start hive
		nohup  /opt/soft/hive110/bin/hive --service hiveserver2 &
		#start zeppelin
		sh /opt/soft/zeppelin081/bin/zeppelin-daemon.sh start
		echo "start over"
	else
		#close zeppelin
		sh /opt/soft/zeppelin081/bin/zeppelin-daemon.sh stop
		#close hive
		hiveprocess=`jps | grep RunJar | awk '{print $1}'`

		for no in $hiveprocess
		do
			kill -9 $no  #如果出现多个Jar 循环删除
		done

		#stop hadoop
		sh /opt/soft/hadoop260/sbin/stop-dfs.sh
		sh /opt/soft/hadoop260/sbin/stop-yarn.sh

		echo "stop over"
	fi
}

my_start $1
	
chmod +x run.sh
source run.sh start
source run.sh stop

 进入http://192.168.100.154:8080/#/

 

default.driver   org.apache.hive.jdbc.HiveDriver

default.url     jdbc:hive2://192.168.42.200:10000

default.user    hive

转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/274046.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号