安装环境:CentOS-7、jdk
软件版本:spark-2.3.4-bin-hadoop2.6
资源路径:百度网盘 请输入提取码
提取码:zzzz
1、将安装包上传到 /opt 目录下
2、解压安装包
tar -zxvf spark-2.3.4-bin-hadoop2.6.tgz -C soft/spark-standby/ --strip-components 1
3、将 /opt/soft/spark-standby/conf/slaves.template 拷贝一份改名为slaves,并编辑添加自己的主机ip或主机映射
cp /opt/soft/spark-standby/conf/slaves.template /opt/soft/spark-standby/conf/slaves
vim /opt/soft/spark-standby/conf/slaves
4、将 /opt/soft/spark-standby/conf/spark-env.sh.template 拷贝一份改名为 /opt/soft/spark-standby/conf/spark-env.sh,并添加如下配置信息
cp /opt/soft/spark-standby/conf/spark-env.sh.template /opt/soft/spark-standby/conf/spark-env.sh
vim /opt/soft/spark-standby/conf/spark-env.sh
export SPARK_MASTER_HOST=hadoop101
export SPARK_MASTER_PORT=7077
export SPARK_WORKER_CORES=2
export SPARK_WORKER_MEMORY=3g
export SPARK_MASTER_WEBUI_PORT=8888
5、编辑 /opt/soft/spark-standby/sbin/spark-config.sh 文件,并添加JAVA_HOME路径
vim /opt/soft/spark-standby/sbin/spark-config.sh
export JAVA_HOME=/opt/soft/jdk180
6、配置历史服务器,拷贝一份 /opt/soft/spark-standby/conf/spark-defaults.conf.template,为/opt/soft/spark-standby/conf/spark-defaults.conf,并编辑
cp /opt/soft/spark-standby/conf/spark-defaults.conf.template /opt/soft/spark-standby/conf/spark-defaults.conf
vim /opt/soft/spark-standby/conf/spark-defaults.conf
注意!!!下图 hdfs://hadoop101:9000/spark-log-standby 这个目录必须事先创建好
7、继续编辑 /opt/soft/spark-standby/conf/spark-env.sh,添加如下内容
vim /opt/soft/spark-standby/conf/spark-env.sh
export SPARK_HISTORY_OPTS="-Dspark.history.ui.port=18080 -Dspark.history.retained
Applications=30 -Dspark.history.fs.logDirectory=hdfs://hadoop101:9000/spark-log-standby"
8、启动hdfs,把之前填写的日志保存的目录创建好
start-dfs.sh
hdfs dfs -mkdir -p /spark-standby-standby
9、启动spark
/opt/soft/spark-standby/sbin/start-all.sh
10、开启日志服务器
/opt/soft/spark-standby/sbin/start-hostory-server.sh
11、打开 maser web UI
http://192.168.1.101:8888
12、打开日志服务器 web UI
http://192.168.1.101:18080
13、停止spark
/opt/soft/spark-standby/sbin/stop-all.sh