需要的支持 hadoop, hive, spark
所有的软件解压在路径 path下
用户环境下操作文件,配置都在 .bashrc 中
java 的文件文件目录存放在 : path/java/jdk1.8.0/
vim .bashrc export JAVA_HOME=path/java/jdk1.8.0 export PATH=$JAVA_HOME/bin:$PATH export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
source .bashrc
bash 输入 java, 有反应就配置成功
由于安装 Hadoop 是为了spark,spark 依赖于 hdfs和yarn,所以Hadoop 配置的是伪分布式, 参考官网就行,复制粘贴没啥意思
https://hadoop.apache.org/docs/r2.10.1/hadoop-project-dist/hadoop-common/SingleCluster.html
vim .bashrc export HADOOP_HOME=path/hadoop-2.10.1 export PATH=$ PATH:$ HADOOP_HOME/bin:$ HADOOP_HOME/sbin export JAVA_LIBRAY_PATH=$ PATH:$ HADOOP_HOME/lib/native
安装过程中会出现 fix warning: Unable to load native-hadoop library, 这里解决
Hive下载解压忽略·······
修改环境变量
#HIVE_HOME export HIVE_HOME=path/hive-2.3.7 export PATH=$PATH:$HIVE_HOME/bin
$HIVE_HOME/conf vi hive-site.xml 增加以下内容
javax.jdo.option.ConnectionURL jdbc:mysql://localhost:3306/hivemetadata?createDatabaseIfNotExist=true;useSSL=false JDBC connect string for a JDBC metastore javax.jdo.option.ConnectionDriverName com.mysql.cj.jdbc.Driver Driver class name for a JDBC metastore javax.jdo.option.ConnectionUserName hive username to use against metastore database javax.jdo.option.ConnectionPassword 12345678 password to use against metastore database hive.metastore.warehouse.dir /user/hive/warehouse location of default database for the warehouse hive.cli.print.current.db true Whether to include the current database in the Hive prompt. hive.cli.print.header true hive.exec.mode.local.auto true Let Hive determine whether to run in local mode automatically
下载 JDBC tar 包,https://dev.mysql.com/downloads/file/?id=509716, 双击安装 或者 sudo apt install mysql-connector-java.xxx.deb/usr/share/java 中找到安装好的jar包,并复制粘贴到 $HIVE_HOME/lib
cp mysql-connector-java-8.0.27.jar path/hive-2.3.7/lib 修改日志文件目录
vi $HIVE_HOME/conf/hive-log4j2.properties
property.hive.log.dir = path/hive-2.3.7/logs



