之前试了很多方法都没什么用,虚拟机重装或返回快照也没用,将hadoop环境配置变量放在my.sh 里一直都是报错。
解决方法:
#环境变量并激活 vim /opt/software/hadoop313/my.sh 进入my.sh配置下面两行路径 #hadoop 3.1.3 export HADOOP_HOME=/opt/software/hadoop313 export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/lib #-------------------------------------------------------------------- cd /opt/software/hadoop313/sbin/ 去到hadoop313目录下sbin 分别编辑以下四个文件 vim start-dfs.sh vim stop-dfs.sh vim start-yarn.sh vim stop-yarn.sh 每个脚本里在最开始插入下面路径代码,前面乜有export!! #-------------------------------------------------------------------- HDFS_NAMENODE_USER=root HDFS_DATANODE_USER=root HDFS_SECONDARYNAMENODE_USER=root HDFS_JOURNALNODE_USER=root HDFS_ZKFC_USER=root YARN_RESOURCEMANAGER_USER=root YARN_NODEMANAGER_USER=root HADOOP_MAPRED_HOME=$HADOOP_HOME HADOOP_COMMON_HOME=$HADOOP_HOME HADOOP_HDFS_HOME=$HADOOP_HOME HADOOP_YARN_HOME=$HADOOP_HOME HADOOP_INSTALL=$HADOOP_HOME HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native HADOOP_LIBEXEC_DIR=$HADOOP_HOME/libexec JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop



