在搭建伪分布式的Hadoop集群环境时,在配置基础环境了并成功开启了HDFS组件后,jps查看已运行的名称节点和数据节点进程,
[hadoop@master hadoop]$ jps 8994 NameNode 10396 Jps 9087 DataNode 9279 SecondaryNameNode
然后尝试开启YARN组件,但是报错:
[hadoop@master hadoop]$ start-yarn.sh starting yarn daemons mkdir: cannot create directory a€?/usr/local/hadoopa€?: Permission denied chown: cannot access a€?/usr/local/hadoop/logsa€?: No such file or directory starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hadoop-resourcemanager-master.out /home/hadoop/app/hadoop-2.7.5/sbin/yarn-daemon.sh: line 123: cd: /usr/local/hadoop: No such file or directory /home/hadoop/app/hadoop-2.7.5/sbin/yarn-daemon.sh: line 124: /usr/local/hadoop/logs/yarn-hadoop-resourcemanager-master.out: No such file or directory head: cannot open a€?/usr/local/hadoop/logs/yarn-hadoop-resourcemanager-master.outa€? for reading: No such file or directory /home/hadoop/app/hadoop-2.7.5/sbin/yarn-daemon.sh: line 129: /usr/local/hadoop/logs/yarn-hadoop-resourcemanager-master.out: No such file or directory /home/hadoop/app/hadoop-2.7.5/sbin/yarn-daemon.sh: line 130: /usr/local/hadoop/logs/yarn-hadoop-resourcemanager-master.out: No such file or directory localhost: bash: line 0: cd: /usr/local/hadoop: No such file or directory localhost: mkdir: cannot create directory a€?/usr/local/hadoopa€?: Permission denied localhost: chown: cannot access a€?/usr/local/hadoop/logsa€?: No such file or directory localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hadoop-nodemanager-master.out localhost: /home/hadoop/app/hadoop-2.7.5/sbin/yarn-daemon.sh: line 123: cd: /usr/local/hadoop: No such file or directory localhost: /home/hadoop/app/hadoop-2.7.5/sbin/yarn-daemon.sh: line 124: /usr/local/hadoop/logs/yarn-hadoop-nodemanager-master.out: No such file or directory localhost: head: cannot open a€?/usr/local/hadoop/logs/yarn-hadoop-nodemanager-master.outa€? for reading: No such file or directory localhost: /home/hadoop/app/hadoop-2.7.5/sbin/yarn-daemon.sh: line 129: /usr/local/hadoop/logs/yarn-hadoop-nodemanager-master.out: No such file or directory localhost: /home/hadoop/app/hadoop-2.7.5/sbin/yarn-daemon.sh: line 130: /usr/local/hadoop/logs/yarn-hadoop-nodemanager-master.out: No such file or directory
其中报错的原因是YARN组件执行脚本(yarn.env.sh)没有默认路径下找到目录(/usr/local/hadoop/logs/yarn-hadoop-resourcemanager-master.out: No such file or directory),因此考虑到之前为对yarn.env.sh中识别的环境变量如$HADOOP_YARN_HOME配置,因此编辑并刷新环境变量,比较完整的能解决这个问题的环境变量如下:
export JAVA_HOME=/home/hadoop/app/jdk1.8.0_131 export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar export HADOOP_HOME=/home/hadoop/app/hadoop-2.7.5 export HADOOP_PREFIX=$HADOOP_HOME export HADOOP_COMMON_HOME=$HADOOP_PREFIX export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_PREFIX/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop export HADOOP_HDFS_HOME=$HADOOP_PREFIX export HADOOP_MAPRED_HOME=$HADOOP_PREFIX export HADOOP_YARN_HOME=$HADOOP_HOME #export HADOOP_ROOT_LOGGER=DEBUG,console export LD_LIBRARY_PATH=$HADOOP_PREFIX/lib/native export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
注意自己的JDK和HADOOP的路径,再次开启YARN组件成功,再次查看到YARN组件的ResourceManager和NodeManager进程在运行,说明HADOOP启动成功。
[hadoop@master hadoop]$ start-yarn.sh starting yarn daemons starting resourcemanager, logging to /home/hadoop/app/hadoop-2.7.5/logs/yarn-hadoop-resourcemanager-master.out localhost: starting nodemanager, logging to /home/hadoop/app/hadoop-2.7.5/logs/yarn-hadoop-nodemanager-master.out [hadoop@master hadoop]$ jps 8994 NameNode 9715 ResourceManager 10084 Jps 9915 NodeManager 9087 DataNode 9279 SecondaryNameNode



