- 1.提前安装JDK和Hadoop并配置好环境变量
- 2.修改配置文件
- 3.格式化NameNode
- 4.启动Hadoop
- 5.Web端查看HDFS信息
vim /etc/profile.d/my_env.sh
# JDK_HOME export JAVA_HOME=/opt/module/jdk1.8.0_212 export CLASSPATH=$:CLASSPATH:$JAVA_HOME/lib/ export PATH=$PATH:$JAVA_HOME/bin #HADOOP_HOME export HADOOP_HOME=/opt/module/hadoop-3.1.3 export PATH=$PATH:$HADOOP_HOME/bin export PATH=$PATH:$HADOOP_HOME/sbin2.修改配置文件
core-site.xml
fs.defaultFS hdfs://hadoop112:9820 hadoop.tmp.dir /opt/module/hadoop-3.1.3/data
hdfs-site.xml
dfs.replication 1 dfs.namenode.http-address hadoop112:9870 dfs.namenode.secondary.http-address hadoop112:9868 dfs.namenode.name.dir /opt/module/hadoop-3.1.3/data/dfs/name dfs.datanode.data.dir /opt/module/hadoop-3.1.3/data/dfs/data
yarn-site.xml(也可以不用改)
3.格式化NameNodeyarn.nodemanager.aux-services mapreduce_shuffle
hdfs namenode -format
如果格式化成功,会看到 has been successfully formatted 和 Exiting with status 0的提示信息。
4.启动Hadoop./sbin/start-dfs.ah5.Web端查看HDFS信息
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-cDgKUlzI-1652283250038)(web界面)]
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-duHgVZ9I-1652283250040)(DN信息)]
5.Web端查看HDFS信息
[外链图片转存中…(img-duHgVZ9I-1652283250040)]



