1、列出所有受支持的命令hadoop fs 等同于 hdfs dfs
hadoop fs
[root@hadoop-node1 hadoop-3.3.2]# hadoop fs
Usage: hadoop fs [generic options]
[-appendToFile ... ]
[-cat [-ignoreCrc] ...]
[-checksum [-v] ...]
[-chgrp [-R] GROUP PATH...]
[-chmod [-R] PATH...]
[-chown [-R] [OWNER][:[GROUP]] PATH...]
[-concat ...]
[-copyFromLocal [-f] [-p] [-l] [-d] [-t ] [-q ] ... ]
[-copyToLocal [-f] [-p] [-crc] [-ignoreCrc] [-t ] [-q ] ... ]
[-count [-q] [-h] [-v] [-t []] [-u] [-x] [-e] [-s] ...]
[-cp [-f] [-p | -p[topax]] [-d] [-t ] [-q ] ... ]
[-createSnapshot []]
[-deleteSnapshot ]
[-df [-h] [ ...]]
[-du [-s] [-h] [-v] [-x] ...]
[-expunge [-immediate] [-fs ]]
[-find ... ...]
[-get [-f] [-p] [-crc] [-ignoreCrc] [-t ] [-q ] ... ]
[-getfacl [-R] ]
[-getfattr [-R] {-n name | -d} [-e en] ]
[-getmerge [-nl] [-skip-empty-file] ]
[-head ]
[-help [cmd ...]]
[-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [-e] [ ...]]
[-mkdir [-p] ...]
[-moveFromLocal [-f] [-p] [-l] [-d] ... ]
[-moveToLocal ]
[-mv ... ]
[-put [-f] [-p] [-l] [-d] [-t ] [-q ] ... ]
[-renameSnapshot ]
[-rm [-f] [-r|-R] [-skipTrash] [-safely] ...]
[-rmdir [--ignore-fail-on-non-empty] ...]
[-setfacl [-R] [{-b|-k} {-m|-x } ]|[--set ]]
[-setfattr {-n name [-v value] | -x name} ]
[-setrep [-R] [-w] ...]
[-stat [format] ...]
[-tail [-f] [-s ] ]
[-test -[defswrz] ]
[-text [-ignoreCrc] ...]
[-touch [-a] [-m] [-t TIMESTAMP (yyyyMMdd:HHmmss) ] [-c] ...]
[-touchz ...]
[-truncate [-w] ...]
[-usage [cmd ...]]
Generic options supported are:
-conf specify an application configuration file
-D define a value for a given property
-fs specify default filesystem URL to use, overrides 'fs.defaultFS' property from configurations.
-jt specify a ResourceManager
-files specify a comma-separated list of files to be copied to the map reduce cluster
-libjars specify a comma-separated list of jar files to be included in the classpath
-archives specify a comma-separated list of archives to be unarchived on the compute machines
The general command line syntax is:
command [genericOptions] [commandOptions]
还可以通过 hadoop fs -help 列出所有命令及其使用指南,内容太多了,我这里就不贴出来了。
2、查看某一个命令的具体用法hadoop fs -help <具体命令>
[root@hadoop-node1 hadoop-3.3.2]# hadoop fs -help ls -ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [-e] [...] : List the contents that match the specified file pattern. If path is not specified, the contents of /user/ will be listed. For a directory a list of its direct children is returned (unless -d option is specified). Directory entries are of the form: permissions - userId groupId sizeOfDirectory(in bytes) modificationDate(yyyy-MM-dd HH:mm) directoryName and file entries are of the form: permissions numberOfReplicas userId groupId sizeOfFile(in bytes) modificationDate(yyyy-MM-dd HH:mm) fileName -C Display the paths of files and directories only. -d Directories are listed as plain files. -h Formats the sizes of files in a human-readable fashion rather than a number of bytes. -q Print ? instead of non-printable characters. -R Recursively list the contents of directories. -t Sort files by modification time (most recent first). -S Sort files by size. -r Reverse the order of the sort. -u Use time of last access instead of modification for display and sorting. -e Display the erasure coding policy of files and directories.
示例:列出根目录下的文件的路径
[root@hadoop-node1 hadoop-3.3.2]# hadoop fs -ls -C / /input /tmp /wordcount-out13、示例
选取了部分命令来进行示例
| 动作 | 命令 |
|---|---|
| 创建一个名为/foodir的目录 | hadoop fs -mkdir /foodir |
| 删除一个名为/foodir的目录 | hadoop fs -rm -R /foodir |
| 从本地复制文件到/foodir目录 | hadoop fs -put test.txt /foodir hadoop fs -copyFromLocal test.txt /foodir |
| 下载文件到本地 | hadoop fs -get /foodir/test.txt ./localdir hadoop fs -copyToLocal /foodir/test.txt ./localdir |
| 查看名为/foodir/test.txt的文件的内容 | hadoop fs -cat /foodir/test.txt |
| 设置某个文件的副本数量 | hadoop fs -setrep 5 /foodir/test.txt |



