HDFS的Shell操作(开发重点)

1.基本语法

bin/hadoop fs 具体命令 OR bin/hdfs dfs 具体命令
dfsfs的实现类。

2.命令大全

[hadoop@hadoop103 hadoop-2.7.2]$ bin/hadoop fs
Usage: hadoop fs [generic options]
	[-appendToFile <localsrc> ... <dst>]
	[-cat [-ignoreCrc] <src> ...]
	[-checksum <src> ...]
	[-chgrp [-R] GROUP PATH...]
	[-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
	[-chown [-R] [OWNER][:[GROUP]] PATH...]
	[-copyFromLocal [-f] [-p] [-l] <localsrc> ... <dst>]
	[-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
	[-count [-q] [-h] <path> ...]
	[-cp [-f] [-p | -p[topax]] <src> ... <dst>]
	[-createSnapshot <snapshotDir> [<snapshotName>]]
	[-deleteSnapshot <snapshotDir> <snapshotName>]
	[-df [-h] [<path> ...]]
	[-du [-s] [-h] <path> ...]
	[-expunge]
	[-find <path> ... <expression> ...]
	[-get [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
	[-getfacl [-R] <path>]
	[-getfattr [-R] {-n name | -d} [-e en] <path>]
	[-getmerge [-nl] <src> <localdst>]
	[-help [cmd ...]]
	[-ls [-d] [-h] [-R] [<path> ...]]
	[-mkdir [-p] <path> ...]
	[-moveFromLocal <localsrc> ... <dst>]
	[-moveToLocal <src> <localdst>]
	[-mv <src> ... <dst>]
	[-put [-f] [-p] [-l] <localsrc> ... <dst>]
	[-renameSnapshot <snapshotDir> <oldName> <newName>]
	[-rm [-f] [-r|-R] [-skipTrash] <src> ...]
	[-rmdir [--ignore-fail-on-non-empty] <dir> ...]
	[-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]]
	[-setfattr {-n name [-v value] | -x name} <path>]
	[-setrep [-R] [-w] <rep> <path> ...]
	[-stat [format] <path> ...]
	[-tail [-f] <file>]
	[-test -[defsz] <path>]
	[-text [-ignoreCrc] <src> ...]
	[-touchz <path> ...]
	[-truncate [-w] <length> <path> ...]
	[-usage [cmd ...]]

Generic options supported are
-conf <configuration file>     specify an application configuration file
-D <property=value>            use value for given property
-fs <local|namenode:port>      specify a namenode
-jt <local|resourcemanager:port>    specify a ResourceManager
-files <comma separated list of files>    specify comma separated files to be copied to the map reduce cluster
-libjars <comma separated list of jars>    specify comma separated jar files to include in the classpath.
-archives <comma separated list of archives>    specify comma separated archives to be unarchived on the compute machines.

The general command line syntax is
bin/hadoop command [genericOptions] [commandOptions]

3.常用命令实操

先启动集群:

102

[hadoop@hadoop102 ~]$ cd /opt/module/hadoop-2.7.2/
[hadoop@hadoop102 hadoop-2.7.2]$ sbin/start-dfs.sh 

在这里插入图片描述
103

[hadoop@hadoop103 ~]$ cd /opt/module/hadoop-2.7.2/
[hadoop@hadoop103 hadoop-2.7.2]$ sbin/start-yarn.sh 

在这里插入图片描述
(1)-help:输出这个命令参数

[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -help rm

(2)-ls: 显示目录信息

[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -ls /

(3)-mkdir:在HDFS上创建目录

[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -mkdir -p /chongqing/chengkou

(4)-moveFromLocal:从本地剪切粘贴到HDFS

[hadoop@hadoop103 hadoop-2.7.2]$ touch bossxiang.txt
[hadoop@hadoop103 hadoop-2.7.2]$ vi bossxiang.txt 
woshibossxiang
[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -moveFromLocal ./bossxiang.txt /chongqing/chengkou

(5)-appendToFile:追加一个文件到已经存在的文件末尾

[hadoop@hadoop103 hadoop-2.7.2]$ touch yuan.txt
[hadoop@hadoop103 hadoop-2.7.2]$ vi yuan.txt 
wo shi chen yuan
[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -appendToFile ./yuan.txt /chongqing/chengkou/bossxiang.txt

(6)-cat:显示文件内容

[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -cat /chongqing/chengkou/bossxiang.txt
[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -chgrp hadoop /chongqing/chengkou/bossxiang.txt

在这里插入图片描述
(8)-copyFromLocal:从本地文件系统中拷贝文件到HDFS路径去

[hadoop@hadoop102 hadoop-2.7.2]$ touch xinyue.txt
[hadoop@hadoop102 hadoop-2.7.2]$ vi xinyue.txt 
wo shi xinyue
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -copyFromLocal ./xinyue.txt  /chongqing/chengkou/

(9)-copyToLocal:从HDFS拷贝到本地

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -copyToLocal /chongqing/chengkou/bossxiang.txt  ./

(10)-cp :从HDFS的一个路径拷贝到HDFS的另一个路径

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -cp /chongqing/chengkou/xinyue.txt /chongqing/

在这里插入图片描述

(11)-mv:在HDFS目录中移动文件

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -mv /chongqing/xinyue.txt /

在这里插入图片描述

(12)-get:等同于copyToLocal,就是从HDFS下载文件到本地

[hadoop@hadoop102 hadoop-2.7.2]$ rm xinyue.txt
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -get /xinyue.txt

(13)-getmerge:合并下载多个文件,比如HDFS的目录 /user/atguigu/test下有多个文件:log.1, log.2,log.3,…

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -getmerge /chongqing/chengkou/* ./zaiyiqi.txt
[hadoop@hadoop102 hadoop-2.7.2]$ cat zaiyiqi.txt 
woshibossxiang
wo shi xinyue

在这里插入图片描述
(14)-put:等同于copyFromLocal

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -put ./LICENSE.txt /chongqing/chengkou

在这里插入图片描述
(15)-tail:显示一个文件的末尾

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -tail /chongqing/chengkou/LICENSE.txt
mentation and/or other materials provided with the
   distribution.

   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

   You can contact the author at :
   - LZ4 source repository : http://code.google.com/p/lz4/
   - LZ4 public forum : https://groups.google.com/forum/#!forum/lz4c
*/

(16)-rm:删除文件或文件夹

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -rm /chongqing/chengkou/LICENSE.txt
20/08/08 04:21:42 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 0 minutes, Emptier interval = 0 minutes.
Deleted /chongqing/chengkou/LICENSE.txt

(17)-rmdir:删除空目录

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -mkdir /test
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -rmdir /test

(18)-du统计文件夹的大小信息

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -du /
29         /chongqing
212046774  /hadoop-2.7.2.tar.gz
13         /sanguo
37         /wc.input
14         /xinyue.txt
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -du -h /
29       /chongqing
202.2 M  /hadoop-2.7.2.tar.gz
13       /sanguo
37       /wc.input
14       /xinyue.txt
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -du -h -s /
202.2 M  /

(19)-setrep:设置HDFS中文件的副本数量

[hadoop@hadoop102 hadoop-2.7.2]$ cd data/tmp/dfs/data/current/BP-1454198558-117.59.224.141-1595323236787/current/finalized/subdir0/subdir0/
[hadoop@hadoop102 subdir0]$ hadoop fs -rm -R /chongqing
[hadoop@hadoop102 subdir0]$ hadoop fs -rm /hadoop-2.7.2.tar.gz
[hadoop@hadoop102 subdir0]$ hadoop fs -rm -R /sanguo
[hadoop@hadoop102 subdir0]$ ll
total 207096
-rw-rw-r--. 1 hadoop hadoop        37 Jul 21 09:10 blk_1073741825
-rw-rw-r--. 1 hadoop hadoop        11 Jul 21 09:10 blk_1073741825_1001.meta
-rw-rw-r--. 1 hadoop hadoop        14 Aug  8 02:57 blk_1073741831
-rw-rw-r--. 1 hadoop hadoop        11 Aug  8 02:57 blk_1073741831_1011.meta
drwxr-xr-x. 9 hadoop hadoop       149 Jan 25  2016 hadoop-2.7.2
-rw-rw-r--. 1 hadoop hadoop 212046774 Jul 21 09:26 tmp.txt
[hadoop@hadoop102 subdir0]$ cat blk_1073741825
xiang
xiang
lin 
lin yuan chen  yuan
[hadoop@hadoop102 subdir0]$ cat blk_1073741831
wo shi xinyue
[hadoop@hadoop102 subdir0]$ 
[hadoop@hadoop102 finalized]$ hadoop fs -setrep 2 /xinyue.txt

在这里插入图片描述

几百本常用电子书免费领取:https://github.com/XiangLinPro/IT_book

在这里插入图片描述

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值