HDFS基本操作

HDFS基本操作

 

1、查看HDFS 目录:

  执行命令:

hadoop fs -ls hdfs://192.168.1.100:9000/

 

[hadoop@baolibin ~]$ hadoop fs -lshdfs://192.168.1.100:9000/
Warning: $HADOOP_HOME is deprecated.
 
Found 1 items
drwxr-xr-x  - hadoop supergroup          02015-02-15 21:05 /usr
[hadoop@baolibin ~]$


 

2、对HDFS目录递归查看:

执行命令:

hadoop fs -lsr hdfs://192.168.1.100:9000/

 

 [hadoop@baolibin~]$ hadoop fs -lsr hdfs://192.168.1.100:9000/
Warning: $HADOOP_HOME is deprecated.
 
drwxr-xr-x  - hadoop supergroup          02015-02-15 21:05 /usr
drwxr-xr-x  - hadoop supergroup          02015-02-15 21:05 /usr/hadoop
drwxr-xr-x  - hadoop supergroup          02015-02-15 21:05 /usr/hadoop/tmp
drwxr-xr-x  - hadoop supergroup          02015-02-15 21:24 /usr/hadoop/tmp/mapred
drwx------  - hadoop supergroup          02015-02-15 21:24 /usr/hadoop/tmp/mapred/system
-rw-------  1 hadoop supergroup          42015-02-15 21:24 /usr/hadoop/tmp/mapred/system/jobtracker.info
[hadoop@baolibin ~]$


 

3、创建目录:

执行命令:

hadoop fs -mkdir input

 

[hadoop@baolibin ~]$ hadoop fs -mkdir input
Warning: $HADOOP_HOME is deprecated.
 
[hadoop@baolibin ~]$


 

 

创建的目录所在HDFS目录:

[hadoop@baolibin ~]$ hadoop fs -ls
Warning: $HADOOP_HOME is deprecated.
 
Found 1 items
drwxr-xr-x  - hadoop supergroup          02015-02-15 22:09 /user/hadoop/input
[hadoop@baolibin ~]$


 

4、从Linux上传文件到HDFS:

执行命令:

hadoop fs -put /home/hadoop/baozi.txt /user/hadoop/input/

 

[hadoop@baolibin ~]$ hadoop fs -put/home/hadoop/baozi.txt /user/hadoop/input/
Warning: $HADOOP_HOME is deprecated.
 
[hadoop@baolibin ~]$


 

查看input 目录:

[hadoop@baolibin ~]$ hadoop fs -ls input
Warning: $HADOOP_HOME is deprecated.
 
Found 1 items
-rw-r--r--  1 hadoop supergroup         462015-02-15 22:13 /user/hadoop/input/baozi.txt
[hadoop@baolibin ~]$


 

 

5、从HDFS下载文件到Linux:

执行命令:

hadoop fs -get /user/hadoop/input/baozi.txt /home/hadoop/xiaobaozi

 

[hadoop@baolibin ~]$ hadoop fs -get/user/hadoop/input/baozi.txt /home/hadoop/xiaobaozi
Warning: $HADOOP_HOME is deprecated.
 
[hadoop@baolibin ~]$


 

 

查看:

[hadoop@baolibin ~]$ ll/home/hadoop/xiaobaozi
总用量 4
-rw-rw-r--. 1 hadoop hadoop 46 2月 15 22:17 baozi.txt
[hadoop@baolibin ~]$


 

6、查看文件内容:

执行命令:

hadoop fs -text input/baozi.txt

 

[hadoop@baolibin ~]$ hadoop fs -textinput/baozi.txt
Warning: $HADOOP_HOME is deprecated.
 
hadoop hello java baozi hbase hive hadoopjava
[hadoop@baolibin ~]$


 

7、删除文件:

执行命令:

hadoop fs -rm input/baozi.txt

 

[hadoop@baolibin ~]$ hadoop fs -rminput/baozi.txt
Warning: $HADOOP_HOME is deprecated.
 
Deletedhdfs://192.168.1.100:9000/user/hadoop/input/baozi.txt
[hadoop@baolibin ~]$


 

查看是否删除:

[hadoop@baolibin ~]$ hadoop fs -ls input/
Warning: $HADOOP_HOME is deprecated.
 
[hadoop@baolibin ~]$


 

8、递归删除文件:

执行命令:

hadoop fs -rmr input

 

[hadoop@baolibin ~]$ hadoop fs -rmr input
Warning: $HADOOP_HOME is deprecated.
 
Deletedhdfs://192.168.1.100:9000/user/hadoop/input
[hadoop@baolibin ~]$


 

查看是否删除:

[hadoop@baolibin ~]$ hadoop fs -ls
Warning: $HADOOP_HOME is deprecated.
 
[hadoop@baolibin ~]$


 

 

9、查看所有命令:

执行命令:

hadoop fs

 

[hadoop@baolibin ~]$ hadoop fs
Warning: $HADOOP_HOME is deprecated.
 
Usage: java FsShell
          [-ls <path>]
          [-lsr <path>]
          [-du <path>]
          [-dus <path>]
          [-count[-q] <path>]
          [-mv <src> <dst>]
          [-cp <src> <dst>]
          [-rm [-skipTrash] <path>]
          [-rmr [-skipTrash] <path>]
          [-expunge]
           [-put <localsrc> ... <dst>]
          [-copyFromLocal <localsrc> ... <dst>]
          [-moveFromLocal <localsrc> ... <dst>]
          [-get [-ignoreCrc] [-crc] <src> <localdst>]
          [-getmerge <src> <localdst> [addnl]]
          [-cat <src>]
          [-text <src>]
          [-copyToLocal [-ignoreCrc] [-crc] <src> <localdst>]
          [-moveToLocal [-crc] <src> <localdst>]
          [-mkdir <path>]
          [-setrep [-R] [-w] <rep> <path/file>]
          [-touchz <path>]
          [-test -[ezd] <path>]
          [-stat [format] <path>]
          [-tail [-f] <file>]
          [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
          [-chown [-R] [OWNER][:[GROUP]] PATH...]
          [-chgrp [-R] GROUP PATH...]
          [-help [cmd]]
 
Generic options supported are
-conf <configuration file>     specify an application configuration file
-D <property=value>            use value for given property
-fs <local|namenode:port>      specify a namenode
-jt <local|jobtracker:port>    specify a job tracker
-files <comma separated list offiles>    specify comma separatedfiles to be copied to the map reduce cluster
-libjars <comma separated list ofjars>    specify comma separated jarfiles to include in the classpath.
-archives <comma separated list ofarchives>    specify comma separatedarchives to be unarchived on the compute machines.
 
The general command line syntax is
bin/hadoop command [genericOptions][commandOptions]
 
[hadoop@baolibin ~]$


 

10、查看某个命令用法:

执行命令:

hadoop fs -help ls

 

 

[hadoop@baolibin ~]$ hadoop fs -help ls
Warning: $HADOOP_HOME is deprecated.
 
-ls <path>:     List the contents that match the specifiedfile pattern. If
                path is not specified, thecontents of /user/<currentUser>
                will be listed. Directoryentries are of the form
                        dirName (full path)<dir>
                and file entries are of theform
                        fileName(full path)<r n> size
                where n is the number ofreplicas specified for the file
                and size is the size of thefile, in bytes.
 
[hadoop@baolibin ~]$


 

11、查看HDFS 基本统计信息:

执行命令:

/usr/hadoop/bin/hadoopdfsadmin -report

 

[hadoop@baolibin current]$/usr/hadoop/bin/hadoop dfsadmin -report
Warning: $HADOOP_HOME is deprecated.
 
Configured Capacity: 7431069696 (6.92 GB)
Present Capacity: 2737635328 (2.55 GB)
DFS Remaining: 2737594368 (2.55 GB)
DFS Used: 40960 (40 KB)
DFS Used%: 0%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0
 
-------------------------------------------------
Datanodes available: 1 (1 total, 0 dead)
 
Name: 192.168.1.100:50010
Decommission Status : Normal
Configured Capacity: 7431069696 (6.92 GB)
DFS Used: 40960 (40 KB)
Non DFS Used: 4693434368 (4.37 GB)
DFS Remaining: 2737594368(2.55 GB)
DFS Used%: 0%
DFS Remaining%: 36.84%
Last contact: Mon Feb 16 00:19:31 CST 2015
 
[hadoop@baolibin current]$



12、命令清单:


hadoop fs

      -help [cmd]      //显示命令的帮助信息
      -ls(r) <path>     //显示当前目录下所有文件
      -du(s) <path>    //显示目录中所有文件大小
      -count[-q] <path>     //显示目录中文件数量
      -mv <src> <dst>      //移动多个文件到目标目录
      -cp <src> <dst> //复制多个文件到目标目录
      -rm(r)              //删除文件(夹)
      -put <localsrc><dst>       //本地文件复制到hdfs
      -copyFromLocal       //同put
      -moveFromLocal      //从本地文件移动到hdfs
      -get [-ignoreCrc] <src><localdst>  //复制文件到本地,可以忽略crc校验
      -getmerge <src><localdst>            //将源目录中的所有文件排序合并到一个文件中
      -cat <src>  //在终端显示文件内容
      -text <src> //在终端显示文件内容
      -copyToLocal [-ignoreCrc]<src> <localdst>  //复制到本地
      -moveToLocal <src><localdst>
      -mkdir <path>  //创建文件夹
      -touchz <path>  //创建一个空文件


希望对初学者有所帮助。



  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值