Hadoop Command Summary

List file is:

$ ./hadoop fs -ls

Found 17 items

-rwxr-xr-x   1 yj70978 retailfi       1259 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/start-mapred.sh

-rwxr-xr-x   1 yj70978 retailfi       2642 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/hadoop-config.sh

-rwxr-xr-x   1 yj70978 retailfi       2810 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/rcc

-rwxr-xr-x   1 yj70978 retailfi      14189 2013-07-22 07:58/home/yj70978/hadoop/hadoop-1.1.2/bin/hadoop

-rwxr-xr-x   1 yj70978 retailfi       1329 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/hadoop-daemons.sh

-rwxr-xr-x   1 yj70978 retailfi       1145 2013-01-30 21:05 /home/yj70978/hadoop/hadoop-1.1.2/bin/start-jobhistoryserver.sh

-rwxr-xr-x   1 yj70978 retailfi       2143 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/slaves.sh

-rwxr-xr-x   1 yj70978 retailfi       1116 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-balancer.sh

-rwxr-xr-x   1 yj70978 retailfi       1745 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/start-dfs.sh

-rwxr-xr-x   1 yj70978 retailfi       1168 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-mapred.sh

-rwxr-xr-x   1 yj70978 retailfi       1246 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-dfs.sh

-rwxr-xr-x   1 yj70978 retailfi       1166 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/start-all.sh

-rwxr-xr-x   1 yj70978 retailfi       1119 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-all.sh

-rwxr-xr-x   1 yj70978 retailfi      63970 2013-01-30 21:06/home/yj70978/hadoop/hadoop-1.1.2/bin/task-controller

-rwxr-xr-x   1 yj70978 retailfi       1065 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/start-balancer.sh

-rwxr-xr-x   1 yj70978 retailfi       1131 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/stop-jobhistoryserver.sh

-rwxr-xr-x   1 yj70978 retailfi       4649 2013-01-30 21:05/home/yj70978/hadoop/hadoop-1.1.2/bin/hadoop-daemon.sh

HDFS has a defaultworking directory of /user/$USER, where $USER is your login user name. Thisdirectory isn’t automatically created for you, though, so let’s create it withthe mkdir command.

List all ofsubdirectories,

hadoop fs -lsr/user

Copy file from localfilesystem to HDFS,

$ hadoop fs -put1.txt /user/mz50947

                No encryption was performed bypeer.

$ hadoop fs -ls/user/mz50947

                No encryption was performed bypeer.

Found 2 items

-rw-r--r--   3 mz50947 enterpriserisk          0 2013-07-23 01:39 /user/mz50947/1

-rw-r--r--   3 mz50947 enterpriserisk          0 2013-07-23 01:40/user/mz50947/1.txt

Delete a file inHDFS,

$ hadoop fs -rm/user/mz50947/1.txt

                No encryption was performed bypeer.

Moved:'hdfs://bdwar001m01l.nam.nsroot.net:8020/user/mz50947/1.txt' to trash at:hdfs://bdwar001m01l.nam.nsroot.net:8020/user/mz50947/.Trash/Current

See content of afile,

hadoop fs -cat/user/mz50947/1.txt

                No encryption was performed bypeer.

Touch

Get a file fromHDFS to local filesystem,

hadoop fs -get/user/mz50947/1.txt .

Looking up help,

hadoop fs –help

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值