1.查看HDFS目录下的内容
hadoop fs -lsr / 查看根目录下的内容
2.在HDFS目录下创建多级目录
hadoop fs -mkdir -p /sanguo/shuguo 多级目录加-p
3.将本地的文件剪切到HDFS上
hadoop fs -moveFromLocal ./panjinlian.txt /sanguo/shuguo ./意为当前目录
4.将本地文本追加到HDFS的文本之后
hadoop fs -appendToFile ./liubei.txt /sanguo/shuguo/panjinlian.txt
5.本地文件复制到HDFS上
hadoop fs -copyFromLocal ./panjinlian.txt /sanguo/shuguo
6.从HDFS上下载文件
hadoop fs -copyToLocal /sanguo/shuguo/panjinlian.txt ./
hadoop fs -get /sanguo/shuguo/panjinlian.txt ./
hadoop fs -getmege /sanguo/shuguo/* ./zaiyiqi.txt 将shuguo中的所有文件下载到zaiyiqi
7.上传文件
hadoop fs -put ./panjinlian.txt /sanguo/shuguo
7.集群的启动与关闭
hadoop目录下,
sbin/start-dfs.sh
sbin/stop-dfs.sh
sbin/start-yarn.sh
sbin/stop-yarn.sh