主要操作HDFS文件的查看,添加,删除,上传,下载。完整源码见后面
环境搭建见:https://blog.csdn.net/qq_25948717/article/details/82015131
Maven就是方便包的管理版本匹配
搭建好如图:
配置:
可以看到右下角正在下载依赖,第一耗时很长
====================================================================================
开发:将所有操作都作为单元测试,先删除test下的AppTest,新建包和java测试类
创建文件:
package com.yexin.hadoop.hdfs;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import java.net.URI;
/**
* @Company: Huazhong University of science and technology
* @version: V1.0
* @author: YEXIN
* @contact: 1650996069@qq.com or yexin@hust.edu.cn 2018--2020
* @software: IntelliJ IDEA
* @file: HDFSApp
* @time: 8/24/18 2:11 PM
* @Desc:Hadoop HDFS Java API Operation
**/
public class HDFSApp {
public static final String HDFS_PATH = "hdfs://node40:9000";
FileSystem fileSystem = null;//important class for operating the hdfs file
Configuration configuration = null;
/**
*mkdir HDFS file
* */
@Test
public void mkdir() throws Exception{
fileSystem.mkdirs(new Path