连接HDFS并创建文件夹
1.创建URI对象,参数为HDFS对应ip端口
URI uri = new URI("hdfs://192.168.109.135:9000");
2. 创建连接配置
Configuration configuration = new Configuration();
String user="hadoop";
FileSystem fs = FileSystem.get(uri, configuration,user);
3.代码
pom依赖导入:
<!-- hadoop 依赖 -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.3</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.3</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.3</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
<scope>compile</scope>
</dependency>
方法示例:
public String hello() throws URISyntaxException, IOException, InterruptedException {
// URI uri = new URI("hdfs://192.168.109.135:9000");
// Configuration configuration = new Configuration();
// String user="hadoop";
// FileSystem fs = FileSystem.get(uri, configuration,user);
// Path path = new Path("/dir/test");
// fs.mkdirs(path);
// fs.close();
//1.获取文件系统
Configuration configuration = new Configuration();
String user="hadoop";
FileSystem fs = FileSystem.get(new URI(nameNode), configuration, user);
//2.执行操作 创建hdfs文件夹
Path path = new Path(filePath);
if (!fs.exists(path)) {
fs.mkdirs(path);
}
//关闭资源
fs.close();
System.out.println("结束!");
return "hello";
}
可能出现报错:Could not locate executable null \bin\winutils.exe in the hadoop binaries
解决方案:winutils下载地址:winutils
添加到系统环境变量HADOOP_HOME重启系统