上一篇文章已经搭建好了hadoop单机模式的服务,现在我们要在本地idea上操作hdfs。
首先创建一个maven工程
1.目录结构
2.pom.xml
<dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>3.8.1</version> <scope>test</scope> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-core</artifactId> <version>2.6.0-mr1-cdh5.11.0</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>2.7.3</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.3</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-core</artifactId> <version>2.7.3</version> </dependency>
3.HdfsTest类
public class HdfsTest { public static void main(String[] args) { FileSystem fileSystem = null; try { fileSystem = FileSystem.get(Tools.configuration); fileSystem.mkdirs(new Path("/hadoop")); }