1.hadoop环境的搭建————此处省略(如你不会搭建请参考:hadoop的搭建)
2.windows上的hadoop环境的搭建
将你搭建好的hadoop环境所对应的版本发到win10
hadoop的环境配置
新建系统变量:
HADOOP_HOME
C:\Users\douyonghou\Desktop\hadoop-2.7.5(我的是在桌面放的)
将bin和sbin加到path路径中
%HADOOP_HOME%/bin
%HADOOP_HOME%/sbin
- cmd中测试hadoop环境
hadoop version
如下图表示配置成功
4.打开idea并创建maven项目,具体步骤如下图,很详细,无坑(请微信关注公众号:谈谈大数据)
5.编辑pom.xml文件
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.hsd</groupId>
<artifactId>hdfs-api-excise</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.5</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-hdfs -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<finalName>${project.artifactId}</finalName>
</build>
</project>
6.添加core-site.xml文件和log4j.properties(我的位置在:C:\Users\douyonghou\Desktop\hadoop-2.7.5\etc\hadoop)放到src——>main——>resouces下
7.(**这几步很关键)进行添加hadoop的依赖包
如图:
下一步:
下一步:
下一步:
下一步:
下一步:
下一步
6.编写你的hadoop程序
package com.hadoop;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.Path;
import java.net.URI;
public class HaddoopAPI {
public static void main(String []args) throws Exception{
String uri = "hdfs://192.168.15.164:9000/";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri),conf);
//列出hdfs上 /input 目录下的所有文件
FileStatus[] statuses = fs.listStatus(new Path("/"));
for (FileStatus status:statuses){
System.out.println(status);
}
}
}
7.最后一步运行