- 将编译好的hadoop文件放到自己的电脑上(目录不能有中文)资源:https://download.csdn.net/download/weixin_40281743/12318942
- 配置HADOOP_HOME在环境变量上,就和配置JAVA_HOME一样
- 在idea建立maven工程
- 在pom.xml中添加依赖,hdfs和mapreduce的依赖
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-hdfs -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.7.1</version>
</dependency>
</dependencies>
-
在windows中配置连接C:\Windows\System32\drivers\etc 中的hosts文件配置;在命令行中可以ping通自己的linux服务器。
-
在idea中写测试代码
public class HadoopTest {
public static void main(String[] args) {
//强制加载,这个就是编译后的资源,目录一定要写对
System.load("D:\\soft\\hadoop\\hadoop\\bin\\hadoop.dll");
//指定用户,指定hadoop的根目录
System.setProperty("HADOOP_USER_NAME", "root");
System.setProperty("hadoop.home.dir", "D:\\soft\\hadoop\\hadoop");
//创建与hdfs的链接
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.116.100:9000");
//hdfs的操作,和我们在java中操作file大同小异
FileSystem fs = null;
FSDataOutputStream out = null;
FSDataInputStream in = null;
try {
fs = FileSystem.get(conf);//获取文件系统
//流操作
//输出流
// out = fs.append(new Path("/input/test"));
// out.write("这是hdfs的文件流!".getBytes());
// out.flush();
// System.out.println("success");
//输入流
in = fs.open(new Path("/input/test"));
byte[] bs = new byte[in.available()];
in.read(bs);
System.out.println(new String(bs));
} catch (IOException e) {
e.printStackTrace();
} finally {
if (out != null) {
try {
out.close();
} catch (IOException e) {
e.printStackTrace();
}
}
if (in != null) {
try {
in.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
if (fs != null) {
try {
fs.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}
linux上hadoop的搭建可以参考
hadoop集群搭建
https://blog.csdn.net/weixin_40281743/article/details/105358852