服务器系统:CentOS7
Hadoop版本:2.7.3
Windows版本:7
JDK版本:8
1.在Window7中解压hadoop
在window7中解压hadoop,并配置环境变量HADOOP_HOME,并在PATH中加入%HADOOP_HOME%/bin,建议路径中不要有空格,我的路径如下:
2.Maven依赖
创建maven项目,通过maven下载hadoop依赖的jar包。
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cn.net.bysoft</groupId>
<artifactId>hadoop-test</artifactId>
<version>1.0.0-SNAPSHOT</version>
<dependencies>
<!-- log -->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.22</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.22</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<version>1.8</version>
</dependency>
</dependencies>
</project>
注意这个jdk.tools,我在引用的时候报错了,需要安装jdk/lib下的tools.jar到maven仓库。我的maven仓库目录:D:\Applications\apache-maven-3.3.9\.m2\repository\jdk\tools\jdk.tools\1.8,这个目录下有我安装的tools.jar,如图:
3.在服务器上创建一个文件
我在服务器上创建了一个input文件夹,并且在这个文件夹里创建了一个hello.txt文件:
## 创建文件夹
## 创建文件夹的时候 可以写/input 或者 input
## 如果写/input,则会在hdfs根目录创建一个文件夹
## 如果写input,则会在/user/root下创建一个input文件夹(因为我用的是root用户,所以是/user下的root)
## 可以通过 hadoop fs -ls / 查询hdfs的根目录
hadoop fs -mkdir input
## 创建一个文本
echo "hello world hello hadoop bye" > hello.txt
## 将这个文本拷贝到hdfs的input文件中
## 文件在/user/root/input/hello.txt
hadoop fs -copyFromLocal hello.txt input
4.编写查询文件的java代码
package cn.net.bysoft.test;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class Client {
public static void main(String[] args) throws FileNotFoundException, IOException {
System.setProperty("hadoop.home.dir", "D:\\Applications\\hadoop-2.7.3");
try {
String dsf = "hdfs://192.168.150.131:8020/user/root/input/hello.txt";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(dsf), conf);
FSDataInputStream hdfsInStream = fs.open(new Path(dsf));
byte[] ioBuffer = new byte[1024];
int readLen = hdfsInStream.read(ioBuffer);
while (readLen != -1) {
System.out.write(ioBuffer, 0, readLen);
readLen = hdfsInStream.read(ioBuffer);
}
hdfsInStream.close();
fs.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
第一次执行这段代码遇到了几个异常,下面是我解决这些异常的办法:
第一个异常
Could not locate executable D:\Applications\hadoop-2.7.3\bin\winutils.exe in the Hadoop binaries.
这个异常是需要在hadoop的bin目录下放一个winutils.exe,可以到github上下载一个hadoop-common,有2.2和2.6。下载后解压到hadoop/bin目录下就可以了,提示覆盖文件就覆盖。
第二个异常
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:231)
at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:152)
at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:775)
at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:831)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:891)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:934)
at java.io.DataInputStream.read(Unknown Source)
at cn.net.bysoft.test.Client.main(Client.java:26)
我之前下载的是2.2的,所以出现了这个异常,下载一个2.7.3的hadoop.dll就可以了。
权限问题
hdfs默认的权限是supergroup组的用户,你的windows用户并没有在里面,这个网上有三、四种解决方案,我是直接把input文件夹的权限都开放了:
## 开放hdfs文件夹权限,和linux系统大致一样
hadoop fs -chmod 777 input
5.执行程序
执行上面的代码,输出如下: