一、导入jar包
1、如果使用maven工程创建项目,在pom文件中加入HDFS依赖(跟hadoop版本一致)第一次下载会有很长时间,耐心等待jar包的下载。
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.5</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.5</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.5</version>
</dependency>
2、如果不用maven工程,你需要拷贝以下文件到你的项目中:
为了编写一个能够与HDFS交互的Java应用程序,一般需要向Java工程中添加以下JAR包:
(1)”/usr/local/hadoop/share/hadoop/common”目录下的hadoop-common-2.6.5.jar和haoop-nfs-2.6.5.jar;
(2)/usr/local/hadoop/share/hadoop/common/lib”目录下的所有JAR包;
(3)“/usr/local/hadoop/share/hadoop/hdfs”目录下的haoop-hdfs-2.6.5.jar和haoop-hdfs-nfs-2.6.5.jar;
(4)“/usr/local/hadoop/share/hadoop/hdfs/lib”目录下的所有JAR包。
二、编写java 程序
这里写两个操作HDFS的java代码
1、判断HDFS中是否有相应的文件(注意hdfs的ip是主节点ip)
package Hadoop.demo;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class HdfsTest {
public static void main(String[] args) {
// TODO Auto-generated method stub
try{
String fileName = "/text/test.txt";
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.60.1:9000");
conf.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");
FileSystem fs = FileSystem.get(conf);
if(fs.exists(new Path(fileName))){
System.out.println("文件存在");
}else{
System.out.println("文件不存在");
}
}catch (Exception e){
e.printStackTrace();
}
}
}
2、读取hdfs中的文件内容
package Hadoop.demo;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import java.io.BufferedReader;
import java.io.InputStreamReader;
public class HdfsRead {
public static void main(String[] args) {
try {
Configuration conf = new Configuration();
conf.set("fs.defaultFS","hdfs://192.168.60.1:9000");
conf.set("fs.hdfs.impl","org.apache.hadoop.hdfs.DistributedFileSystem");
FileSystem fs = FileSystem.get(conf);
Path file = new Path("/text/test.txt");
FSDataInputStream getIt = fs.open(file);
BufferedReader d = new BufferedReader(new InputStreamReader(getIt));
String content = d.readLine(); //读取文件一行
System.out.println(content);
d.close(); //关闭文件
fs.close(); //关闭hdfs
} catch (Exception e) {
e.printStackTrace();
}
}
}
运行代码中可能会遇到以下问题:
Error running 'ExampleForHbase': Cannot start process, the working directory 'J:\HadoopTest\HadoopTest' does not exist
解决方案:
选择第一个
选择红框在ok,运行