1.在idea中创建java项目;
2.导入相关依赖包;
2.1 依赖包位置(确保hadoop在虚拟机上已启动:start-all.sh):
需要导入的依赖包有:
(1)~/hadoop/common目录下的hadoop-common-2.7.1.jar和haoop-nfs-2.7.1.jar;
(2)~/hadoop/common/lib目录下的所有JAR包;
(3)~/share/hadoop/hdfs目录下的haoop-hdfs-2.7.1.jar、haoop-hdfs-nfs-2.7.1.jar和hadoop-hdfs-client-3.2.3.jar;
(4)~/hadoop/hdfs/lib目录下的所有JAR包。
2.2在IDEA中File->Project Structure;
3.在src文件夹下创建HDFSFileIfExist类,并编写代码;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class HDFSFileIfExist {
public static void main(String[] args){
try{
String fileName = "test";
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.18.23:9000");
conf.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");
FileSystem fs = FileSystem.get(conf);
if(fs.exists(new Path(fileName))){
System.out.println("文件存在");
}else{
System.out.println("文件不存在");
}
}catch (Exception e){
e.printStackTrace();
}
}
}
Run后输出“文件不存在”
4.生成JavaHdfs.jar文件并上传至虚拟机上;
4.1在IDEA中File->Project Structure;
4.2 利用Xftp将生成的jar包上传至虚拟机中;
5.在虚拟机上运行jar包并输出“文件不存在”;
. /bin/hadoop jar JavaHDFS.jar