开始操作前,检查是否存在要删除的文件(/user/hadoop/program_put_input):
package CheckAndDelete ;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
public class CheckAndDelete {
/**
* 检查是否存在文件,存在则删除
*/
static boolean checkAndDelete(final String path, Configuration conf) {
Path dst_path = new Path(path);
try {
FileSystem hdfs = dst_path.getFileSystem(conf);
//检查是否存在文件
if (hdfs.exists(dst_path)) {
// 有则删除
hdfs.delete(dst_path, true);
return true;
}else{
return false;
}
} catch (IOException e) {
e.printStackTrace();
return false;
}
}
static public void main(String args[]){
Configuration conf = new Configuration();
String path = "/user/hadoop/program_put_input";
boolean status = checkAndDelete( path, conf);
System.err.println("delete? :" + status);
}
}
1、将该java代码打包为HelloHadoop.jar(程序入口选择CheckAndDelete.java)
2、将HelloHadoop.jar从workspace复制到/usr/local/hadoop目录下(你自己的hadoop目录下)
3、运行hadoop框架(伪分布式): $ bin/start-all.sh
4、运行HelloHadoop.jar : $bin/hadoop jar HelloHadoop.jar
5、执行完毕,检查是否存在要删除的文件(/user/hadoop/program_put_input)