实验环境:
centos(桌面版)+Intellij IDEA 2021.2
最开始为最小安装,只需要改为桌面安装即可!
实验内容:
(1) 从本地向HDFS中上传任意文本文件,如果指定的文件在HDFS中已经存在则覆盖,由用户来指定是复制操作还是剪切操作;
提示:FileSystem类提供exists函数判断文件是否存在;提供copyFromLocalFile函数复制文件,该函数可以接受四个参数,第一个参数表示是否删除源文件,第二个参数表示是否覆盖,后两个参数为源路径和hdfs路径
本地文件为:text.txt
文件上传到FileDirectory目录,终端查看结果:
(2) 从HDFS中下载指定文件,如果本地文件与要下载的文件名称相同,则自动对下载的文件重命名(文件名添加_0, _1, …);
提示:FileSystem类提供copyToLocalFile函数用于下载文件
要求:文字配合截图说明执行前后效果
文件text.txt下载到文件夹test中,终端查看结果:三次ls
第一次ls,文件夹中没有文件,查找为空;
第二次ls,text.txt正常复制;第三次ls,text.txt已存在,重命名后存储。
(3)从HDFS中读取指定文件输出文件内容,文件内容要求不止一行
要求:文字配合截图说明执行前后效果
终端对比:
(4)给定HDFS中某一个目录,输出该目录下所有文件的大小、权限、路径等信息、如果该文件是目录,则递归输出该目录下所有文件相关信息;
提示:FileSystem类提供listStatus函数返回FileStatus对象,该对象提供各种方法返回对应信息
要求:文字配合截图说明执行前后效果
终端对比:
(5)删除HDFS中指定文件
要求:文字配合截图说明执行前后效果
终端查看结果:
执行前:
执行后:
代码集合:
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import java.io.BufferedReader;
import java.io.File;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.URI;
import java.util.Scanner;
public class HDFS_1 {
public static FileSystem dfs;
static {
try {
URI uri = new URI("hdfs://localhost:9000");
Configuration conf = new Configuration();
dfs = FileSystem.get(uri, conf, "hadoop");
} catch (Exception e) {
e.printStackTrace();
}
}
public static void createFolder(String path) throws IOException {
Path dir = new Path(path);
boolean flag = dfs.mkdirs(dir);
if (flag) {
System.out.println("create folder success...");
} else {
System.out.println("create folder failure...");
}
}
public static void uploadFileToHDFS(String localFile, String hdfsPath) throws IOException {
int index = localFile.lastIndexOf("/");
String fileName = localFile.substring(index + 1);
Path dstFile = new Path(hdfsPath + fileName);
Scanner scanner = new Scanner(System.in);
String copyOrCut = "";
while (!"0".equals(copyOrCut) && !"1".equals(copyOrCut)) {
System.out.print("please select copy or cut(0: copy, 1:cut):");
copyOrCut = scanner.next();
}
dfs.copyFromLocalFile(!"0".equals(copyOrCut), dfs.exists(dstFile), new Path(localFile), new Path(hdfsPath));
}
public static void downloadFileFromHDFS(String hdfsFile, String localPath) throws IOException {
int i = hdfsFile.lastIndexOf("/");
String fileName = hdfsFile.substring(i + 1);
File file = new File(localPath + fileName);
if (file.exists()) {
int j = fileName.lastIndexOf(".");
if (j != -1) {
String fileNameTemp = fileName.substring(0, j);
String fileType = fileName.substring(j + 1);
fileName = fileNameTemp + "_0." + fileType;
} else {
fileName = fileName + "_0";
}
}
dfs.copyToLocalFile(false, new Path(hdfsFile), new Path(localPath + fileName));
}
public static String readHDFSFile(String filePath) throws IOException {
Path path = new Path(filePath);
FSDataInputStream hfis = dfs.open(path);
BufferedReader br = null;
StringBuilder builder = null;
try {
br = new BufferedReader(new InputStreamReader(hfis));
String line = null;
builder = new StringBuilder();
while ((line = br.readLine()) != null) {
builder.append(line).append("\n");
}
} finally {
if (br != null) {
br.close();
}
if (hfis != null) {
hfis.close();
}
}
return builder.toString();
}
public static void listFile(String folderPath) throws IOException {
Path folder = new Path(folderPath);
FileStatus[] fileStatuses = dfs.listStatus(folder);
for (FileStatus fileStatus : fileStatuses) {
if (fileStatus.isDirectory()) {
listFile(fileStatus.getPath().toString());
} else {
System.out.println("p: " + fileStatus.getPermission()
+ ", size: " + fileStatus.getLen()
+ ", path: " + fileStatus.getPath().toString());
}
}
}
public static void deleteHDFSFile(String hdfsFile) throws IOException {
Path path = new Path(hdfsFile);
boolean delete = dfs.delete(path, false);
if (delete) {
System.out.println("删除成功");
} else {
System.out.println("删除失败");
}
}
public static void main(String[] args) throws IOException {
//uploadFileToHDFS("/home/hadoop/text.txt", "/FileDirectory/");
//downloadFileFromHDFS("/FileDirectory/text.txt", "/home/hadoop/test/");
//String str = readHDFSFile("/FileDirectory/text.txt");
//System.out.println(str);
//listFile("/FileDirectory");
deleteHDFSFile("/FileDirectory/text.txt");
}
}