Hadoop 环境搭好之后,我尝试了实现一个 Hadoop 网盘的小项目,使用 Java 的 api,你也可以选择其他的。经过修改,网盘页面具体的效果如下。
实现了文件的上传,下载,删除,用户登录到自己的文件夹进行简易的文件管理。
一、这里我的前期准备:
1)Hadoop2.7.3
2)mysql 的 Jdbc 驱动包
3) 上传组件
4) 这里电脑还需要安装 mysql
二、Hadoop2.7.3 的安装配置详见 Hadoop 网盘具体实现(一)
三、MySQL 的安装网上的教程很多,不再赘述。这里着重介绍如何在 eclipse 上快速配置 MySQL。
1)打开 eclipse,创建 dynamic web project
2)将 mysql-connector-java-5.1.7-bin.jar 复制到 WebContent/WEB-INF/lib / 目录下
3)链接数据库
因为我用的是 mac 下的 eclipse,顺序可能不太一样,大概如下,打开 window–>Perspective–>Open Perspective–>Other
选择 Database Development,OK
然后再 Database Connections 文件夹下右键选择 New
接着选择 MySQL,next
接着如下,选择匹配自己的 MySQL 配置填写,填完之后可以 test connection,如果显示 Ping Successfully,则数据库配置好了
这里数据库建表如下
4)、fileupload 控件实现文件的上传
(1)首先将 commons-fileupload-1.3.1.jar 和 commons-io-2.4.jar 复制到 WEB-INF/lib 目录下。
(2)在 WebContent / 下创建 inedx.jsp 文件用于上传文件。
来测试文件上传
<%@ page language="java" contentType="text/html; charset=UTF-8"
pageEncoding="UTF-8"%>
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<title>Insert title here</title>
</head>
<body>
<form class="form-inline" method="POST" enctype="MULTIPART/FORM-DATA" action="UploadServlet">
<div style="line-height:50px;float:left;">
<input type="submit" name="submit" value="上传文件" />
</div>
<div style="line-height:50px;float:left;">
<input type="file" name="file1" size="30"/>
</div>
</form>
</body>
</html>
然后再创建一个 UploadServlet 处理上传的文件。
代码如下:
package com.lsy.yunpan.controller;
import java.io.File;
import java.io.IOException;
import java.util.Iterator;
import java.util.List;
import javax.servlet.ServletContext;
import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.apache.commons.fileupload.FileItem;
import org.apache.commons.fileupload.disk.DiskFileItemFactory;
import org.apache.commons.fileupload.servlet.ServletFileUpload;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.mapred.JobConf;
import com.lsy.yunpan.model.HDFSDao;
/**
* Servlet implementation class UploadServlet
*/
@WebServlet("/UploadServlet")
public final class UploadServlet extends HttpServlet {
private static final long serialVersionUID = 1L;
private final Log log = LogFactory.getLog(UploadServlet.class);
private final int MAX_FILE_SIZE = 50 * 1024 * 1024; // 50M
private final int MAX_MEM_SIZE = 50 * 1024 * 1024; // 50M
private String fileUploadPath;
/**
* @see HttpServlet#HttpServlet()
*/
public UploadServlet() {
super();
}
@Override
public void init() throws ServletException {
super.init();
log.debug("init UploadServlet");
ServletContext context = getServletContext();
this.fileUploadPath = context.getInitParameter("file.upload.path");
log.debug("source file path:" + fileUploadPath + "");
}
/**
* @see HttpServlet#doPost(HttpServletRequest request, HttpServletResponse
* response)
*/
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException,
IOException {
request.setCharacterEncoding("UTF-8");
File file;
// 验证上传内容了类型
String contentType = request.getContentType();
if ((contentType.indexOf("multipart/form-data") >= 0)) {
DiskFileItemFactory factory = new DiskFileItemFactory();
// 设置内存中存储文件的最大值
factory.setSizeThreshold(MAX_MEM_SIZE);
// 本地存储的数据大于 maxMemSize.
factory.setRepository(new File("/tmp"));
// 创建一个新的文件上传处理程序
ServletFileUpload upload = new ServletFileUpload(factory);
// 设置最大上传的文件大小
upload.setSizeMax(MAX_FILE_SIZE);
try {
// 解析获取的文件
List<FileItem> fileList = upload.parseRequest(request);
// 处理上传的文件
Iterator<FileItem> iterator = fileList.iterator();
log.debug("begin to upload file to tomcat server</p>");
while (iterator.hasNext()) {
FileItem item = iterator.next();
if (!item.isFormField()) {
// 获取上传文件的参数
String fileName = item.getName();
String fn = fileName.substring(fileName.lastIndexOf("\\") + 1);
log.debug("<br>" + fn + "<br>");
// boolean isInMemory = item.isInMemory();
// long sizeInBytes = item.getSize();
// 写入文件
if (fileName.lastIndexOf("\\") >= 0) {
file = new File(fileUploadPath, fileName.substring(fileName.lastIndexOf("\\")));
} else {
file = new File(fileUploadPath, fileName.substring(fileName.lastIndexOf("\\") + 1));
}
item.write(file);
JobConf conf = HDFSDao.getConfig();
HDFSDao hdfs = new HDFSDao(conf);
hdfs.copyFile(fileUploadPath+File.separator+fn, "/swan/"+fn);
}
}
log.debug("upload file to tomcat server success!");
request.setAttribute("flag", "success");
request.setAttribute("msg", "upload file to tomcat server success!<br/>upload file to hadoop hdfs success!s");
request.getRequestDispatcher("upload.jsp").forward(request, response);
} catch (Exception ex) {
log.error(ex.getMessage());
request.setAttribute("flag", "danger");
request.setAttribute("msg", ex.getMessage());
request.getRequestDispatcher("upload.jsp").forward(request, response);
}
} else {
log.warn("<p>No file uploaded</p>");
request.setAttribute("flag", "warning");
request.setAttribute("msg", "<p>No file uploaded</p>");
request.getRequestDispatcher("upload.jsp").forward(request, response);
}
}
}
然后在 web.xml 下设置上传的路径:
<?xml version="1.0" encoding="UTF-8"?>
<web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://xmlns.jcp.org/xml/ns/javaee" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/web-app_3_1.xsd" id="WebApp_ID" version="3.1">
<display-name>hadoopYupan</display-name>
<welcome-file-list>
<welcome-file>index.html</welcome-file>
<welcome-file>index.htm</welcome-file>
<welcome-file>index.jsp</welcome-file>
<welcome-file>default.html</welcome-file>
<welcome-file>default.htm</welcome-file>
<welcome-file>default.jsp</welcome-file>
</welcome-file-list>
<context-param>
<description>Location to store uploaded file</description>
<param-name>file.upload.path</param-name>
<param-value>/Users/apple/tmp</param-value>
</context-param>
</web-app>
我们测试一下是否可以上传,我现在将上传(4).docx
可以看到,上传成功
额外链接
1.如何在在mac下安装Hadoop2.7.3
参考:链接
项目GitHub地址:https://github.com/swan815/Hadoop_YupanLearn/tree/master/hadoop_yuapn_test