说明
由于某些原因系统jvm内存最大只能给到512,但是要导出百万数据该如何实现呢?传统的一次性导出肯定是不行的
优化
Excel导出基于 springboot , easyexcel
依赖:
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>easyexcel</artifactId>
<version>2.2.6</version>
</dependency>
导出优化
基于自增id滚动式查询写入
@GetMapping("/standard-product-ext/export")
public void productExtExport(HttpServletResponse response, StandardProductExtQuery query) throws Exception{
String today = DateUtil.today();
String fileName = "标准商品标签导入模板" + today;
String fileNameCode = URLEncoder.encode(fileName, "UTF-8");
response.setContentType("application/force-download");
response.setHeader("Content-Disposition", "attachment;filename=" + fileNameCode + ".xlsx");
ExcelWriter writer = new ExcelWriterBuilder()
.autoCloseStream(true)
.file(response.getOutputStream())
.head(DscStandardProductExtVO.class)
.build();
Integer lastBatchMaxId = 0;
query.setLastBatchMaxId(lastBatchMaxId);
WriteSheet writeSheet = new WriteSheet();
writeSheet.setSheetName("标准商品标签导入模板" + today);
List<DscStandardProductExtVO> productExt
for (; ; ) {
productExt = dictionaryService.getProductExt(query);
if (productExt.isEmpty()) {
writer.write(productExt, writeSheet);
writer.finish();
break;
} else {
lastBatchMaxId = productExt.stream().map(DscStandardProductExtVO::getId).max(Integer::compareTo).orElse(Integer.MAX_VALUE);
query.setLastBatchMaxId(lastBatchMaxId);
writer.write(productExt, writeSheet);
}
}
}
- dictionaryService.getProductExt(query) 的xml
SELECT
*
FROM
base_drug
where
id > #{lastBatchMaxId}
导入优化
- 使用 easyexcel 监听器导出
@PostMapping("/standard-product-ext/import")
public ResponseResult productExtImport(MultipartFile file) throws IOException {
File localFile = new File(appSettings.getTempdir() + RandomStringUtils.randomAlphanumeric(12) + file.getOriginalFilename());
// 将上传文件 写入到 localFile 本地文件,后续对 localFile 操作读取
FileUtils.multipartFileToFile(file, localFile);
try {
ExcelUploadResult excelUploadResult = dictionaryService.updateBatchProductExtByFile(localFile);
return ResponseUtils.success(excelUploadResult);
} catch (DaoException e) {
ResponseUtils.fail(ErrorCodeConstants.FAIL, "导入标准商品标签失败");
}
return null;
}
dictionaryService.updateBatchProductExtByFile(localFile) 方法
public ExcelUploadResult updateBatchProductExtByFile(File excelFile) throws DaoException{
StandardProductExtListener listener = new StandardProductExtListener(clearSearchDao,
dscStandardProductExtDao, cosUtils, appSettings);
try {
EasyExcel.read(excelFile, DscStandardProductExtVO.class, listener).sheet().doRead();
} catch (Exception e) {
log.error("[标签导入]读取excel出错", e);
throw new ServiceException("excel导入失败");
} finally {
if (excelFile.exists()) {
excelFile.delete();
}
}
int successTotal = listener.getSuccessTotal();
int total = listener.getTotal();
int error = listener.getError();
ExcelUploadResult result = error > 0 ? new ExcelUploadResult(total, successTotal,error, listener.getErrorFileUrl())
: new ExcelUploadResult(successTotal, total, 0, null);
return result;
}
- StandardProductExtListener
需要注意的是 StandardProductExtListener 不能被spirng管理,需要手动new,依赖spring的类通过构造方法注入,这里是官方说明的,如果给spirng管理会有什么问题暂时没有尝试
public class StandardProductExtListener extends AnalysisEventListener<DscStandardProductExtVO> {
int successTotal = 0;
int total = 0;
int error = 0;
String errorFileUrl;
List<DscStandardProductExtVO> list = new ArrayList<>();
ClearSearchDao clearSearchDao;
DscStandardProductExtDao dscStandardProductExtDao;
CosUtils cosUtils;
AppSettings appSettings;
List<DscStandardProductExtVO> errorList = Lists.newArrayList();
public StandardProductExtListener(ClearSearchDao clearSearchDao, DscStandardProductExtDao dscStandardProductExtDao,
CosUtils cosUtils, AppSettings appSettings) {
this.clearSearchDao = clearSearchDao;
this.dscStandardProductExtDao = dscStandardProductExtDao;
this.cosUtils = cosUtils;
this.appSettings = appSettings;
}
/**
* 每条数据的解析
* @param vo
* @param analysisContext
*/
@Override
public void invoke(DscStandardProductExtVO vo, AnalysisContext analysisContext) {
list.add(vo);
if (list.size() > 3000) {
try {
doFile(list);
total += list.size();
list.clear();
} catch (DaoException e) {
e.printStackTrace();
}
}
}
@Override
public void doAfterAllAnalysed(AnalysisContext analysisContext) {
try {
total += list.size();
doFile(list);
log.info("处理最后数据{}", total);
errorFileUrl = updateErrorFile();
} catch (DaoException e) {
e.printStackTrace();
}
}
public void doFile(List<DscStandardProductExtVO> readExcels) throws DaoException{
// 业务逻辑处理
}
}
总结
经过测试 导出百万数据 使用这两种优化没什么问题,只是导入速度可能有点慢,但是不会OOM,其次可能系统full gc会相对严重。
关于我
觉得文章不错请扫码关注我吧