注意:每写一个导入都要写一个监听
1.编写实体类(需要导入的字段),和excel对应
@Data
@ApiModel("数据导入对象")
public class MonitPointImport extends BaseEntity implements Serializable {
@ColumnWidth(15)
@ExcelProperty("监测点编号")
private String id;
@ColumnWidth(15)
@ExcelProperty("监测点名称")
private String pointName;
@ColumnWidth(15)
@ExcelProperty("监测点类型")
private String pointType;
@ColumnWidth(15)
@ExcelProperty("监测点位置")
private String installAddress;
}
2.编写controller层
@PostMapping (value ="importMonitPoint")
@ApiOperation(value = "导入监测点信息", notes = "POST请求,导入监测点信息")
public ApiResponseBody importMonitPoint(@RequestParam(value = "file") MultipartFile file){
return pointArchivesService.importMonitPoint(file);
}
3.编写service层 (PointImportListener为监听)
@Override
public ApiResponseBody importMonitPoint(MultipartFile file) {
try{
//获取文件流
InputStream inputStream = file.getInputStream();
//easyexcel导入文件
EasyExcel.read(inputStream, MonitPointImport.class,new PointImportListener(monitPointMapper)).sheet().doRead();
return ApiResponseBody.defaultSuccess();
}catch (IOException e){
e.printStackTrace();
return ApiResponseBody.error(BizCodeMsgEnum.ERROR);
}
}
4.编写监听 代码解读:(MonitPointImport为实体类)(monitPointMapper.insertMonitPoint(list);存储数据库)
如果需要对字段进行处理,可以在监听进行处理,不需要的话,拿走改一下实体类直接用
@Slf4j
public class PointImportListener extends AnalysisEventListener<MonitPointImport> {
private static final Logger LOGGER = LoggerFactory.getLogger(PointImportListener.class);
/**
* 每隔5条存储数据库,实际使用中可以3000条,然后清理list ,方便内存回收
*/
private static final int BATCH_COUNT = 5;
private List<MonitPointImport> list = ListUtils.newArrayListWithExpectedSize(BATCH_COUNT);
/**
* 如果使用了spring,请使用这个构造方法。每次创建Listener的时候需要把spring管理的类传进来
*
* @param
*/
private MonitPointMapper monitPointMapper;
public PointImportListener(MonitPointMapper monitPointMapper) {
this.monitPointMapper = monitPointMapper;
}
@Override
public void invoke(MonitPointImport monitPointImport, AnalysisContext analysisContext) {
LOGGER.info("解析到一条数据:{}", JSON.toJSONString(monitPointImport));
list.add(monitPointImport);
// 达到BATCH_COUNT了,需要去存储一次数据库,防止数据几万条数据在内存,容易OOM
if (list.size() >= BATCH_COUNT) {
saveData();
// 存储完成清理 list
list = ListUtils.newArrayListWithExpectedSize(BATCH_COUNT);
}
}
/**
* 所有数据解析完成了 都会来调用
*
* @param context
*/
@Override
public void doAfterAllAnalysed(AnalysisContext context) {
// 这里也要保存数据,确保最后遗留的数据也存储到数据库
saveData();
LOGGER.info("所有数据解析完成!");
}
/**
* 加上存储数据库
*/
private void saveData() {
LOGGER.info("{}条数据,开始存储数据库!", list.size());
monitPointMapper.insertMonitPoint(list);
LOGGER.info("存储数据库成功!");
}
}