springboot集成Elk做日志系统(二)-切面/分词器快速定位日志


前面还有:
springboot集成Elk做日志系统(一) 环境搭建
springboot集成Elk做日志系统(三)java通过RestHighLevelClient操作es日志

一、日志的其他配置:

1.让logback可以打出hibernate的sql等:

在logback-spring.xml中配置:

<!-- hibernate 配置 -->
    <logger name="org.hibernate.type.descriptor.sql.BasicBinder"  level="TRACE" />
    <logger name="org.hibernate.type.descriptor.sql.BasicExtractor"  level="DEBUG" />
    <logger name="org.hibernate.SQL" level="DEBUG" />
    <logger name="org.hibernate.type" level="TRACE" />
    <logger name="org.hibernate.engine.QueryParameters" level="DEBUG" />
    <logger name="org.hibernate.engine.query.HQLQueryPlan" level="DEBUG" />

实现效果:

在这里插入图片描述

2.用户操作的相关信息获取

思路

  • 使用注解记录日志的元数据
  • 使用AOP切面进行操作日志数据的组装插入
  • 对dto字段增加注解别名
  • 比较新老dto,将变更的内容存入http会话
  • aop方法调用完毕获取字段变更内容
    在这里插入图片描述

2.1 实体类设计LogOperation

用于把要取到的日志信息存到一起,然后用log直接打印到控制台,这样结合前文的配置,elasticsearch就能直接收到切面记录的日志了

package com.sinoccdc.devops;

import freemarker.cache.StrongCacheStorage;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;

import java.util.Date;

@Getter
@Setter
@AllArgsConstructor
@NoArgsConstructor
public class LogOperation {

    //用户名称
    String opUserName;

    //调用后台的方法
    String opMethod;

    //请求路径
    String opUrl;

    //操作参数
    String opParam;

    //用户账号
    String opUserAccount;


    //操作大类
    String opCategory;

    //操作子类
    String opSubCategory;

    //操作类型,新增,修改,删除,查询
    Integer opType;

    //操作类型的中文说明
    String typeStr;

    //描述
    String opDesc;



    //操作结果
    String opResult;



    //操作耗时
    Long opCost;



    //请求Ip
    String opIp;

    //请求扩展字段
    String opExtend;

    //创建时间
    Date createTime;

    //操作时间的扩展类
    String opCreateTime;

    //toString方法用于直接把日志信息打印到控制台
    @Override
    public String toString() {
        return "LogOperation{" +
                "opUserName='" + opUserName + '\'' +
                ", opMethod='" + opMethod + '\'' +
                ", opUrl='" + opUrl + '\'' +
                ", opParam='" + opParam + '\'' +
                ", opUserAccount='" + opUserAccount + '\'' +
                ", opCategory='" + opCategory + '\'' +
                ", opSubCategory='" + opSubCategory + '\'' +
                ", opType=" + opType +
                ", typeStr='" + typeStr + '\'' +
                ", opDesc='" + opDesc + '\'' +
                ", opResult='" + opResult + '\'' +
                ", opCost=" + opCost +
                ", opIp='" + opIp + '\'' +
                ", opExtend='" + opExtend + '\'' +
                ", createTime=" + createTime +
                ", opCreateTime='" + opCreateTime + '\'' +
                '}';
    }
}

2.2 注解记录操作日志元数据 OperationLog

package com.sinoccdc.devops.annotation;

import com.sinoccdc.devops.OperationLogTypeEnum;

import java.lang.annotation.*;

import static com.sinoccdc.devops.OperationLogTypeEnum.INSERT;

/**
 * @author syl
 * 注解记录操作日志元数据
 */
@Target(ElementType.METHOD)
@Retention(RetentionPolicy.RUNTIME)
@Documented
public @interface OperationLog {
    String category();
    String subcategory();
    String desc();
    OperationLogTypeEnum type() default INSERT;
}

2.3 枚举类OperationLogTypeEnum

package com.sinoccdc.devops;

public enum OperationLogTypeEnum {

    INSERT(0,"新增"),
    UPDATE(1,"修改"),
    DELETE(2,"删除"),
    SEARCH(3,"查询");

    //    可以看出这在枚举类型里定义变量和方法和在普通类里面定义方法和变量没有什么区别。唯一要注意的只是变量和方法定义必须放在所有枚举值定义的后面,否则编译器会给出一个错误。
    private int type;
    private String desc;

    OperationLogTypeEnum(int type, String desc) {
        this.type = type;
        this.desc = desc;
    }

    /**
     * 自己定义一个静态方法,通过type返回枚举常量对象
     * @param type
     * @return
     */
    public static OperationLogTypeEnum getValue(int type){

        for (OperationLogTypeEnum  logTypeEnum: values()) {
            if(logTypeEnum.getType() == type){
                return  logTypeEnum;
            }
        }
        return null;

    }


    public int getType() {
        return type;
    }

    public void setType(int type) {
        this.type = type;
    }

    public String getDesc() {
        return desc;
    }

    public void setDesc(String desc) {
        this.desc = desc;
    }

}

```java

### 2.4  AOP实现—OperationLogAspect,记录操作用户信息、请求返回等信息 

```java
package com.sinoccdc.devops;

import com.alibaba.fastjson.JSON;
import com.sinoccdc.devops.annotation.OperationLog;
import com.sinoccdc.devops.domain.model.security.user.User;
import com.sinoccdc.devops.web.apis.common.SecurityController;
import groovy.util.logging.Slf4j;
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.reflect.MethodSignature;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.web.context.request.RequestContextHolder;
import org.springframework.web.context.request.ServletRequestAttributes;
import org.springframework.web.multipart.MultipartFile;

import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.http.HttpSession;
import java.io.IOException;
import java.lang.reflect.Method;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;

/**
 * @author syl
 * AOP实现,记录操作用户信息、请求返回等信息
 */
@Aspect
@Component
@Slf4j
public class OperationLogAspect {



    @Around("@annotation(com.sinoccdc.devops.annotation.OperationLog)")
    public Object insertOperationLog(ProceedingJoinPoint joinPoint) throws Throwable {


        HttpServletRequest request = ((ServletRequestAttributes) RequestContextHolder.getRequestAttributes()).getRequest();
        long start = System.currentTimeMillis();
        MethodSignature signature = (MethodSignature) joinPoint.getSignature();
        Method method = signature.getMethod();

        OperationLog opLog = method.getAnnotation(OperationLog.class);
        LogOperation operationLog = new LogOperation();


        Date now = new Date(); // 创建一个Date对象,获取当前时间
        // 指定格式化格式
        SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
        operationLog.setCreateTime(now);
        operationLog.setOpCreateTime(sdf.format(now));

        //用户名和Id的查询
        User loginUser = SecurityController.getLoginUser();
        if (loginUser != null && loginUser.getLoginId() != null) {
            operationLog.setOpUserAccount(loginUser.getLoginId());
            operationLog.setOpUserName(loginUser.getUsername());
        }



        HttpSession session = request.getSession();


        //获取用户ip地址
        HttpServletRequest request1 = ((ServletRequestAttributes) RequestContextHolder.getRequestAttributes())
                .getRequest();
        String ipAddress = getIpAddress(request1);
        operationLog.setOpIp(ipAddress);



        String opCatetory = opLog.category();
        String opSubcategory = opLog.subcategory();
        String opDesc = opLog.desc();
        OperationLogTypeEnum opType = opLog.type();
        operationLog.setOpCategory(opCatetory);
        operationLog.setOpSubCategory(opSubcategory);
        operationLog.setOpDesc(opDesc);
        operationLog.setOpType(opType.getType());
        operationLog.setTypeStr(opType.getDesc());

        String className = joinPoint.getTarget().getClass().getName();
        String methodName = method.getName();
        methodName = className + "." + methodName;
        operationLog.setOpMethod(methodName);
        List<Object> objList = new ArrayList<>();
        Object[] args = joinPoint.getArgs();
        for (Object arg : args) {
            if (arg instanceof HttpServletRequest
                    || arg instanceof HttpServletResponse
                    || arg instanceof MultipartFile) {
                continue;
            }
            objList.add(arg);
        }
        String params = JSON.toJSONString(objList);
        operationLog.setOpParam(params);
        Object result;
        try {
            result = joinPoint.proceed();
            operationLog.setOpResult(JSON.toJSONString(result));
        } catch (Exception exception) {
            operationLog.setOpResult(exception.getMessage());
            throw exception;
        } finally {
            long end = System.currentTimeMillis();
            long cost = (end - start);
            operationLog.setOpCost(cost);
            // 记录数据变更
            BeanDiff beanDiff = BeanCompareUtils.getBeanDiff();
            if (beanDiff != null) {
                operationLog.setOpType(OperationLogTypeEnum.UPDATE.getType());
                operationLog.setOpExtend(JSON.toJSONString(beanDiff));
            }
            //保存数据到数据库
//            operationLogService.insert(operationLog);


            //我们这只需要把对象打印到控制台
            Logger logger = LoggerFactory.getLogger(OperationLog.class);
            String log = operationLog.toString();
            System.out.println("aaa测试日志的内容"+log);
            //logger.info("logTestInfo_测试切面日志"+JacksonUtil.obj2json(operationLog));
            logger.info("logTestInfo_测试切面日志"+operationLog.toString());
        }
        return result;
    }


    /**
     * 获取请求主机IP地址,如果通过代理进来,则透过防火墙获取真实IP地址;
     *
     * @param request
     * @return
     * @throws IOException
     */
    public final static String getIpAddress(HttpServletRequest request)
            throws IOException {
        // 获取请求主机IP地址,如果通过代理进来,则透过防火墙获取真实IP地址

        String ip = request.getHeader("X-Forwarded-For");

        if (ip == null || ip.length() == 0 || "unknown".equalsIgnoreCase(ip)) {
            if (ip == null || ip.length() == 0
                    || "unknown".equalsIgnoreCase(ip)) {
                ip = request.getHeader("Proxy-Client-IP");
            }
            if (ip == null || ip.length() == 0
                    || "unknown".equalsIgnoreCase(ip)) {
                ip = request.getHeader("WL-Proxy-Client-IP");
            }
            if (ip == null || ip.length() == 0
                    || "unknown".equalsIgnoreCase(ip)) {
                ip = request.getHeader("HTTP_CLIENT_IP");
            }
            if (ip == null || ip.length() == 0
                    || "unknown".equalsIgnoreCase(ip)) {
                ip = request.getHeader("HTTP_X_FORWARDED_FOR");
            }
            if (ip == null || ip.length() == 0
                    || "unknown".equalsIgnoreCase(ip)) {
                ip = request.getRemoteAddr();
            }
        } else if (ip.length() > 15) {
            String[] ips = ip.split(",");
            for (int index = 0; index < ips.length; index++) {
                String strIp = ips[index];
                if (!("unknown".equalsIgnoreCase(strIp))) {
                    ip = strIp;
                    break;
                }
            }
        }
        return ip;
    }

}

2.5 使用AOP注解在接口层进行切面,无侵入

使用自己创建的@OperationLog注解,将此处接口的类别,描述,类型(增删查改)注明,便于后面切面拿到数据

@OperationLog(category = "十八项指标系统", subcategory = "查询指标信息", desc = "查询指标信息", type = OperationLogTypeEnum.SEARCH)
    @ApiOperation(value = "查找指标信息", notes = "查找指标信息")
    @ApiImplicitParam(name = "id", value = "指标信息ID", required = true, dataType = "String")
    @PostMapping("/query")
    public ResponseEntity<ApiResult> queryDeductionRule(@Valid @RequestBody DeductionRuleQueryPayload payload){
        DeductionRuleQueryCommnad command = payload.toCommand();
        Page<DeductionRuleEx> page = this.getDeductionRuleService().queryDeductionRuleByRuleNameAndCreateUser(command);
        return DeductionRuleTableResult.build(page);
    }

在这里插入图片描述

2.6 字段别名注解


package com.sinoccdc.devops;

import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;

/**
 * syl:字段别名注解
 */
@Retention(RetentionPolicy.RUNTIME)
public @interface FieldAlias {
    String value() default "";
}

2.7 对业务字段增加注解

标记需要记录的字段,比较bean的时候只比较被注解的字段


/**
     * 规则名称
     */
    @FieldAlias("规则名称")
    @Column(columnDefinition = "varchar(256 char) comment '规则名称'",name = "RULE_NAME", nullable = false, length = 256)
    private String ruleName;

在这里插入图片描述

2.8 比较方法BeanCompareUtils

使用反射进行bean字段的比较,只比较被注解的字段,构造FieldDiff存入会话中


package com.sinoccdc.devops;

import com.alibaba.fastjson.JSON;
import groovy.util.logging.Slf4j;
import org.springframework.stereotype.Service;
import org.springframework.web.context.request.RequestContextHolder;
import org.springframework.web.context.request.ServletRequestAttributes;

import javax.servlet.http.HttpServletRequest;
import java.lang.reflect.Field;
import java.util.*;


/**
 * 实例字段差异比较工具类
 * @author test
 */
@Service
@Slf4j
public class BeanCompareUtils{

    private static final String INCLUDE = "INCLUDE";
    private static final String EXCLUDE = "EXCLUDE";
    private static final String FILTER_TYPE = "FILTER_TYPE";
    private static final String FILTER_ARRAY = "FILTER_ARRAY";

    // 存放过滤类型及过滤字段数组
    private static ThreadLocal<Map<String, Object>> threadLocal = new ThreadLocal<>();


    public static BeanDiff getBeanDiff() {
        HttpServletRequest request = ((ServletRequestAttributes) RequestContextHolder.getRequestAttributes()).getRequest();
        Object beanDiff = request.getAttribute("beanDiff");
        return beanDiff != null ? (BeanDiff) beanDiff : null;
    }

    /**
     * bean比较
     * @param oldBean
     * @param newBean
     * @return
     */
    public static BeanDiff compare(Object oldBean, Object newBean) {
        BeanDiff beanDiff = new BeanDiff();

        Class oldClass = oldBean.getClass();
        Class newClass = newBean.getClass();

        if (oldClass.equals(newClass)) {
            List<Field> fieldList = new ArrayList<>();
            fieldList = getCompareFieldList(fieldList, newClass);

            Map<String, Object> map = threadLocal.get();

            boolean needInclude = false;
            boolean needExclude = false;
            boolean hasArray = false;
            String[] fieldArray = null;

            if(map != null) {
                fieldArray = (String[])map.get(FILTER_ARRAY);
                String type = (String)map.get(FILTER_TYPE);

                if (fieldArray != null && fieldArray.length > 0) {
                    // 数组排序
                    Arrays.sort(fieldArray);
                    hasArray = true;

                    if (INCLUDE.equals(type)) {
                        needInclude = true;
                    } else if (EXCLUDE.equals(type)) {
                        needExclude = true;
                    }
                }
            }

            for (int i = 0; i < fieldList.size(); i ++) {
                Field field = fieldList.get(i);
                field.setAccessible(true);
                FieldAlias alias = field.getAnnotation(FieldAlias.class);

                try {
                    Object oldValue = field.get(oldBean);
                    Object newValue = field.get(newBean);

                    if (hasArray) {
                        // 二分法查找该字段是否被排除或包含
                        int idx = Arrays.binarySearch(fieldArray, field.getName());

                        // 该字段被指定排除或没有指定包含
                        if ((needExclude && idx > -1) || (needInclude && idx < 0)) {
                            continue;
                        }
                    }

                    if (nullableNotEquals(oldValue, newValue)) {
                        FieldDiff fieldDiff = new FieldDiff(field.getName(), alias.value(), oldValue, newValue);

                        // 打印
                        System.out.println(fieldDiff.toString());

                        beanDiff.addFieldDiff(fieldDiff);
                    }

                } catch (IllegalArgumentException e) {
                    e.printStackTrace();
                } catch (IllegalAccessException e) {
                    e.printStackTrace();
                }
            }
        }

        return beanDiff;
    }

    /**
     * bean比较
     * @param oldBean
     * @param newBean
     * @param includeFieldArray 需要包含的字段
     * @return
     */
    public static BeanDiff compareInclude(Object oldBean, Object newBean, String[] includeFieldArray) {
        Map<String, Object> map = new HashMap<>();
        map.put(FILTER_TYPE, INCLUDE);
        map.put(FILTER_ARRAY, includeFieldArray);
        threadLocal.set(map);

        return compare(oldBean, newBean);
    }

    /**
     * bean比较
     * @param oldBean
     * @param newBean
     * @param excludeFieldArray 需要排除的字段
     * @return
     */
    public static BeanDiff compareExclude(Object oldBean, Object newBean, String[] excludeFieldArray) {
        Map<String, Object> map = new HashMap<>();
        map.put(FILTER_TYPE, EXCLUDE);
        map.put(FILTER_ARRAY, excludeFieldArray);
        threadLocal.set(map);

        return compare(oldBean, newBean);
    }


    /**
     * 获取需要比较的字段list
     * @param fieldList
     * @param clazz
     * @return
     */
    private static List<Field> getCompareFieldList(List<Field> fieldList, Class clazz) {
        Field[] fieldArray = clazz.getDeclaredFields();

        List<Field> list = Arrays.asList(fieldArray);

        for (int i = 0; i < list.size(); i ++) {
            Field field = list.get(i);
            FieldAlias alias = field.getAnnotation(FieldAlias.class);
            if (alias != null) {
                fieldList.add(field);
            }
        }

        Class superClass = clazz.getSuperclass();
        if (superClass != null) {
            getCompareFieldList(fieldList, superClass);
        }
        return fieldList;
    }


    /**
     * 比较值是否不相等
     * @param oldValue
     * @param newValue
     * @return
     */
    private static boolean nullableNotEquals(Object oldValue, Object newValue) {

        if (oldValue == null && newValue == null) {
            return false;
        }

        if (oldValue != null && oldValue.equals(newValue)) {
            return false;
        }

        return (!"".equals(oldValue) || newValue != null) && (!"".equals(newValue) || oldValue != null);

    }

}

2.9 两个对象差异存入BeanDiff


package com.sinoccdc.devops;

import java.util.ArrayList;
import java.util.List;

/**
 * 两个对象差异
 */
public class BeanDiff {
    /**
     * 所有差异字段list
     */
    private List<FieldDiff> fieldDiffList = new ArrayList<>();

    public void addFieldDiff(FieldDiff fieldDiff) {
        this.fieldDiffList.add(fieldDiff);
    }

    public List<FieldDiff> getFieldDiffList() {
        return fieldDiffList;
    }

    public void setFieldDiffList(List<FieldDiff> fieldDiffList) {
        this.fieldDiffList = fieldDiffList;
    }


}


2.10 两个对象差异-字段新旧值FieldDiff


package com.sinoccdc.devops;

/**
 * 两个对象差异-字段新旧值
 */
public class FieldDiff {
    /**
     * 字段英文名
     */
    private String fieldENName;

    /**
     * 字段中文名
     */
    private String fieldCNName;

    /**
     * 旧值
     */
    private Object oldValue;

    /**
     * 新值
     */
    private Object newValue;


    public FieldDiff(String fieldENName, String fieldCNName, Object oldValue, Object newValue) {
        this.fieldENName = fieldENName;
        this.fieldCNName = fieldCNName;
        this.oldValue = oldValue;
        this.newValue = newValue;
    }

    // 这里省略get set 方法

    @Override
    public String toString() {
        String oldVal = this.oldValue == null ? "" : this.oldValue.toString();
        String newVal = this.newValue == null ? "" : this.newValue.toString();
        return "将 " + this.fieldCNName + " 从“" + oldVal + "” 修改为 “" + newVal + "”";
    }

}

以上这几个文件都用上之后,只要接口上配置了注解,访问该接口,就能在切面中获取到日志信息,如下:

在这里插入图片描述

二、elasticSearch使用分词

1.为什么要分词?

在之前的操作和配置后,日志中获取到了用户的操作日志,都存在message字段中,但是我们想要在kibana查询到日志的内容,还需要对message进行分词;

但是我们是通过logstash自动将日志导入进es的,导入的时候已经自动创建了索引和映射,不好修改,所以我们需要在logstash中提前配置好elasticsearch的分词模板!!!

1.1下载

访问:https://github.com/medcl/elasticsearch-analysis-ik/releases,找到与自己的ES相同的版本,
可以下载源码,然后自己编译,也可以直接下载编译好的压缩包

此处我使用7.10.2的

在这里插入图片描述

1.2解压安装

在es目录下的plugins目录下创建一个新文件夹,命名为ik,然后把上面的压缩包中的内容解压到该目录中。
在这里插入图片描述

使用的是es集群的,别忘了三个es的安装包下面都要放ik分词器!!!

3.重启即可

IK分词器对中文具有良好支持的分词器,相比于ES自带的分词器,IK分词器更能适用中文博大精深的语言环境,
此外,IK分词器包括ik_max_word和ik_smart,它们有什么区别呢?

ik_max_word会将文本做最细粒度的拆分;
ik_smart 会做最粗粒度的拆分。

如:同样是对“这是一个对分词器的测试”进行分词,不同的分词器分词结果不同:
ik_max_word:这是/一个/一/个/对分/分词器/分词/词/器/测试
ik_smart:这是/一个/分词器/测试
standard:这/是/一/个/对/分/词/器/的/测/试

2.如何分词

2.1.定义Logstash模板

创建一个叫 logstash-test-.json的文件

(据说这个模板适合7.x的 )


{
  "index_patterns": ["logstash-test-*"],
  "order" : 0,
  "version": 1,
  "settings": {
    "number_of_shards": 1,
    "number_of_replicas":0
  },
  "mappings": {
    "date_detection": true,
    "numeric_detection": true,
    "dynamic_templates": [
      {
        "string_fields": {
          "match": "*",
          "match_mapping_type": "string",
          "mapping": {
            "type": "text",
            "norms": false,
            "analyzer": "ik_max_word",
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          }
        }
      }
    ]
  }
}

把该文件放到自己安装logstash的包下面,此处我放在config下面了:

C:\install\elk\logstash-7.10.2-windows-x86_64\logstash-7.10.2\config\

在这里插入图片描述

说明:

“template”: “*”,是匹配所有索引的意思,

如果只想匹配以test-开头的索引, “template”: “test-*”

“analyzer”: “ik_max_word”,这里用的是ik分词器,如果想是有其他分词器,在这里修改即可

“match”: “*”,匹配字段名

“date_detection”: true,识别日期类型
“numeric_detection”: true,识别数字类型

2.2.Logstash配置文件

input { 
# stdin { }

tcp { 

# host:port就是上面appender中的destination,
# 这里其实把Logstash作为服务,开启9250端口接收logback发出的消息 
#这个需要配置成本机IP,不然logstash无法启动
host => "127.0.0.1" 
#端口号
port => 9250 
mode => "server" tags => ["tags"] 
#将日志以json格式输入
codec => json_lines 
}

}

filter {
  grok {
    match => [
      "message","%{NOTSPACE:tag}[T ]%{NOTSPACE:method}[T ]%{NOTSPACE:api}[T ]%{NOTSPACE:params}",
      "message","%{NOTSPACE:tag}[T ]%{NOTSPACE:author}[T ]%{NOTSPACE:msg}"
      ]
  }
}

output {

     elasticsearch { 
	hosts => ["localhost:9200"] 
	index => "logstash-test-%{+YYYY.MM.dd}"
	

	action => "index"
	template=>"C:/install/elk/logstash-7.10.2-windows-x86_64/logstash-7.10.2/config/logstash-test-.json"
	template_name=>"logstash-test-"
	manage_template => true
	template_overwrite => true

     }
stdout { codec => rubydebug }

} 
说明

在这里插入图片描述

①、template_name:模板名称

②、template:模板位置,就是上面的logstash.json位置

③、template_overwrite:模板如果存在,则覆盖

④、manage_template:管理模板

2.3.查看执行结果

2.3.1 查看索引mapping
GET 索引名/_mappings

结果:

对比应用了分词模板前后,message字段实现了分词

在这里插入图片描述

2.4.测试

#删除索引
DELETE /logstash-test-2021.03.31


#查看索引
GET logstash-test-2021.03.31/_mappings



#分词查询
GET /logstash-test-2021.03.31/_search
{
  "query": {
    "term":{
      "message":"管理员"
    }
  }
}

#分词测试
GET /logstash-test-2021.03.29/_analyze
{

  "field": "message",
  "text": "aaa_Login" 
}

全部完成之后,利用message查询,就能很快的查到日志了

在这里插入图片描述

参考文章:

使用logback输出Hibernate

Elasticsearch索引的操作,利用kibana(如何创建/删除一个es的索引?)

IK分词器下载、使用和测试

ES重建索引

Logstash设置ElasticSearch默认分词器

Logger打印日志

使用AOP和注解记录用户操作日志

用户操作日志记录字段修改前后值

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值