前言
本文基于elasticsearch7.3.0版本
说明
edge_ngram和ngram是elasticsearch内置的两个tokenizer和filter
实例
步骤
- 自定义两个分析器edge_ngram_analyzer和ngram_analyzer
- 进行分词测试
创建测试索引
PUT analyzer_test
{
"settings": {
"refresh_interval": "1s",
"index": {
"max_ngram_diff": 10
},
"analysis": {
"analyzer": {
"edge_ngram_analyzer": {
"type": "custom",
"char_filter": [],
"tokenizer": "keyword",
"filter": [
"edge_ngram_filter"
]
},
"ngram_analyzer": {
"type": "custom",
"char_filter": [],
"tokenizer": "keyword",
"filter": [