内容暂缺,记录一下ik分词器的安装
- 安装(记得把v后面和ik后面的版本号换成与自己es兼容的版本即可)
[root@localhost elasticsearch-5.5.2]# ./bin/elasticsearch-plugin install https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v5.5.2/elasticsearch-analysis-ik-5.5.2.zip
- 安装完成后会多两个目录,通过命令查看
目录1:存放词典文件和词典配置
[root@localhost elasticsearch-5.5.2]# ls config/analysis-ik/
目录2:存放jar包和配置
[root@localhost elasticsearch-5.5.2]# ls plugins/analysis-ik/
- 重新启动es,在kibana查看分词效果
GET _analyze?pretty
{
"analyzer": "ik_smart",
"text":"京东金融"
}
GET _analyze?pretty
{
"analyzer": "ik_max_word",
"text":"京东金融"
}
分词结果
{
"tokens": [
{
"token": "京东",
"start_offset": 0,
"end_offset": 2,
"type": "CN_WORD",
"position": 0
},
{
"token": "金融",
"start_offset": 2,
"end_offset": 4,
"type": "CN_WORD",
"position": 1
}
]
}