下载 kibana-7.10.1-linux-x86_64.tar.gz
解压
tar -zxf kibana-7.10.1-linux-x86_64.tar.gz
cd logstash-7.10.1
新增配置文件 logstash_springboot.conf 到 config 目录
input {
tcp {
port => 4560
codec => json_lines
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "springboot-%{+YYYY.MM.dd}"
}
}
启动
./bin/logstash -f config/logstash_springboot.conf
后台启动:
nohup ./bin/logstash -f config/logstash_springboot.conf > /dev/null 2>&1 &
异常1:
不能创建索引
[2021-01-13T10:07:15,161][WARN ][logstash.outputs.elasticsearch][main][e6e2fccab1e25a77be1ca1bf10ed265fc3b7f51eae2d3db7249a7db5eaa74cf2] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"springboot-2021.01.13", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x29233dd5>], :response=>{"index"=>{"_index"=>"springboot-2021.01.13", "_type"=>"_doc", "_id"=>nil, "status"=>400, "error"=>{"type"=>"validation_exception", "reason"=>"Validation Failed: 1: this action would add [2] total shards, but this cluster currently has [1000]/[1000] maximum shards open;"}}}}
原因:
elasticsearch7版本及以上的,默认只允许1000个分片,因为集群分片数不足引起的。
解决:
在kibana的tools中改变临时设,如图:
PUT /_cluster/settings
{
"transient": {
"cluster": {
"max_shards_per_node":10000
}
}
}