首先,确报创建索引是的mapping字段和json写入的字段名称一致,否则会有意想不到的问题。刚开始,redis里的json字符串字段为regular_detail_hour,但是,我在创建索引时写的为:regular_detail_time。通过logstash写入到ES,并不报错,只是新建了一个叫regular_detail_hour的字段。
通过bean实例化了一个json,然后通过RedisTemplate
ListOperations<String,String> liststring=redisTemplate.opsForList();
liststring.rightPush(key,value);
return(true);
写入到redis数据,list类型。
logstash会通过消息队列类型的操作去读取数据,读完就删。
input {
redis {
host => "172.19.112.4"
data_type => "list"
key => "di"
type => "di"
}
}
filter {
if [type] == "di" {
json { source => "message"}
mutate { remove_field => ["message","tags"] }
date { match => [ "rept_col_time", "yyyy-MM-dd HH:mm:ss" ]
target =>"@timestamp"
locale => "en"
timezone => "+00:00"
}
}
}
output {
if [type] == "di"{
if "_grokparsefailure" not in [tags] and "_groktimeout" not in [tags]{
elasticsearch {
hosts => ["10.96.91.208:9200","10.96.91.209:9200","10.96.91.210:9200","10.96.91.211:9200"]
index => "di"
}
}
}
}
另一边,ElasticSearch,也进行日期类型的处理,必须和json字符串的格式一样,否则出错。
put di
{
"mappings" : {
"di" : {
"properties" : {
"@timestamp": {"type": "date"},
"@version": {"type": "text","fields": {"keyword": {"type": "keyword","ignore_above": 256}}},
"rpt_id" : { "type" : "keyword" },
"iiiii" : { "type" : "keyword" },
"data_type" : { "type" : "keyword" },
"rpt_arrive_time" : { "type" : "date","format": "yyyy-MM-dd HH:mm:ss"},
"filename" : { "type" : "keyword" },
"observe_time" : { "type" : "date","format": "yyyy-MM-dd HH:mm:ss" },
"regular_detail_hour" : { "type" : "date" ,"format": "yyyy-MM-dd HH:mm:ss"},
"bbb" : { "type" : "keyword" },
"rept_validity" : { "type" : "short" },
"rept_status" : { "type" : "short" },
"rept_col_time" : { "type" : "date","format":"yyyy-MM-dd HH:mm:ss"},
"data_source" : { "type" : "keyword" },
"data_source_name" : { "type" : "keyword" },
"part_day" : { "type" : "short" },
"data_type_new":{"type" : "keyword"}
}
}
}
}
至此结束