windows下maven项目集成ELK演示

1、准备

1>jdk8及以上运行环境

java version "9.0.1"

2>下载Elasticsearch,Logstash,Kibana

https://www.elastic.co/downloads

2、安装配置

1>elasticsearch-6.1.1\config\elasticsearch.yml增加下列配置:

http.cors.enabled: true
http.cors.allow-origin: "*"

备用dos命令:

netstat -ano|findstr "port"
查看进程名: tasklist|findstr "PID"

2>测试:

windows+r输入cmd 进入:你的安装目录\logstash-6.1.1\bin

你的安装目录\logstash-6.1.1\bin>logstash -f 'input { stdin { } } output { stdout {} }'

io/console not supported; tty will not be manipulated
Default settings used: Filter workers: 4
Logstash startup completed
echo  hello world
{
       "message" => "echo  hello world\r",
      "@version" => "1",
    "@timestamp" => "2017-12-24T05:50:53.257Z",
          "host" => "你的主机名"
}

3、使用

1>cmd运行elasticsearch-6.1.1:你的安装目录\elasticsearch-6.1.1\bin>elasticsearch.bat

2>在logstash的bin目录新建logstash-test.conf

input {
 tcp {
        host => "127.0.0.1"
        port => 8182
        mode => "server"
        ssl_enable => false
codec => json_lines  
    }
}
output {
    elasticsearch {
        hosts => "127.0.0.1:9200"
        index => "testIndex"
    }
  stdout { codec => rubydebug } 
}

cmd运行logstash-6.1.1:你的安装目录\logstash-6.1.1\bin>logstash -f logstash-test.conf



这样就正常

3>cmd运行kibana-6.1.1:你的安装目录\kibana-6.1.1-windows-x86_64\bin>kibana.bat

kibana 就是方便展示数据,不启用也可以

4、maven项目

1>增加依赖包

        <dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
<version>2.7.8</version>
</dependency>
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>4.11</version>
</dependency>
<dependency>
    <groupId>ch.qos.logback</groupId>
    <artifactId>logback-classic</artifactId>
    <version>1.2.3</version>
</dependency>
<dependency>
    <groupId>net.logstash.log4j</groupId>
    <artifactId>jsonevent-layout</artifactId>
    <version>1.7</version>
</dependency>
    </dependencies>

2>日志配置文件增加

        <!-- 发送日志到 logstash -->
    <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>127.0.0.1:8182</destination>
        <!-- encoder is required -->
        <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder"/>
    </appender>
    
<root>
<level value="INFO" />
<appender-ref ref="LOGSTASH" />
</root>

3>类中使用 org.slf4j.LoggerFactory 打印日志,如:

private static final Logger logger = LoggerFactory.getLogger(Test.class);

 @Test
    public void test2(){
        logger.info("test3:","test3");
    }


*********************************

启动项目就会发送日志到 127.0.0.1:8182 ,logstash一直监听8182端口,有数据发送来就会接收并插入elasticsearch(127.0.0.1:9200),生成logstash-test索引;就可以在kibana 中查看了。

启动项目时,logstash中会显示,这时就是在发送日志只是json解析异常






ES 的  DSL (domain specific language)一种JSON格式的查询.  可以在cmd中:

curl "http://localhost:9200/logstash-test/_search?q=*&pretty"



https://www.elastic.co/guide/en/elasticsearch/reference/6.x/query-dsl-query-string-query.html#query-string-syntax


5、遇到的问题及解决

1>版本最好保持一致;

2>这个不知道什么原因,不影响;如有知道的可以告诉我


3.[2017-08-14T16:01:46,482][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"The setting `flush_size` in plugin `elasticsearch` is obsolete and is no longer available. This setting is no longer available as we now try to restrict bulk requests to sane sizes. See the 'Batch Sizes' section of the docs. If you think you still need to restrict payloads based on the number, not size, of events, please open a ticket. If you have any questions about this, you are invited to visit https://discuss.elastic.co/c/logstash and ask."}

解决:LogStash6及以上不支持:安装nodeJS

npm install -g cnpm --registry=https://registry.npm.taobao.org
cnpm install --global gulp
npm install -g grunt-cli
npm install grunt --save-dev
npm install grunt-contrib-clean grunt-contrib-concat grunt-contrib-watch grunt-contrib-connect grunt-contrib-copy grunt-contrib-jasmine

4.ElasticSearch测试

数据  https://www.elastic.co/guide/en/kibana/3.0/import-some-data.html

其他数据网上都有;

1>创建索引

{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse"}],"type":"mapper_parsing_exception","reason":"failed to parse","caused_by"
:{"type":"not_x_content_exception","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"}},"status":400}
解决:"{"""


{"error":"Content-Type header [application/x-www-form-urlencoded] is not supported","status":406}curl: (7) Failed to connect to  port 80: Connection refused
增加: -H "Content-Type:application/json;charset=UTF-8"


{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"No handler for type [string] declared on field [speaker]"}],"type":"mapper_parsing_exceptio
n","reason":"Failed to parse mapping [_default_]: No handler for type [string] declared on field [speaker]","caused_by":{"type":"mapper_parsing_exception","reas
on":"No handler for type [string] declared on field [speaker]"}},"status":400}
解决: packetbeat  https://segmentfault.com/a/1190000008897731
String字段被拆分成两种新的数据类型: text用于全文搜索的, 而keyword用于关键词搜索.
ES支持的类型有:String, date, long, double, boolean , ip, object, nested, geo_point, geo_shape, completion.


正确写法:
curl -H "Content-Type:application/json;charset=UTF-8" -XPUT "http://localhost:9200/shakespeare/emp/1" -d "{"""mappings""" : {"""_default_""" : {"""properties""" : {"""speaker""" : {"""type""": """String""", """index""" : """not_analyzed""" },"""play_name""" : {"""type""": """String""", """index""" : """not_analyzed""" },"""line_id""" : { """type""" : """integer""" },"""speech_number""" : { """type""" : """integer""" }}}}}"


或者

curl -H "Content-Type:application/json;charset=UTF-8" -XPUT "localhost:9200/mycompany/employee/1" -d "{\"first_name\":\"John\",\"last_name\":\"Smith\",\"age\":25,\"about\":\"Ilovetogorockclimbing\",\"interests\":[\"sports\",\"music\"]}" ;

2>导入数据

curl -H "Content-Type:application/json;charset=UTF-8"  -XPOST "http://localhost:9200/account/_bulk?pretty" --data-binary @accounts.json  
curl -H "Content-Type:application/json;charset=UTF-8"  -XPOST "http://localhost:9200/shakespeare/emp/_bulk?pretty" --data-binary @shakespeare_6.0.json  
curl -H "Content-Type:application/json;charset=UTF-8"  -XPOST "http://localhost:9200/_bulk?pretty" --data-binary @logs.jsonl  


查看是否导入数据:  curl "http://localhost:9200/_cat/indices?v"

在Elasticsearch内部,对时间类型字段,是统一采用 UTC 时间。在做查询和显示是需要转换时间内容增加8个小时

查看是否插入日志:  curl http://localhost:9200/shakespeare/_search?pretty


5、Elasticsearch-head

6以上安装:https://www.cnblogs.com/xing901022/p/6030296.html



  • 2
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值