1.目的:
是将v2ex.txt导入到ES中分析,通过kibana呈现;现在Elastic Stack有好多好用的工具,这次用到的是:
Filebeat=>采集文本文件=>Logstash=>进行JSON转换=>ES=>存储=>Kibana=>图标呈现
2.拉取镜像:
这次根据官网使用最新版本7.6.2
3.创建文件及响应映射文件夹
4.快速编排单机模式:docker-compose.yml
version: '2'
services:
es:
image: docker.elastic.co/elasticsearch/elasticsearch:7.6.2
container_name: es
ports:
- 9200:9200
- 9300:9300
expose:
- 9200
ulimits:
nofile:
soft: 65536
hard: 65536
environment:
- discovery.type=single-node
- ES_JAVA_OPTS=-Xms256M -Xmx256M
volumes:
- ./es_data:/usr/share/elasticsearch/data
kibana:
image: docker.elastic.co/kibana/kibana:7.6.2
container_name: kibana
ports:
- 5601:5601
external_links:
- es
environment:
- ELASTICSEARCH_HOSTS=http://es:9200
logstash:
container_name: logstash
image: logstash:7.3.1
ports:
- 5044:5044
volumes:
- ./logstash/:/usr/share/logstash/config/
- ./logstash/conf.d/logstash.conf:/config_d/logstash.conf
depends_on:
- es
- kibana
command: bash -c "logstash -f /config_d --config.reload.automatic"
filebeat:
image: docker.elastic.co/beats/filebeat:7.6.2
container_name: filebeat
external_links:
- es
- kibana
volumes:
- ./filebeat/filebeat.docker.yml:/usr/share/filebeat/filebeat.yml
- ./filebeat/v2ex.txt:/v2ex/v2ex.txt
depends_on:
- es
- kibana
- logstash
5.可以根据条件分析数据了~
(1)首先是Kibana首页,看到了V2EX数据了,有两万多条
(2)然后看下比较热门的20个分类
(3)看看回复数排行前5的帖子是什么,这么火爆
(4)关于*[加班->996->福报->转行->创业]*的帖子?别说,也挺火