logstash-2.3.1安装和kafka结合使用

logstash-2.3.1安装和kafka结合使用
说明:
1.logstash必须运行于jdk7.55以上版本(可参考的我另一篇博客安装https://blog.csdn.net/qq_16563637/article/details/81738113)
2.logstash开发语言JRuby,运行于jvm
3.logstash可以防止数据丢失并可以采集更多数据源(https://www.elastic.co/guide/en/logstash/2.3/input-plugins.html)
4.下载地址:https://artifacts.elastic.co/downloads/logstash/logstash-2.3.1.tar.gz
5.logstash如果和es一块安装需要两个版本相等
6.logstash5x不再支持kafka0.8支持kafka0.9
安装log
解压缩

tar zxf logstash-2.3.1.tar.gz -C /bigdata/

su bigdata
-----------------logstash配置文件jdbc-kafka.conf-------------------------
先安装logstash的jdbc插件
说明:
1如果是2.x版本是需要自己安装logstash-input-jdbc插件,5.x已经集成不需要重复安装,直接使用即可
2.2.x版本安装命令进入到logstash的bin下
(可以参考这篇博客安装jdbc插件https://blog.csdn.net/fishinhouse/article/details/81302105

    yum install gem
    chown -R bigdata:bigdata /{bigdata,data}
    cd bin
    ./logstash-plugin  install logstash-input-jdbc

3.先启动kafka再启动logstash
在logstash文件目录下创建一个文件夹,命名随意,我这里是mysqletc
mkdir mysqletc
如果出现mkdir: 无法创建目录"mysqletc": 权限不够
su root
chown -R bigdata:bigdata /{bigdata,data}
cd mysqletc
首先创建一个mysql.conf的文件
vi mysql.conf

input {
	jdbc {
		#mysql 数据库链接, shop为数据库名 
		jdbc_connection_string => "jdbc:mysql://192.168.1.103:3306/sparkTest"
		#用户名和密码 
		jdbc_user => "root"
		jdbc_password => "123456"
		#驱动 
		jdbc_driver_library => "/bigdata/logstash-5.5.1/mysqletc/mysql-connector-java-5.1.41.jar"
		#驱动类
		jdbc_driver_class => "com.mysql.jdbc.Driver"
		#这将导致SQL语句被分解成多个查询,每个查询将使用限制和偏移来共同检索完整的结果集。
		#数量大小使用jdbc_page_size设置,请注意,查询之间不能保证排序。
		jdbc_paging_enabled => "true"
		jdbc_page_size => "50000"
		#执行的sql 文件路径 + 名称 
		#sql示例statement => "SELECT id, mycolumn1, mycolumn2 FROM my_table WHERE id > :id"
		statement_filepath => "/bigdata/logstash-5.5.1/mysqletc/sqlOne/sumSalary.sql"
		#设置监听间隔 各字段含义(由左至右)分、时、天、月、年,全部为 * 默认含义为每分钟都更新 通linux任务计划cron
		schedule => "* * * * *"
		#如果有多个input 输入到不同的type 课用下面的type 字段区分,后续在output下面分别做逻辑(if  type=="jdbc " then..)
		#type => "jdbc"  --- 这里不需要 #因为是增量需要制定跟踪的字段
		#设置为时true,使用定义的 tracking_column值作为:sql_last_value。设置为时false,:sql_last_value反映上次执行查询的时间。
		use_column_value=>true
		tracking_column => "salary_id"
		#是否保存状态
		record_last_run => true
		#上次运行时文件的路径
		last_run_metadata_path => "/bigdata/logstash-5.5.1/mysqletc/sqlOne/my_info"
	}
}
####这里有filter 需求可以加filter  删除,修改,截取某些字段。
output {
    	kafka {
    	  topic_id => "sumSalary"
    	  codec => plain {
    		format => "%{message}"
    		charset => "UTF-8"
    	  }
    	  bootstrap_servers => "192.168.1.103:9092"
    	}
    }

设置sql

cd /bigdata/logstash-5.5.1/mysqletc
mkdir sqlOne
cd sqlOne 
vi sumSalary.sql
select sa.user_idx_id,sa.base_salary,sa.overtime_salary,sa.performance_salary,sa.meal_salary,sa.speech_allowance,sa.dedcut_money,sa.dedcut_social_security,sa.payroll from sparkTest.salary sa,sparkTest.user us where sa.user_idx_id=us.user_id and sa.salary_year="2018" and sa.salary_id > :sql_last_value

保存
创建last_run_metadata_path

cd /bigdata/logstash-5.5.1/mysqletc/sqlOne/
vi my_info

保存
上传 mysql-connector-java-5.1.41.jar到/bigdata/logstash-5.5.1/mysqletc/
启动:

cd /bigdata/logstash-5.5.1
bin/logstash agent -f mysqletc/sqlOne/mysql.conf

启动kafka这里不再说明
测试kafka消费数据

su root
cd kafka_2.10-0.8.2.1
bin/kafka-console-consumer.sh --zookeeper 192.168.1.103:2181 --topic sumSalary --from-beginning

-----------------logstash配置文件flow-es.conf----------------------------

input {
  file {
    type => "flow"
    path => "/var/nginx_logs/*.log"
    discover_interval => 5
    start_position => "beginning" 
  }
}

output {
  if [type] == "flow" {
    elasticsearch {
      index => "flow-%{+YYYY.MM.dd}"
      hosts => ["172.16.0.14:9200", "172.16.0.15:9200", "172.16.0.16:9200"]
    }
  }  
}

文件说明:
1.该文件必须遵循JRuby语法
2.flow传输数据到elasticsearch
3.path文件的位置
4.discover_interval => 5代表5秒钟产生的数据采集一次
5.start_position => “beginning” 代表从最开始开始读取数据(已经读取过的不会再读取),默认为end代表从当前开始读取
6.hosts =>代表elasticsearch集群,可用"ip:port,ip:port,ip:port表示"
7.path => “/var/nginx_logs//.log” path可以监听多级子目录
-----------------logstash配置文件flow-kafka.conf----------------------------

input {
  file {
    path => "/var/nginx_logs/*.log"
    discover_interval => 5
    start_position => "beginning" 
  }
}

output {
	kafka {
	  topic_id => "test"
	  codec => plain {
		format => "%{message}"
		charset => "UTF-8"
	  }
	  bootstrap_servers => "192.168.1.103:9092"
	}
}

文件说明:
1.该文件必须遵循JRuby语法
2.flow传输数据到kafka
3.path文件的位置
4.discover_interval => 5代表5秒钟产生的数据采集一次
5.bootstrap_servers => 代表kafka集群的位置可用"ip:port,ip:port,ip:port表示"
6.start_position => “beginning” 代表从最开始开始读取数据(已经读取过的不会再读取),默认为end代表从当前开始读取
7.plain代表简单字符串可以传json
8.format => "%{message}"代表拿到的消息,%{message}相当于一个el表达式
9.charset => “UTF-8"代表编码格式,可以穿"UTF-8”,“GB2312”
-------------------------------------启动logstash---------------------------------------------
启动前必须将kafka先启动起来,这里不再赘述kafka安装和启动
打开logstash

cd logstash-2.3.1

//创建配置文件安放目录

mkdir config
cd config

将flow-es.conf和flow-kafka.conf放入里面

cd ..

执行启动命令

bin/logstash agent -f config/flow-kafka.conf

在kafka中创建一个topic

bin/kafka-topics.sh --create --zookeeper 192.168.1.103:2181 --replication-factor 1 --partitions 1 --topic test

将日志文件放入/var/nginx_logs/路径下面
//track.log//

{"time" : "1461368569.041","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368648.620","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "http://www.17173.com/","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368823.966","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368825.504","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368826.360","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368827.195","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368827.439","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368827.639","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368827.845","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368828.053","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}
{"time" : "1461368828.274","client" : "10.0.0.1","domain" : "www.lfg176.com","url" : "http://www.lfg176.com/","title" : "www.lfg176.com","referrer": "","sh" : "768","sw" : "1280","cd" : "24","lang" : "zh-CN","ua" : "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36","trace" : "2885852891655e495730b2ce733a259b","type" : "1"}

/
kafka消费数据

cd kafka_2.10-0.8.2.1
bin/kafka-console-consumer.sh --zookeeper 192.168.1.103:2181 --topic test --from-beginning

成功输出数据

  • 1
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值