首先安装Elasticsearch,参考https://blog.csdn.net/letterss/article/details/91361205
同步我们需要用到logstash工具,下载logstash
将下载的logstash-7.2.0.tar解压
tar -zxvf logstash-7.2.0.tar
将mysql的连接库jar包放到logstash-7.2.0/config目录下,我这里用的是mysql-connector-java-5.1.22-bin.jar
在logstash-7.2.0目录下面创建mysql.conf文件
cd logstash-7.2.0
vim mysql.conf
编写增量同步配置文件
-----------------------------------------------------------------------------------------------------------------------------------------------------------
input {
jdbc {
jdbc_driver_library => "../config/mysql-connector-java-5.1.22-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/itv_basic_v077"
jdbc_user => "root"
jdbc_password => "admin"
record_last_run => true
use_column_value => true
tracking_column => "semi_id" #主键ID
last_run_metadata_path => "/ES/log/dtv_semi" #存储记录位置
clean_run => "false"
schedule => "* * * * *" #定时字段 各字段含义(由左至右)分、时、天、月、年,全部为*默认含义为每分钟都更新
statement => "select * from dtv_semi where semi_id>:sql_last_value" #sql语句,注意有冒号 这里的id就是上面的id
type => "con_vod"
}
jdbc {
jdbc_driver_library => "./config/mysql-connector-java-5.1.22-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/itv_basic_v077"
jdbc_user => "root"
jdbc_password => "admin"
record_last_run => true
use_column_value => true
tracking_column => "id" #主键ID
last_run_metadata_path => "/ES/log/dtv_semi_con"
clean_run => "false"
schedule => "* * * * *" #定时字段 各字段含义(由左至右)分、时、天、月、年,全部为*默认含义为每分钟都更新
statement => "select * from dtv_semi_con where id>:sql_last_value" #sql语句,注意有冒号 这里的id就是上面的id
type => "semi_con"
}
}
output {
stdout {
codec => json_lines
}
elasticsearch {
index => "house" ##索引名称
document_type => "%{type}" #这里是动态的引用上面配置的type
hosts => "localhost:9200" #Elasticsearch访问地址
document_id => "%{id}" #主键字段
}
}
----------------------------------------------------------------------------------------------------------------------------------------------
然后我们在logstash-7.2.0目录启动
bin/logstash -f mysql.conf
如遇错误
Sending Logstash logs to /usr/local/logstash/logstash-6.5.0/logs which is now configured via log4j2.properties
[2018-11-20T12:23:45,931][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-11-20T12:23:46,088][FATAL][logstash.runner ] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting.
[2018-11-20T12:23:46,130][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
在data目录下面删除.lock文件
rm -rf data/.lock
然后重新启动