linux logstash启动脚本,在Linux中从bash脚本启动时Logstash shutdown停止

我写了一个bash脚本,它在指定的文件夹中找到CSV文件,并使用正确的配置文件将它们管道到logstash中.但是,当运行此脚本时,我遇到以下错误,说关闭进程停止,导致无限循环,直到我用ctrl c手动停止它:

[2018-03-22T08:59:53,833][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}

[2018-03-22T08:59:54,211][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

[2018-03-22T08:59:57,970][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}

[2018-03-22T08:59:58,116][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<0xf6851b3 run>"}0xf6851b3>

[2018-03-22T08:59:58,246][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

[2018-03-22T08:59:58,976][INFO ][logstash.outputs.file ] Opening file {:path=>"/home/kevin/otrs_customer_user"}

[2018-03-22T09:00:06,471][WARN ][logstash.shutdownwatcher ] {"inflight_count"=>0, "stalling_thread_info"=>{["LogStash::Filters::CSV", {"separator"=>";", "columns"=>["IOT", "OID", "SUM", "XID", "change_by", "change_time", "city", "company", "company2", "create_by", "create_time", "customer_id", "email", "fax", "first_name", "id", "inst_city", "inst_first_name", "inst_last_name", "inst_street", "inst_zip", "last_name", "login", "mobile", "phone", "phone2", "street", "title", "valid_id", "varioCustomerId", "zip"], "id"=>"f1c74146d6672ca71f489aac1b4c2a332ae515996657981e1ef44b441a7420c8"}]=>[{"thread_id"=>23, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:90:in `read_batch'"}]}}

[2018-03-22T09:00:06,484][ERROR][logstash.shutdownwatcher ] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.

[2018-03-22T09:00:11,438][WARN ][logstash.shutdownwatcher ] {"inflight_count"=>0, "stalling_thread_info"=>{["LogStash::Filters::CSV", {"separator"=>";", "columns"=>["IOT", "OID", "SUM", "XID", "change_by", "change_time", "city", "company", "company2", "create_by", "create_time", "customer_id", "email", "fax", "first_name", "id", "inst_city", "inst_first_name", "inst_last_name", "inst_street", "inst_zip", "last_name", "login", "mobile", "phone", "phone2", "street", "title", "valid_id", "varioCustomerId", "zip"], "id"=>"f1c74146d6672ca71f489aac1b4c2a332ae515996657981e1ef44b441a7420c8"}]=>[{"thread_id"=>23, "name"=>nil, "current_call"=>"[...]/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:90:in `read_batch'"}]}}

当我用bash logstash -f xyz.config

input {

stdin {

id => "${LS_FILE}"

}

}

filter {

mutate {

add_field => { "foo_type" => "${FOO_TYPE}" }

add_field => { "[@metadata][LS_FILE]" => "${LS_FILE}"}

}

if [@metadata][LS_FILE] == "contacts.csv" {

csv {

separator => ";"

columns =>

[

"IOT",

"OID",

"SUM",

"XID",

"kundenid"

]

}

if [kundenid]{

mutate {

update => { "kundenid" => "n-%{kundenid}" }

}

}

}

}

output {

if [@metadata][LS_FILE] == "contacts.csv" {

file{

path => "~/contacts_file"

codec => json_lines

}

}

}

示例脚本:

LOGSTASH="/customer/app/logstash-6.2.3/bin/logstash"

for file in $(find $TARGETPATH -name *.csv) # Loop each file in given path

do

if [[ $file = *"foo"* ]]; then

echo "Importing $file"

export LS_FILE=$(basename $file)

bash $LOGSTASH -f $CFG_FILE < $file # Starting logstash

echo "file $file imported."

fi

done

我在bash脚本中导出环境变量,并将它们设置为logstash配置中的元数据,以便为不同的输入文件执行一些条件.文件中JSON的输出仅用于测试目的.

当您尝试关闭时,Logstash会尝试执行各种步骤,

>它停止所有输入,过滤和输出插件

>处理所有飞行中的事件

>终止Logstash进程

并且有各种因素使得关机过程非常难以预测,例如:

>输入插件以慢速接收数据.

>慢速过滤器,如执行sleep(10000)的Ruby过滤器或执行非常繁重查询的Elasticsearch过滤器.

>断开连接的输出插件,等待重新连接以刷新飞行中事件.

从Logstash documentation,

Logstash has a stall detection mechanism that analyzes the behavior of

the pipeline and plugins during shutdown. This mechanism produces

periodic information about the count of inflight events in internal

queues and a list of busy worker threads.

您可以在启动logstash时使用–pipeline.unsafe_shutdown标志,以便在停止关闭时强制终止进程.如果未启用–pipeline.unsafe_shutdown,则Logstash会继续运行并定期生成这些报告,这就是为什么问题在您的情况下似乎是随机的.

请记住,不安全的关闭,强制执行Logstash进程或崩溃

由于任何其他原因,Logstash进程可能会导致数据丢失

(除非您已启用Logstash以使用persistent queues).

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值