深入理解ElasticSearch关键词(二)

  • real-time pipelining capabilities

  • Codecs

    Codecs are basically stream filters that can operate as part of the input or output.

    Codes enable you to easily separate the transport of your messages from the serialization process.

    Popular codecs include json, msgpack, and plain.

    A codec plugin changes the data representation of an event. Codecs are essentially stream filters that can operate as part of an input or output.

  • Beats

    Beats are open source data shippers that you install as agents on your servers to send operational data to Elasticsearch.

  • ILM(Index Lifecycle management)

    Use the index lifecycle management (ILM) feature in Elasticsearch to mange your Filebeat indices as they age.

    ILM defines four index liftcycle phases:

    • Hot : The index is actively being updated and queried
    • Warm : The index is no longer being updated but is still being queried
    • Cold : The index is no longer being updated and is seldom queried. The information still needs to be searchable, but it’s okey if those queries are slower
    • Delete : The index is no longer needed and can safely be removed

    An index’s lifecycle policy specifies which phases are applicable, what actions are performed in each phase, and when it transitions between phases.

  • Data stream

    We recommend using data streams to manage time series data.

    A data stream lets you store append-only time series data acrosss multiple indices while giving you a single named resource for requests.

    A data stream consists of one or more hidden, auto-generated backing indices.

    Each data stream requires a matching index template. The same index template can be used for multiple data streams.

  • Index template

    An Index Template is a way to tell Elasticsearch how to configure an index when it is created.

    There are two types of templates, index templates and component templates.

  • filebeat -e

    Linux系统,包括docker容器,启动filebeta需要用命令: sudo ./filebeat -e

  • Harvester

    Filebeat consists of two main components: inputs and harvesters.

    A harvester is responsible for reading the content of a single file.

    The harvester reads each file, line by line, and sends the content to the output.

    One harvester is started for each file.

  • Go Glob

    func Glob(pattern string) (matches [] string, err error)
    

    Glob returns the names of all files matching pattern or nil if there is no matching file.

    The syntax of patterns is the same as in Match.

  • Log Stream

    A log stream is a sequence of log events that share the same source.

  • file rotation

    When dealing with file rotation, avoid harvesting symlinks.

    In information technology, log rotation is an automated process used in system administration in which log files are compressed, moved (archived), renamed or deleted once they are too old or too big.

    New incoming log data is directed into a new fresh file which at the same location.

  • Seccomp

    理解Seccomp

    On Linux 3.17 and later, Filebeat can take advantage of secure computing mode, also known as seccomp.

  • Index mappings, settings, aliases

    Component templates are building blocks for constructing index templates that specify index mappings, settings, and aliases.

  • mapping

    Mapping is the process of defining how a document, and the fields it contains, are stored and indexed.

    A mapping definitions has: Metadata fields, Fields.

    Metadata fields are used to customize how a document’s associated metadata is treated, like _index, _id, and _source, _type fields.

    A

  • setting

    Index Modules are modules created per index and control all aspects related to an index.

    Index level settings can be set per-index.

  • aliases

    An index alias is a secondary name used to refer to one or more existing indeces.

  • Processors

    You can define processors in your configuration to process events before they are sent to the configured output.

    The libbeat library provides processors for :

    • reducing the number of exported fields
    • enhancing events with additional metadata
    • performing additional processing and decoding

    Each processor receives and event, applies a defined action to the event, and returns the event.

  • Text Analysis

    Text analysis is the process of converting unstructed text, like the body of an email or producet description, into a structured format that’s optimized for search.

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值