WARNING: Version 5.1 of Filebeat has passed its EOL date.
This documentation is no longer being maintained and may be removed. If you are running this version, we strongly advise you to upgrade. For the latest information, see the current release documentation.
Filtering and Enhancing the Exported Data
editFiltering and Enhancing the Exported Data
editYour use case might require only a subset of the data exported by Filebeat, or you might need to enhance the exported data (for example, by adding metadata). Filebeat provides a couple of options for filtering and enhancing exported data. You can:
- Define filters at the prospector level to configure each prospector to include or exclude specific lines or files.
- Define processors to configure global processing across all data exported by Filebeat.
Filtering at the Prospector Level
editYou can specify filtering options at the prospector level to configure which lines or files are included or excluded in the output. This allows you to specify different filtering criteria for each prospector.
You configure prospector-level filtering in the filebeat.prospectors
section
of the config file by specifying regular expressions that match the lines you
want to include and/or exclude from the output. The supported options are
include_lines
, exclude_lines
, and
exclude_files
.
For example, you can use the include_lines
option to export any lines that
start with "ERR" or "WARN":
filebeat.prospectors: - input_type: log paths: - /var/log/myapp/*.log include_lines: ["^ERR", "^WARN"]
The disadvantage of this approach is that you need to implement a configuration option for each filtering criteria that you need.
See Filebeat configuration options for more information about each option.
Defining Processors
editYou can define processors in your configuration to process events before they are sent to the configured output.The libbeat library provides processors for:
- reducing the number of exported fields
- enhancing events with additional metadata
- performing additional processing and decoding
Each processor receives an event, applies a defined action to the event, and returns the event. If you define a list of processors, they are executed in the order they are defined in the Filebeat configuration file.
event -> processor 1 -> event1 -> processor 2 -> event2 ...
Drop Event Example
editThe following configuration drops all the DEBUG messages.
processors: - drop_event: when: regexp: message: "^DBG:"
To drop all the log messages coming from a certain log file:
processors: - drop_event: when: contains: source: "test"
Decode JSON Example
editIn the following example, the fields exported by Filebeat include a
field, inner
, whose value is a JSON object encoded as a string:
{ "outer": "value", "inner": "{\"data\": \"value\"}" }
The following configuration decodes the inner JSON object:
filebeat.prospectors: - paths: - input.json json.keys_under_root: true processors: - decode_json_fields: fields: ["inner"] output.console.pretty: true
The resulting output looks something like this:
{ "@timestamp": "2016-12-06T17:38:11.541Z", "beat": { "hostname": "host.example.com", "name": "host.example.com", "version": "5.1.2" }, "inner": { "data": "value" }, "input_type": "log", "offset": 55, "outer": "value", "source": "input.json", "type": "log" }
See Processors for more information.