Using Logstash with Elastic Integrations
editUsing Logstash with Elastic Integrations
editYou can take advantage of the extensive, built-in capabilities of Elastic Integrations—such as managing data collection, transformation, and visualization—and then use Logstash for additional data processing and output options. Logstash can further expand capabilities for use cases where you need additional processing, or if you need your data delivered to multiple destinations.
Elastic Integrations: ingesting to visualizing
editElastic Integrations provide quick, end-to-end solutions for:
- ingesting data from a variety of data sources,
- ensuring compliance with the Elastic Common Schema (ECS),
- getting the data into the Elastic Stack, and
- visualizing it with purpose-built dashboards.
Integrations are available for popular services and platforms, such as Nginx, AWS, and MongoDB, as well as many generic input types like log files. Each integration includes pre-packaged assets to help reduce the time between ingest and insights.
To see available integrations, go to the Kibana home page, and click Add Integrations. You can use the query bar to search for integrations you may want to use. When you find an integration for your data source, the UI walks you through adding and configuring it.
Extend Integrations with Logstash
editLogstash can run the ingest pipeline component of your Elastic integration when you use the Logstash filter-elastic_integration
plugin in your Logstash pipeline.
Adding the filter-elastic_integration
plugin as the first filter plugin keeps the pipeline’s behavior as close as possible to the behavior you’d expect if the bytes were processed by the integration in Elasticsearch.
The more you modify an event before calling the elastic_integration
filter, the higher the risk that the modifications will have meaningful effect in how the event is transformed.
Sample pipeline configuration
input { elastic_agent { port => 5044 } } filter { elastic_integration{ cloud_id => "<cloud id>" cloud_auth => "<your_cloud-auth" } translate { source => "[http][host]" target => "[@metadata][tenant]" dictionary_path => "/etc/conf.d/logstash/tenants.yml" } } output { if [@metadata][tenant] == "tenant01" { elasticsearch { cloud_id => "<cloud id>" api_key => "<api key>" } } else if [@metadata][tenant] == "tenant02" { elasticsearch { cloud_id => "<cloud id>" api_key => "<api key>" } } }
Use |
|
You can use additional filters as long as they follow |
|
Sample config to output data to multiple destinations |
Using filter-elastic_integration
with output-elasticsearch
editElastic Integrations are designed to work with data streams and ECS-compatible output.
Be sure that these features are enabled in the output-elasticsearch
plugin.
-
Set
data-stream
totrue
.
(Check out Data streams for additional data streams settings.) -
Set
ecs-compatibility
tov1
orv8
.
Check out the output-elasticsearch
plugin docs for additional settings.