Load Kibana dashboards
editLoad Kibana dashboards
editFor deeper observability into your infrastructure, you can use the Metrics app and the Logs app in Kibana. For more details, see Metrics monitoring and Log monitoring.
Filebeat comes packaged with example Kibana dashboards, visualizations,
and searches for visualizing Filebeat data in Kibana. Before you can use
the dashboards, you need to create the index pattern, filebeat-*
, and
load the dashboards into Kibana.
To do this, you can either run the setup
command (as described here) or
configure dashboard loading in the
filebeat.yml
config file. This requires a Kibana endpoint configuration. If you didn’t already configure
a Kibana endpoint, see Kibana endpoint.
Load dashboards
editMake sure Kibana is running before you perform this step. If you are accessing a secured Kibana instance, make sure you’ve configured credentials as described in the Quick start: installation and configuration.
To load the recommended index template for writing to Elasticsearch and deploy the sample dashboards for visualizing the data in Kibana, use the command that works with your system.
filebeat setup --dashboards
filebeat setup --dashboards
./filebeat setup --dashboards
./filebeat setup --dashboards
docker run --net="host" docker.elastic.co/beats/filebeat:8.15.4 setup --dashboards
Open a PowerShell prompt as an Administrator (right-click the PowerShell icon and select Run As Administrator).
From the PowerShell prompt, change to the directory where you installed Filebeat, and run:
PS > .\filebeat.exe setup --dashboards
For more options, such as loading customized dashboards, see Importing Existing Beat Dashboards. If you’ve configured the Logstash output, see Load dashboards for Logstash output.
Load dashboards for Logstash output
editDuring dashboard loading, Filebeat connects to Elasticsearch to check version information. To load dashboards when the Logstash output is enabled, you need to temporarily disable the Logstash output and enable Elasticsearch. To connect to a secured Elasticsearch cluster, you also need to pass Elasticsearch credentials.
The example shows a hard-coded password, but you should store sensitive values in the secrets keystore.
filebeat setup -e \ -E output.logstash.enabled=false \ -E output.elasticsearch.hosts=['localhost:9200'] \ -E output.elasticsearch.username=filebeat_internal \ -E output.elasticsearch.password=YOUR_PASSWORD \ -E setup.kibana.host=localhost:5601
filebeat setup -e \ -E output.logstash.enabled=false \ -E output.elasticsearch.hosts=['localhost:9200'] \ -E output.elasticsearch.username=filebeat_internal \ -E output.elasticsearch.password=YOUR_PASSWORD \ -E setup.kibana.host=localhost:5601
./filebeat setup -e \ -E output.logstash.enabled=false \ -E output.elasticsearch.hosts=['localhost:9200'] \ -E output.elasticsearch.username=filebeat_internal \ -E output.elasticsearch.password=YOUR_PASSWORD \ -E setup.kibana.host=localhost:5601
./filebeat setup -e \ -E output.logstash.enabled=false \ -E output.elasticsearch.hosts=['localhost:9200'] \ -E output.elasticsearch.username=filebeat_internal \ -E output.elasticsearch.password=YOUR_PASSWORD \ -E setup.kibana.host=localhost:5601
docker run --net="host" docker.elastic.co/beats/filebeat:8.15.4 setup -e \ -E output.logstash.enabled=false \ -E output.elasticsearch.hosts=['localhost:9200'] \ -E output.elasticsearch.username=filebeat_internal \ -E output.elasticsearch.password=YOUR_PASSWORD \ -E setup.kibana.host=localhost:5601
Open a PowerShell prompt as an Administrator (right-click the PowerShell icon and select Run As Administrator).
From the PowerShell prompt, change to the directory where you installed Filebeat, and run:
PS > .\filebeat.exe setup -e ` -E output.logstash.enabled=false ` -E output.elasticsearch.hosts=['localhost:9200'] ` -E output.elasticsearch.username=filebeat_internal ` -E output.elasticsearch.password=YOUR_PASSWORD ` -E setup.kibana.host=localhost:5601