View web crawler events logs
editView web crawler events logs
editWeb crawler logs are stored in an Elasticsearch data stream named logs-elastic_crawler-default
.
Kibana provides two user interfaces to view these logs: Discover and Logs.
For a complete reference of all web crawler events, see Web crawler events logs reference.
View web crawler events logs in Discover
editTo view crawler logs in Discover, you must first configure a Kibana data view for the logs-elastic_crawler-default
data stream.
- Navigate to Stack Management → Data Views via the left sidebar.
- Select Create data view.
-
Choose a name and use the index pattern
logs-elastic_crawler-default
to create the data view. - Navigate to Discover via the left sidebar.
- Select your newly created data view, and you are ready to begin exploring your crawl events in detail.
You will likely want to set up some custom columns in Discover to more easily discern crawl events at a glance.
To add a new column, select an entry from the list of available fields on the left sidebar.
Handy columns for crawler events include crawler.crawl.id
, url.domain
, url.path
, event.action
, and http.response.status_code
.
View web crawler events logs in Logs
editTo view crawler logs in Logs, you must first configure a source configuration.
- Navigate to Logs UI via the left sidebar, under Observability.
-
Under Settings, specify
logs-elastic_crawler-default
as the Log indices data view. -
Configure any desired custom columns. A nice place to start is by removing the preset
event.dataset
andmessage
options and addingcrawler.crawl.id
,url.domain
,url.path
,event.action
, andhttp.response.status_code
.Autocomplete will be available on these columns after the initial source configuration has been applied. You can edit these settings at any time.
- Apply the configuration, navigate to Stream, and you are ready to begin exploring your crawl events in detail!