- Filebeat Reference: other versions:
- Overview
- Getting Started With Filebeat
- Setting up and running Filebeat
- Upgrading Filebeat
- How Filebeat works
- Configuring Filebeat
- Specify which modules to run
- Configure inputs
- Manage multiline messages
- Specify general settings
- Load external configuration files
- Configure the internal queue
- Configure the output
- Configure index lifecycle management
- Load balance the output hosts
- Specify SSL settings
- Filter and enhance the exported data
- Define processors
- Add cloud metadata
- Add fields
- Add labels
- Add the local time zone
- Add tags
- Decode CEF
- Decode CSV fields
- Decode JSON fields
- Decode Base64 fields
- Decompress gzip fields
- Community ID Network Flow Hash
- Convert
- Drop events
- Drop fields from events
- Extract array
- Keep fields from events
- Registered Domain
- Rename fields from events
- Add Kubernetes metadata
- Add Docker metadata
- Add Host metadata
- Add Observer metadata
- Dissect strings
- DNS Reverse Lookup
- Add process metadata
- Script Processor
- Timestamp
- Parse data by using ingest node
- Enrich events with geoIP information
- Configure project paths
- Configure the Kibana endpoint
- Load the Kibana dashboards
- Load the Elasticsearch index template
- Configure logging
- Use environment variables in the configuration
- Autodiscover
- YAML tips and gotchas
- Regular expression support
- HTTP Endpoint
- filebeat.reference.yml
- Beats central management
- Modules
- Modules overview
- Apache module
- Auditd module
- AWS module
- CEF module
- Cisco module
- Coredns Module
- Elasticsearch module
- Envoyproxy Module
- Google Cloud module
- haproxy module
- IBM MQ module
- Icinga module
- IIS module
- Iptables module
- Kafka module
- Kibana module
- Logstash module
- MongoDB module
- MSSQL module
- MySQL module
- nats module
- NetFlow module
- Nginx module
- Osquery module
- Palo Alto Networks module
- PostgreSQL module
- RabbitMQ module
- Redis module
- Santa module
- Suricata module
- System module
- Traefik module
- Zeek (Bro) Module
- Exported fields
- Apache fields
- Auditd fields
- AWS fields
- Beat fields
- Decode CEF processor fields fields
- CEF fields
- Cisco fields
- Cloud provider metadata fields
- Coredns fields
- Docker fields
- ECS fields
- elasticsearch fields
- Envoyproxy fields
- Google Cloud fields
- haproxy fields
- Host fields
- ibmmq fields
- Icinga fields
- IIS fields
- iptables fields
- Jolokia Discovery autodiscover provider fields
- Kafka fields
- kibana fields
- Kubernetes fields
- Log file content fields
- logstash fields
- mongodb fields
- mssql fields
- MySQL fields
- nats fields
- NetFlow fields
- NetFlow fields
- Nginx fields
- Osquery fields
- panw fields
- PostgreSQL fields
- Process fields
- RabbitMQ fields
- Redis fields
- s3 fields
- Google Santa fields
- Suricata fields
- System fields
- Traefik fields
- Zeek fields
- Monitoring Filebeat
- Securing Filebeat
- Troubleshooting
- Get help
- Debug
- Common problems
- Can’t read log files from network volumes
- Filebeat isn’t collecting lines from a file
- Too many open file handlers
- Registry file is too large
- Inode reuse causes Filebeat to skip lines
- Log rotation results in lost or duplicate events
- Open file handlers cause issues with Windows file rotation
- Filebeat is using too much CPU
- Dashboard in Kibana is breaking up data fields incorrectly
- Fields are not indexed or usable in Kibana visualizations
- Filebeat isn’t shipping the last line of a file
- Filebeat keeps open file handlers of deleted files for a long time
- Filebeat uses too much bandwidth
- Error loading config file
- Found unexpected or unknown characters
- Logstash connection doesn’t work
- @metadata is missing in Logstash
- Not sure whether to use Logstash or Beats
- SSL client fails to connect to Logstash
- Monitoring UI shows fewer Beats than expected
- Contributing to Beats
Log file content fields
editLog file content fields
editContains log file lines.
-
log.file.path
-
The file from which the line was read. This field contains the absolute path to the file. For example:
/var/log/system.log
.type: keyword
required: False
-
log.source.address
-
Source address from which the log event was read / sent from.
type: keyword
required: False
-
log.offset
-
The file offset the reported line starts at.
type: long
required: False
-
stream
-
Log stream when reading container logs, can be stdout or stderr
type: keyword
required: False
-
input.type
-
The input type from which the event was generated. This field is set to the value specified for the
type
option in the input section of the Filebeat config file.required: True
-
syslog.facility
-
The facility extracted from the priority.
type: long
required: False
-
syslog.priority
-
The priority of the syslog event.
type: long
required: False
-
syslog.severity_label
-
The human readable severity.
type: keyword
required: False
-
syslog.facility_label
-
The human readable facility.
type: keyword
required: False
-
process.program
-
The name of the program.
type: keyword
required: False
-
log.flags
-
This field contains the flags of the event.
-
http.response.content_length
-
type: alias
alias to: http.response.body.bytes
-
user_agent.os.full_name
-
type: keyword
-
fileset.name
-
The Filebeat fileset that generated this event.
type: keyword
-
fileset.module
-
type: alias
alias to: event.module
-
read_timestamp
-
type: alias
alias to: event.created
-
docker.attrs
-
docker.attrs contains labels and environment variables written by docker’s JSON File logging driver. These fields are only available when they are configured in the logging driver options.
type: object
-
icmp.code
-
ICMP code.
type: keyword
-
icmp.type
-
ICMP type.
type: keyword
-
igmp.type
-
IGMP type.
type: keyword
-
kafka.topic
-
Kafka topic
type: keyword
-
kafka.partition
-
Kafka partition number
type: long
-
kafka.offset
-
Kafka offset of this message
type: long
-
kafka.key
-
Kafka key, corresponding to the Kafka value stored in the message
type: keyword
-
kafka.block_timestamp
-
Kafka outer (compressed) block timestamp
type: date
-
kafka.headers
-
An array of Kafka header strings for this message, in the form "<key>: <value>".
type: array