- Elastic integrations
- Integrations quick reference
- 1Password
- Abnormal Security
- ActiveMQ
- Active Directory Entity Analytics
- Admin By Request EPM integration
- Airflow
- Akamai
- Apache
- API (custom)
- Arbor Peakflow SP Logs
- Arista NG Firewall
- Atlassian
- Auditd
- Auth0
- authentik
- AWS
- Amazon CloudFront
- Amazon DynamoDB
- Amazon EBS
- Amazon EC2
- Amazon ECS
- Amazon EMR
- AWS API Gateway
- Amazon GuardDuty
- AWS Health
- Amazon Kinesis Data Firehose
- Amazon Kinesis Data Stream
- Amazon MQ
- Amazon Managed Streaming for Apache Kafka (MSK)
- Amazon NAT Gateway
- Amazon RDS
- Amazon Redshift
- Amazon S3
- Amazon S3 Storage Lens
- Amazon Security Lake
- Amazon SNS
- Amazon SQS
- Amazon VPC
- Amazon VPN
- AWS Bedrock
- AWS Billing
- AWS CloudTrail
- AWS CloudWatch
- AWS ELB
- AWS Fargate
- AWS Inspector
- AWS Lambda
- AWS Logs (custom)
- AWS Network Firewall
- AWS Route 53
- AWS Security Hub
- AWS Transit Gateway
- AWS Usage
- AWS WAF
- Azure
- Activity logs
- App Service
- Application Gateway
- Application Insights metrics
- Application Insights metrics overview
- Application State Insights metrics
- Azure logs (v2 preview)
- Azure OpenAI
- Billing metrics
- Container instance metrics
- Container registry metrics
- Container service metrics
- Custom Azure Logs
- Custom Blob Storage Input
- Database Account metrics
- Event Hub input
- Firewall logs
- Frontdoor
- Functions
- Microsoft Entra ID
- Monitor metrics
- Network Watcher VNet
- Network Watcher NSG
- Platform logs
- Resource metrics
- Spring Cloud logs
- Storage Account metrics
- Virtual machines metrics
- Virtual machines scaleset metrics
- Barracuda
- BeyondInsight and Password Safe Integration
- BitDefender
- Bitwarden
- blacklens.io
- Blue Coat Director Logs
- BBOT (Bighuge BLS OSINT Tool)
- Box Events
- Bravura Monitor
- Broadcom ProxySG
- Canva
- Cassandra
- CEL Custom API
- Ceph
- Check Point
- Cilium Tetragon
- CISA Known Exploited Vulnerabilities
- Cisco
- Cisco Meraki Metrics
- Citrix
- Claroty CTD
- Cloudflare
- Cloud Asset Inventory
- CockroachDB Metrics
- Common Event Format (CEF)
- Containerd
- CoreDNS
- Corelight
- Couchbase
- CouchDB
- Cribl
- CrowdStrike
- Cyberark
- Cybereason
- CylanceProtect Logs
- Custom Websocket logs
- Darktrace
- Data Exfiltration Detection
- DGA
- Digital Guardian
- Docker
- DomainTools Real Time Unified Feeds
- Elastic APM
- Elastic Fleet Server
- Elastic Security
- Elastic Stack monitoring
- Elasticsearch Service Billing
- Envoy Proxy
- ESET PROTECT
- ESET Threat Intelligence
- etcd
- Falco
- F5
- File Integrity Monitoring
- FireEye Network Security
- First EPSS
- Forcepoint Web Security
- ForgeRock
- Fortinet
- Gigamon
- GitHub
- GitLab
- Golang
- Google Cloud
- Custom GCS Input
- GCP
- GCP Audit logs
- GCP Billing metrics
- GCP Cloud Run metrics
- GCP CloudSQL metrics
- GCP Compute metrics
- GCP Dataproc metrics
- GCP DNS logs
- GCP Firestore metrics
- GCP Firewall logs
- GCP GKE metrics
- GCP Load Balancing metrics
- GCP Metrics Input
- GCP PubSub logs (custom)
- GCP PubSub metrics
- GCP Redis metrics
- GCP Security Command Center
- GCP Storage metrics
- GCP VPC Flow logs
- GCP Vertex AI
- GoFlow2 logs
- Hadoop
- HAProxy
- Hashicorp Vault
- HTTP Endpoint logs (custom)
- IBM MQ
- IIS
- Imperva
- InfluxDb
- Infoblox
- Iptables
- Istio
- Jamf Compliance Reporter
- Jamf Pro
- Jamf Protect
- Jolokia Input
- Journald logs (custom)
- JumpCloud
- Kafka
- Keycloak
- Kubernetes
- LastPass
- Lateral Movement Detection
- Linux Metrics
- Living off the Land Attack Detection
- Logs (custom)
- Lumos
- Lyve Cloud
- Mattermost
- Memcached
- Menlo Security
- Microsoft
- Microsoft 365
- Microsoft Defender for Cloud
- Microsoft Defender for Endpoint
- Microsoft DHCP
- Microsoft DNS Server
- Microsoft Entra ID Entity Analytics
- Microsoft Exchange Online Message Trace
- Microsoft Exchange Server
- Microsoft Graph Activity Logs
- Microsoft M365 Defender
- Microsoft Office 365 Metrics Integration
- Microsoft Sentinel
- Microsoft SQL Server
- Mimecast
- ModSecurity Audit
- MongoDB
- MongoDB Atlas
- MySQL
- Nagios XI
- NATS
- NetFlow Records
- Netskope
- Network Beaconing Identification
- Network Packet Capture
- Nginx
- Okta
- Oracle
- OpenAI
- OpenCanary
- Osquery
- Palo Alto
- pfSense
- PHP-FPM
- PingOne
- PingFederate
- Pleasant Password Server
- PostgreSQL
- Prometheus
- Proofpoint TAP
- Proofpoint On Demand
- Pulse Connect Secure
- Qualys VMDR
- QNAP NAS
- RabbitMQ Logs
- Radware DefensePro Logs
- Rapid7
- Redis
- Rubrik RSC Metrics Integration
- Sailpoint Identity Security Cloud
- Salesforce
- SentinelOne
- ServiceNow
- Slack Logs
- Snort
- Snyk
- SonicWall Firewall
- Sophos
- Spring Boot
- SpyCloud Enterprise Protection
- SQL Input
- Squid Logs
- SRX
- STAN
- Statsd Input
- Sublime Security
- Suricata
- StormShield SNS
- Symantec
- Symantec Endpoint Security
- Sysmon for Linux
- Sysdig
- Syslog Router Integration
- System
- System Audit
- Tanium
- TCP Logs (custom)
- Tenable OT Security
- Teleport
- Tenable
- Threat intelligence
- ThreatConnect
- Threat Map
- Thycotic Secret Server
- Tines
- Traefik
- Trellix
- Trend Micro
- TYCHON Agentless
- UDP Logs (custom)
- Universal Profiling
- Varonis integration
- Vectra Detect
- VMware
- WatchGuard Firebox
- WebSphere Application Server
- Windows
- Wiz
- Zeek
- ZeroFox
- Zero Networks
- ZooKeeper Metrics
- Zoom
- Zscaler
Platform Observability
editPlatform Observability
editVersion |
0.1.0 [beta] This functionality is in beta and is subject to change. The design and code is less mature than official GA features and is being provided as-is with no warranties. Beta features are not subject to the support SLA of official GA features. (View all) |
Compatible Kibana version(s) |
8.3.0 or higher |
Supported Serverless project types |
Security |
Subscription level |
Basic |
Compatibility
editThis package works with Kibana 8.3.0 and later.
Kibana logs
editThe Kibana integration collects logs from Kibana instance.
Logs
editAudit
editAudit logs collects the Kibana audit logs.
Example
An example event for kibana_audit
looks as following:
{ "event": { "action": "http_request", "category": [ "web" ], "outcome": "unknown" }, "http": { "request": { "method": "get" } }, "url": { "domain": "localhost", "path": "/internal/security/session", "port": 5601, "scheme": "http" }, "user": { "name": "elastic", "roles": [ "superuser" ] }, "kibana": { "space_id": "default", "session_id": "ccZ0sbxrmmJwo+/y2Mn1tmGIrKOuZYaF8voUh0SkA/k=" }, "trace": { "id": "1c8c5808-d2d6-41fc-8cb7-998aa8996be9" }, "ecs": { "version": "8.0.0" }, "@timestamp": "2022-06-29T12:05:03.742+00:00", "message": "User is requesting [/internal/security/session] endpoint", "log": { "level": "INFO", "logger": "plugins.security.audit.ecs" }, "process": { "pid": 7 }, "transaction": { "id": "f8863d86567119e6" } }
Exported fields
Field | Description | Type |
---|---|---|
@timestamp |
Event timestamp. |
date |
data_stream.dataset |
Data stream dataset. |
constant_keyword |
data_stream.namespace |
Data stream namespace. |
constant_keyword |
data_stream.type |
Data stream type. |
constant_keyword |
ecs.version |
ECS version this event conforms to. |
keyword |
event.action |
The action captured by the event. This describes the information in the event. It is more specific than |
keyword |
event.category |
This is one of four ECS Categorization Fields, and indicates the second level in the ECS category hierarchy. |
keyword |
event.dataset |
Name of the dataset. If an event source publishes more than one type of log or events (e.g. access log, error log), the dataset is used to specify which one the event comes from. It’s recommended but not required to start the dataset name with the module name, followed by a dot, then the dataset name. |
keyword |
event.ingested |
Timestamp when an event arrived in the central data store. This is different from |
date |
event.kind |
This is one of four ECS Categorization Fields, and indicates the highest level in the ECS category hierarchy. |
keyword |
event.outcome |
This is one of four ECS Categorization Fields, and indicates the lowest level in the ECS category hierarchy. |
keyword |
http.request.method |
HTTP request method. The value should retain its casing from the original event. For example, |
keyword |
kibana.add_to_spaces |
The set of space ids that a saved object was shared to. |
keyword |
kibana.authentication_provider |
The authentication provider associated with a login event. |
keyword |
kibana.authentication_realm |
The Elasticsearch authentication realm name which fulfilled a login event. |
keyword |
kibana.authentication_type |
The authentication provider type associated with a login event. |
keyword |
kibana.delete_from_spaces |
The set of space ids that a saved object was removed from. |
keyword |
kibana.lookup_realm |
The Elasticsearch lookup realm which fulfilled a login event. |
keyword |
kibana.saved_object.id |
The id of the saved object associated with this event. |
keyword |
kibana.saved_object.type |
The type of the saved object associated with this event. |
keyword |
kibana.session_id |
The ID of the user session associated with this event. Each login attempt results in a unique session id. |
keyword |
kibana.space_id |
The id of the space associated with this event. |
keyword |
log.level |
Original log level of the log event. If the source of the event provides a log level or textual severity, this is the one that goes in |
keyword |
log.logger |
The name of the logger inside an application. This is usually the name of the class which initialized the logger, or can be a custom name. |
keyword |
message |
For log events the message field contains the log message, optimized for viewing in a log viewer. For structured logs without an original message field, other fields can be concatenated to form a human-readable summary of the event. If multiple messages exist, they can be combined into one message. |
match_only_text |
process.pid |
Process id. |
long |
trace.id |
Unique identifier of the trace. A trace groups multiple events like transactions that belong together. For example, a user request handled by multiple inter-connected services. |
keyword |
transaction.id |
Unique identifier of the transaction within the scope of its trace. A transaction is the highest level of work measured within a service, such as a request to a server. |
keyword |
url.domain |
Domain of the url, such as "http://www.elastic.co[www.elastic.co]". In some cases a URL may refer to an IP and/or port directly, without a domain name. In this case, the IP address would go to the |
keyword |
url.path |
Path of the request, such as "/search". |
wildcard |
url.port |
Port of the request, such as 443. |
long |
url.query |
The query field describes the query string of the request, such as "q=elasticsearch". The |
keyword |
url.scheme |
Scheme of the request, such as "https". Note: The |
keyword |
user.name |
Short name or login of the user. |
keyword |
user.name.text |
Multi-field of |
match_only_text |
user.roles |
Array of user roles at the time of the event. |
keyword |
Log
editLog collects the Kibana logs.
Example
An example event for kibana_log
looks as following:
{ "http": { "request": { "id": "unknownId", "method": "GET" }, "response": { "body": { "bytes": 118 }, "status_code": 200 } }, "url": { "path": "/_nodes", "query": "filter_path=nodes.*.version%2Cnodes.*.http.publish_address%2Cnodes.*.ip" }, "ecs": { "version": "8.0.0" }, "@timestamp": "2022-07-14T10:35:25.366+00:00", "message": "200 - 118.0B\nGET /_nodes?filter_path=nodes.*.version%2Cnodes.*.http.publish_address%2Cnodes.*.ip", "log": { "level": "DEBUG", "logger": "elasticsearch.query.data" }, "process": { "pid": 7 }, "trace": { "id": "0cd8dd5a3483159a43c07e9205432775" }, "transaction": { "id": "6301eca88fba8d99" } }
Exported fields
Field | Description | Type |
---|---|---|
@timestamp |
Event timestamp. |
date |
data_stream.dataset |
Data stream dataset. |
constant_keyword |
data_stream.namespace |
Data stream namespace. |
constant_keyword |
data_stream.type |
Data stream type. |
constant_keyword |
ecs.version |
ECS version this event conforms to. |
keyword |
event.dataset |
Name of the dataset. If an event source publishes more than one type of log or events (e.g. access log, error log), the dataset is used to specify which one the event comes from. It’s recommended but not required to start the dataset name with the module name, followed by a dot, then the dataset name. |
keyword |
event.ingested |
Timestamp when an event arrived in the central data store. This is different from |
date |
event.kind |
This is one of four ECS Categorization Fields, and indicates the highest level in the ECS category hierarchy. |
keyword |
event.outcome |
This is one of four ECS Categorization Fields, and indicates the lowest level in the ECS category hierarchy. |
keyword |
http.request.id |
A unique identifier for each HTTP request to correlate logs between clients and servers in transactions. The id may be contained in a non-standard HTTP header, such as |
keyword |
http.request.method |
HTTP request method. The value should retain its casing from the original event. For example, |
keyword |
http.response.body.bytes |
Size in bytes of the response body. |
long |
http.response.status_code |
HTTP response status code. |
long |
log.level |
Original log level of the log event. If the source of the event provides a log level or textual severity, this is the one that goes in |
keyword |
log.logger |
The name of the logger inside an application. This is usually the name of the class which initialized the logger, or can be a custom name. |
keyword |
message |
For log events the message field contains the log message, optimized for viewing in a log viewer. For structured logs without an original message field, other fields can be concatenated to form a human-readable summary of the event. If multiple messages exist, they can be combined into one message. |
match_only_text |
process.pid |
Process id. |
long |
trace.id |
Unique identifier of the trace. A trace groups multiple events like transactions that belong together. For example, a user request handled by multiple inter-connected services. |
keyword |
transaction.id |
Unique identifier of the transaction within the scope of its trace. A transaction is the highest level of work measured within a service, such as a request to a server. |
keyword |
url.path |
Path of the request, such as "/search". |
wildcard |
url.query |
The query field describes the query string of the request, such as "q=elasticsearch". The |
keyword |
Changelog
editChangelog
Version | Details | Kibana version(s) |
---|---|---|
0.1.0 |
Enhancement (View pull request) |
— |
0.0.2 |
Enhancement (View pull request) |
— |
0.0.1 |
Enhancement (View pull request) |
— |
On this page