Log4j input plugin
editLog4j input plugin
edit- Plugin version: v3.1.3
- Released on: 2018-04-06
- Changelog
For other versions, see the Versioned plugin docs.
Installation
editFor plugins not bundled by default, it is easy to install by running bin/logstash-plugin install logstash-input-log4j
. See Working with plugins for more details.
Getting Help
editFor questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.
Deprecation Notice
editThis plugin is deprecated. It is recommended that you use filebeat to collect logs from log4j.
The following section is a guide for how to migrate from SocketAppender to use filebeat.
To migrate away from log4j SocketAppender to using filebeat, you will need to make these changes:
- Configure your log4j.properties (in your app) to write to a local file.
- Install and configure filebeat to collect those logs and ship them to Logstash
- Configure Logstash to use the beats input.
Configuring log4j for writing to local files
editIn your log4j.properties file, remove SocketAppender and replace it with RollingFileAppender.
For example, you can use the following log4j.properties configuration to write daily log files.
# Your app's log4j.properties (log4j 1.2 only) log4j.rootLogger=daily log4j.appender.daily=org.apache.log4j.rolling.RollingFileAppender log4j.appender.daily.RollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy log4j.appender.daily.RollingPolicy.FileNamePattern=/var/log/your-app/app.%d.log log4j.appender.daily.layout = org.apache.log4j.PatternLayout log4j.appender.daily.layout.ConversionPattern=%d{YYYY-MM-dd HH:mm:ss,SSSZ} %p %c{1}:%L - %m%n
Configuring log4j.properties in more detail is outside the scope of this migration guide.
Configuring filebeat
editNext, install filebeat. Based on the above log4j.properties, we can use this filebeat configuration:
# filebeat.yml filebeat: prospectors: - paths: - /var/log/your-app/app.*.log input_type: log output: logstash: hosts: ["your-logstash-host:5000"]
For more details on configuring filebeat, see Configure Filebeat.
Configuring Logstash to receive from filebeat
editFinally, configure Logstash with a beats input:
# logstash configuration input { beats { port => 5000 } }
It is strongly recommended that you also enable TLS in filebeat and logstash beats input for protection and safety of your log data..
For more details on configuring the beats input, see the logstash beats input documentation.
Description
editRead events over a TCP socket from a Log4j SocketAppender. This plugin works only with log4j version 1.x.
Can either accept connections from clients or connect to a server,
depending on mode
. Depending on which mode
is configured,
you need a matching SocketAppender or a SocketHubAppender
on the remote side.
One event is created per received log4j LoggingEvent with the following schema:
-
timestamp
⇒ the number of milliseconds elapsed from 1/1/1970 until logging event was created. -
path
⇒ the name of the logger -
priority
⇒ the level of this event -
logger_name
⇒ the name of the logger -
thread
⇒ the thread name making the logging request -
class
⇒ the fully qualified class name of the caller making the logging request. -
file
⇒ the source file name and line number of the caller making the logging request in a colon-separated format "fileName:lineNumber". -
method
⇒ the method name of the caller making the logging request. -
NDC
⇒ the NDC string -
stack_trace
⇒ the multi-line stack-trace
Also if the original log4j LoggingEvent contains MDC hash entries, they will be merged in the event as fields.
Log4j Input Configuration Options
editThis plugin supports the following configuration options plus the Common Options described later.
Setting | Input type | Required |
---|---|---|
No |
||
string, one of |
No |
|
No |
||
No |
Also see Common Options for a list of options supported by all input plugins.
host
edit- Value type is string
-
Default value is
"0.0.0.0"
When mode is server
, the address to listen on.
When mode is client
, the address to connect to.
mode
edit-
Value can be any of:
server
,client
-
Default value is
"server"
Mode to operate in. server
listens for client connections,
client
connects to a server.
port
edit- Value type is number
-
Default value is
4560
When mode is server
, the port to listen on.
When mode is client
, the port to connect to.
proxy_protocol
edit- Value type is boolean
-
Default value is
false
Proxy protocol support, only v1 is supported at this time http://www.haproxy.org/download/1.5/doc/proxy-protocol.txt
Common Options
editThe following configuration options are supported by all input plugins:
Details
edit
enable_metric
edit- Value type is boolean
-
Default value is
true
Disable or enable metric logging for this specific plugin instance by default we record all the metrics we can, but you can disable metrics collection for a specific plugin.
id
edit- Value type is string
- There is no default value for this setting.
Add a unique ID
to the plugin configuration. If no ID is specified, Logstash will generate one.
It is strongly recommended to set this ID in your configuration. This is particularly useful
when you have two or more plugins of the same type, for example, if you have 2 log4j inputs.
Adding a named ID in this case will help in monitoring Logstash when using the monitoring APIs.
input { log4j { id => "my_plugin_id" } }
Variable substitution in the id
field only supports environment variables
and does not support the use of values from the secret store.
tags
edit- Value type is array
- There is no default value for this setting.
Add any number of arbitrary tags to your event.
This can help with processing later.
type
edit- Value type is string
- There is no default value for this setting.
Add a type
field to all events handled by this input.
Types are used mainly for filter activation.
The type is stored as part of the event itself, so you can also use the type to search for it in Kibana.
If you try to set a type on an event that already has one (for example when you send an event from a shipper to an indexer) then a new input will not override the existing type. A type set at the shipper stays with that event for its life even when sent to another Logstash server.