- Logstash Reference: other versions:
- Logstash Introduction
- Getting Started with Logstash
- How Logstash Works
- Setting Up and Running Logstash
- Setting Up X-Pack
- Breaking Changes
- X-Pack Breaking Changes
- Upgrading Logstash
- Configuring Logstash
- Managing Logstash
- Working with Logstash Modules
- Working with Filebeat Modules
- Data Resiliency
- Transforming Data
- Deploying and Scaling Logstash
- Performance Tuning
- Monitoring Logstash
- Monitoring APIs
- Working with plugins
- Input plugins
- beats
- cloudwatch
- couchdb_changes
- dead_letter_queue
- elasticsearch
- exec
- file
- ganglia
- gelf
- generator
- github
- google_pubsub
- graphite
- heartbeat
- http
- http_poller
- imap
- irc
- jdbc
- jms
- jmx
- kafka
- kinesis
- log4j
- lumberjack
- meetup
- pipe
- puppet_facter
- rabbitmq
- redis
- relp
- rss
- s3
- salesforce
- snmptrap
- sqlite
- sqs
- stdin
- stomp
- syslog
- tcp
- udp
- unix
- varnishlog
- websocket
- wmi
- xmpp
- Output plugins
- boundary
- circonus
- cloudwatch
- csv
- datadog
- datadog_metrics
- elasticsearch
- exec
- file
- ganglia
- gelf
- google_bigquery
- graphite
- graphtastic
- http
- influxdb
- irc
- juggernaut
- kafka
- librato
- loggly
- lumberjack
- metriccatcher
- mongodb
- nagios
- nagios_nsca
- opentsdb
- pagerduty
- pipe
- rabbitmq
- redis
- redmine
- riak
- riemann
- s3
- sns
- solr_http
- sqs
- statsd
- stdout
- stomp
- syslog
- tcp
- timber
- udp
- webhdfs
- websocket
- xmpp
- zabbix
- Filter plugins
- aggregate
- alter
- cidr
- cipher
- clone
- csv
- date
- de_dot
- dissect
- dns
- drop
- elapsed
- elasticsearch
- environment
- extractnumbers
- fingerprint
- geoip
- grok
- i18n
- jdbc_streaming
- json
- json_encode
- kv
- metricize
- metrics
- mutate
- prune
- range
- ruby
- sleep
- split
- syslog_pri
- throttle
- tld
- translate
- truncate
- urldecode
- useragent
- uuid
- xml
- Codec plugins
- Contributing to Logstash
- How to write a Logstash input plugin
- How to write a Logstash input plugin
- How to write a Logstash codec plugin
- How to write a Logstash filter plugin
- Contributing a Patch to a Logstash Plugin
- Logstash Plugins Community Maintainer Guide
- Submitting your plugin to RubyGems.org and the logstash-plugins repository
- Glossary of Terms
- Release Notes
- X-Pack Release Notes
Avro codec plugin
editAvro codec plugin
edit- Plugin version: v3.2.3
- Released on: 2017-11-07
- Changelog
Installation
editFor plugins not bundled by default, it is easy to install by running bin/logstash-plugin install logstash-codec-avro
. See Working with plugins for more details.
Getting Help
editFor questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.
Description
editRead serialized Avro records as Logstash events
This plugin is used to serialize Logstash events as Avro datums, as well as deserializing Avro datums into Logstash events.
Encoding
editThis codec is for serializing individual Logstash events as Avro datums that are Avro binary blobs. It does not encode Logstash events into an Avro file.
Decoding
editThis codec is for deserializing individual Avro records. It is not for reading Avro files. Avro files have a unique format that must be handled upon input.
Usage
editExample usage with Kafka input.
input { kafka { codec => avro { schema_uri => "/tmp/schema.avsc" } } } filter { ... } output { ... }
Avro Codec Configuration Options
editSetting | Input type | Required |
---|---|---|
Yes |
||
No |
On this page