google_pubsub
editgoogle_pubsub
edit- Version: 1.0.0
- Released on: March 28, 2017
- Changelog
Installation
editFor plugins not bundled by default, it is easy to install by running bin/logstash-plugin install logstash-input-google_pubsub
. See Working with plugins for more details.
Getting Help
editFor questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.
Description
editThe main motivation behind the development of this plugin was to ingest Stackdriver Logging messages via the Exported Logs feature of Stackdriver Logging.
Prerequisites
editYou must first create a Google Cloud Platform project and enable the the Google Pub/Sub API. If you intend to use the plugin ingest Stackdriver Logging messages, you must also enable the Stackdriver Logging API and configure log exporting to Pub/Sub. There is plentiful information on https://cloud.google.com/ to get started:
- Google Cloud Platform Projects and Overview
- Google Cloud Pub/Sub documentation
- Stackdriver Logging documentation
Cloud Pub/Sub
editCurrently, this module requires you to create a topic
manually and specify
it in the logstash config file. You must also specify a subscription
, but
the plugin will attempt to create the pull-based subscription
on its own.
All messages received from Pub/Sub will be converted to a logstash event
and added to the processing pipeline queue. All Pub/Sub messages will be
acknowledged
and removed from the Pub/Sub topic
(please see more about
Pub/Sub concepts.
It is generally assumed that incoming messages will be in JSON and added to
the logstash event
as-is. However, if a plain text message is received, the
plugin will return the raw text in as raw_message
in the logstash event
.
Authentication
editYou have two options for authentication depending on where you run Logstash.
-
If you are running Logstash outside of Google Cloud Platform, then you will need to create a Google Cloud Platform Service Account and specify the full path to the JSON private key file in your config. You must assign sufficient roles to the Service Account to create a subscription and to pull messages from the subscription. Learn more about GCP Service Accounts and IAM roles here:
- If you are running Logstash on a Google Compute Engine instance, you may opt to use Application Default Credentials. In this case, you will not need to specify a JSON private key file in your config.
Stackdriver Logging (optional)
editIf you intend to use the logstash plugin for Stackdriver Logging message
ingestion, you must first manually set up the Export option to Cloud Pub/Sub and
the manually create the topic
. Please see the more detailed instructions at,
https://cloud.google.com/logging/docs/export/using_exported_logs [Exported Logs]
and ensure that the necessary permissions
have also been manually configured.
Logging messages from Stackdriver Logging exported to Pub/Sub are received as
JSON and converted to a logstash event
as-is in
this format.
Sample Configuration
editBelow is a copy of the included example.conf-tmpl
file that shows a basic
configuration for this plugin.
input { google_pubsub { # Your GCP project id (name) project_id => "my-project-1234" # The topic name below is currently hard-coded in the plugin. You # must first create this topic by hand and ensure you are exporting # logging to this pubsub topic. topic => "logstash-input-dev" # The subscription name is customizeable. The plugin will attempt to # create the subscription (but use the hard-coded topic name above). subscription => "logstash-sub" # If you are running logstash within GCE, it will use # Application Default Credentials and use GCE's metadata # service to fetch tokens. However, if you are running logstash # outside of GCE, you will need to specify the service account's # JSON key file below. #json_key_file => "/home/erjohnso/pkey.json" } } output { stdout { codec => rubydebug } }
Synopsis
editThis plugin supports the following configuration options:
Required configuration options:
google_pubsub { max_messages => ... project_id => ... subscription => ... topic => ... }
Available configuration options:
Setting | Input type | Required |
---|---|---|
No |
||
No |
||
No |
||
No |
||
a valid filesystem path |
No |
|
Yes |
||
Yes |
||
Yes |
||
No |
||
Yes |
||
No |
Details
edit
codec
edit- Value type is codec
-
Default value is
"plain"
The codec used for input data. Input codecs are a convenient method for decoding your data before it enters the input, without needing a separate filter in your Logstash pipeline.
enable_metric
edit- Value type is boolean
-
Default value is
true
Disable or enable metric logging for this specific plugin instance by default we record all the metrics we can, but you can disable metrics collection for a specific plugin.
id
edit- Value type is string
- There is no default value for this setting.
Add a unique ID
to the plugin configuration. If no ID is specified, Logstash will generate one.
It is strongly recommended to set this ID in your configuration. This is particularly useful
when you have two or more plugins of the same type, for example, if you have 2 grok filters.
Adding a named ID in this case will help in monitoring Logstash when using the monitoring APIs.
output { stdout { id => "my_plugin_id" } }
json_key_file
edit- Value type is path
- There is no default value for this setting.
If logstash is running within Google Compute Engine, the plugin will use GCE’s Application Default Credentials. Outside of GCE, you will need to specify a Service Account JSON key file.
project_id
edit- This is a required setting.
- Value type is string
- There is no default value for this setting.
Google Cloud Project ID (name, not number)
subscription
edit- This is a required setting.
- Value type is string
- There is no default value for this setting.
tags
edit- Value type is array
- There is no default value for this setting.
Add any number of arbitrary tags to your event.
This can help with processing later.
topic
edit- This is a required setting.
- Value type is string
- There is no default value for this setting.
Google Cloud Pub/Sub Topic and Subscription. Note that the topic must be created manually with Cloud Logging pre-configured export to PubSub configured to use the defined topic. The subscription will be created automatically by the plugin.
type
edit- Value type is string
- There is no default value for this setting.
This is the base class for Logstash inputs.
Add a type
field to all events handled by this input.
Types are used mainly for filter activation.
The type is stored as part of the event itself, so you can also use the type to search for it in Kibana.
If you try to set a type on an event that already has one (for example when you send an event from a shipper to an indexer) then a new input will not override the existing type. A type set at the shipper stays with that event for its life even when sent to another Logstash server.