Google_cloud_storage output plugin v3.0.4

edit
  • Plugin version: v3.0.4
  • Released on: 2017-08-16
  • Changelog

For other versions, see the overview list.

To learn more about Logstash, see the Logstash Reference.

Getting help

edit

For questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.

Description

edit

Summary: plugin to upload log events to Google Cloud Storage (GCS), rolling files based on the date pattern provided as a configuration setting. Events are written to files locally and, once file is closed, this plugin uploads it to the configured bucket.

For more info on Google Cloud Storage, please go to: https://cloud.google.com/products/cloud-storage

In order to use this plugin, a Google service account must be used. For more information, please refer to: https://developers.google.com/storage/docs/authentication#service_accounts

Recommendation: experiment with the settings depending on how much log data you generate, so the uploader can keep up with the generated logs. Using gzip output can be a good option to reduce network traffic when uploading the log files and in terms of storage costs as well.

USAGE: This is an example of logstash config:

output {
   google_cloud_storage {
     bucket => "my_bucket"                                     (required)
     key_path => "/path/to/privatekey.p12"                     (required)
     key_password => "notasecret"                              (optional)
     service_account => "1234@developer.gserviceaccount.com"   (required)
     temp_directory => "/tmp/logstash-gcs"                     (optional)
     log_file_prefix => "logstash_gcs"                         (optional)
     max_file_size_kbytes => 1024                              (optional)
     output_format => "plain"                                  (optional)
     date_pattern => "%Y-%m-%dT%H:00"                          (optional)
     flush_interval_secs => 2                                  (optional)
     gzip => false                                             (optional)
     uploader_interval_secs => 60                              (optional)
   }
}
  • Support logstash event variables to determine filename.
  • Turn Google API code into a Plugin Mixin (like AwsConfig).
  • There’s no recover method, so if logstash/plugin crashes, files may not be uploaded to GCS.
  • Allow user to configure file name.
  • Allow parallel uploads for heavier loads (+ connection configuration if exposed by Ruby API client)

Google_cloud_storage Output Configuration Options

edit

This plugin supports the following configuration options plus the Common options described later.

Also see Common options for a list of options supported by all output plugins.

 

bucket

edit
  • This is a required setting.
  • Value type is string
  • There is no default value for this setting.

GCS bucket name, without "gs://" or any other prefix.

date_pattern

edit
  • Value type is string
  • Default value is "%Y-%m-%dT%H:00"

Time pattern for log file, defaults to hourly files. Must Time.strftime patterns: www.ruby-doc.org/core-2.0/Time.html#method-i-strftime

flush_interval_secs

edit
  • Value type is number
  • Default value is 2

Flush interval in seconds for flushing writes to log files. 0 will flush on every message.

gzip

edit
  • Value type is boolean
  • Default value is false

Gzip output stream when writing events to log files.

key_password

edit
  • Value type is string
  • Default value is "notasecret"

GCS private key password.

key_path

edit
  • This is a required setting.
  • Value type is string
  • There is no default value for this setting.

GCS path to private key file.

log_file_prefix

edit
  • Value type is string
  • Default value is "logstash_gcs"

Log file prefix. Log file will follow the format: <prefix>_hostname_date<.part?>.log

max_file_size_kbytes

edit
  • Value type is number
  • Default value is 10000

Sets max file size in kbytes. 0 disable max file check.

output_format

edit
  • Value can be any of: json, plain
  • Default value is "plain"

The event format you want to store in files. Defaults to plain text.

service_account

edit
  • This is a required setting.
  • Value type is string
  • There is no default value for this setting.

GCS service account.

temp_directory

edit
  • Value type is string
  • Default value is ""

Directory where temporary files are stored. Defaults to /tmp/logstash-gcs-<random-suffix>

uploader_interval_secs

edit
  • Value type is number
  • Default value is 60

Uploader interval when uploading new files to GCS. Adjust time based on your time pattern (for example, for hourly files, this interval can be around one hour).

Common options

edit

These configuration options are supported by all output plugins:

Setting Input type Required

codec

codec

No

enable_metric

boolean

No

id

string

No

codec

edit
  • Value type is codec
  • Default value is "plain"

The codec used for output data. Output codecs are a convenient method for encoding your data before it leaves the output without needing a separate filter in your Logstash pipeline.

enable_metric

edit
  • Value type is boolean
  • Default value is true

Disable or enable metric logging for this specific plugin instance. By default we record all the metrics we can, but you can disable metrics collection for a specific plugin.

  • Value type is string
  • There is no default value for this setting.

Add a unique ID to the plugin configuration. If no ID is specified, Logstash will generate one. It is strongly recommended to set this ID in your configuration. This is particularly useful when you have two or more plugins of the same type. For example, if you have 2 google_cloud_storage outputs. Adding a named ID in this case will help in monitoring Logstash when using the monitoring APIs.

output {
  google_cloud_storage {
    id => "my_plugin_id"
  }
}