Create or update Logstash pipeline API

edit

Create or update Logstash pipeline API

edit

This API creates or updates a Logstash pipeline used for Logstash Central Management.

Request

edit

PUT _logstash/pipeline/<pipeline_id>

Prerequisites

edit
  • If the Elasticsearch security features are enabled, you must have the manage_logstash_pipelines cluster privilege to use this API.

Description

edit

Creates a Logstash pipeline. If the specified pipeline exists, the pipeline is replaced.

Path parameters

edit
<pipeline_id>
(Required, string) Identifier for the pipeline.

Request body

edit
description
(Optional, string) Description of the pipeline. This description is not used by Elasticsearch or Logstash.
last_modified
(Required, string) Date the pipeline was last updated. Must be in the yyyy-MM-dd'T'HH:mm:ss.SSSZZ strict_date_time format.
pipeline
(Required, string) Configuration for the pipeline. For supported syntax, see the Logstash configuration documentation.
pipeline_metadata
(Required, object) Optional metadata about the pipeline. May have any contents. This metadata is not generated or used by Elasticsearch or Logstash.
pipeline_settings
(Required, object) Settings for the pipeline. Supports only flat keys in dot notation. For supported settings, see the Logstash settings documentation.
username
(Required, string) User who last updated the pipeline.

Examples

edit

The following example creates a new pipeline named my_pipeline:

resp = client.logstash.put_pipeline(
    id="my_pipeline",
    pipeline={
        "description": "Sample pipeline for illustration purposes",
        "last_modified": "2021-01-02T02:50:51.250Z",
        "pipeline_metadata": {
            "type": "logstash_pipeline",
            "version": "1"
        },
        "username": "elastic",
        "pipeline": "input {}\n filter { grok {} }\n output {}",
        "pipeline_settings": {
            "pipeline.workers": 1,
            "pipeline.batch.size": 125,
            "pipeline.batch.delay": 50,
            "queue.type": "memory",
            "queue.max_bytes": "1gb",
            "queue.checkpoint.writes": 1024
        }
    },
)
print(resp)
response = client.logstash.put_pipeline(
  id: 'my_pipeline',
  body: {
    description: 'Sample pipeline for illustration purposes',
    last_modified: '2021-01-02T02:50:51.250Z',
    pipeline_metadata: {
      type: 'logstash_pipeline',
      version: '1'
    },
    username: 'elastic',
    pipeline: "input {}\n filter { grok {} }\n output {}",
    pipeline_settings: {
      'pipeline.workers' => 1,
      'pipeline.batch.size' => 125,
      'pipeline.batch.delay' => 50,
      'queue.type' => 'memory',
      'queue.max_bytes' => '1gb',
      'queue.checkpoint.writes' => 1024
    }
  }
)
puts response
const response = await client.logstash.putPipeline({
  id: "my_pipeline",
  pipeline: {
    description: "Sample pipeline for illustration purposes",
    last_modified: "2021-01-02T02:50:51.250Z",
    pipeline_metadata: {
      type: "logstash_pipeline",
      version: "1",
    },
    username: "elastic",
    pipeline: "input {}\n filter { grok {} }\n output {}",
    pipeline_settings: {
      "pipeline.workers": 1,
      "pipeline.batch.size": 125,
      "pipeline.batch.delay": 50,
      "queue.type": "memory",
      "queue.max_bytes": "1gb",
      "queue.checkpoint.writes": 1024,
    },
  },
});
console.log(response);
PUT _logstash/pipeline/my_pipeline
{
  "description": "Sample pipeline for illustration purposes",
  "last_modified": "2021-01-02T02:50:51.250Z",
  "pipeline_metadata": {
    "type": "logstash_pipeline",
    "version": "1"
  },
  "username": "elastic",
  "pipeline": "input {}\n filter { grok {} }\n output {}",
  "pipeline_settings": {
    "pipeline.workers": 1,
    "pipeline.batch.size": 125,
    "pipeline.batch.delay": 50,
    "queue.type": "memory",
    "queue.max_bytes": "1gb",
    "queue.checkpoint.writes": 1024
  }
}

If the request succeeds, you receive an empty response with an appropriate status code.