Stream any log file

edit

[preview] This functionality is in technical preview and may be changed or removed in a future release. Elastic will work to fix any issues, but features in technical preview are not subject to the support SLA of official GA features.

Required role

The Admin role or higher is required to onboard log data. To learn more, refer to Assign user roles and privileges.

logs stream logs api key beats
Copy a project’s Elasticsearch endpoint

This guide shows you how to send a log file to your Observability project using a standalone Elastic Agent and configure the Elastic Agent and your data streams using the elastic-agent.yml file, and query your logs using the data streams you’ve set up.

The quickest way to get started is to:

  1. Open your Observability project. If you don’t have one, create an observability project.
  2. Go to Add Data.
  3. Under Collect and analyze logs, click Stream log files.

This will kick off a set of guided instructions that walk you through configuring the standalone Elastic Agent and sending log data to your project.

To install and configure the Elastic Agent manually, refer to Manually install and configure the standalone Elastic Agent.

Configure inputs and integration
edit

Enter a few configuration details in the guided instructions.

Configure inputs and integration in the Stream log files guided instructions

Configure inputs

  • Log file path: The path to your log files. You can also use a pattern like /var/log/your-logs.log*. Click Add row to add more log file paths.

    This will be passed to the paths field in the generated elastic-agent.yml file in a future step.

  • Service name: Provide a service name to allow for distributed services running on multiple hosts to correlate the related instances.

Configure integration

Elastic creates an integration to streamline connecting your log data to Elastic.

  • Integration name: Give your integration a name. This is a unique identifier for your stream of log data that you can later use to filter data in Logs Explorer. The value must be unique within your project, all lowercase, and max 100 chars. Special characters will be replaced with _.

    This will be passed to the streams.id field in the generated elastic-agent.yml file in a future step.

    The integration name will be used in Logs Explorer. It will appear in the "All logs" dropdown menu.

    All logs dropdown menu on Logs Explorer page

  • Dataset name: Give your integration’s dataset a name. The name for your dataset data stream. Name this data stream anything that signifies the source of the data. The value must be all lowercase and max 100 chars. Special characters will be replaced with _.

    This will be passed to the data_stream.dataset field in the generated elastic-agent.yml file in a future step.

Install the Elastic Agent
edit

After configuring the inputs and integration, you’ll continue in the guided instructions to install and configure the standalone Elastic Agent.

Run the command under Install the Elastic Agent that corresponds with your system to download, extract, and install the Elastic Agent. Turning on Automatically download the agent’s config includes your updated Elastic Agent configuration file in the download.

If you do not want to automatically download the configuration, click Download config file to download it manually and add it to /opt/Elastic/Agent/elastic-agent.yml on the host where you installed the Elastic Agent. The values you provided in Configure inputs and integration will be prepopulated in the generated configuration file.

Manually install and configure the standalone Elastic Agent
edit

If you’re not using the guided instructions, follow these steps to manually install and configure your the Elastic Agent.

Step 1: Download and extract the Elastic Agent installation package
edit

On your host, download and extract the installation package that corresponds with your system:

curl -L -O https://artifacts.elastic.co/downloads/beats/elastic-agent/elastic-agent-9.0.0-beta1-darwin-x86_64.tar.gz
tar xzvf elastic-agent-9.0.0-beta1-darwin-x86_64.tar.gz
Step 2: Install and start the Elastic Agent
edit

After downloading and extracting the installation package, you’re ready to install the Elastic Agent. From the agent directory, run the install command that corresponds with your system:

On macOS, Linux (tar package), and Windows, run the install command to install and start Elastic Agent as a managed service and start the service. The DEB and RPM packages include a service unit for Linux systems with systemd, For these systems, you must enable and start the service.

You must run this command as the root user because some integrations require root privileges to collect sensitive data.

sudo ./elastic-agent install

During installation, you’ll be prompted with some questions:

  1. When asked if you want to install the agent as a service, enter Y.
  2. When asked if you want to enroll the agent in Fleet, enter n.
Step 3: Configure the Elastic Agent
edit

After your agent is installed, configure it by updating the elastic-agent.yml file.

Locate your configuration fileedit

You’ll find the elastic-agent.yml in one of the following locations according to your system:

Main Elastic Agent configuration file location:

/Library/Elastic/Agent/elastic-agent.yml

Update your configuration fileedit

Update the default configuration in the elastic-agent.yml file manually. It should look something like this:

outputs:
  default:
    type: elasticsearch
    hosts: '<your-elasticsearch-endpoint>:<port>'
    api_key: 'your-api-key'
inputs:
  - id: your-log-id
    type: filestream
    streams:
      - id: your-log-stream-id
        data_stream:
          dataset: example
        paths:
          - /var/log/your-logs.log

You need to set the values for the following fields:

Field Value

hosts

Copy the Elasticsearch endpoint from your project’s page and add the port (the default port is 443). For example, https://my-deployment.es.us-central1.gcp.cloud.es.io:443.

If you’re following the guided instructions in your project, the Elasticsearch endpoint will be prepopulated in the configuration file.

If you need to find your project’s Elasticsearch endpoint outside the guided instructions:

  1. Go to the Projects page that lists all your projects.
  2. Click Manage next to the project you want to connect to.
  3. Click View next to Endpoints.
  4. Copy the Elasticsearch endpoint.
Copy a project’s Elasticsearch endpoint

api-key

Use an API key to grant the agent access to your project. The API key format should be <id>:<key>.

If you’re following the guided instructions in your project, an API key will be autogenerated and will be prepopulated in the downloadable configuration file.

If configuring the Elastic Agent manually, create an API key:

  1. Navigate to Project settingsManagementAPI keys and click Create API key.
  2. Select Restrict privileges and add the following JSON to give privileges for ingesting logs.

    {
      "standalone_agent": {
        "cluster": [
          "monitor"
        ],
        "indices": [
          {
            "names": [
              "logs-*-*"
            ],
            "privileges": [
              "auto_configure", "create_doc"
            ]
          }
        ]
      }
    }
  3. You must set the API key to configure Beats. Immediately after the API key is generated and while it is still being displayed, click the Encoded button next to the API key and select Beats from the list in the tooltip. Base64 encoded API keys are not currently supported in this configuration.

    logs stream logs api key beats

inputs.id

A unique identifier for your input.

type

The type of input. For collecting logs, set this to filestream.

streams.id

A unique identifier for your stream of log data.

If you’re following the guided instructions in your project, this will be prepopulated with the value you specified in Configure inputs and integration.

data_stream.dataset

The name for your dataset data stream. Name this data stream anything that signifies the source of the data. In this configuration, the dataset is set to example. The default value is generic.

If you’re following the guided instructions in your project, this will be prepopulated with the value you specified in Configure inputs and integration.

paths

The path to your log files. You can also use a pattern like /var/log/your-logs.log*.

If you’re following the guided instructions in your project, this will be prepopulated with the value you specified in Configure inputs and integration.

Restart the Elastic Agentedit

After updating your configuration file, you need to restart the Elastic Agent.

First, stop the Elastic Agent and its related executables using the command that works with your system:

sudo launchctl unload /Library/LaunchDaemons/co.elastic.elastic-agent.plist

Elastic Agent will restart automatically if the system is rebooted.

Next, restart the Elastic Agent using the command that works with your system:

sudo launchctl load /Library/LaunchDaemons/co.elastic.elastic-agent.plist
Troubleshoot your Elastic Agent configuration
edit

If you’re not seeing your log files in your project, verify the following in the elastic-agent.yml file:

  • The path to your logs file under paths is correct.
  • Your API key is in <id>:<key> format. If not, your API key may be in an unsupported format, and you’ll need to create an API key in Beats format.

If you’re still running into issues, refer to Elastic Agent troubleshooting and Configure standalone Elastic Agents.

Next steps
edit

After you have your agent configured and are streaming log data to your project:

  • Refer to the Parse and organize logs documentation for information on extracting structured fields from your log data, rerouting your logs to different data streams, and filtering and aggregating your log data.
  • Refer to the Filter and aggregate logs documentation for information on filtering and aggregating your log data to find specific information, gain insight, and monitor your systems more efficiently.