It is time to say goodbye: This version of Elastic Cloud Enterprise has reached end-of-life (EOL) and is no longer supported.
The documentation for this version is no longer being maintained. If you are running this version, we strongly advise you to upgrade. For the latest information, see the current release documentation.
Indexing data into Elasticsearch
editIndexing data into Elasticsearch
editBy now you’ve probably spun up a deployment and might be wondering what’s next. Congratulations on completing that first big step! Now let’s help you do something with it. You likely have data that you want to add, known as ingesting or indexing, to Elasticsearch, so let’s explore some options.
Migrating data
editIf you want to move your existing Elasticsearch data into your new infrastructure, check out the migration options. You’ll find instructions to guide you through:
- Migrating data from its original source
- Reindexing data from a remote Elasticsearch cluster
- Restoring data from a snapshot
- Migrating internal Elasticsearch indices
Ingestion methods
editWhen it comes to delivering your data into the Elastic Stack, a variety of options are available. All of the documentation and tutorials listed here rely on one of four ingestion methods: Elastic Agent, Beats, Logstash, or a direct connection from a client application. You can use these options individually or in combination.
Elastic Agent and Fleet
editElastic Agent offers a single, unified way to enable shipping monitoring data from multiple hosts or containers into the Elastic Stack. Elastic Agent serves as a convenient front end that uses Beats shippers or Elastic Endpoint under the covers. See the Elastic Agents product documentation to learn more.
Fleet provides a web-based UI in Kibana to add and manage integrations for popular services and platforms, as well as manage a fleet of Elastic Agents. To learn more about how these work, see the Fleet and Elastic Agent overview.
Beats
editThese lightweight shippers get installed as agents on the endpoint. The data that they collect gets pushed back to the Elasticsearch cluster. Each Beat is created with a specific purpose, making it possible to target and aggregate common data from a multitude of endpoints. Some Beats, like Filebeat, have modules that specialize them even further. Configure the Beat to use the Cloud ID to simplify sending data back to your deployment. TLS, basic authentication, and API key authentication are available to ensure that your data is shipped securely.
Here’s a short and incomplete list of the types of data that Beats can handle: log files, metrics, audit data, network traffic, uptime monitoring, and more. Learn about the possibilities from the Beats documentation.
Logstash
editLogstash is an open source data collection engine with real-time pipelining capabilities. It can accept data that is pushed to it as well as pull data from external sources. Logstash has the added benefit of being able to persist information in queues if the cluster temporarily cannot accept data, smoothing out ingestion spikes. However, Logstash is not available as part of the Elastic Cloud deployment and requires separate installation and maintenance. Use the Cloud ID to configure Logstash to work with a deployment. And, as with Beats, you have the option of using basic authentication or an API key for secure data transmission.
Learn more about the possible inputs, filters, and outputs from the Logstash documentation.
Try out sample data
editThere are number of ways for you to get a sample data set ingested into Elasticsearch. This gives you a convenient way to test drive the broad set of Kibana tools and visualizations before ingesting your own data. Several data packages are available with a one-click installation, as well as a makelogs
script and a simple CSV upload method for more ingest options.
To learn more, see Installing sample data.
Ingest data with Elastic solutions
editWhen you use one of the Elastic solutions, it’s already tailor-made to your use case, such as for application or workplace search, observability, or security. For these, you can choose from one of the following guides to find the most suitable data ingestion steps and examples for your needs.
Enterprise Search
edit- Getting started with App Search
- The App Search getting started documentation and video can help you to index an initial set of documents to create a custom search experience in your applications.
- Workplace Search Content Sources Overview
- Learn how to integrate Workplace Search with a variety of third-party content sources such as GitHub, Google Drive, or Dropbox. You can also build your own connectors using custom API sources, allowing you to search unique content repositories and ingest that data into Workplace Search.
- Enterprise Search web crawler
- The Elastic Enterprise Search web crawler, currently a beta feature, discovers, extracts, and indexes your web content into your App Search engines.
Observability
edit- Send data to Elasticsearch
- These steps get you started with the ingestion aspects of Elastic Observability, detailing how to configure Elasticsearch to store and search your data, and Kibana to visualize and manage it. You can also set up APM Server as part of an Elastic Cloud Enterprise deployment and then configure APM agents to send data into the deployment.
- Tutorials
- Try out these tutorials to guide you through specific observability scenarios, including monitoring data from AWS, GCP, or Azure, in a Java application, or from Kubernetes.
Security
edit- Ingest data to Elastic Security
- This guide presents options for ingesting data into Elastic Security, including using the Elastic Agent with Elastic Endpoint Integration, using Beats shippers installed on the systems that you want to monitor, using Elastic Agent with Splunk, and using third party collectors shipping ECS-compliant data.
- Ingest data into SIEM
- Learn about how to ingest data into the Elastic SIEM app (now part of the Elastic Security solution), including using Beats shippers installed on the systems that you want to monitor, using Elastic Endpoint Security to ship data directly to Elasticsearch, and using third party collectors shipping ECS-compliant data.