Connect Agents to Elasticsearch with Model Context Protocol

Let’s use Model Context Protocol server to chat with your data in Elasticsearch.

What if interacting with your data was as effortless as chatting with a colleague? Imagine simply asking, "Show me all orders over $500 from last month" or "Which products received the most 5-star reviews?" and getting instant, accurate answers, no querying required.

Model Context Protocol (MCP) makes this possible. It seamlessly connects conversational AI with your databases and external APIs, transforming complex requests into natural conversations. While modern LLMs are great at understanding language, their true potential is unlocked when integrated with real-world systems. MCP bridges the gap between them, making data interaction more intuitive and efficient.

In this post, we’ll explore:

Exciting times ahead! MCP's integration with your Elastic stack transforms how you interact with information, making complex queries as intuitive as everyday conversation.

Model Context Protocol

Model Context Protocol (MCP), developed by Anthropic, is an open standard that connects AI models to external data sources through secure, bidirectional channels. It solves a major AI limitation: real-time access to external systems while preserving conversation context.

MCP architecture

Model Context Protocol architecture consists of two key components:

  • MCP Clients – AI assistants and chatbots that request information or execute tasks on behalf of users.
  • MCP Servers – Data repositories, search engines, and APIs that retrieve relevant information or perform requested actions (e.g., calling external APIs).

MCP Servers expose two primary capabilities to Clients:

  • Resources - Structured data, documents, and content that can be retrieved and used as context for LLM interactions. This allows AI assistants to access relevant information from databases, search indexes, or other sources.
  • Tools - Executable functions that enable LLMs to interact with external systems, perform computations, or take real-world actions. These tools extend AI capabilities beyond text generation, allowing assistants to trigger workflows, call APIs, or manipulate data dynamically.
  • Prompts - Reusable prompt templates and workflows to standardize and share common LLM interactions.
  • Sampling - Request LLM completions through the client to enable sophisticated agentic behaviors while maintaining security and privacy.

MCP server + Elasticsearch

Traditional Retrieval-Augmented Generation (RAG) systems retrieve documents based on user queries, but MCP takes it a step further: it enables AI agents to dynamically construct and execute tasks in real time. This allows users to ask natural language questions like:

  • "Show me all orders over $500 from last month."
  • "Which products received the most 5-star reviews?"

And get instant, precise answers, without writing a single query.

MCP achieves this through:

  • Dynamic tool selection – Agents intelligently choose the right tools exposed via MCP servers based on user intent. “Smarter” LLMs are generally better at selecting the right tools with the appropriate arguments based on context.
  • Bidirectional communication – Agents and data sources exchange information fluidly, refining queries as needed (e.g. lookup index mapping first, only then construct the ES query).
  • Multi-tool orchestration – Workflows can leverage tools from multiple MCP servers simultaneously.
  • Persistent context – Agents remember previous interactions, maintaining continuity across conversations.

An MCP server connected to Elasticsearch unlocks a powerful real-time retrieval architecture. AI agents can explore, query, and analyze Elasticsearch data on demand. Your data can be searchable through a simple chat interface.

Beyond just retrieving data, MCP enables action. It integrates with other tools to trigger workflows, automate processes, and feed insights into analytics systems. By separating search from execution, MCP keeps AI-powered applications flexible, up-to-date, and seamlessly integrated into agentic workflows.

Hands on: MCP server to chat with your Elasticsearch data

To interact with Elasticsearch via an MCP server, we need at least functions to:

  • Retrieve indices
  • Obtain mappings
  • Perform searches using Elasticsearch’s Query DSL

Our server is written in TypeScript, and we will be using the official MCP TypeScript SDK. For setup, we recommend installing the Claude Desktop App (the free version is sufficient) since it includes a built-in MCP Client. Our MCP server essentially exposes the official JavaScript Elasticsearch client through MCP tools.

Let’s start by defining the Elasticsearch client and MCP server:

const esClient = new Client({
node: url,
auth: {
apiKey: apiKey,
},
});
const server = new McpServer({
name: "elasticsearch-mcp-server",
version: "0.1.0",
});

We will use following MCP server tools that can interact with Elasticsearch:

  • List Indices (list_indices): This tool retrieves all available Elasticsearch indices, providing details such as index name, health status, and document count.
  • Get Mappings (get_mappings): This tool fetches the field mappings for a specified Elasticsearch index, helping users understand the structure and data types of stored documents.
  • Search (search): This tool executes an Elasticsearch search using a provided Query DSL. It automatically enables highlights for text fields, making it easier to identify relevant search results.

The full Elasticsearch MCP server implementation is available in the elastic/mcp-server-elasticsearch repo.

Chat with your index

Let's explore how to set up the Elasticsearch MCP server so you can ask natural language questions about your data, such as "Find all orders over $500 from last month."

Configure your Claude Desktop App

  • Open the Claude Desktop App
  • Navigate to Settings > Developer > MCP Servers
  • Click "Edit Config" and add this configuration to your claude_desktop_config.json:
{
"mcpServers": {
"Elasticsearch MCP Server": {
"command": "npx",
"args": [
"-y",
"@elastic/mcp-server-elasticsearch"
],
"env": {
"ES_URL": "",
"ES_API_KEY": ""
}
}
}
}

Note: This setup utilizes the @elastic/mcp-server-elasticsearch npm package published by Elastic. If you want to develop locally, you can find more details on spinning up the Elasticsearch MCP server here.

Populate your Elasticseach index

  • You can use our example data to populate the "orders" index for this demo
  • This will allow you to try queries like "Find all orders over $500 from last month"

Start using it

  • Open a new conversation in the Claude Desktop App
  • The MCP server will connect automatically
  • Start asking questions about your Elasticsearch data!

Check out this demo to see how easy it is to query your Elasticsearch data using natural language:

How does it work?

When asked 'Find all orders over $500 from last month,' the LLM recognizes the intent of searching the Elasticsearch index with specified constraints. To perform an effective search, the agent figures to:

  • Figure out the index name: orders
  • Understand the mappings of orders index
  • Build the Query DSL compatible with index mappings and finally execute the search request

This interaction can be represented as:

Conclusion

Model Context Protocol enhances how you interact with Elasticsearch data, enabling natural language conversations instead of complex queries. By bridging AI capabilities with your data, MCP creates a more intuitive and efficient workflow that maintains context throughout your interactions.

The Elasticsearch MCP server is available as a public npm package (@elastic/mcp-server-elasticsearch), making integration straightforward for developers. With minimal setup, your team can start exploring data, triggering workflows, and gaining insights through simple conversations.

Ready to experience it for yourself? Try out the Elasticsearch MCP server today and start chatting with your data.

Want to get Elastic certified? Find out when the next Elasticsearch Engineer training is running!

Elasticsearch is packed with new features to help you build the best search solutions for your use case. Dive into our sample notebooks to learn more, start a free cloud trial, or try Elastic on your local machine now.

Related content

Ready to build state of the art search experiences?

Sufficiently advanced search isn’t achieved with the efforts of one. Elasticsearch is powered by data scientists, ML ops, engineers, and many more who are just as passionate about search as your are. Let’s connect and work together to build the magical search experience that will get you the results you want.

Try it yourself