- Java REST Client (deprecated): other versions:
- Overview
- Java Low Level REST Client
- Java High Level REST Client
- Getting started
- Document APIs
- Search APIs
- Miscellaneous APIs
- Indices APIs
- Analyze API
- Create Index API
- Delete Index API
- Indices Exists API
- Open Index API
- Close Index API
- Shrink Index API
- Split Index API
- Refresh API
- Flush API
- Flush Synced API
- Clear Cache API
- Force Merge API
- Rollover Index API
- Put Mapping API
- Get Mappings API
- Get Field Mappings API
- Index Aliases API
- Exists Alias API
- Get Alias API
- Update Indices Settings API
- Get Settings API
- Put Template API
- Validate Query API
- Get Templates API
- Templates Exist API
- Get Index API
- Freeze Index API
- Unfreeze Index API
- Delete Template API
- Cluster APIs
- Ingest APIs
- Snapshot APIs
- Tasks APIs
- Script APIs
- Licensing APIs
- Machine Learning APIs
- Put Job API
- Get Job API
- Delete Job API
- Open Job API
- Close Job API
- Update Job API
- Flush Job API
- Put Datafeed API
- Update Datafeed API
- Get Datafeed API
- Delete Datafeed API
- Preview Datafeed API
- Start Datafeed API
- Stop Datafeed API
- Get Datafeed Stats API
- Get Job Stats API
- Forecast Job API
- Delete Forecast API
- Get Buckets API
- Get Overall Buckets API
- Get Records API
- Post Data API
- Get Influencers API
- Get Categories API
- Get Calendars API
- Put Calendar API
- Get Calendar Events API
- Post Calendar Event API
- Delete Calendar Event API
- Put Calendar Job API
- Delete Calendar Job API
- Delete Calendar API
- Put Filter API
- Get Filters API
- Update Filter API
- Delete Filter API
- Get Model Snapshots API
- Delete Model Snapshot API
- Revert Model Snapshot API
- Update Model Snapshot API
- ML Get Info API
- Delete Expired Data API
- Set Upgrade Mode API
- Migration APIs
- Rollup APIs
- Security APIs
- Put User API
- Get Users API
- Delete User API
- Enable User API
- Disable User API
- Change Password API
- Put Role API
- Get Roles API
- Delete Role API
- Delete Privileges API
- Get Privileges API
- Clear Roles Cache API
- Clear Realm Cache API
- Authenticate API
- Has Privileges API
- Get User Privileges API
- SSL Certificate API
- Put Role Mapping API
- Get Role Mappings API
- Delete Role Mapping API
- Create Token API
- Invalidate Token API
- Put Privileges API
- Create API Key API
- Get API Key information API
- Invalidate API Key API
- Watcher APIs
- Graph APIs
- CCR APIs
- Index Lifecycle Management APIs
- Using Java Builders
- Migration Guide
- License
Usage
editUsage
editOnce a RestClient
instance has been created as shown in Initialization,
a Sniffer
can be associated to it. The Sniffer
will make use of the provided RestClient
to periodically (every 5 minutes by default) fetch the list of current nodes from the cluster
and update them by calling RestClient#setNodes
.
RestClient restClient = RestClient.builder( new HttpHost("localhost", 9200, "http")) .build(); Sniffer sniffer = Sniffer.builder(restClient).build();
It is important to close the Sniffer
so that its background thread gets
properly shutdown and all of its resources are released. The Sniffer
object should have the same lifecycle as the RestClient
and get closed
right before the client:
sniffer.close(); restClient.close();
The Sniffer
updates the nodes by default every 5 minutes. This interval can
be customized by providing it (in milliseconds) as follows:
RestClient restClient = RestClient.builder( new HttpHost("localhost", 9200, "http")) .build(); Sniffer sniffer = Sniffer.builder(restClient) .setSniffIntervalMillis(60000).build();
It is also possible to enable sniffing on failure, meaning that after each
failure the nodes list gets updated straightaway rather than at the following
ordinary sniffing round. In this case a SniffOnFailureListener
needs to
be created at first and provided at RestClient
creation. Also once the
Sniffer
is later created, it needs to be associated with that same
SniffOnFailureListener
instance, which will be notified at each failure
and use the Sniffer
to perform the additional sniffing round as described.
SniffOnFailureListener sniffOnFailureListener = new SniffOnFailureListener(); RestClient restClient = RestClient.builder( new HttpHost("localhost", 9200)) .setFailureListener(sniffOnFailureListener) .build(); Sniffer sniffer = Sniffer.builder(restClient) .setSniffAfterFailureDelayMillis(30000) .build(); sniffOnFailureListener.setSniffer(sniffer);
Set the failure listener to the |
|
When sniffing on failure, not only do the nodes get updated after each
failure, but an additional sniffing round is also scheduled sooner than usual,
by default one minute after the failure, assuming that things will go back to
normal and we want to detect that as soon as possible. Said interval can be
customized at |
|
Set the |
The Elasticsearch Nodes Info api doesn’t return the protocol to use when
connecting to the nodes but only their host:port
key-pair, hence http
is used by default. In case https
should be used instead, the
ElasticsearchNodesSniffer
instance has to be manually created and provided
as follows:
RestClient restClient = RestClient.builder( new HttpHost("localhost", 9200, "http")) .build(); NodesSniffer nodesSniffer = new ElasticsearchNodesSniffer( restClient, ElasticsearchNodesSniffer.DEFAULT_SNIFF_REQUEST_TIMEOUT, ElasticsearchNodesSniffer.Scheme.HTTPS); Sniffer sniffer = Sniffer.builder(restClient) .setNodesSniffer(nodesSniffer).build();
In the same way it is also possible to customize the sniffRequestTimeout
,
which defaults to one second. That is the timeout
parameter provided as a
querystring parameter when calling the Nodes Info api, so that when the
timeout expires on the server side, a valid response is still returned
although it may contain only a subset of the nodes that are part of the
cluster, the ones that have responded until then.
RestClient restClient = RestClient.builder( new HttpHost("localhost", 9200, "http")) .build(); NodesSniffer nodesSniffer = new ElasticsearchNodesSniffer( restClient, TimeUnit.SECONDS.toMillis(5), ElasticsearchNodesSniffer.Scheme.HTTP); Sniffer sniffer = Sniffer.builder(restClient) .setNodesSniffer(nodesSniffer).build();
Also, a custom NodesSniffer
implementation can be provided for advanced
use-cases that may require fetching the `Node`s from external sources rather
than from Elasticsearch:
RestClient restClient = RestClient.builder( new HttpHost("localhost", 9200, "http")) .build(); NodesSniffer nodesSniffer = new NodesSniffer() { @Override public List<Node> sniff() throws IOException { return null; } }; Sniffer sniffer = Sniffer.builder(restClient) .setNodesSniffer(nodesSniffer).build();