WARNING: Version 5.4 of Elasticsearch has passed its EOL date.
This documentation is no longer being maintained and may be removed. If you are running this version, we strongly advise you to upgrade. For the latest information, see the current release documentation.
Keyword Tokenizer
editKeyword Tokenizer
editThe keyword
tokenizer is a “noop” tokenizer that accepts whatever text it
is given and outputs the exact same text as a single term. It can be combined
with token filters to normalise output, e.g. lower-casing email addresses.
Example output
editPOST _analyze { "tokenizer": "keyword", "text": "New York" }
The above sentence would produce the following term:
[ New York ]
Configuration
editThe keyword
tokenizer accepts the following parameters:
|
The number of characters read into the term buffer in a single pass.
Defaults to |