WARNING: Version 5.0 of Elasticsearch has passed its EOL date.
This documentation is no longer being maintained and may be removed. If you are running this version, we strongly advise you to upgrade. For the latest information, see the current release documentation.
Keyword Repeat Token Filter
editKeyword Repeat Token Filter
editThe keyword_repeat
token filter Emits each incoming token twice once
as keyword and once as a non-keyword to allow an unstemmed version of a
term to be indexed side by side with the stemmed version of the term.
Given the nature of this filter each token that isn’t transformed by a
subsequent stemmer will be indexed twice. Therefore, consider adding a
unique
filter with only_on_same_position
set to true
to drop
unnecessary duplicates.
Here is an example:
index : analysis : analyzer : myAnalyzer : type : custom tokenizer : standard filter : [lowercase, keyword_repeat, porter_stem, unique_stem] unique_stem: type: unique only_on_same_position : true