New

The executive guide to generative AI

Read more

Normalization Token Filter

edit

Normalization Token Filter

edit

There are several token filters available which try to normalize special characters of a certain language.

You can currently choose between arabic_normalization and persian_normalization normalization in your token filter configuration. For more information check the ArabicNormalizer or the PersianNormalizer documentation.

Note: These filters are available since 0.90.2

Was this helpful?
Feedback