New

The executive guide to generative AI

Read more

Keyword Tokenizer

edit

A tokenizer of type keyword that emits the entire input as a single output.

The following are settings that can be set for a keyword tokenizer type:

Setting Description

buffer_size

The term buffer size. Defaults to 256.

Was this helpful?
Feedback