Look up tokenization or tokenisation in Wiktionary, the free dictionary.
Tokenization may refer to:
- Tokenization (lexical analysis) in language processing
- Tokenization in large language models
- Tokenization in search engine indexing
- Tokenization (data security)
- Asset tokenization in finance
- Word segmentation
- A procedure during the Transformer architecture
See also
edit- Tokenism of minorities