Skip to content

Words and Morphology

- Tokenization: the process that converts running text (i.e., a sequence of characters) into a sequence of tokens.