tokenizer

English

Etymology

tokenize + -er

Noun

tokenizer (plural tokenizers)

  1. (computing) A system that parses an input stream into its component tokens.
This article is issued from Wiktionary. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.