Noun
tokenizer (plural tokenizers) (computing) A system that parses an input stream into its component tokens.