A custom tokenizer for a grammar.
A string with the code this grammar needs to tokenize.
The grammar with the custom tokenizer
A token stream representing the matched code.
tokenize symbol for more info.
Generated using TypeDoc
A custom tokenizer for a grammar.