org.allenai.nlpstack.core
A tokenizer takes a sentence string as input and separates words (tokens) along word (token) boundaries.
A tokenizer takes a sentence string as input and separates words (tokens) along word (token) boundaries.