FastTokenizer is the fastest way to tokenize JSON in ScalaJack.
ValidTokenizer is a fair bit slower than FastTokenizer, but...
ValidTokenizer is a fair bit slower than FastTokenizer, but... you get full JSON validaton. You also don't need to worry about estimating element capacity as this uses dynamic array buffers. ValidTokenizer will give you high-quality information if the JSON is wrong, so it's ideal for JSON you aren't confident about.
If your code is _really_ sophisticated, maybe it can first try to use the FastTokenizer, and catch failures in a try block. The resolution of the error might be to re-parse the troublesome JSON with the ValidTokenizer for a decent error message. Hmm...
As with all the tokenizers, this is *NOT* thread-safe! Don't share these across threads!
FastTokenizer is the fastest way to tokenize JSON in ScalaJack. It's fast. It gets its speed, in part, by assuming the given JSON is correct, so there's almost nothing in the way of checking. If it does fail, it blows up with exceptions--not all of them helpful.
Note there's a lot of "primitive" code here, e.g. avoidance of data structures/classes in favor of basic Arrays, etc. This is for speed.
FastTokenizer is best used when you know your JSON is reliable and correct.
The capacity parameter requires some guess about the JSON data you have. It is the size of the arrays created to hold all the index (element) data. It should be safely larger than the maximum number of token elements you reasonably expect to see in a JSON string. For example "[1,2,3]" has 7 elements, while "{"one":1239,"two":false, "three":"hey"} as 8. If you have a weak stomach it can be equal to the size of the largest JSON string, but that's probably wild overkill.
As with all the tokenizers, this is *NOT* thread-safe! Don't share these across threads!