Class NGramTokenFilter
- java.lang.Object
-
- org.apache.lucene.util.AttributeSource
-
- org.apache.lucene.analysis.TokenStream
-
- org.apache.lucene.analysis.TokenFilter
-
- org.apache.lucene.analysis.ngram.NGramTokenFilter
-
- All Implemented Interfaces:
java.io.Closeable
,java.lang.AutoCloseable
public final class NGramTokenFilter extends TokenFilter
Tokenizes the input into n-grams of the given size(s).You must specify the required
Version
compatibility when creating aNGramTokenFilter
. As of Lucene 4.4, this token filters:- handles supplementary characters correctly,
- emits all n-grams for the same token at the same position,
- does not modify offsets,
- sorts n-grams by their offset in the original token first, then increasing length (meaning that "abc" will give "a", "ab", "abc", "b", "bc", "c").
You can make this filter use the old behavior by providing a version <
Version.LUCENE_44
in the constructor but this is not recommended as it will lead to brokenTokenStream
s that will cause highlighting bugs.If you were using this
TokenFilter
to perform partial highlighting, this won't work anymore since this filter doesn't update offsets. You should modify your analysis chain to useNGramTokenizer
, and potentially overrideNGramTokenizer.isTokenChar(int)
to perform pre-tokenization.
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from class org.apache.lucene.util.AttributeSource
AttributeSource.AttributeFactory, AttributeSource.State
-
-
Field Summary
Fields Modifier and Type Field Description static int
DEFAULT_MAX_NGRAM_SIZE
static int
DEFAULT_MIN_NGRAM_SIZE
-
Constructor Summary
Constructors Constructor Description NGramTokenFilter(Version version, TokenStream input)
Creates NGramTokenFilter with default min and max n-grams.NGramTokenFilter(Version version, TokenStream input, int minGram, int maxGram)
Creates NGramTokenFilter with given min and max n-grams.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description boolean
incrementToken()
Returns the next token in the stream, or null at EOS.void
reset()
This method is called by a consumer before it begins consumption usingTokenStream.incrementToken()
.-
Methods inherited from class org.apache.lucene.analysis.TokenFilter
close, end
-
Methods inherited from class org.apache.lucene.util.AttributeSource
addAttribute, addAttributeImpl, captureState, clearAttributes, cloneAttributes, copyTo, equals, getAttribute, getAttributeClassesIterator, getAttributeFactory, getAttributeImplsIterator, hasAttribute, hasAttributes, hashCode, reflectAsString, reflectWith, restoreState, toString
-
-
-
-
Field Detail
-
DEFAULT_MIN_NGRAM_SIZE
public static final int DEFAULT_MIN_NGRAM_SIZE
- See Also:
- Constant Field Values
-
DEFAULT_MAX_NGRAM_SIZE
public static final int DEFAULT_MAX_NGRAM_SIZE
- See Also:
- Constant Field Values
-
-
Constructor Detail
-
NGramTokenFilter
public NGramTokenFilter(Version version, TokenStream input, int minGram, int maxGram)
Creates NGramTokenFilter with given min and max n-grams.- Parameters:
version
- Lucene version to enable correct position increments. See above for details.input
-TokenStream
holding the input to be tokenizedminGram
- the smallest n-gram to generatemaxGram
- the largest n-gram to generate
-
NGramTokenFilter
public NGramTokenFilter(Version version, TokenStream input)
Creates NGramTokenFilter with default min and max n-grams.- Parameters:
version
- Lucene version to enable correct position increments. See above for details.input
-TokenStream
holding the input to be tokenized
-
-
Method Detail
-
incrementToken
public final boolean incrementToken() throws java.io.IOException
Returns the next token in the stream, or null at EOS.- Specified by:
incrementToken
in classTokenStream
- Returns:
- false for end of stream; true otherwise
- Throws:
java.io.IOException
-
reset
public void reset() throws java.io.IOException
Description copied from class:TokenFilter
This method is called by a consumer before it begins consumption usingTokenStream.incrementToken()
.Resets this stream to a clean state. Stateful implementations must implement this method so that they can be reused, just as if they had been created fresh.
If you override this method, always call
super.reset()
, otherwise some internal state will not be correctly reset (e.g.,Tokenizer
will throwIllegalStateException
on further usage).NOTE: The default implementation chains the call to the input TokenStream, so be sure to call
super.reset()
when overriding this method.- Overrides:
reset
in classTokenFilter
- Throws:
java.io.IOException
-
-