org.allenai.nlpstack.parse.poly.decisiontree
all possible outcomes for the decision tree
stores the children of each node (as a map from feature values to node ids)
stores the feature that each node splits on; can be None for leaf nodes
for each node, stores a map of outcomes to their frequency of appearance at that node (i.e. how many times a training vector with that outcome makes it to this node during classification)
All features used in the decision tree.
All features used in the decision tree.
stores the children of each node (as a map from feature values to node ids)
Classifies an feature vector.
Classifies an feature vector.
feature vector to classify
predicted outcome
The probability distribution over outcomes for each node of the decision tree.
The probability distribution over outcomes for each node of the decision tree.
If this tree was trained with DecisionTreeTrainer, then the distribution is Laplacian-smoothed assuming one count for each label in the training data.
Finds the "decision point" of the specified feature vector.
Finds the "decision point" of the specified feature vector. This is the node for which no child covers the feature vector.
feature vector to classify
the decision tree node that the feature vector is classified into
Gets a probability distribution over possible outcomes..
Gets a probability distribution over possible outcomes..
feature vector to compute the distribution for
probability distribution of outcomes according to training data
for each node, stores a map of outcomes to their frequency of appearance at that node (i.e.
for each node, stores a map of outcomes to their frequency of appearance at that node (i.e. how many times a training vector with that outcome makes it to this node during classification)
all possible outcomes for the decision tree
Prints the decision tree to stdout.
From a particular node, chooses the correct child according to the feature vector and the node's splitting feature (if there is one).
From a particular node, chooses the correct child according to the feature vector and the node's splitting feature (if there is one).
the id of the node
the feature vector
the node id of the correct child (if there is one)
stores the feature that each node splits on; can be None for leaf nodes
Immutable decision tree for integer-valued features and outcomes.
Each data structure is an indexed sequence of properties. The ith element of each sequence is the property of node i of the decision tree.
all possible outcomes for the decision tree
stores the children of each node (as a map from feature values to node ids)
stores the feature that each node splits on; can be None for leaf nodes
for each node, stores a map of outcomes to their frequency of appearance at that node (i.e. how many times a training vector with that outcome makes it to this node during classification)