org.allenai.nlpstack.parse.poly.decisiontree

DecisionTree

Related Docs: object DecisionTree | package decisiontree

case class DecisionTree(outcomes: Iterable[Int], child: IndexedSeq[Map[Int, Int]], splittingFeature: IndexedSeq[Option[Int]], outcomeHistograms: IndexedSeq[Map[Int, Int]]) extends ProbabilisticClassifier with Product with Serializable

Immutable decision tree for integer-valued features and outcomes.

Each data structure is an indexed sequence of properties. The ith element of each sequence is the property of node i of the decision tree.

outcomes

all possible outcomes for the decision tree

child

stores the children of each node (as a map from feature values to node ids)

splittingFeature

stores the feature that each node splits on; can be None for leaf nodes

outcomeHistograms

for each node, stores a map of outcomes to their frequency of appearance at that node (i.e. how many times a training vector with that outcome makes it to this node during classification)

Linear Supertypes
Serializable, Serializable, Product, Equals, ProbabilisticClassifier, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. DecisionTree
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. ProbabilisticClassifier
  7. AnyRef
  8. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new DecisionTree(outcomes: Iterable[Int], child: IndexedSeq[Map[Int, Int]], splittingFeature: IndexedSeq[Option[Int]], outcomeHistograms: IndexedSeq[Map[Int, Int]])

    outcomes

    all possible outcomes for the decision tree

    child

    stores the children of each node (as a map from feature values to node ids)

    splittingFeature

    stores the feature that each node splits on; can be None for leaf nodes

    outcomeHistograms

    for each node, stores a map of outcomes to their frequency of appearance at that node (i.e. how many times a training vector with that outcome makes it to this node during classification)

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  4. def allFeatures: Set[Int]

    All features used in the decision tree.

    All features used in the decision tree.

    Definition Classes
    DecisionTreeProbabilisticClassifier
  5. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  6. val child: IndexedSeq[Map[Int, Int]]

    stores the children of each node (as a map from feature values to node ids)

  7. def classify(featureVector: FeatureVector): (Int, Option[Justification])

    Classifies an feature vector and optionally returns a "justification" for the classification decision.

    Classifies an feature vector and optionally returns a "justification" for the classification decision.

    featureVector

    feature vector to classify

    returns

    (predicted outcome, optional justification for the prediction)

    Definition Classes
    ProbabilisticClassifier
  8. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  9. lazy val decisionPaths: IndexedSeq[DTDecisionPath]

    The kth element of this sequence is node k's "decision path", i.e.

    The kth element of this sequence is node k's "decision path", i.e. the sequence of decisions that lead from the root to node k.

  10. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  11. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def findDecisionPoint(featureVector: FeatureVector, nodeId: Int = 0): Int

    Finds the "decision point" of the specified feature vector.

    Finds the "decision point" of the specified feature vector. This is the node for which no child covers the feature vector.

    featureVector

    feature vector to classify

    returns

    the decision tree node that the feature vector is classified into

    Attributes
    protected
    Annotations
    @tailrec()
  13. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  14. def getNodeDivergenceScore(node: Int): Double

  15. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  19. def outcomeDistribution(featureVector: FeatureVector): (OutcomeDistribution, Option[Justification])

    Gets a probability distribution over possible outcomes..

    Gets a probability distribution over possible outcomes..

    featureVector

    feature vector to compute the distribution for

    returns

    probability distribution of outcomes according to training data

    Definition Classes
    DecisionTreeProbabilisticClassifier
  20. val outcomeDistributionRoot: Map[Int, Float]

  21. def outcomeHistogram(featureVector: FeatureVector): Map[Int, Int]

  22. val outcomeHistograms: IndexedSeq[Map[Int, Int]]

    for each node, stores a map of outcomes to their frequency of appearance at that node (i.e.

    for each node, stores a map of outcomes to their frequency of appearance at that node (i.e. how many times a training vector with that outcome makes it to this node during classification)

  23. val outcomes: Iterable[Int]

    all possible outcomes for the decision tree

  24. def print(featureNames: Vector[String], outcomeNames: Vector[String], nodeId: Int = 0, tabbing: String = ""): Unit

    Prints the decision tree to stdout.

  25. def selectChild(nodeId: Int, featureVector: FeatureVector): Option[Int]

    From a particular node, chooses the correct child according to the feature vector and the node's splitting feature (if there is one).

    From a particular node, chooses the correct child according to the feature vector and the node's splitting feature (if there is one).

    nodeId

    the id of the node

    featureVector

    the feature vector

    returns

    the node id of the correct child (if there is one)

    Attributes
    protected
  26. val splittingFeature: IndexedSeq[Option[Int]]

    stores the feature that each node splits on; can be None for leaf nodes

  27. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  28. lazy val topologicalOrder: Seq[Int]

    A topological order of the decision tree nodes (where the root is the first node).

  29. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from ProbabilisticClassifier

Inherited from AnyRef

Inherited from Any

Ungrouped