public class FT extends Classifier implements OptionHandler, AdditionalMeasureProducer, Drawable, TechnicalInformationHandler
@article{Gama2004, author = {Joao Gama}, booktitle = {Machine Learning}, number = {3}, pages = {219-250}, title = {Functional Trees}, volume = {55}, year = {2004} } @article{Landwehr2005, author = {Niels Landwehr and Mark Hall and Eibe Frank}, booktitle = {Machine Learning}, number = {1-2}, pages = {161-205}, title = {Logistic Model Trees}, volume = {95}, year = {2005} }Valid options are:
-B Binary splits (convert nominal attributes to binary ones)
-P Use error on probabilities instead of misclassification error for stopping criterion of LogitBoost.
-I <numIterations> Set fixed number of iterations for LogitBoost (instead of using cross-validation)
-F <modelType> Set Funtional Tree type to be generate: 0 for FT, 1 for FTLeaves and 2 for FTInner
-M <numInstances> Set minimum number of instances at which a node can be split (default 15)
-W <beta> Set beta for weight trimming for LogitBoost. Set to 0 (default) for no weight trimming.
-A The AIC is used to choose the best iteration.
Modifier and Type | Field and Description |
---|---|
static int |
MODEL_FT
model types
|
static int |
MODEL_FTInner |
static int |
MODEL_FTLeaves |
static Tag[] |
TAGS_MODEL
possible model types.
|
BayesNet, Newick, NOT_DRAWABLE, TREE
Constructor and Description |
---|
FT()
Creates an instance of FT with standard options
|
Modifier and Type | Method and Description |
---|---|
String |
binSplitTipText()
Returns the tip text for this property
|
void |
buildClassifier(Instances data)
Builds the classifier.
|
double |
classifyInstance(Instance instance)
Classifies an instance.
|
double[] |
distributionForInstance(Instance instance)
Returns class probabilities for an instance.
|
Enumeration |
enumerateMeasures()
Returns an enumeration of the additional measure names
|
String |
errorOnProbabilitiesTipText()
Returns the tip text for this property
|
boolean |
getBinSplit()
Get the value of binarySplits.
|
Capabilities |
getCapabilities()
Returns default capabilities of the classifier.
|
boolean |
getErrorOnProbabilities()
Get the value of errorOnProbabilities.
|
double |
getMeasure(String additionalMeasureName)
Returns the value of the named measure
|
int |
getMinNumInstances()
Get the value of minNumInstances.
|
SelectedTag |
getModelType()
Get the type of functional tree model being used.
|
int |
getNumBoostingIterations()
Get the value of numBoostingIterations.
|
String[] |
getOptions()
Gets the current settings of the Classifier.
|
String |
getRevision()
Returns the revision string.
|
TechnicalInformation |
getTechnicalInformation()
Returns an instance of a TechnicalInformation object, containing
detailed information about the technical background of this class,
e.g., paper reference or book this class is based on.
|
boolean |
getUseAIC()
Get the value of useAIC.
|
double |
getWeightTrimBeta()
Get the value of weightTrimBeta.
|
String |
globalInfo()
Returns a string describing classifier
|
String |
graph()
Returns graph describing the tree.
|
int |
graphType()
Returns the type of graph this classifier
represents.
|
Enumeration |
listOptions()
Returns an enumeration describing the available options.
|
static void |
main(String[] argv)
Main method for testing this class
|
int |
measureNumLeaves()
Returns the number of leaves in the tree
|
int |
measureTreeSize()
Returns the size of the tree
|
String |
minNumInstancesTipText()
Returns the tip text for this property
|
String |
modelTypeTipText()
Returns the tip text for this property
|
String |
numBoostingIterationsTipText()
Returns the tip text for this property
|
void |
setBinSplit(boolean c)
Set the value of binarySplits.
|
void |
setErrorOnProbabilities(boolean c)
Set the value of errorOnProbabilities.
|
void |
setMinNumInstances(int c)
Set the value of minNumInstances.
|
void |
setModelType(SelectedTag newMethod)
Set the Functional Tree type.
|
void |
setNumBoostingIterations(int c)
Set the value of numBoostingIterations.
|
void |
setOptions(String[] options)
Parses a given list of options.
|
void |
setUseAIC(boolean c)
Set the value of useAIC.
|
void |
setWeightTrimBeta(double n)
Set the value of weightTrimBeta.
|
String |
toString()
Returns a description of the classifier.
|
String |
useAICTipText()
Returns the tip text for this property
|
String |
weightTrimBetaTipText()
Returns the tip text for this property
|
debugTipText, forName, getDebug, makeCopies, makeCopy, setDebug
public static final int MODEL_FT
public static final int MODEL_FTLeaves
public static final int MODEL_FTInner
public static final Tag[] TAGS_MODEL
public Capabilities getCapabilities()
getCapabilities
in interface CapabilitiesHandler
getCapabilities
in class Classifier
Capabilities
public void buildClassifier(Instances data) throws Exception
buildClassifier
in class Classifier
data
- the data to train withException
- if classifier can't be built successfullypublic double[] distributionForInstance(Instance instance) throws Exception
distributionForInstance
in class Classifier
instance
- the instance to compute the distribution forException
- if distribution can't be computed successfullypublic double classifyInstance(Instance instance) throws Exception
classifyInstance
in class Classifier
instance
- the instance to classifyException
- if instance can't be classified successfullypublic String toString()
public Enumeration listOptions()
listOptions
in interface OptionHandler
listOptions
in class Classifier
public void setOptions(String[] options) throws Exception
-B Binary splits (convert nominal attributes to binary ones)
-P Use error on probabilities instead of misclassification error for stopping criterion of LogitBoost.
-I <numIterations> Set fixed number of iterations for LogitBoost (instead of using cross-validation)
-F <modelType> Set Funtional Tree type to be generate: 0 for FT, 1 for FTLeaves and 2 for FTInner
-M <numInstances> Set minimum number of instances at which a node can be split (default 15)
-W <beta> Set beta for weight trimming for LogitBoost. Set to 0 (default) for no weight trimming.
-A The AIC is used to choose the best iteration.
setOptions
in interface OptionHandler
setOptions
in class Classifier
options
- the list of options as an array of stringsException
- if an option is not supportedpublic String[] getOptions()
getOptions
in interface OptionHandler
getOptions
in class Classifier
public double getWeightTrimBeta()
public boolean getUseAIC()
public void setWeightTrimBeta(double n)
public void setUseAIC(boolean c)
c
- Value to assign to useAIC.public boolean getBinSplit()
public boolean getErrorOnProbabilities()
public int getNumBoostingIterations()
public SelectedTag getModelType()
public void setModelType(SelectedTag newMethod)
c
- Value corresponding to tree type.public int getMinNumInstances()
public void setBinSplit(boolean c)
c
- Value to assign to binarySplits.public void setErrorOnProbabilities(boolean c)
c
- Value to assign to errorOnProbabilities.public void setNumBoostingIterations(int c)
c
- Value to assign to numBoostingIterations.public void setMinNumInstances(int c)
c
- Value to assign to minNumInstances.public int graphType()
public int measureTreeSize()
public int measureNumLeaves()
public Enumeration enumerateMeasures()
enumerateMeasures
in interface AdditionalMeasureProducer
public double getMeasure(String additionalMeasureName)
getMeasure
in interface AdditionalMeasureProducer
additionalMeasureName
- the name of the measure to query for its valueIllegalArgumentException
- if the named measure is not supportedpublic String globalInfo()
public TechnicalInformation getTechnicalInformation()
getTechnicalInformation
in interface TechnicalInformationHandler
public String modelTypeTipText()
public String binSplitTipText()
public String errorOnProbabilitiesTipText()
public String numBoostingIterationsTipText()
public String minNumInstancesTipText()
public String weightTrimBetaTipText()
public String useAICTipText()
public String getRevision()
getRevision
in interface RevisionHandler
getRevision
in class Classifier
public static void main(String[] argv)
argv
- the commandline optionsCopyright © 2016 University of Waikato, Hamilton, NZ. All Rights Reserved.