infotheo | R Documentation |
Information-theoretic meta-features are particularly appropriate to describe discrete (categorical) attributes, but they also fit continuous ones so a discretization is required.
infotheo(...) ## Default S3 method: infotheo( x, y, features = "all", summary = c("mean", "sd"), transform = TRUE, ... ) ## S3 method for class 'formula' infotheo( formula, data, features = "all", summary = c("mean", "sd"), transform = TRUE, ... )
... |
Further arguments passed to the summarization functions. |
x |
A data.frame contained only the input attributes. |
y |
A factor response vector with one label for each row/component of x. |
features |
A list of features names or |
summary |
A list of summarization functions or empty for all values. See
post.processing method to more information. (Default:
|
transform |
A logical value indicating if the numeric attributes should
be transformed. If |
formula |
A formula to define the class column. |
data |
A data.frame dataset contained the input attributes and class The details section describes the valid values for this group. |
The following features are allowed for this method:
Attributes concentration. It is the Goodman and Kruskal's tau measure otherwise known as the concentration coefficient computed for each pair of attributes (multi-valued).
Attributes entropy, a measure of randomness of each attributes in the dataset (multi-valued).
Class concentration, similar to "attrConc", however, it is computed for each attribute and the class (multi-valued).
Class entropy, which describes how much information is necessary to specify the class in the dataset.
Equivalent number of attributes, which represents the number of attributes suitable to optimally solve the classification task using the dataset.
Joint entropy, which represents the total entropy of each attribute and the class (multi-valued).
Mutual information, that is the common information shared between each attribute and the class in the dataset (multi-valued).
Noise ratio, which describes the amount of irrelevant information contained in the dataset.
This method uses the unsupervised data discretization procedure provided by
discretize function, where the default values are used when
transform=TRUE
.
A list named by the requested meta-features.
Donald Michie, David J. Spiegelhalter, Charles C. Taylor, and John Campbell. Machine Learning, Neural and Statistical Classification, volume 37. Ellis Horwood Upper Saddle River, 1994.
Alexandros Kalousis and Melanie Hilario. Model selection via meta-learning: a comparative study. International Journal on Artificial Intelligence Tools, volume 10, pages 525 - 554, 2001.
Ciro Castiello, Giovanna Castellano, and Anna Maria Fanelli. Meta-data: Characterization of input features for meta-learning. In 2nd International Conference on Modeling Decisions for Artificial Intelligence (MDAI), pages 457 - 468, 2005.
Other meta-features:
clustering()
,
complexity()
,
concept()
,
general()
,
itemset()
,
landmarking()
,
model.based()
,
relative()
,
statistical()
## Extract all metafeatures infotheo(Species ~ ., iris) ## Extract some metafeatures infotheo(iris[1:4], iris[5], c("classEnt", "jointEnt")) ## Extract all meta-features without summarize the results infotheo(Species ~ ., iris, summary=c()) ## Use another summarization functions infotheo(Species ~ ., iris, summary=c("min", "median", "max")) ## Do not transform the data (using only categorical attributes) infotheo(Species ~ ., iris, transform=FALSE)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.