This function implements a greedy heuristic algorithm for computing decision reducts (or approximate decision reducts) based on RST.
1 2 3 4 
decision.table 
an object of a 
attrDescriptions 
a list containing possible values of attributes (columns) in 
decisionIdx 
an integer value representing an index of the decision attribute. 
qualityF 
a function used for computation of the quality of attribute subsets. Currently, the following functions are included:

nAttrs 
an integer between 1 and the number of conditional attributes. It indicates
the attribute sample size for the Monte Carlo selection of candidating attributes.
If set to 
epsilon 
a numeric value between [0, 1) representing an approximate threshold. It indicates whether to compute approximate reducts or not. If it equals 0 (the default) a standard decision reduct is computed. 
inconsistentDecisionTable 
logical indicating whether the decision table is suspected
to be inconsistent or 
In this implementation, we provided some attribute subset quality measures which can be
passed to the algorithm by the parameter qualityF
. Those measures
guide the computations in the search for a decision/approximated reduct. They are used to
assess amount of information gained after addition of an attribute. For example,
X.entropy
corresponds to the information gain measure.
Additionally, this function can use the value of epsilon
parameter in order to compute
εapproximate reducts. The εapproximate can be defined as an
irreducable subset of attributes B
, such that:
Quality_{\mathcal{A}}(B) ≥ (1  ε)Quality_{\mathcal{A}}(A),
where Quality_{\mathcal{A}}(B) is the value of a quality measure (see possible values
of the parameter qualityF
) for an attribute subset B in decision table \mathcal{A}
and ε is a numeric value between 0 and 1 expressing the approximation threshold.
A lot of monographs provide comprehensive explanations about this topics, for example
(Janusz and Stawicki, 2011; Slezak, 2002; Wroblewski, 2001) which are used as the references of this function.
Finally, this implementation allows to restrain the computational complexity of greedy
searching for decision reducts by setting the value of the parameter nAttrs
. If this
parameter is set to a positive integer, the Monte Carlo method of selecting candidating
attributes will be used in each iteration of the algorithm.
A class "FeatureSubset"
that contains the following components:
reduct
: a list representing a single reduct. In this case, it could be a superreduct or just a subset of features.
type.method
: a string representing the type of method which is "greedy.heuristic"
.
type.task
: a string showing the type of task which is "feature selection"
.
model
: a string representing the type of model. In this case, it is "RST"
which means rough set theory.
epsilon
: the approximation threshold.
Andrzej Janusz
A. Janusz and S. Stawicki, "Applications of Approximate Reducts to the Feature Selection Problem", Proceedings of International Conference on Rough Sets and Knowledge Technology (RSKT), vol. 6954, p. 45  50 (2011).
D. Ślęzak, "Approximate Entropy Reducts", Fundamenta Informaticae, vol. 53, no. 3  4, p. 365  390 (2002).
J. Wróblewski, "Ensembles of Classifiers Based on Approximate Reducts", Fundamenta Informaticae, vol. 47, no. 3  4, p. 351  360 (2001).
FS.DAAR.heuristic.RST
and FS.reduct.computation
.
1 2 3 4 5 6 7 8 9 10 11 12 13  ###################################################
## Example 1: Evaluate reduct and generate
## new decision table
###################################################
data(RoughSetData)
decision.table < RoughSetData$hiring.dt
## evaluate a single reduct
res.1 < FS.greedy.heuristic.reduct.RST(decision.table, qualityF = X.entropy,
epsilon = 0.0)
## generate a new decision table corresponding to the reduct
new.decTable < SF.applyDecTable(decision.table, res.1)

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.
Please suggest features or report bugs with the GitHub issue tracker.
All documentation is copyright its authors; we didn't write any of that.