decision_tree | R Documentation |
Decision Tree
decision_tree(object, ...)
## S3 method for class 'formula'
decision_tree(formula, data, maxdepth = 100L, ...)
treeheight(node)
treedepth(node)
## Default S3 method:
decision_tree(x, y, maxdepth = 100L, ...)
is.decisiontree(object)
object |
R object. |
... |
Optional arguments. |
formula |
A model |
data |
A data frame, containing the variables in |
maxdepth |
The maximum depth of the resulting tree. If this value, default |
x |
A matrix or data frame with feature values. |
y |
A factor variable with categorical values for |
A decision tree is a type of model that puts a certain feature from x
onto a node, called split node, of the tree structure on the basis of
operations (e.g. gini impurity, information gain) and also uses a calculated value of the feature for each node for further separations into
left and right subnodes. At the end of the tree are the leaf nodes, each of which has a resulting level of y
.
treeheight()
computes the height of a tree. The height of a tree is the number of nodes from the starting node on the path to its deepest leaf node.
treedepth()
computes the depth of a tree. The depth of a tree is the number of edges or arcs from the starting node on the path to its deepest leaf node.
A list from class decisiontree
with split nodes and leaf nodes.
Other Machine Learning:
cross_validation_split()
,
k_nearest_neighbors()
,
moving_average()
,
naive_bayes()
,
naive_forecast()
,
predict.decisiontree()
,
predict.kmeans()
,
predict.naivebayes()
df <- data.frame(Outlook = factor(c("Sunny", "Sunny", "Overcast", "Rain", "Rain", "Rain", "Overcast", "Sunny", "Sunny", "Rain", "Sunny", "Overcast", "Overcast", "Rain")),
Temperature = factor(c("Hot", "Hot", "Hot", "Mild", "Cool", "Cool", "Cool", "Mild", "Cool", "Mild", "Mild", "Mild", "Hot", "Mild")),
Humidity = factor(c("High", "High", "High", "High", "Normal", "Normal", "Normal", "High", "Normal", "Normal", "Normal", "High", "Normal", "High")),
Wind = factor(c("Weak", "Strong", "Weak", "Weak", "Weak", "Strong", "Strong", "Weak", "Weak", "Weak", "Strong", "Strong", "Weak", "Strong")),
PlayTennis = factor(c("No", "No", "Yes", "Yes", "Yes", "No", "Yes", "No", "Yes", "Yes", "Yes", "Yes", "Yes", "No")))
x <- df[, -5L]
y <- df[[5L]]
# Build up decision tree
tree <- decision_tree(as.formula(PlayTennis ~ .), data = df)
# Compute height and depth of the tree
treeheight(tree); treedepth(tree)
# Predict labels of the features
yhat <- predict(tree, x)
accuracy(y, yhat)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.