classification_report {MLCOPULA}R Documentation

Calculates classification performance metrics.

Description

Calculates the confusion matrix and several performance metrics.

Usage

classification_report(
y_true,
y_pred
)

Arguments

y_true

A vector with the true labels.

y_pred

A vector with the predicted labels.

Value

Returns a list with the following entries:

metrics

A table with the precision, recall and f1-score for each class.

confusion_matrix

The confusion matrix.

accuracy

The accuracy.

mutual_information

The mutual information between the true and the predicted classes.

Examples

#Example 1
X <- iris[,1:4]
y <- iris$Species
model <- copulaClassifier(X = X, y = y, copula = "frank",
                      distribution = "kernel", graph_model = "tree")
y_pred <- copulaPredict(X = X, model = model)
classification_report(y_true = y, y_pred = y_pred$class)

#Example 2
X <- iris[,1:4]
y <- iris$Species
model <- copulaClassifier(X = X, y = y, copula = c("frank","clayton"), 
                        distribution = "kernel", graph_model = "chain")
y_pred <- copulaPredict(X = X, model = model)
classification_report(y_true = y, y_pred = y_pred$class)


[Package MLCOPULA version 1.0.1 Index]