View source: R/objective_function_SL.R
objective_function_SL | R Documentation |
This function calculates the objective function for a structural learning task. It computes multiple components
such as the total variation (TV) distance between original and generated datasets (X
vs. X_prime
, Y
vs. Y_prime
),
the changes in regression coefficients (beta0
and beta1
), the R² differences for each category in Z
, and the
inter-cluster centroid distances. The loss function combines these components using penalty parameters (lambda1
, lambda2
,
lambda3
, lambda4
).
objective_function_SL(
X_prime,
Y_prime,
X,
Y,
Z,
p,
beta0_orig,
beta1_orig,
lambda1,
lambda2,
lambda3,
lambda4,
R2_orig,
printc = FALSE
)
X_prime |
A numeric vector representing the generated values of |
Y_prime |
A numeric vector representing the generated values of |
X |
A numeric vector representing the original values of |
Y |
A numeric vector representing the original values of |
Z |
A categorical vector representing the categories for each observation. |
p |
A numeric vector representing the target correlation values for each category in |
beta0_orig |
The original intercept value for the regression model. |
beta1_orig |
The original slope value for the regression model. |
lambda1 |
Penalty parameters to control the importance of different loss components. |
lambda2 |
Penalty parameters to control the importance of different loss components. |
lambda3 |
Penalty parameters to control the importance of different loss components. |
lambda4 |
Penalty parameters to control the importance of different loss components. |
R2_orig |
The original R² value for the model (not used directly in the calculation but might be for reference). |
printc |
A boolean flag to control printing of intermediate values for debugging. |
A numeric value representing the total loss calculated by the objective function.
# Test Case 1: Simple random data with normal distribution
set.seed(123)
X <- rnorm(100)
Y <- rnorm(100)
Z <- sample(1:3, 100, replace = TRUE)
X_prime <- rnorm(100)
Y_prime <- rnorm(100)
p <- c(0.5, 0.7, 0.9)
beta0_orig <- 0
beta1_orig <- 1
lambda1 <- lambda2 <- lambda3 <- lambda4 <- 1
R2_orig <- 0.9
loss <- objective_function_SL(X, Y, Z, X_prime, Y_prime, p, beta0_orig, beta1_orig,
lambda1, lambda2, lambda3, lambda4, R2_orig)
print(loss)
# Test Case 2: Skewed data with different categories and a larger lambda for penalty
X <- rexp(100)
Y <- rpois(100, lambda = 2)
Z <- sample(1:4, 100, replace = TRUE)
X_prime <- rnorm(100)
Y_prime <- rnorm(100)
p <- c(0.3, 0.5, 0.8, 0.6)
beta0_orig <- 0.5
beta1_orig <- 1.5
lambda1 <- lambda2 <- lambda3 <- 0.5
lambda4 <- 2
R2_orig <- 0.85
loss <- objective_function_SL(X, Y, Z, X_prime, Y_prime, p, beta0_orig, beta1_orig,
lambda1, lambda2, lambda3, lambda4, R2_orig)
print(loss)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.