1) standardize the data to force mean zero and variance unity, 2) kernel regress x on y and a matrix of control variables, with the option ‘gradients = TRUE’ and finally 3) compute the absolute values of gradients
abs_stdapdC(x, y, ctrl)
vector of data on the dependent variable
data on the regressors which can be a matrix
Data matrix on the control variable(s) beyond causal path issues
The first argument is assumed to be the dependent variable. If
abs_stdapd(x,y) is used, you are regressing x on y (not the usual y
on x). The regressors can be a matrix with 2 or more columns. The missing values
are suitably ignored by the standardization.
Absolute values of kernel regression gradients are returned after standardizing the data on both sides so that the magnitudes of amorphous partial derivatives (apd's) are comparable between regression of x on y on the one hand and regression of y on x on the other.
Prof. H. D. Vinod, Economics Dept., Fordham University, NY
1 2 3 4 5 6 7 8