Description Usage Arguments Details Value References Examples
General function that groups the inference of the statistic Kappa. This function is made up of: the statistic Kappa is a coefficient of agreement for nominal scales which measures the relationship of beyond chance agreement to expected disagreement, the standard deviation is the square root of the Kappa variance that it is computed using the Delta method, the confidence interval for Kappa statistic to confidence level 95 % by default.
1 | GKappa(object,alpha)
|
object |
a coMa object (confusion matrix object) |
alpha |
Significance level |
Assuming a multinomial sampling model Cohen's Kappa is the maximum likelihood estimate of Kappa. The standard deviation is calculated using the variance of Kappa according to Congalton.
The variance of Kappa is
\frac{1}{n}≤ft( \frac{θ_1(1-θ_1)}{(1-θ_2)^2} + \frac{2(1-θ_1)(2θ_1θ_2-θ_3)}{(1-θ_2)^3} + \frac{(1-θ_1)^2(θ_4-4θ_2^2)}{(1-θ_2)^4} \right)
where n is the sample size,
θ_1 = \frac{1}{n}∑_{i = 1}^k n_{ii},
θ_2 = \frac{1}{n^2}∑_{i = 1}^k n_{i+}n_{+i},
θ_3 = \frac{1}{n^2}∑_{i = 1}^k n_{ii}(n_{i+} + n_{+i}),
and
θ_4 = \frac{1}{n^3}∑_{i = 1}^k ∑_{j = 1}^k n_{ij}(n_{j+} + n_{+i})^2.
Confidence intervals can be computed using the approximate large sample variance and the fact tha de statistic is asymptotically normally distributed.
GKappa
returns a list with the following elements:
Cohen, J. (1960). A coefficient of agreement of nominal scales. Educational and Psychological Measurement, 20, 37-46.
Rosenfield, G. H., & Fitzpatrick-Lins, K. (1986). A coefficient of agreement as a measure of thematic classification accuracy. Photogrammetric Engineering and Remote Sensing, 52, 223-227.
Congalton, R.G., Green, K. (2009) Assessing the accuracy of Remote Sensed Data. Principles and Practices. CRC Press. Taylor & Francis Group.
1 2 3 4 5 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.