# Test Linear Hypothesis

Share:

### Description

Testing linear hypothesis on the coefficients of a system of equations by an F-test or Wald-test.

### Usage

 1 2 3 4 ## S3 method for class 'systemfit' linearHypothesis( model, hypothesis.matrix, rhs = NULL, test = c( "FT", "F", "Chisq" ), vcov. = NULL, ... )

### Arguments

 model a fitted object of type systemfit. hypothesis.matrix matrix (or vector) giving linear combinations of coefficients by rows, or a character vector giving the hypothesis in symbolic form (see documentation of linearHypothesis in package "car" for details). rhs optional right-hand-side vector for hypothesis, with as many entries as rows in the hypothesis matrix; if omitted, it defaults to a vector of zeroes. test character string, "FT", "F", or "Chisq", specifying whether to compute Theil's finite-sample F test (with approximate F distribution), the finite-sample Wald test (with approximate F distribution), or the large-sample Wald test (with asymptotic Chi-squared distribution). vcov. a function for estimating the covariance matrix of the regression coefficients or an estimated covariance matrix (function vcov is used by default). ... further arguments passed to linearHypothesis.default (package "car").

### Details

Theil's F statistic for sytems of equations is

F = \frac{ ( R \hat{b} - q )' ( R ( X' ( Σ \otimes I )^{-1} X )^{-1} R' )^{-1} ( R \hat{b} - q ) / j }{ \hat{e}' ( Σ \otimes I )^{-1} \hat{e} / ( M \cdot T - K ) }

where j is the number of restrictions, M is the number of equations, T is the number of observations per equation, K is the total number of estimated coefficients, and Σ is the estimated residual covariance matrix. Under the null hypothesis, F has an approximate F distribution with j and M \cdot T - K degrees of freedom (Theil, 1971, p. 314).

The F statistic for a Wald test is

F = \frac{ ( R \hat{b} - q )' ( R \, \widehat{Cov} [ \hat{b} ] R' )^{-1} ( R \hat{b} - q ) }{ j }

Under the null hypothesis, F has an approximate F distribution with j and M \cdot T - K degrees of freedom (Greene, 2003, p. 346).

The χ^2 statistic for a Wald test is

W = ( R \hat{b} - q )' ( R \widehat{Cov} [ \hat{b} ] R' )^{-1} ( R \hat{b} - q )

Asymptotically, W has a χ^2 distribution with j degrees of freedom under the null hypothesis (Greene, 2003, p. 347).

### Value

An object of class anova, which contains the residual degrees of freedom in the model, the difference in degrees of freedom, the test statistic (either F or Wald/Chisq) and the corresponding p value. See documentation of linearHypothesis in package "car".

### References

Greene, W. H. (2003) Econometric Analysis, Fifth Edition, Prentice Hall.

Theil, Henri (1971) Principles of Econometrics, John Wiley & Sons, New York.