knitr::opts_chunk$set(echo = FALSE)

Varianzanalyse

Likelihood

$$L(\theta) = f(y | \theta)$$

Maximierung der Likelihood

$$\hat{\theta}{ML} = argmax{\theta} \ L(\theta)$$

Beispiel

$$y = Xb + e$$

$$f_Y(y | b, \sigma^2) = (2\pi \sigma^2)^{-n/2} exp\left(-{1\over 2\sigma^2} (y - Xb)^T\ (y - Xb)\right)$$

Multivariate Normalverteilung

### # the following code is copied from http://www.ejwagenmakers.com/misc/Plotting_3d_in_R.pdf
mu1<-0 # setting the expected value of x1
mu2<-0 # setting the expected value of x2
s11<-10 # setting the variance of x1
s12<-15 # setting the covariance between x1 and x2
s22<-10 # setting the variance of x2
rho<-0.5 # setting the correlation coefficient between x1 and x2
x1<-seq(-10,10,length=41) # generating the vector series x1
x2<-x1 # copying x1 to x2

f<-function(x1,x2)
{
  term1<-1/(2*pi*sqrt(s11*s22*(1-rho^2)))
  term2<--1/(2*(1-rho^2))
  term3<-(x1-mu1)^2/s11
  term4<-(x2-mu2)^2/s22
  term5<--2*rho*((x1-mu1)*(x2-mu2))/(sqrt(s11)*sqrt(s22))
  term1*exp(term2*(term3+term4-term5))
} # setting up the function of the multivariate normal density
#
z<-outer(x1,x2,f) # calculating the density values
#
persp(x1, x2, z,
      main="Two dimensional Normal Distribution",
#      sub=expression(italic(f)~(bold(x)) ==
#                      frac(1,2~pi~sqrt(sigma[11]~sigma[22]~(1-rho^2))) ~
#                       phantom(0)^bold(.)~exp~bgroup("{",
#                                                     list(-frac(1,2(1-rho^2)),
#                                                      bgroup("[", frac((x[1]~-~mu[1])^2, sigma[11])~-~2~rho~frac(x[1]~-~mu[1],                                                                                                                                                                                 sqrt(sigma[11]))~ frac(x[2]~-~mu[2],sqrt(sigma[22]))~+~
#                                                      frac((x[2]~-~mu[2])^2, sigma[22]),"]")),"}")),
      col="lightgreen",
      theta=30, phi=20,
      r=50,
      d=0.1,
      expand=0.5,
      ltheta=90, lphi=180,
      shade=0.75,
      ticktype="detailed",
      nticks=5) # produces the 3-D plot
#
mtext(expression(list(mu[1]==0,
                      mu[2]==0,
                      sigma[11]==10,
                      sigma[22]==10,
                      sigma[12]==15,
                      rho==0.5)),
      side=3) # adding a text line to the graph

Parameter

$$l(\theta) = \log(L(\theta))$$

ML Schätzung für $b$

$$l(b, \sigma^2) = \log(L(b, \sigma^2))$$ $$= -{n\over 2}\log(2\pi) - {n\over 2}\log(\sigma^2) - {1\over 2\sigma^2} (y - Xb)^T\ (y - Xb)$$

\begin{eqnarray} \frac{\partial l(b, \sigma^2)}{\partial b} &=& - {1\over 2\sigma^2} (-(y^TX)^T - X^Ty + 2X^TXb) \nonumber\ &=& - {1\over 2\sigma^2} (-2X^Ty + 2X^TXb) \label{eq:PartialLogLWrtB} \end{eqnarray}

Bestimmung von $\hat{b}_{ML}$

$$- {1\over 2\sigma^2} (-2X^Ty + 2X^TXb) = 0$$

$$X^Ty = X^TX\hat{b}$$

$$\hat{b} = (X^TX)^{-1}X^Ty$$

ML Schätzung für $\sigma^2$

$${1\over \hat{\sigma}^2} (y - Xb)^T\ (y - Xb) - n = 0$$

\begin{equation} \hat{\sigma}^2 = {1\over n} (y - Xb)^T\ (y - Xb) \label{eq:MlEstSigma2} \end{equation}

Bestimmung von $\hat{\sigma}^2$

\begin{equation} \hat{\sigma}^2 = {1\over n} \sum_{i=1}^n (y_i - x_i^T\hat{b})^2 \label{eq:MlEstSigma2SumResult} \end{equation}

Lineares gemischtes Modell

$$var(e) = R = I * \sigma_e^2$$ $$var(u) = G \text{.}$$ $$E\left[e\right] = 0 \text{ und } E\left[u\right] = 0$$ $$E\left[y\right] = Xb \text{ und } var(y) = V$$ $$y \sim \mathcal{N}(Xb, V)$$

ML-Schätzung

Restricted (Residual) Maximum Likelihood (REML)



charlotte-ngs/ZLHS2016 documentation built on May 13, 2019, 3:33 p.m.