Tools:
The models are:
where . Be sure to write down your answers.Now use the rotating 3-D plot to view the data. Does this change your guesses?
As we did for SLR model, we use least squares to fit the MLR model. This means finding estmators of the model parameters and . The LSEs of the s are those values, of , denoted ,which minimize
The fitted values are and the residuals areLet's see what happens when we fit models to sasdata.eg10_2a and sasdata.eg10_2c.
Residuals and studentized residuals are the primary tools to analyze model fit. We look for outliers and other deviations from model assumptions. Let's look at the residuals from some fits to sasdata.eg10_2c.
The intercept has the interpretation ``expected response when the X_{i} all equal 0''. The coefficient is interpreted as the change in expected response per unit change in X_{i} when the other Xs are held fixed (if that is possible).
Otherwise can interpret the model using multivariate calculus: change in expected response per unit change in Z_{i} (with the other predictors held fixed) is
So, for example, if the fitted model isTwo ways of building models:
Idea:
residuals. It measures the variation of the response unaccounted for by the fitted model or the uncertainty of predicting the response using the fitted model.
The degrees of freedom for a SS is the number of independent pieces of data making up the SS. For SSTO, SSE and SSR the degrees of freedom are n-1, n-q-1 and q. These add just as the SSs do. A SS divided by its degrees of freedom is called a Mean Square.
This is a table which summarizes the SSs, degrees of freedom and mean squares.
Analysis of Variance | |||||
Source | DF | SS | MS | F Stat | Prob > F |
Model | q | SSR | MSR | F=MSR/MSE | p-value |
Error | n-q-1 | SSE | MSE | ||
C Total | n-1 | SSTO |
H_{0}: | |
H_{a}: | Not H_{0} |
H_{0}: | |
H_{a}: |
A level L prediction interval for a new response at predictor values is
where andMulticollinearity is correlation among the predictors.
Selection of variables in empirical model building is an important task. We consider only one of many possible methods: backward elimination, which consists of starting with all possible X_{i} in the model and eliminating the non-significant ones one at at time, until we are satisfied with the remaining model.