Principle Foundations      Home Page

 

Multiple Regression Analysis

The three-variable linear model:

     Multiple regression analysis is used for testing hypothesis about the relationship between a dependent variable, Y, and two or more independent variables, Xs, and for prediction. The three-variable linear regression model can be written as

                                     Yi= b0 + b1X1i + b2X2i + ui

For the case of k independent or explanatory variables, we have

Yi= b0 + b1X1i + b2X2i + .......+ bkXki + ui

where X2i represents,  for example, the ith observation on independent variable X2.

 The additional assumption (to those of the simple regression model) is that the there is no exact linear relationship between the X's.

 

The first five assumptions of the multiple regression linear model are exactly the same as those of the simple OLS regression model. That is, the first three assumptions can be summarized as ui~N(0,σu2). The fourth assumption is E(uiuj)=0 for i=j; and the fifth assumption is E(Xiui) =0 . The only additional assumption required for the multiple OLS regression linear model is that there is no exact linear relationship between the axis. If two or more explanatory variables are perfectly linearly correlated, it will be impossible to calculate OLS estimates of the parameters because the system of normal equations will contain two or more equations that are not independent. If two or more explanatory variables are highly but not perfectly linearly correlated, then OLS parameter estimates can be calculated, but the effect of each of the highly linearly correlated variables on the explanatory variable cannot be isolated.

  

In case of multiple regression analysis with two independent or explanatory variables:

Parameter b0 is the constant term or intercept of the regression and gives the estimated value of Yi when X1i = X2i = 0.

Parameter b1 measures the change in Y for each one-unit change in X1 while holding X2 constant. Slope parameter b1 is a partial regression coefficient because it corresponds to the partial derivative of Y with respect to X1, or .

Parameter b2 measures the change in Y for each one-unit change in X2 while holding X1 constant. Slope parameter b2 is the second partial regression coefficient because it corresponds to the partial derivative of Y with respect to X2, or .

Since, ,  and  are obtained by the OLS method, they are also best linear estimators. That is E( )= , E( )= , and E( )=  and , ,  are lower than for any other unbiased linear estimator.

 

Copyright © 2002                                                                             Back to top

Evgenia Vogiatzi                                                                    <<Previous  Next>>

1