|
1 | | In multiple regression analysis, R2 = 0.02, n = 2,000, k = 5 and F = 11.2. |
| | A) | multicollinearity is present |
| | B) | none of the five variables are statistically significant |
| | C) | this regression is excellent for prediction purposes |
| | D) | there is some evidence of a linear relationship between y and at least some of the x- variables, but the regression is extremely weak and useless for prediction purposes. |
| | E) | not enough information to make any conclusions |
|
|
2 | | In a multiple regression analysis, MSE = 20, n = 54, k = 3, and SST(total) = 2,000. What is the R2 of the regression? |
| | A) | 0.0 |
| | B) | 0.01 |
| | C) | 0.99 |
| | D) | 1.00 |
| | E) | 0.50 |
|
|
3 | | Suppose that in a multiple regression the F is significant, but none of the t-ratios are significant. This means that: |
| | A) | multicollinearity may be present |
| | B) | autocorrelation may be present |
| | C) | the regression is good |
| | D) | a nonlinear model would be a better fit |
| | E) | none of the above |
|
|
4 | | How many degrees of freedom for error are associated with a multiple regression model with k independent variables? |
| | A) | n - (k + 1) |
| | B) | n - k |
| | C) | n - 1 |
| | D) | n - k + 1 |
| | E) | none of the above |
|
|
5 | | The F ratio used to test for the existence of' a linear relationship between the dependent variable and any independent variable is: |
| | A) | MSE/(n-(k + l)) |
| | B) | MSR/MSE |
| | C) | MSR/MST |
| | D) | MSE/MSR |
| | E) | none of the above |
|
|
6 | |
Consider the following multiple regression model, with n = 25: y = 5 + 10x1 + 20x2.
R2 = 0.90 Sb1 = 3.2 sb2 = 5.5 Calculate the t-test statistic to test whether x1 contributes information to the prediction of y. |
| | A) | 0.32 |
| | B) | 3.636 |
| | C) | 3.125 |
| | D) | 2.8125 |
| | E) | 11.11 |
|
|
7 | | Correlation of the values of variables with values of the same variables lagged one or more time periods back is called: |
| | A) | multicollinearity |
| | B) | a transformation |
| | C) | autocorrelation |
| | D) | variance inflation |
| | E) | interaction |
|
|
8 | | Dummy variables are used when: |
| | A) | qualitative variables are involved in the model |
| | B) | quantitative variables are involved in the model |
| | C) | doing residual analysis |
| | D) | making transformations of quantitative variables |
| | E) | none of the above |
|
|
9 | | What does a correlation matrix show? |
| | A) | Residuals. |
| | B) | Regression coefficients. |
| | C) | The correlation coefficients between all of the independent variables. |
| | D) | Both A and B, above. |
|
|
10 | | When independent variables are correlated, you have. |
| | A) | Multicollinearilty. |
| | B) | Homoscedasticity. |
| | C) | Autocorrelation. |
| | D) | Residuals. |
|
|
11 | | When successive residuals are correlated you have. |
| | A) | Multicollinearilty. |
| | B) | Homoscedasticity. |
| | C) | Autocorrelation. |
| | D) | Residuals. |
|
|
12 | | As more independent variables are added to a multiple regression model, ___________ will increase; this is not always so with ___________, which will only increase if the additional variables add substantial explanatory power to the model. |
| | A) | R2; adjusted R2 |
| | B) | adjusted R2; R2 |
| | C) | R2; the coefficient of partial determination |
| | D) | adjusted R2; the coefficient of multiple determination |
| | E) | the standard error of the estimate; R2 |
|