|
1 | | In order to test the significance of the multiple regression model, we use |
| | A) | the partial F test. |
| | B) | the t test. |
| | C) | the overall F test. |
| | D) | adjusted R2. |
|
|
|
2 | | In order to test the significance of a single independent variable, we use |
| | A) | the partial F test. |
| | B) | the t test. |
| | C) | the overall F test. |
| | D) | adjusted R2. |
|
|
|
3 | | For the same point estimate of the dependent variable and the same level of alpha, the confidence interval is always wider than the corresponding prediction interval. |
| | A) | True |
| | B) | False |
|
|
|
4 | | The C statistic in regression analysis determines the desirability of a model if it is substantially greater than k + 1. |
| | A) | True |
| | B) | False |
|
|
|
5 | | An application of the multiple regression model generated the following results involving the F test of the overall regression model: p-value = .0012, R2 = .67, s = .076. Thus, the null hypothesis, which states that none of the independent variables are significantly related to the dependent variable, should be rejected, at the .05 level of significance. |
| | A) | True |
| | B) | False |
|
|
|
6 | | β0, the intercept of the model, is defined when at least one of the independent variables is equal to 0. |
| | A) | True |
| | B) | False |
|
|
|
7 | | Which is not an assumption of a multiple regression model? |
| | A) | Positive autocorrelation of error terms |
| | B) | Normality of error terms |
| | C) | Independence of error terms |
| | D) | Constant variation of error terms |
|
|
|
8 | | The range of feasible values for the multiple coefficient of determination is from |
| | A) | 0 to ∞ |
| | B) | -1 to 0 |
| | C) | -1 to 1 |
| | D) | 0 to 1 |
| | E) | -∞ to 0 |
|
|
|
9 | | β1 in a quadratic regression model is known as the |
| | A) | y-intercept. |
| | B) | shift parameter. |
| | C) | rate of curvature. |
| | D) | slope of the line between x and y. |
|
|
|
10 | | βj is the change in the mean value of the dependent variable that is associated with a one-unit increase in xj when the other independent variables in the model do not change. |
| | A) | True |
| | B) | False |
|
|
|
11 | | For a multiple regression model, the computer output shows that the simple correlation coefficient between the dependent variable and one of the independent variables is .99. This result indicates that most likely the problem of multicollinearity exists in this model. |
| | A) | True |
| | B) | False |
|
|
|
12 | | An interaction variable in a multiple regression model is calculated by multiplying two independent variables together. |
| | A) | True |
| | B) | False |
|
|
|
13 | | Multicollinearity is said to exist between two independent variables in a multiple regression model if they are dependent on each other. |
| | A) | True |
| | B) | False |
|
|
|
14 | | In multiple regression models, the number of degrees of freedom associated with SSE is |
| | A) | n - 2 |
| | B) | n - 1 |
| | C) | n - (k + 1) |
| | D) | n |
|
|
|
15 | | The effects of the different levels of a qualitative independent variable can be modeled in multiple regression by using |
| | A) | SSE. |
| | B) | adjusted R2. |
| | C) | dummy variables. |
| | D) | a confidence interval. |
|
|