![]() ![]() Which is distributed as F with k and (N-k-1) degrees of freedom when the null is true. You have already seen this once, but here it is again in a new context: R-square is the proportion of variance in Y due to the multiple regression. If we compute the correlation between Y and Y' we find that R=.82, which when squared is also an R-square of. Together, the variance of regression (Y') and the variance of error (e) add up to the variance of Y (1.57 = 1.05+.52). The variance of Y' is 1.05, and the variance of the residuals is. The mean of Y is 3.25 and so is the mean of Y'. We can also compute the correlation between Y and Y' and square that. ![]() We use a capital R to show that it's a multiple R instead of a single variable r. When we do multiple regression, we can compute the proportion of variance due to regression. In multiple regression, the linear part has more than one X variable associated with it. Just as in simple regression, the dependent variable is thought of as a linear part and an error. What is the expected height (Z) at each value of X and Y? The linear regression solution to this problem in this dimensionality is a plane. This graph doesn't show it very well, but the regression problem can be thought of as a sort of response surface problem. We can (sort of) view the plot in 3D space, where the two predictors are the X and Y axes, and the Z axis is the criterion, thus: We have 3 variables, so we have 3 scatterplots that show their relations.īecause we have computed the regression equation, we can also view a plot of Y' vs. Job Perf' = -4.10 +.09MechApt +.09Coscientiousness. We can now compute the regression coefficients: The numbers in the table above correspond to the following sums of squares, cross products, and correlations: We can collect the data into a matrix like this: Suppose we want to predict job performance of Chevy mechanics based on mechanical aptitude test scores and test scores from personality test that measures conscientiousness. This equation is a straight-forward generalization of the case for one independent variable. The equation for a with two independent variables is: Also note that a term corresponding to the covariance of X1 and X2 (sum of deviation cross-products) also appears in the formula for the slope. Note that terms corresponding to the variance of both X variables occur in the slopes. ![]() For example, X 2 appears in the equation for b 1. In the two variable case, the other X variable also appears in the equation. It's simpler for k=2 IVs, which we will discuss here.įor the one variable case, the calculation of b and a was:Īt this point, you should notice that all the terms from the one variable case appear in the two variable case. ![]() The prediction equation is:įinding the values of b is tricky for k>2 independent variables, and will be developed after some matrix algebra. Again we want to choose the estimates of a and b so as to minimize the sum of squared errors of prediction. We still have one error and one intercept. Note that we have k independent variables and a slope for each. We can extend this to any number of independent variables: Where Y is an observed score on the dependent variable, a is the intercept, b is the slope, X is the observed score on the independent variable, and e is an error or residual. With one independent variable, we may write the regression equation as: How is it possible to have a significant R-square and non-significant b weights? What are the three factors that influence the standard error of the b weight? Write a regression equation with beta weights in it. Why do we report beta weights (standardized b weights)? What happens to b weights if we add new variables to the regression equation that are highly correlated with ones already in the equation? multiple regression?ĭescribe R-square in two different ways, that is, using two distinct formulas. What is the difference in interpretation of b weights in simple regression vs. Write a raw score regression equation with 2 ivs in it. Regression with Two Independent Variables ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |