# Introduction to Multiple Regression (3 of 3)

Just as in the case of one-variable regression, the sum of squares in a multiple regression analysis can be partitioned into the sum of squares predicted and the sum of squares error.
```Sum of squares total:     55.57
Sum of squares predicted: 22.21
Sum of squares error:     33.36```
Again, as in one-variable regression, R² is the ratio of sum of squares predicted to sum of squares total. In this example, R² = 22.21/55.57 = 0.40. Sometimes multiple regression analysis is performed on standardized variables. When this is done, the regression coefficients are referred to as beta (ß) weights. The Y intercept (A) is always zero when standardized variables are used. Therefore, the regression equation for standardized variables is:

Y' = ß1z1 + ß2z2 + ... + ßkzk

Y' where is the predicted standard score on the criterion, ß1 is the standardized regression coefficient for the first predictor variable, ß2 is the coefficient for the second predictor variable, z1 is the standard score on the first predictor variable, z2 is the standardized score on the second predictor variable, etc.