A few basic concepts (Linear Regression) that
you need to know
--R^2
indicates how much variance of the dependent variable can be explained by (or
is associated with) the regression equation composed by one or more predictors
--In simple
linear regression, r^2 is the square of the Pearson coefficient; also note that
the slope of the regression line (in z-scores) always coincides with the
Pearson coefficient in simple linear regression.
--The
regression equation provides the coefficients in unstandardized scores (b) and
in standardized scores (beta). If you want to find out which are the best/worst
predictors, you need to check the standardized coefficients (these regression
coefficients are between -1 and +1, and they give you an idea of the relationship
of each predictor with the dependent variable; note that standardized scores
don’t have any units, so that you can compare different predictors).
--If there
are several predictors (independent variables, in SPSS), then the ideal
situation is that these predictors are not related among them. If there is a
high relation between predictors, we may suffer from collinearity problems—you
may want to check for collinearity in SPSS (e.g., by looking at the Variance
Inflation Factor [VIF]; is it above 10 or not?)
--By
default, we enter all predictors (independent variables, in SPSS) in the
regression equation. But there are other options. For instance, in the
“stepwise” procedure, only reliable predictors enter the regression equation
(i.e., what is the point of including predictors that basically don’t help
predict the dependent variable?)