 # Regression Analysis

What is it?

Regression is a statistical tool for investigating relationships between variables (Sykes, 1993). It explains the dependent variable that you are interested in as the combination of constant amount (Intercept), the effect of independent variable(s) (Coefficient), and noise (Residual/Random error).

Most intermediate statistical techniques root in regression analysis. If you become familiar with this analytic technique, you can touch various branches, such as factor analysis and structural equation modeling. The analysis is challenging for those newly studying statistical analysis but valuable for those who want to become quantitative researchers.

Basic Logic

• The relationship between independent variable(s) and a dependent variable is linear.
• Theory guides whether the variable is dependent or independent. Although there are statistical/numerical techniques that support casual inference (what is the cause? what is the result?), you should develop/find/adapt theory to justify your model selection. Of course, exploratory model is possible.
• Remember that many assumptions are related to noise (residual).

What Should I know?

• The Introductory level of knowledge about statistical analysis (EPSY 530)
• It will be beneficial in understanding the underlying proof of regression if you know or remember calculus, but don’t be afraid of mathematical basics. Understanding assumptions and the analysis process are more crucial.
• If you want to go further, it is recommended to study matrix algebra.

Where can I learn it?

• EPSY 530 Statistics 1 covers the principles of regression analysis
• EPSY 724 Regression dedicated to regression issues, including hypothesis testing and mediation

Written by YangHyun Kim (ykim39@albany.edu) 