Regression is a statistical method that attempts to determine the strength and behaviour of the relationship between one dependent variable (usually denoted by Y) and a set of one or more other variables (known as independent variables). Ordinary least squares (OLS) regression is a statistical method of analysis that estimates the relationship between the variables by minimizing the sum of squared differences between the observed and predicted values of the dependent variable.
回归是一种统计方法,试图确定一个因变量(通常由Y表示)与一组一个或多个其他变量(称为自变量)之间关系的强度和行为。 普通最小二乘(OLS)回归是一种统计分析方法,它通过最小化因变量的观察值与预测值之间的平方差之和来估计变量之间的关系。
If your data shows a linear relationship between the X and Y variables, it is useful to find the line that best fits that relationship. The Least Squares Regression Line is the line that makes the vertical distance from the data points to the regression line as small as possible. It’s called a “least squares” because the best line of fit is one that minimizes the sum of squares of the errors (aka the variance). Another name for the line is “Linear regression equation” (because the resulting equation gives you a linear equation). R² measures how well a linear regression line fits the data and has the equation ŷ= a+ b x. a denotes the intercept, b is the slop, x is the independent variable and ŷ is the dependent variable. Once the intercept and slope have been estimated using least squares, various indices are studied to determine the reliability of these estimates. One of the most popular of these reliability indic