Assuming you're talking about a simple regression model
-
estimated by least squares, we know from wikipedia thatacceptedYi=α+βXi+εiβ^=cor(Yi,Xi)⋅SD(Yi)SD(Xi)Therefore the two only coincide when SD(Yi)=SD(Xi). That is, they only coincide when the two variables are on the same scale, in some sense. The most common way of achieving this is through standardization, as indicated by @gung.
The two, in some sense give you the same information - they each tell you the strength of the linear relationship between Xi
and Yi. But, they do each give you distinct information (except, of course, when they are exactly the same):
-
The correlation gives you a bounded measurement that can be interpreted independently of the scale of the two variables. The closer the estimated correlation is to ±1
-
-
, the closer the two are to a perfect linear relationship. The regression slope, in isolation, does not tell you that piece of information.
-
The regression slope gives a useful quantity interpreted as the estimated change in the expected value of Yi
for a given value of Xi . Specifically, β^ tells you the change in the expected value of Yi corresponding to a 1-unit increase in Xi . This information can not be deduced from the correlation coefficient alone.