I have an n-dimensional hyperplane: w′x+b=0 and a point x0. The shortest distance from this point to a hyperplane is d=|w⋅x0+b|||w||. I have no problem to prove this for 2 and 3 dimension space using algebraic manipulations, but fail to do this for an n-dimensional space. Can someone show a nice explanation for it?
Another proof: