The term cosine distance is often used for the complement in positive space, that is: {\displaystyle D_{C}(A,B)=1-S_{C}(A,B),} {\displaystyle D_{C}(A,B)=1-S_{C}(A,B),} where {\displaystyle D_{C}} D_C is the cosine distance and {\displaystyle S_{C}} S_{C} is the cosine similarity. It is important to note, however, that this is not a proper distance metric as it does not have the triangle inequality property—or, more formally, the Schwarz inequality—and it violates the coincidence axiom; to repair the triangle inequality property while maintaining the same ordering, it is necessary to convert to angular distance (see below).
One advantage of cosine similarity is its low-complexity, especially for sparse vectors: only the non-zero dimensions need to be considered.
Other names of cosine similarity are Orchini similarity and the Tucker coefficient of congruence; Ochiai similarity (see below) is cosine similarity applied to binary data.