If we are given a set of instances S, a feature A, and a partition boundary T, the class information entropy of the partition induced by T, denoted E(A,T,S) is given by:
For a given feature A, the boundary Tmin which minimizes the entropy function over all possible partition boundaries is selected as a binary discretization boundary. This method can then be applied recursively to both of the partitions induced by Tmin until some stopping condition is achieved, thus creating multiple intervals on the feature A.
Discretization based on entropy is supervised method because it needs the class lables of instances.
Equal frequency Intervals, divides a continuous variable into k bins where (given m instances) each bin contains m/k (possibly duplicated) adjacent values.
Both equal frequency intervals method and equal width interval binning method are unsupervised discretization methods.