Partition a set of numbers into two such that difference between their sum is minimum, and both sets have equal number of elements.
For example: {1, 4, 9, 16} is partitioned as {1,16} and {4,9} with diff = 17-13=4.
Does greedy work here? First sorting, and then picking smallest and largest to fall in set 1, and picking 2nd smallest and 2nd largest to fall in set 2.
For example: {1, 4, 9, 16} is partitioned as {1,16} and {4,9} with diff = 17-13=4.
Does greedy work here? First sorting, and then picking smallest and largest to fall in set 1, and picking 2nd smallest and 2nd largest to fall in set 2.
I was asked to prove which I failed :(
----------------------------------------------------------------------------------------------------
The problem is similar to knapsack problem, however, the number of knapsack is constant in this problem, so :
Observing the insanity of above posts which claim to develop a greedy O(nlogn) solution for a variation of partition problem, I was compelled to code the program. Being NP-hard by nature, the solution falls into pseudo-polynomial time algorithm with complexity O(n^2W) where n = # of elements, W = sum of elements.
//constraints: n is even
void fun (int items[], int n)
{
int total = accumulate (items, items+n, 0) / 2;
int maxSubsetSz = n/2 ;
vector< vector<int> > T (maxSubsetSz+1, vector<int> (total+1, 0) );
//T[i][j] is set if there exists subset of size i that sum up to j
T[0][0] = 1;
for(int i = 0; i < n; i++) //consider taking i-th item
for(int k = maxSubsetSz-1; k >= 0; k--) //each item must be taken once, hence loop backwards
for(int j = 0; j <= total-items[i]; j++)
if ( T[k][j] && items[i]+j <= total )
T[k+1][j+items[i]] = 1;
for(int j = total; j >= 1; j--)
if ( T [maxSubsetSz][j] ) {
cout << "sum : " << j << endl;
break;
}
}