Arthur and Table
Arthur has bought a beautiful big table into his new flat. When he came home, Arthur noticed that the new table is unstable.
In total the table Arthur bought has n legs, the length of the i-th leg is li.
Arthur decided to make the table stable and remove some legs. For each of them Arthur determined number di — the amount of energy that he spends to remove the i-th leg.
A table with k legs is assumed to be stable if there are more than half legs of the maximum length. For example, to make a table with 5 legs stable, you need to make sure it has at least three (out of these five) legs of the maximum length. Also, a table with one leg is always stable and a table with two legs is stable if and only if they have the same lengths.
Your task is to help Arthur and count the minimum number of energy units Arthur should spend on making the table stable.
The first line of the input contains integer n (1 ≤ n ≤ 105) — the initial number of legs in the table Arthur bought.
The second line of the input contains a sequence of n integers li (1 ≤ li ≤ 105), where li is equal to the length of the i-th leg of the table.
The third line of the input contains a sequence of n integers di (1 ≤ di ≤ 200), where di is the number of energy units that Arthur spends on removing the i-th leg off the table.
Print a single integer — the minimum number of energy units that Arthur needs to spend in order to make the table stable.
2 1 5 3 2
2
3 2 4 4 1 1 1
0
6 2 2 1 1 3 3 4 3 5 5 2 1
8
题意:给出n条桌腿,与每条左腿的权值(移动的带价),当剩下最长腿的数量大于剩下总数量的一半,即可达到稳定状态。求达到稳定状态最小的移动代价之和。
改代码改了很长时间……开始的思路是以权值优先升序排序所有的边与权值(此处用结构体),并从最长腿开始逐一递减,计算每一次的代价之和。计算时将边长小于当前最长腿的升序排序的权值算入和内,同时将大于最长腿的代价之和也算入。大于最长腿的代价之和易算,之前预处理时即将每条边长度所对应的权值通过数组记录下来,算和时只要加上上一条最长腿的代价即可。但算小于当前最长腿的代价之和时,超时了。原因在于计算时,从小到大按权值遍历所有边,大于最长腿的边只能跳过,大量的时间就浪费在了检查大于最长腿的边的无用功上。
之后换了算法,以边长优先升序排序所有的边与权值。计算小于最长腿的边的代价之和时,只需从小到大遍历即可。这里同样用数组储存下权值的边的数量(与之前恰好相反…………),这里复杂度为O(n)级别,因为权值大小有限。而之前的无脑遍历是O(n^2),无怪乎超时。
下面是代码:
#include <algorithm>
#include <iostream>
#include <sstream>
#include <cstring>
#include <cstdlib>
#include <string>
#include <vector>
#include <cstdio>
#include <stack>
#include <cmath>
#include <queue>
#include <map>
#include <set>
using namespace std;
#define N 100005
#define M 30
#define INF 0x3f3f3f3f;
struct table{
int l,d;
}a[N];
int b[N],c[N],num[205];
bool cmp(table a,table b){
return a.l<b.l;
}
int main(){
int n,del,maxx,minx;
while (cin>>n) {
del=0;
maxx=0;
minx=INF;
memset(b, 0, sizeof(b));
memset(c, 0, sizeof(c));
memset(num, 0, sizeof(num));
for (int i=0; i<n; i++) {
scanf("%d",&a[i].l);
c[a[i].l]++;
if (a[i].l>maxx) {
maxx=a[i].l;
}
if (a[i].l<minx) {
minx=a[i].l;
}
}
for (int i=0; i<n; i++) {
scanf("%d",&a[i].d);
b[a[i].l]+=a[i].d;
}
sort(a, a+n, cmp);
int minn=INF;
int dtemp=0,sum=0,sum1=0,cnt=0;
for (int i=minx+1; i<=maxx; i++) {
del+=c[i];
sum1+=b[i];
}
for (int i=minx; i<=maxx; i++) {
if (c[i]==0) {
del-=c[i+1];
sum1-=b[i+1];
continue;
}
dtemp=n+1-c[i]*2;
sum=sum1;
if (dtemp<=del) minn=min(minn, sum);
else {
while ( cnt < n && a[cnt].l < i )
num[a[cnt++].d]++;
for (int j=1; j<=200; j++) {
if (dtemp-del>num[j]) {
sum+=num[j]*j;
dtemp-=num[j];
}
else {
sum+=(dtemp-del)*j;
dtemp=del;
break;
}
}
minn=min(minn, sum);
}
del-=c[i+1];
sum1-=b[i+1];
}
cout<<minn<<endl;
}
return 0;
}