MilkingTime
Time Limit: 1000MS | Memory Limit: 65536K |
Total Submissions: 12910 | Accepted: 5453 |
Description
Bessie is such a hard-working cow. In fact, she is so focused onmaximizing her productivity that she decides to schedule her next N (1 ≤ N ≤ 1,000,000) hours (convenientlylabeled 0..N-1) so that she produces as much milk as possible.
Farmer John has a list of M (1 ≤ M ≤ 1,000) possibly overlappingintervals in which he is available for milking. Each interval i has a starting hour (0 ≤ starting_houri ≤ N),an ending hour (starting_houri < ending_houri ≤ N),and a corresponding efficiency (1 ≤ efficiencyi ≤ 1,000,000) which indicates how manygallons of milk that he can get out of Bessie in that interval. Farmer Johnstarts and stops milking at the beginning of the starting hour and ending hour,respectively. When being milked, Bessie must be milked through an entireinterval.
Even Bessie has her limitations, though. After being milked duringany interval, she must rest R (1 ≤ R ≤ N)hours before she can start milking again. Given Farmer Johns list of intervals,determine the maximum amount of milk that Bessie can produce in the N hours.
Input
* Line 1: Three space-separated integers: N, M, and R
* Lines 2..M+1: Line i+1describes FJ's ith milking interval withthree space-separated integers: starting_houri , ending_houri , and efficiencyi
Output
* Line 1: The maximum number of gallons of milk that Bessie canproduct in the N hours
SampleInput
12 4 2
1 2 8
10 12 19
3 6 24
7 10 31
SampleOutput
43
Source
Vb
算法分析:
最长不下降子序列的变形,首先先按照开始时间排序,这里我们时间区间i是递增的,设dp【i】为到i的最大挤奶量,按照最长不下降子序列思路做即可。
代码实现:
#include<cstdio>
#include<cstring>
#include<cstdlib>
#include<cctype>
#include<cmath>
#include<iostream>
#include<sstream>
#include<iterator>
#include<algorithm>
#include<string>
#include<vector>
#include<set>
#include<map>
#include<stack>
#include<deque>
#include<queue>
#include<list>
#include<string.h>
const double eps = 1e-8;
inline int dcmp(double a, double b){
if(fabs(a - b) < eps) return 0;
return a > b ? 1 : -1;
}
typedef long long LL;
typedef unsigned long long ULL;
const int INT_INF = 0x3f3f3f3f;
const int INT_M_INF = 0x7f7f7f7f;
const LL LL_INF = 0x3f3f3f3f3f3f3f3f;
const LL LL_M_INF = 0x7f7f7f7f7f7f7f7f;
const int dr[] = {0, 0, -1, 1, -1, -1, 1, 1};
const int dc[] = {-1, 1, 0, 0, -1, 1, -1, 1};
const int MOD = 1e9 + 7;
const double pi = acos(-1.0);
const int MAXN = 1e5 + 10;
const int MAXT = 10000 + 10;
using namespace std;
#define N 1005
struct node
{
int l,r,e;
}a[N];
int dp[N];
bool cmp(node b,node c)
{
return b.l<c.l;
}
int main()
{
int n,m,r;
while(scanf("%d%d%d",&n,&m,&r)!=EOF)
{
for(int i=0;i<m;i++)
{scanf("%d%d%d",&a[i].l,&a[i].r,&a[i].e);
dp[i]=0;
}
sort(a,a+m,cmp); //排序
int ans=0;
for(int i=0;i<m;i++)
{
int maxx=0;
for(int j=0;j<i;j++)
if(a[j].r+r<=a[i].l&&dp[j]>maxx) //不能超过当前i开始时间
maxx=dp[j];
dp[i]=maxx+a[i].e;
ans=max(ans,dp[i]);
}
printf("%d\n",ans);
}
return 0;
}