Sum
Time Limit: 2000/1000 MS (Java/Others) Memory Limit: 65536/65536 K (Java/Others)
Total Submission(s): 756 Accepted Submission(s): 405
Total Submission(s): 756 Accepted Submission(s): 405
Problem Description
There is a number sequence
A1,A2....An
,you can select a interval [l,r] or not,all the numbers
Ai(l≤i≤r)
will become
f(Ai)
.
f(x)=(1890x+143)mod10007
.After that,the sum of n numbers should be as much as possible.What is the maximum sum?
Input
There are multiple test cases.
First line of each case contains a single integer n. (1≤n≤105)
Next line contains n integers A1,A2....An . (0≤Ai≤104)
It's guaranteed that ∑n≤106 .
First line of each case contains a single integer n. (1≤n≤105)
Next line contains n integers A1,A2....An . (0≤Ai≤104)
It's guaranteed that ∑n≤106 .
Output
For each test case,output the answer in a line.
Sample Input
2 10000 9999 5 1 9999 1 9999 1
Sample Output
19999 22033
Source
题意:给出一个数字串 A1 ~ An,你可以选择一个连续的子串 [l, r],使得里面所有的数字 Ai 的值变成 f(Ai) = (1890 * Ai + 143) % 10007,或者你也可以不要选择。最后要求所有数字的和最大是多少。
分析:最大连续子段和。很明显得可以想到,要使得最后的数字和最大,那么就是求一个区间 [l, r] 使得增量最大,也就是说使得 Σ (f(Ai) - Ai) 最大,那么很明显就转换成了最大连续子段和问题了。
题目链接:http://acm.hdu.edu.cn/showproblem.php?pid=5586
代码清单:
/*******************************************************************************
*** problem ID : HDU_5586.cpp
*** create time : Tue Dec 08 14:32:36 2015
*** author name : nndxy
*** author blog : http://blog.csdn.net/jhgkjhg_ugtdk77
*** author motto: never loose enthusiasm for life, life is to keep on fighting!
*******************************************************************************/
#include <map>
#include <set>
#include <cmath>
#include <queue>
#include <stack>
#include <ctime>
#include <vector>
#include <cctype>
#include <string>
#include <cstdio>
#include <cstring>
#include <cstdlib>
#include <iostream>
#include <algorithm>
#include <bits/stdc++.h>
using namespace std;
#define exit() return 0
#define setIn(name) freopen(name".in", "r", stdin)
#define setOut(name) freopen(name".out", "w", stdout)
#define debug(x) cout << #x << " = " << x << endl
typedef long long ll;
typedef unsigned int uint;
typedef unsigned long long ull;
const int maxn = 100000 + 5;
const int INF = 100000000 + 5;
int n;
int A[maxn];
int B[maxn];
int ans;
void input() {
ans = 0;
for(int i = 1; i <= n; i++) {
scanf("%d", &A[i]);
ans += A[i];
B[i] = (1890 * A[i] + 143) % 10007 - A[i];
}
}
void solve() {
int sum = 0, ret = 0;
for(int i = 1; i <= n; i++) {
if(sum + B[i] > 0) {
sum += B[i];
ret = max(ret, sum);
}
else sum = 0;
}
printf("%d\n", ans + ret);
}
int main() {
while(scanf("%d", &n) != EOF) {
input();
solve();
} exit();
}