Problem Description
A number sequence is defined as follows:
f(1) = 1, f(2) = 1, f(n) = (A * f(n - 1) + B * f(n - 2)) mod 7.
Given A, B, and n, you are to calculate the value of f(n).
f(1) = 1, f(2) = 1, f(n) = (A * f(n - 1) + B * f(n - 2)) mod 7.
Given A, B, and n, you are to calculate the value of f(n).
Input
The input consists of multiple test cases. Each test case contains 3 integers A, B and n on a single line (1 <= A, B <= 1000, 1 <= n <= 100,000,000). Three zeros signal the end of input and this test case is not to be processed.
Output
For each test case, print the value of f(n) on a single line.
Sample Input
1 1 3 1 2 10 0 0 0
Sample Output
2 5
代码如下:
#include <iostream>
using namespace std;
int num[10000];
int main()
{
int A,B,n;
num[1] = num[2] = 1;
while(cin>>A>>B>>n, A !=0|| B!=0 || n!=0)
{
int i;
for(i=3; i<10000 ; i++)
{
num[i] = (A*num[i-1] + B*num[i-2]) % 7;
if(num[i] == 1 &&num[i-1] == 1)
break;
}
n = n % (i-2);
num[0] = num[i-2];
cout << num[n] << endl;
}
return 0;
}
运行结果:
代码是借鉴网上的,自己写的全都是RUNTIME ERROR。心塞,连斐波那契都不会了。