HDU 1005

Problem Description
A number sequence is defined as follows:
f(1) = 1, f(2) = 1, f(n) = (A * f(n - 1) + B * f(n - 2)) mod 7.
Given A, B, and n, you are to calculate the value of f(n).

Input
The input consists of multiple test cases. Each test case contains 3 integers A, B and n on a single line (1 <= A, B <= 1000, 1 <= n <= 100,000,000). Three zeros signal the end of input and this test case is not to be processed.

Output
For each test case, print the value of f(n) on a single line.

Sample Input
1 1 3
1 2 10
0 0 0

Sample Output
2
5

Submission


#include <iostream>
int A,B;
int f(int n){
	if( n == 1 || n == 2){
		return 1;
	}
	return (A*f(n-1)+B*f(n-2))%7;
}
int main(){
 
	int n;
	while(scanf("%d%d%d",&A,&B,&n)!=EOF,A||B||n){
		int a = f(n%49);
		printf("%d\n",a);
	}
}

HDU 1005HDU 1005 langman5288 发布了9 篇原创文章 · 获赞 3 · 访问量 383 私信 关注
上一篇:深度学习论文:Learning Spatial Fusion for Single-Shot Object Detection及其PyTorch实现


下一篇:SSD: Single Shot MultiBox Detector