Given a roman numeral, convert it to an integer.
Input is guaranteed to be within the range from 1 to 3999.
Subscribe to see which companies asked this question.
JAVA:
public class Solution {
public int romanToInt(String s) {
if(s == null || s.length() == 0) return 0;
int[] num = new int[s.length()];
for(int i = 0; i < s.length(); i++) {
switch(s.charAt(i)) {
case 'M':
num[i] = 1000;
break;
case 'D':
num[i] = 500;
break;
case 'C':
num[i] = 100;
break;
case 'L':
num[i] = 50;
break;
case 'X':
num[i] = 10;
break;
case 'V':
num[i] = 5;
break;
case 'I':
num[i] = 1;
break;
default:
return 0;
}
}
int sum = 0;
for(int i = 0; i < num.length-1; i++) {
if (num[i] < num[i+1]) sum -= num[i];
else sum += num[i];
}
sum += num[num.length-1];
return sum;
}
}
GO:
go的思路更简洁,使用递归,首先找到给定数字中最大的一个,然后该值减去左边的值再加上右边的值就是得到的结果。
import (
"bytes"
)
var order = [7]byte{'M', 'D', 'C', 'L', 'X', 'V', 'I'}
var value = [7]int{1000, 500, 100, 50, 10, 5, 1}
func romanToInt(s string) int {
if "" == s {
return 0
}
return convert([]byte(s))
}
func convert(num []byte) int {
if nil == num || len(num) == 0 {
return 0
}
var idx int
for i, c := range order {
idx = bytes.IndexByte(num, c)
if idx == -1 {
continue
}
if idx > 0 {
return value[i] - convert(num[:idx]) + convert(num[idx+1:])
}
return value[i] + convert(num[idx+1:])
}
return 0
}