I'm trying to multiply an array of numbers. Should be pretty simple, but for some reason I'm getting some huge numbers and I can't figure out where I'm doing wrong.
I enter a number, it gets split into an array, it runs through the numbers and multiplis them
var iArray = i.toString().toCharArray()
var iCount = iArray.count().toString()
var x = 0
var sum: Long = 1
while(x < iCount.toInt()) {
Log.i(iArray[x].toString(), "array");
sum *= iArray[x].toLong()
x++
Log.i(sum.toString(), "sum");
}
In the logcat I can see the correct numbers in the array. As an example, if I try 357 this is what I get as a result
I/3: array I/51: sum
I/5: array I/2703: sum
I/7: array I/148665: sum
But if I just calculate 3*5*7 it works fine. What am I missing?