Quick question (intergers and accuracy)

Hey, why doesn't this work. The rest of the program doesn't really effect how this works, so I'm only going to past the bit of code that Im curious about.

Code :

public static double Average(int total)
{
double average= (double)(total/10);
return average;
}

When I leave the code as it is, it displays the answer as an integer, when the answer should have a decimal point.

I can fix it by removing the brackets around the *total/10* and it then produces the answer with the decimal point. I was just wondering, why do I have to remove the brackets? Surely they are the same thing :O

Re: Quick question (intergers and accuracy)

It's the order of operations.

(double) (total/10) evaluates as such:

1. take the integer total and divide by the integer 10. This is an integer number.

2. Cast that number to a double. An integer casted to a double looks like an integer.

Now if you have the other way around,

(double)total / 10:

1. cast total to double.

2. Divide the double value of total by the integer 10. 10 is implicitly casted to a double before the division, the result is a decimal number.

This second form is equivalent to ((double)total)/10