Why does Decimal.Divide(int, int) work, but not (int / int)?

int is an integer type; dividing two ints performs an integer division, i.e. the fractional part is truncated since it can't be stored in the result type (also int!). Decimal, by contrast, has got a fractional part. By invoking Decimal.Divide, your int arguments get implicitly converted to Decimals.

You can enforce non-integer division on int arguments by explicitly casting at least one of the arguments to a floating-point type, e.g.:

int a = 42;
int b = 23;
double result = (double)a / b;

In the first case, you're doing integer division, so the result is truncated (the decimal part is chopped off) and an integer is returned.

In the second case, the ints are converted to decimals first, and the result is a decimal. Hence they are not truncated and you get the correct result.


The following line:

int a = 1, b = 2;
object result = a / b;

...will be performed using integer arithmetic. Decimal.Divide on the other hand takes two parameters of the type Decimal, so the division will be performed on decimal values rather than integer values. That is equivalent of this:

int a = 1, b = 2;
object result = (Decimal)a / (Decimal)b;

To examine this, you can add the following code lines after each of the above examples:

Console.WriteLine(result.ToString());
Console.WriteLine(result.GetType().ToString());

The output in the first case will be

0
System.Int32

..and in the second case:

0,5
System.Decimal

Tags:

C#

Int

Math

Divide