Count number of decimal places in a Float (or Decimal) in Swift

Doing this with Decimal is fairly straightforward, provided you correctly create your Decimal. Decimals are stored as significand * 10^exponent. significand is normalized to the smallest integer possible. So for 1230, the significand is 123 and the exponent is 1. For 1.23 the significand is also 123 and the exponent is -2. That leads us to:

extension Decimal {
    var significantFractionalDecimalDigits: Int {
        return max(-exponent, 0)
    }
}

However, you must be very careful constructing your Decimal. If you construct it from a Double, you will already have applied binary rounding errors. So for example:

let n = Decimal(0.111) // 0.11100000000000002048 because you passed a Double
n.significantFractionalDecimalDigits // 20

vs.

let n = Decimal(string: "0.111")!
n.significantFractionalDecimalDigits // 3 what you meant

Keep in mind of course that Decimal has a maximum number of significant digits, so it may still apply rounding.

let n = Decimal(string: "12345678901235678901234567890.1234567890123456789")!
n.significantFractionalDecimalDigits // 9 ("should" be 19)

And if you're walking down this road at all, you really must read the Floating Point Guide and the canonical StackOverflow question: Is floating point math broken?


This is actually really hard due to floating point not being representable precisely in a decimal format. For example the nearest 64 bit IEEE754 floating point to 5.98 is

5.980000000000000426325641456060111522674560546875

Presumably in this case you want the answer to be 2.

The easiest thing to do is to use your favourite converter to a string, formatted to 15 significant figures (for a double precision type) and inspect the output. It's not particularly fast, but it will be reliable. For a 32 bit floating point type, use 7 significant figures.

That said, if you can use a decimal type from the get-go then do that.


For me, there is quite easy solution that works on any region and device because iOS will handle possible binary errors for you automatically:

extension Double {
    func decimalCount() -> Int {
        if self == Double(Int(self)) {
            return 0
        }

        let integerString = String(Int(self))
        let doubleString = String(Double(self))
        let decimalCount = doubleString.count - integerString.count - 1

        return decimalCount
    }
}

Edit: it should work same for Double or Float


This is terrible 😆, but you can actually count the length of the string after the dot.

extension Double {
    
    var decimalPlaces: Int {
        let decimals = String(self).split(separator: ".")[1]
        return decimals == "0" ? 0 : decimals.count
    }
}
print(3.decimalPlaces) // 0
print(3.1.decimalPlaces) // 1
print(3.14.decimalPlaces) // 2
print(3.1415.decimalPlaces) // 4
print(3.141592.decimalPlaces) // 6
print(3.14159265.decimalPlaces) // 8

Tags:

Swift