How to convert Unicode Character to Int in Swift

With Swift 5, Unicode.Scalar has a value property. value has the following declaration:

A numeric representation of the Unicode scalar.

var value: UInt32 { get }

The following Playground sample codes show how to iterate over the unicodeScalars property of a Character or a String and print the Int32 value of each Unicode scalar:

let char: Character = "\u{0D85}"

for scalar in char.unicodeScalars {
    print(scalar.value)
}

/*
 prints: 3461
 */
let str: String = "\u{0D85}"

for scalar in str.unicodeScalars {
    print(scalar.value)
}

/*
 prints: 3461
 */

Hex to Int

If you are starting with \u{0D85} and you know the hex value of the Unicode character, then you might as well write it in the following format because it is an Int already.

let myInt = 0x0D85                          // Int: 3461

String to Int

I assume, though, that you have "\u{0D85}" (in quotes), which makes it a String by default. And since it is a String, you can't be certain that you will only have a single Int value for the general case.

let myString = "\u{0D85}"

for element in myString.unicodeScalars {
    let myInt = element.value               // UInt32: 3461
}

I could have also used myString.utf16 and let myInt = Int(element), but I find it easier to deal with Unicode scalar values (UTF-32) when there is a possibility of things like emoji. (See this answer for more details.)

Character to Int

Swift Character, which is an extended grapheme cluster, does not have a utf16 or unicodeScalars property, so if you are starting with Character then convert it to a String first and then follow the directions in the String to Int section above.

let myChar: Character = "\u{0D85}"
let myString = String(myChar)