My friend @rayfix kindly gave me awesome feedback on the previous post about Converting Character to Unicode scalar value in Swift.

⚠️ I tried to use jekyll-twitter-plugin to embed tweets into this post. But it seems jekyll plugin like that can’t be used with GitHub Pages 😢 So I just quote tweets below.

He gave me 2 other approaches.

Approach 1: Simpler and shorter

@keitaitok Nice post. Made me think. How about just Int(UnicodeScalar(“A”)!.value) to avoid the loops? (The link to this tweet)

UnicodeScalar struct’s init?(_ description: String) is used (Doc Ref). This initializer is a failable initializer. The return value is Optional. Out of curiosity, Why is this initializer failable? I checked Swift repo on GitHub to see it’s implementation (I ❤️ open source). I am not sure if I am allowed to put the code here, so I just put the link instead. Please check it out yourself.

The initializer is defined in UnicodeScalar.swift file under stdlib/public/core. Surprisingly the implementation is quite easy. It just guard-checks the passed argument’s unicodeScalar value is not nil and count is 1. The following is simple sample code. Now you know why the latter value is nil 😉

let singleCharUniScalar = UnicodeScalar("A")
print(singleCharUniScalar) // Optional("A")
let threeCharsUniScalar = UnicodeScalar("ABC")
print(threeCharsUniScalar) // nil

Let’s get back to the converting code. UnicodeScalar’s value property is of type UInt32. It means this Int initializer is Init(_ value: Int32) (Doc Ref). It doesn’t take Optional, that’s why you need to forced unwrapping on UnicodeScalar. Cool, it’s much simpler than mine!

Approach 2: Much more!

@keitaitok An even shorter solution is UInt8(ascii: “A”) (The link to the tweet)

UInt8 has very specific initializer! I didn’t know this exists at all. This initializer takes only ascii value, thus 0..<128. You should check it’s implementation out too. My original problem was how to get ascii value, so this approach totally works. If you want unicode scalar value than ascii value, you would use the first approach.

let asciiA = UInt8(ascii: "A")
print(asciiA) // 65
let ramen = UInt8(ascii: "🍜") // Compile error

I noticed one thing when I used this initializer. This initializer’s parameter ascii’s type is UnicodeScalar. However, I can pass in "A" directly here. This would mean String Literal can be used as String and UnicodeScalar depending on the passed situation. Ok, things are getting complicated. I have no idea how String Literal is treated with like this case. I am even not sure if I can call it String Literal 😛

One small caveat. You should be aware of types! The approach 1’s final type is Int, while the approach 2’s is UInt8. You would need to type cast if you need to do some operations, e.g., additions, greater/less checks, etc.

let asciiA = UInt8(ascii: "A")
let number = 100 // type is Int
if asciiA < number { // Compile error


  • There are several ways to achieve to get unicode/ascii scalar value.
  • Browsing Swift Standard Library would be a fun explore. There are soooooo many APIs.
  • We can check implementation too via Swift repo on GitHub. It helps to understand APIs intensions.