Print unicode character from variable (swift)

I have a problem I couldn't find a solution to. I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character 👍.

Usually one would do something like this:

println("\u{1f44d}") // 👍

Here is what I mean:

let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working

I have tried several other ways but somehow the workings behind this magic stay hidden for me.

One should imagine the value of charAsString coming from an API call or from another object.


Solution 1:

One possible solution (explanations "inline"):

let charAsString = "1f44d"

// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {

    // Create string from Unicode code point:
    let str = String(UnicodeScalar(charCode))
    println(str) // 👍
} else {
    println("invalid input")
}

Slightly simpler with Swift 2:

let charAsString = "1f44d"

// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
    // Create string from Unicode code point:
    let str = String(UnicodeScalar(charCode))
    print(str) // 👍
} else {
    print("invalid input")
}

Note also that not all code points are valid Unicode scalars, compare Validate Unicode code point in Swift.


Update for Swift 3:

public init?(_ v: UInt32)

is now a failable initializer of UnicodeScalar and checks if the given numeric input is a valid Unicode scalar value:

let charAsString = "1f44d"

// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
    let unicode = UnicodeScalar(charCode) {
    // Create string from Unicode code point:
    let str = String(unicode)
    print(str) // 👍
} else {
    print("invalid input")
}

Solution 2:

This can be done in two steps:

  1. convert charAsString to Int code
  2. convert code to unicode character

Second step can be done e.g. like this

var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"

As for first the step, see here how to convert String in hex representation to Int

Solution 3:

As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).

UPDATED: Swift 3.0 changed UnicodeScalar initializer

print("\u{1f44d}") // 👍

let charAsString = "1f44d" // code in variable

let charAsInt = Int(charAsString, radix: 16)! // As indicated by @MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping

print("\(uScalar)")

Solution 4:

Here are a couple ways to do it:

let string = "1f44d"

Solution 1:

"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)

Solution 2:

"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)