How to convert a Rust char to an integer so that '1' becomes 1?

The method you need is char::to_digit. It converts char to a number it represents in the given radix.

You can also use Iterator::sum to calculate sum of a sequence conveniently:

fn main() {
    const RADIX: u32 = 10;
    let x = "134";
    println!("{}", x.chars().map(|c| c.to_digit(RADIX).unwrap()).sum::<u32>());
}

my_char as u32 - '0' as u32

Now, there's a lot more to unpack about this answer.

It works because the ASCII (and thus UTF-8) encodings have the Arabic numerals 0-9 ordered in ascending order. You can get the scalar values and subtract them.

However, what should it do for values outside this range? What happens if you provide 'p'? It returns 64. What about '.'? This will panic. And '♥' will return 9781.

Strings are not just bags of bytes. They are UTF-8 encoded and you cannot just ignore that fact. Every char can hold any Unicode scalar value.

That's why strings are the wrong abstraction for the problem.

From an efficiency perspective, allocating a string seems inefficient. Rosetta Code has an example of using an iterator which only does numeric operations:

struct DigitIter(usize, usize);

impl Iterator for DigitIter {
    type Item = usize;
    fn next(&mut self) -> Option<Self::Item> {
        if self.0 == 0 {
            None
        } else {
            let ret = self.0 % self.1;
            self.0 /= self.1;
            Some(ret)
        }
    }
}

fn main() {
    println!("{}", DigitIter(1234, 10).sum::<usize>());
}

Another way is to iterate over the characters of your string and convert and add them using fold.

fn sum_of_string(s: &str) -> u32 {
  s.chars().fold(0, |acc, c| c.to_digit(10).unwrap_or(0) + acc)
}

fn main() {
    let x = "123";
    println!("{}", sum_of_string(x));
}

If c is your character you can just write:

c as i32 - 0x30;

Test with:

let c:char = '2';
let n:i32 = c as i32 - 0x30;
println!("{}", n);

output:

2

NB: 0x30 is '0' in ASCII table, easy enough to remember!