How to convert a virtual-key code to a character according to the current keyboard layout?

Solution 1:

The correct solution is the ToUnicode WinAPI function:

[DllImport("user32.dll")]
public static extern int ToUnicode(uint virtualKeyCode, uint scanCode,
    byte[] keyboardState,
    [Out, MarshalAs(UnmanagedType.LPWStr, SizeConst = 64)]
    StringBuilder receivingBuffer,
    int bufferSize, uint flags);

One way to wrap this into a sensible, convenient method would be:

static string GetCharsFromKeys(Keys keys, bool shift, bool altGr)
{
    var buf = new StringBuilder(256);
    var keyboardState = new byte[256];
    if (shift)
        keyboardState[(int) Keys.ShiftKey] = 0xff;
    if (altGr)
    {
        keyboardState[(int) Keys.ControlKey] = 0xff;
        keyboardState[(int) Keys.Menu] = 0xff;
    }
    WinAPI.ToUnicode((uint) keys, 0, keyboardState, buf, 256, 0);
    return buf.ToString();
}

Now we can retrieve characters and actually get the expected results:

Console.WriteLine(GetCharsFromKeys(Keys.E, false, false));    // prints e
Console.WriteLine(GetCharsFromKeys(Keys.E, true, false));     // prints E

// Assuming British keyboard layout:
Console.WriteLine(GetCharsFromKeys(Keys.E, false, true));     // prints é
Console.WriteLine(GetCharsFromKeys(Keys.E, true, true));      // prints É

It is also possible to use ToUnicodeEx to retrieve the characters for a keyboard layout that is not the currently active one. The signature is the same except for one extra parameter, the input locale ID, which can be retrieved using the LoadKeyboardLayout function.

Solution 2:

I think this can be achieved using this method:

[DllImportAttribute("User32.dll")]
public static extern int ToAscii(int uVirtKey, int uScanCode, byte[] lpbKeyState,
        byte[] lpChar, int uFlags);

The example usage can be found here: http://www.pcreview.co.uk/forums/toascii-function-t1706394.html