Why both UNICODE and _UNICODE?
I've been looking at the command line generated by Visual Studio, and for one of my project it defines two symbols: _UNICODE
and UNICODE
. Now if I understand this document this rather old document, the _UNICODE
symbol is a VC++ thing that causes certain standard functions to use wchar_t
instead of char
in their interfaces.
But what does the UNICODE
without an underscore mean?
Solution 1:
Raymond Chen explains it here: TEXT vs. _TEXT vs. _T, and UNICODE vs. _UNICODE:
The plain versions without the underscore affect the character set the Windows header files treat as default. So if you define
UNICODE
, thenGetWindowText
will map toGetWindowTextW
instead ofGetWindowTextA
, for example. Similarly, theTEXT
macro will map toL"..."
instead of"..."
.The versions with the underscore affect the character set the C runtime header files treat as default. So if you define
_UNICODE
, then_tcslen
will map towcslen
instead ofstrlen
, for example. Similarly, the_TEXT
macro will map toL"..."
instead of"..."
.
Looking into Windows SDK you will find things like this:
#ifdef _UNICODE
#ifndef UNICODE
#define UNICODE
#endif
#endif
Solution 2:
In a nutshell,
UNICODE
is used by Windows headers,
whereas
_UNICODE
is used by C-runtime/MFC headers.
Solution 3:
Compiler vendors have to prefix the identifiers in their header files with an underscore to prevent them from colliding with your identifiers. So <tchar.h>
, a compiler header file, uses _UNICODE. The Windows SDK header files are compiler agnostic, and stone-cold old, it uses UNICODE without the underscore. You'll have to define both.