Why does everybody typedef over standard C types?

If you want to use Qt, you have to embrace quint8, quint16 and so forth.

If you want to use GLib, you have to welcome guint8, guint16 and so forth.

On Linux there are u32, s16 and so forth.

uC/OS defines SINT32, UINT16 and so forth.

And if you have to use some combination of those things, you better be prepared for trouble. Because on your machine u32 will be typedefd over long and quint32 will be typedefd over int and the compiler will complain.

Why does everybody do this, if there is <stdint.h>? Is this some kind of tradition for libraries?


Solution 1:

stdint.h didn't exist back when these libraries were being developed. So each library made its own typedefs.

Solution 2:

For the older libraries, this is needed because the header in question (stdint.h) didn't exist.

There's still, however, a problem around: those types (uint64_t and others) are an optional feature in the standard. So a complying implementation might not ship with them -- and thus force libraries to still include them nowadays.

Solution 3:

stdint.h has been standardised since 1999. It is more likely that many applications define (effectively alias) types to maintain partial independence from the underlying machine architecture.

They provide developers confidence that types used in their application matches their project specific assumptions on behavior that may not match either the language standard or compiler implementation.

The practice is mirrored in the object oriented Façade design pattern and is much abused by developers invariably writing wrapper classes for all imported libraries.

When compliers were much less standard and machine architectures could vary from 16-bit, 18-bit through 36-bit word length mainframes this was much more of a consideration. The practice is much less relevant now in a world converging on 32-bit ARM embedded systems. It remains a concern for low-end microcontrollers with odd memory maps.