Difference between uint and unsigned int?

Is there any difference between uint and unsigned int?

I'm looking in this site, but all questions refer to C# or C++. I'd like an answer about the C language.

If it is relevant, note that I'm using GCC under Linux.


Solution 1:

uint isn't a standard type - unsigned int is.

Solution 2:

Some systems may define uint as a typedef.

typedef unsigned int uint;

For these systems they are same. But uint is not a standard type, so every system may not support it and thus it is not portable.

Solution 3:

I am extending a bit answers by Erik, Teoman Soygul and taskinoor

uint is not a standard.

Hence using your own shorthand like this is discouraged:

typedef unsigned int uint;

If you look for platform specificity instead (e.g. you need to specify the number of bits your int occupy), including stdint.h:

#include <stdint.h>

will expose the following standard categories of integers:

  • Integer types having certain exact widths

  • Integer types having at least certain specified widths

  • Fastest integer types having at least certain specified widths

  • Integer types wide enough to hold pointers to objects

  • Integer types having greatest width

For instance,

Exact-width integer types

The typedef name int N _t designates a signed integer type with width N, no padding bits, and a two's-complement representation. Thus, int8_t denotes a signed integer type with a width of exactly 8 bits.

The typedef name uint N _t designates an unsigned integer type with width N. Thus, uint24_t denotes an unsigned integer type with a width of exactly 24 bits.

defines

int8_t
int16_t
int32_t
uint8_t
uint16_t
uint32_t