Why is memset() incorrectly initializing int?
memset
sets each byte of the destination buffer to the specified value. On your system, an int
is four bytes, each of which is 5 after the call to memset
. Thus, grid[0]
has the value 0x05050505
(hexadecimal), which is 84215045
in decimal.
Some platforms provide alternative APIs to memset
that write wider patterns to the destination buffer; for example, on OS X or iOS, you could use:
int pattern = 5;
memset_pattern4(grid, &pattern, sizeof grid);
to get the behavior that you seem to expect. What platform are you targeting?
In C++, you should just use std::fill_n
:
std::fill_n(grid, 100, 5);
memset(grid, 5, 100 * sizeof(int));
You are setting 400 bytes, starting at (char*)grid
and ending at (char*)grid + (100 * sizeof(int))
, to the value 5
(the casts are necessary here because memset
deals in bytes, whereas pointer arithmetic deals in objects
.
84215045
in hex is 0x05050505
; since int
(on your platform/compiler/etc.) is represented by four bytes, when you print it, you get "four fives."
memset
is about setting bytes, not values. One of the many ways to set array values in C++ is std::fill_n
:
std::fill_n(grid, 100, 5);
Don't use memset.
You set each byte []
of the memory to the value of 5. Each int is 4 bytes long [5][5][5][5]
, which the compiler correctly interprets as 5*256*256*256 + 5*256*256 + 5*256 + 5 = 84215045. Instead, use a for loop, which also doesn't require sizeof(). In general, sizeof() means you're doing something the hard way.
for(int i=0; i<110; ++i)
grid[i] = 5;