How to map the indexes of a matrix to a 1-dimensional array (C++)?

The way most languages store multi-dimensional arrays is by doing a conversion like the following:

If matrix has size, n (rows) by m (columns), and we're using "row-major ordering" (where we count along the rows first) then:

matrix[ i ][ j ] = array[ i*m + j ].

Here i goes from 0 to (n-1) and j from 0 to (m-1).

So it's just like a number system of base 'm'. Note that the size of the last dimension (here the number of rows) doesn't matter.


For a conceptual understanding, think of a (3x5) matrix with 'i' as the row number, and 'j' as the column number. If you start numbering from i,j = (0,0) --> 0. For 'row-major' ordering (like this), the layout looks like:

           |-------- 5 ---------|
  Row      ______________________   _ _
   0      |0    1    2    3    4 |   |
   1      |5    6    7    8    9 |   3
   2      |10   11   12   13   14|  _|_
          |______________________|
Column     0    1    2    3    4 

As you move along the row (i.e. increase the column number), you just start counting up, so the Array indices are 0,1,2.... When you get to the second row, you already have 5 entries, so you start with indices 1*5 + 0,1,2.... On the third row, you have 2*5 entries already, thus the indices are 2*5 + 0,1,2....

For higher dimension, this idea generalizes, i.e. for a 3D matrix L by N by M:

matrix[ i ][ j ][ k ] = array[ i*(N*M) + j*M + k ]

and so on.


For a really good explanation, see: http://www.cplusplus.com/doc/tutorial/arrays/; or for some more technical aspects: http://en.wikipedia.org/wiki/Row-major_order


For row-major ordering, I believe the statement matrix[ i ][ j ] = array[ i*n + j ] is wrong.

The offset should be offset = (row * NUMCOLS) + column.

Your statement results to be row * NUMROWS + column, which is wrong.

The links you provided give a correct explanation.


Something like this?

//columns = amount of columns, x = column, y = row
var calculateIndex = function(columns, x, y){
    return y * columns + x;
};

The example below converts an index back to x and y coordinates.

//i = index, x = amount of columns, y = amount of rows
var calculateCoordinates = function(index, columns, rows){
    //for each row
    for(var i=0; i<rows; i++){
        //check if the index parameter is in the row
        if(index < (columns * i) + columns && index >= columns * i){
            //return x, y
            return [index - columns * i, i];
        }
    }
    return null;
};