Is Google's Android OpenGL tutorial teaching incorrect linear algebra?
Solution 1:
As the guy who wrote that OpenGL tutorial, I can confirm that the example code is incorrect. Specifically, the order of the factors in the shader code should be reversed:
" gl_Position = uMVPMatrix * vPosition;"
As to the application of the rotation matrix, the order of the factors should also be reversed so that the rotation is the last factor. The rule of thumb is that matrices are applied in right-to-left order, and the rotation is applied first (it's the the "M" part of "MVP"), so it needs to be the rightmost operand. Furthermore, you should use a scratch matrix for this calculation, as recommended by Ian Ni-Lewis (see his more complete answer, below):
float[] scratch = new float[16];
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(scratch, 0, mMVPMatrix, 0, mRotationMatrix, 0);
Thanks for calling attention to this problem. I'll get the training class and sample code fixed as soon as I can.
Edit: This issue has now been corrected in the downloadable sample code and the OpenGL ES training class, including comments on the correct order of the factors. Thanks for the feedback, folks!
Solution 2:
The tutorial is incorrect, but many of the mistakes either cancel each other out or are not obvious in this very limited context (fixed camera centered at (0,0), rotation around Z only). The rotation is backwards, but otherwise it kind of looks right. (To see why it's wrong, try a less trivial camera: set the eye and lookAt to y=1, for instance.)
One of the things that made this very hard to debug is that the Matrix methods don't do any alias detection on their inputs. The tutorial code makes it seem like you can call Matrix.multiplyMM with the same matrix used as both an input and the result. This isn't true. But because the implementation multiplies a column at a time, it's far less obvious that something is wrong if the right hand side is reused (as in the current code, where mMVPMatrix is the rhs and the result) than if the left hand side is reused. Each column on the left is read before the corresponding column in the result is written, so the output will be correct even if the LHS is overwritten. But if the right-hand side is the same as the result, then its first column will be overwritten before it's finished being read.
So the tutorial code is at a sort of local maximum: it seems like it works, and if you change any one thing, it breaks spectacularly. Which leads one to believe that wrong as it looks, it might just be correct. ;-)
Anyway, here's some replacement code that gets what I think is the intended result.
Java code:
@Override
public void onDrawFrame(GL10 unused) {
float[] scratch = new float[16];
// Draw background color
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
// Set the camera position (View matrix)
Matrix.setLookAtM(mVMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
// Calculate the projection and view transformation
Matrix.multiplyMM(mMVPMatrix, 0, mProjMatrix, 0, mVMatrix, 0);
// Draw square
mSquare.draw(mMVPMatrix);
// Create a rotation for the triangle
Matrix.setRotateM(mRotationMatrix, 0, mAngle, 0, 0, 1.0f);
// Combine the rotation matrix with the projection and camera view
Matrix.multiplyMM(scratch, 0, mMVPMatrix, 0, mRotationMatrix, 0);
// Draw triangle
mTriangle.draw(scratch);
}
Shader code:
gl_Position = uMVPMatrix * vPosition;
NB: these fixes make the projection correct, but they also reverse the direction of rotation. That's because the original code applied the transformations in the wrong order. Think of it this way: instead of rotating the object clockwise, it was rotating the camera counterclockwise. When you fix the order of operations so that the rotation is applied to the object instead of the camera, then the object starts going counterclockwise. It's not the matrix that's wrong; it's the angle that was used to create the matrix.
So to get the 'correct' result, you also need to flip the sign of mAngle.
Solution 3:
I solved this problem as follows:
@Override
public void onDrawFrame(GL10 unused) {
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -1f, 0f, 0f, 0f, 0f, 1.0f, 0.0f);
Matrix.setRotateM(mModelMatrix, 0, mAngle, 0, 0, 1.0f);
Matrix.translateM(mModelMatrix, 0, 0.4f, 0.0f, 0);
mSquare.draw(mProjMatrix,mViewMatrix,mModelMatrix);
}
@Override
public void onSurfaceChanged(GL10 unused, int width, int height) {
...
Matrix.frustumM(mProjMatrix, 0, -ratio, ratio, -1, 1, 1, 99);
}
class Square {
private final String vertexShaderCode =
"uniform mat4 uPMatrix; \n" +
"uniform mat4 uVMatrix; \n" +
"uniform mat4 uMMatrix; \n" +
"attribute vec4 vPosition; \n" +
"void main() { \n" +
" gl_Position = uPMatrix * uVMatrix * uMMatrix * vPosition; \n" +
"} \n";
...
public void draw(float[] mpMatrix,float[] mvMatrix,float[]mmMatrix) {
...
mPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uPMatrix");
mVMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uVMatrix");
mMMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMMatrix");
GLES20.glUniformMatrix4fv(mPMatrixHandle, 1, false, mpMatrix, 0);
GLES20.glUniformMatrix4fv(mVMatrixHandle, 1, false, mvMatrix, 0);
GLES20.glUniformMatrix4fv(mMMatrixHandle, 1, false, mmMatrix, 0);
...
}
}
Solution 4:
I’m working on the same issue and that’s what I found:
I believe that Joe’s sample is CORRECT,
including
the order of the factors in the shader code:
gl_Position = vPosition * uMVPMatrix;
To verify it just try to rotate the triangle with reversed factors order, it will stretch the triangle to vanishing point at 90 degrees.
The real problem seems to be in setLookAtM function.
In Joe’s sample parameters are:
Matrix.setLookAtM(mVMatrix, 0,
0f, 0f,-3f, 0f, 0f, 0f, 0f, 1f, 0f );
which is perfectly logical as well.
However, the resulting view matrix looks weird to me:
-1 0 0 0
0 1 0 0
0 0 -1 0
0 0 -3 1
As we can see, this matrix will invert X coordinate,
since the first member is –1,
which will lead to left/right flip on the screen.
It will also reverse Z-order, but let's focus on X coordinate here.
I think that setLookAtM function is also working correctly.
However, since Matrix class is NOT a part of OpenGL,
it can use some other coordinates system,
for example - regular screen coordinates with Y axis pointing down.
This is just a guess, I didn’t really verify that.
Possible solutions:
We can build desirable view matrix manually,
the code is:
Matrix.setIdentityM(mVMatrix,0);
mVMatrix[14] = -3f;
OR
we can try to trick setLookAtM function by giving it
reversed camera coordinates:
0, 0, +3 (instead of –3).
Matrix.setLookAtM(mVMatrix, 0,
0f, 0f, 3f, 0f, 0f, 0f, 0f, 1f, 0f );
The resulting view matrix will be:
1 0 0 0
0 1 0 0
0 0 1 0
0 0 -3 1
That’s exactly what we need.
Now camera behaves as expected,
and sample works correctly.