Pixel aligned rendering in OpenGL, and direct state access
Two main topics here: The first one is how to render pixel-aligned objects in OpenGL, the other one is a great extension for OpenGL.
Pixel perfect rendering
If you want to render a 2D overlay, you usually want to have the vertices snapped exactly to pixels, so a line from 3,4 to 3,6 is exactly 1 pixel high and two pixels large. With OpenGL, this requires you to move place your vertices exactly at half pixel offsets. Of course you can also translate the whole scene by 0.5 to the bottom right.
The other thing you have to set up is a projection matrix which maps the vertices directly onto pixels. In the easiest case, you set the projection to be an orthogonal matrix, and the model view matrix to identity. For the orthogonal matrix there is a neat trick to move the origin to the top left, and this can be done by setting the matrix like this:
glMatrixLoadIdentityEXT(GL_PROJECTION);
// -1 mirrors in Y
glMatrixOrthoEXT (GL_PROJECTION, 0, screenWidth, screenHeight, 0, -1, 1);
glMatrixLoadIdentityEXT (GL_MODELVIEW);
In case the calls look a bit unfamiliar to you, welcome to the second topic! We’re finished with the first one actually, because after this few lines, you are ready to work in 2D. Just remember to position your vertices at half-offsets, and everything will match up properly. You’ll probably also want to disable depth tests at this stage, and rely on proper overdrawing.
Direct state access
This is the most practical OpenGL extensions of the last few years: EXT_direct_state_access. It allows you to set a variable without having to query the state, change it, and set it back to what it was before. A real time saver, and if you look at the extension author list, you know why it’s as slim as it is. Right now, it is supported by the latest nVidia drivers, and works just as expected.
Small rant, can’t resist. Call me a fanboy, but in my opinion, we should be all happy for the support OpenGL is getting from nvidia. If it would be AMD only, I believe OpenGL would be even more dead than it is. I guess it’s the SGI heritage and the strong workstation market which keeps nvidia dedicated to OpenGL, and I hope they won’t give up. Even though I’m not using OpenGL right now, I consider it important that there is always a viable alternative. Who knows, I might end up doing more OpenGL in the future again, and then I’ll be glad if I won’t have to endure to much pain and suffering.