Freetype Fonts in OpenGl


The Tutorial
In the summer of '03 I wrote a tutorial explaining how to use freetype fonts in OpenGL. It still survives on the web as Nehe Lesson 43. The tutorial code on NeHe's site has known bugs, I've included fixes below.

If you found the tutorial helpful, you might also be interested in this hacked version of the tutorial code that uses glDrawPixels instead of textured polygons to render the character glyphs -- it's simpler, and may be more robust (because it's immune to any problems related to texture memory or matrix stacks).

Here's some code that uses w_char's instead of chars, and which may be helpful if you need to display text in a language other than english (it'll give you things like "öáæé").

If you want an even wider range of unicode glyphs, you can try working with a more complete truetype font, such as Bitstream Cyberbit. Using a non-Roman glyph set (like Kanji) in a display list-based OpenGL font renderer is a little tricky to do efficiently -- as you'll want to limit the number of glyphs you're loading to the particular languages you're working with. That said, here's a wildly inefficient version of the code, which loads a good chunk of all the unicode characters in the Cyberbit font. You should be able to use it as a starting point for more finely tuned language-specific code.


Memory leak
As noticed by Bruce Wallace, there's a memory leak in make_dlist. It can be fixed by adding
FT_Done_Glyph(glyph);
to the end of make_dlist().

(Bruce also suggests switching the ch parameter to an unsigned char, as this allows extended character sets to be used with the original code.)


Blending bug
The basic problem is that the code, as written on the old tutorial, is making the edge pixels tend towards black insomuch as they are supposed to be alpha-blended. Among other things this means that light text will seem to have a dark outline around it. An easy fix is to just make the color channel pure white. Thanks to Thorsten Jordan for this bug report.

Here's a patched version of the msvc6 code.

And here are the particular lines that are broken (for those you working with a ported version of the code):

Take out this loop,
for(int j=0; j <height; j++) for(int i=0; i < width; i++)
	expanded_data[2*(i+j*width)] =
	expanded_data[2*(i+j*width)+1] =
            (i>=bitmap.width || j>=bitmap.rows) ?
            0 : bitmap.buffer[i + bitmap.width*j];
And replace it with:
for(int j=0; j < height;j++) for(int i=0; i < width; i++) {
        expanded_data[2*(i+j*width)] = 255;
        expanded_data[2*(i+j*width)+1] =
                (i>=bitmap.width || j>=bitmap.rows) ?
                0 : bitmap.buffer[i + bitmap.width*j];
}


Narrow font bug
Alex (aka Majin), has been experimenting with small font bitmaps, and noticed this bug:

glTexImage2D needs dimensions in [2,4,8,...], but the tutorial code won't pad a font with width 1, because it considers 1 to be a power of 2.

To fix this, goto next_p2 and change:
	int rval=1;
to
	int rval=2;


Another issue noticed by Alex
If a small font's bitmaps are not aligned with the screen pixels, bilinear interpolation will cause artifacts. An easy fix for this is to change the interpolation method from GL_LINEAR to GL_NEAREST, though this may lead to additional artifacts in the case of rotated text. (If rotating text is important to you, you may want to try telling OpenGL to super sample from larger font bitmaps.)