Getting Started with OpenGL ES 3+ Programming by Hans de Ruiter - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.

Tutorial 3: Texture Mapping

Most OpenGL tutorial series would now have a tutorial that adds per-vertex colours. However, that’s rather boring, and per-vertex colouring isn’t all that useful. So let’s jump straight to texture mapping instead. This will mean more work, but the results will be worth it.

Texture mapping is basically wrapping a 3D model with an image. This is an easy way to add more detail (a.k.a., texture) to an object. We’ll be using it to draw a wooden box.

HINT: For more, look at the “Texture Mapping” section in the “Modern Graphics Programming Primer” (https://keasigmadelta.com/graphics-primer)

  1. Getting Started

Start by creating a new project called GLTutorial3. Use the same method you’ve used for previous tutorials. Now copy GLTutorial2’s source files: Main.cpp and Shader.cpp|h, and add them to the project (right-click on GLTutorial3, and select Add => Existing item...). We’ll be building on those.

Neither SDL2 nor OpenGL come with functions to load images, so we’re going to use a new library called SDL_image.

  1. Installing SDL_image on Windows

Select Tools => NuGet Package Manager => Package Manager Console from the menus (Figure 8).

Image

Figure 8: Opening the NuGet Package Manager Console.

Image

Figure 8: Opening the NuGet Package Manager Console.

The console will open at the bottom of the window. Now type the following, and push enter (Figure 9):

Install-Package sdl2_image.v140

SDL_image is now installed into the project and ready to go.

Image

Figure 9: Installing SDL_image via NuGet.

Image

Figure 9: Installing SDL_image via NuGet.

  1. Installing SDL_image on Other Platforms

If you’re using another OS, then you’ll need to look up how to install SDL_image.

  1. Texture Mapping Shaders

New shaders are needed for texture mapping. For starters, each vertex needs a texture coordinate, which indicates what part of the texture appears at that point. Next, the fragment shader has to actually read the texture. Here’s how it’s done...

  1. The Vertex Shader

Create a new file called Texture.vert, and enter the following code:

#version 300 es

 

layout(location = 0) in vec2 vertPos;

layout(location = 1) in vec2 vertTexCoord;

 

out vec2 texCoord;

 

void main() {

    texCoord = vertTexCoord;

    gl_Position = vec4(vertPos, 0.0, 1.0);

}

 

The code is still very basic; it just passes the vertex coordinates straight through. There are a few things to pay attention to, though.

First, there are now two inputs (a.k.a., vertex attributes): the vertex position (vertexPos), and the texture coordinates (vertTexCoord). This means we now have two vertex attribute arrays to set up. We’ll do that in the Vertex Texture Coordinates section below (Page 42).

Second, the inputs have been given explicit locations using layout(location = n). This very handy feature was introduced in OpenGL ES 3, and allows us to specify which vertex attribute each input is mapped to. Before, we’d have to get or set the vertex attribute to shader variable mappings manually in the main code. This was rather tedious, so being able to set them in the shader code is awesome.

  1. The Fragment Shader

Here’s the code to enter into Texture.frag:

#version 300 es

 

#ifdef GL_ES

precision highp float;

#endif

 

in vec2 texCoord;

 

out vec4 fragColour;

 

uniform sampler2D texSampler;

 

void main() {

    fragColour = texture(texSampler, texCoord);

}

 

This is the simplest shader capable of reading textures. It has the three essentials: a texture sampler (texSampler), the texture coordinate to read (texCoord) and a texture() function call. The texture’s colour at texCoord is written straight to the output: fragColour.

  1. Binding the Sampler2D to a Texture Unit

I had really hoped to skip this step via an OpenGL ES 3.1 feature that allows you to set the texture unit in the GLSL shader code. Alas, the Angle library doesn’t support this yet, so we’re stuck doing it the version 3.0 way. I’ll update this tutorial once this issue has been fixed.

In Main.cpp, add the following code below the shaderLoad() section:

    // Bind texSampler to unit 0

    GLint texSamplerUniformLoc = glGetUniformLocation(shaderProg, "texSampler");

    if (texSamplerUniformLoc < 0) {

        SDL_Log("ERROR: Couldn't get texSampler's location.");

        return EXIT_FAILURE;

    }

    glUniform1i(texSamplerUniformLoc, 0);

 

This code gets the vertex shader’s texSampler variable, and sets it to 0 for texture unit 0.

While you’re working on this code, you may as well update the shaderLoad() call to load our new shaders:

    GLuint shaderProg = shaderProgLoad("Texture.vert", "Texture.frag");

  1. Texture Loading

Okay, the shaders are ready. Now we have to load the actual texture into video memory. Actually, first we need a texture. There are plenty of textures available online (both free and not). You can use whatever you wish. I’ll be using a wooden crate texture that’s available here: http://opengameart.org/content/3-crate-textures-w-bump-normal

Once you’ve downloaded and unzipped the file, copy crate1/crate1_diffuse.png to GLTutorial3/GLTutorial3/. Don’t worry about the other files; we won’t be using them.

Now, on to loading the texture. Create two files, Texture.cpp and its header called Texture.h. We’re going to write a function to load textures (texLoad()), and one to free the texture at the end (texDestroy()). Here’s the resulting header file (Texture.h):

// Texture.h

 

#ifndef __TEXTURE_H__

#define __TEXTURE_H__

 

#include <GLES3/gl3.h>

 

/** Loads a 2D texture from file.

 *

 * @param filename name of the image file to load

 *

 * @return GLuint the texture's name, or 0 if failed

 */

GLuint texLoad(const char *filename);

 

/** Deallocates a texture.

 */

void texDestroy(GLuint texName);

 

#endif

 

Now switch to Texture.cpp. Start by including the header files we’ll need:

// Texture.cpp

//

// See header file for details

 

#include "Texture.h"

 

#include <SDL.h>

#include <SDL_image.h>

#include <SDL_opengles2.h>

 

  1. Swizzling

The texture loader will need one support function, called sdlToGLSwizzle()). This will transform SDL colour channel masks to OpenGL “swizzles.” Swizzling maps image channels to texture inputs. For example, an image may have the colours stored in ARGB order, which need to be mapped to the texture unit’s RGBA channels. Here’s how it’s done:

/** Sets the swizzling for a texture colour channel from an SDL colour mask.

 *

 * @param channel the texture channel to set (e.g., GL_TEXTURE_SWIZZLE_R)

 * @param mask the SDL colour channel mask (e.g., texSurf->format->Rmask)

 */

 

bool sdlToGLSwizzle(GLenum channel, Uint32 mask) {

    GLint swizzle;

    switch (mask) {

    case 0x000000FF:

#if SDL_BYTEORDER == SDL_BIG_ENDIAN

            swizzle = GL_ALPHA;

#else

            swizzle = GL_RED;

#endif

        break;

    case 0x0000FF00:

#if SDL_BYTEORDER == SDL_BIG_ENDIAN

            swizzle = GL_BLUE;

#else

            swizzle = GL_GREEN;

#endif

        break;

    case 0x00FF0000:

#if SDL_BYTEORDER == SDL_BIG_ENDIAN

            swizzle = GL_GREEN;

#else

            swizzle = GL_BLUE;

#endif

        break;

    case 0xFF000000:

#if SDL_BYTEORDER == SDL_BIG_ENDIAN

            swizzle = GL_ALPHA;

#else

            swizzle = GL_RED;

#endif

        break;

    default:

        SDL_Log("Unrecognized colour channel mask 0x%08X", mask);

        return false;

    }

 

    glTexParameteri(GL_TEXTURE_2D, channel, swizzle);

    return true;

}

 

The code is designed to support both big and little-endian processors. If you don’t know what that means, don’t worry about it for now (or look it up online). What’s important is that glTexParameteri() is given the correct channel to read from.

Confusingly, OpenGL uses GL_RED, GL_GREEN, GL_BLUE, & GL_ALPHA instead of channel numbers. So you can end up with the “red” channel being read from GL_ALPHA, etc. I’ve been caught out by this, so my advice is to make sure you keep backups once you have code that works.

  1. Loading the Texture

Let’s move on to the actual texture loading. SDL_image is used to load the image, and then glTexImage2D() is used to turn it into an OpenGL texture:

GLuint texLoad(const char *filename) {

    // Make sure the JPEG and PNG image loaders are present (don't know what file type we'll get).

    int flags = IMG_INIT_JPG | IMG_INIT_PNG;

    if ((IMG_Init(flags) & flags) == 0) {

        // Failed :-(

        SDL_Log("ERROR: Texture loading failed. Couldn't get JPEG and PNG loaders.\n");

        return 0;

    }

 

    // Load the image

    SDL_Surface *texSurf = IMG_Load(filename);

    if (!texSurf) {

        SDL_Log("Loading image %s failed with error: %s", filename, IMG_GetError());

        return 0;

    }

 

    // Determine the format

    // NOTE: Only supporting 24 and 32-bit images

    GLenum format;

    GLenum type = GL_UNSIGNED_BYTE;

    switch (texSurf->format->BytesPerPixel) {

    case 3:

        format = GL_RGB;

        break;

    case 4:

        format = GL_RGBA;

        break;

    default:

        SDL_Log("Can't load image %s; it isn't a 24/32-bit image\n", filename);

        SDL_FreeSurface(texSurf);

        texSurf = NULL;

        return 0;

    }

 

    // Create the texture

    GLuint texture;

    glGenTextures(1, &texture);

    glBindTexture(GL_TEXTURE_2D, texture);

    glTexImage2D(GL_TEXTURE_2D, 0, format, texSurf->w,

        texSurf->h, 0, format, type, texSurf->pixels);

    GLenum err =