Lab 3: Introduction to Textures

Assigned: 03/27

Due: 04/08


Introduction

This lab is designed to introduce you to using textures read from image files in your shader.

Shaders

In this lab, you will implement shaders corresponding to three of the texture mapping applications covered in lecture:

As with previous assignments, two ZIP archives have been provided for you:

You will be using several texture images located in /usr/local/pub/wrc/graphics/textures/images, in BMP, JPG, and TIFF formats: a very boring one (boring.bmp, boring.jpg, boring.tif); a more exciting one (earth.bmp, etc.); a panorama image (landscape.bmp, etc.); and a simple black-and-white star (star.bmp, etc.). You will need to convert the ones you will use for the RenderMan portions of the assignment.

What to Submit

As with previous assignments, please place your solutions in the lab3results folder, as follows:

Create a ZIP archive of your results folder by using the "Compress" entry of the right-click menu for the folder. Submit this ZIP archive on one of the Ubuntu systems with the command

try  wrc-grd  shading-3  lab3results.zip

on a CS system that supports 'try' (e.g., one of the compute servers, or one of the machines in ICLs 1-3).


Tasks

Part 1 - Basic

Part 1 is designed to introduce you to the mechanism by which textures are read in from a file and sampled in a shader. For this shader, you will read in a texture from an image file and apply the values in the shader to a number of parameters in the shading algorithm. Part 1 consists of 3 sub-parts. You can either combine all sub-parts into a single "mega-shader", or create individual smaller shaders for each sub-part.

You may use any of the texture images found in the folder /usr/local/pub/wrc/graphics/textures/images. You are also free to make use of your own favorite image for the texture. Just be aware that textures for RenderMan should be in TIFF format and, for this lab, textures for GLSL can be in almost any format. If you do this, remember submit a copy of the texture file along with your solution.

You can find many freely-usable textures (and models) online; here are some links:

Also, you will be submitting more images for this task than in previous assignments; you'll need to submit an image for each subpart for each renderer (a total of six images). Also, please submit a README file which indicates the parameters/textures used in generating each image.

A: RenderMan

For RenderMan you can use the old standby scene from Lab 1 (included in the ZIP distribution as rit3.rib for convenience). For subparts (A) and (B), apply the shaders to one or both of the polygons in the scene; for subpart (C), apply the shader to the sphere (so that the transparency is apparent). (Be aware that your mapping may be "interesting" when the texture is applied to the sphere unless you use a texture that is designed for spherical mapping.) Be sure to convert your texture to prman format by using txmake before running your shader.

One note about RenderMan texturing. RenderMan will automatically set up texture coordinates using a set of default rules based on the type of primative being shaded. Though this is handy, it might be confusing if you're not aware of the rules. For example, observe the difference between applying your shader on the wall of rit3.rib (which defines absolute values of the vertex coords) and that of rib3b.rib (which applies scaling to the wall polygon). You may find it handy to use the "-mode periodic" option to txmake.

B: GLSL

For GLSL, texture access is typically done in the fragment shader. However, you must assure that all of the necessary values (e.g. texture coordinates) are attached the vertices and passed to the fragment shader via a vertex shader. This can be passed via the predefined varying variable gl_TexCoord. This variable is an array; for this lab, you will be using the first element of the array. Texture coordinates from OpenGL are available to the vertex program in the variables gl_MultiTexCoordn where n is the texture value of interest (for this lab, n will be 0). Thus, adding the following line to your vertex shader will assure that the texture coordinate is indeed attached to a vertex and will be properly interpolated and available when the fragment shader is run:

gl_TexCoord[0] = gl_multTexCoord0;

Setting up textures in OpenGL is a bit complex, even when not passing to a shader. To assist you in the passing of a shader from OpenGL application to your shader, please use the file lab3.cpp as a guide. This file attempts to recreate the basic scene used in the RenderMan exercises. Be sure to read the comments included in this file. Also included is lab3-trackball.cpp which enables a trackball-like interaction for moving the scene. Feel free to use either.

As you may remember, OpenGL does not natively support any image file format for texturing. There are many freely available libraries for reading in image file in a variety of formats. For this lab, we will use the Simple OpenGL Image Library (SOIL), an open source project used by many OpenGL programmers. To further simplify using textures in OpenGL, a support module called InitTextures is being supplied to you; this module provides a function similar to the ShaderSetup() function, but which creates an OpenGL texture object. It has this prototype:

GLuint InitTextures( const char *filename );

where filename is the pathname of the file containing the image you wish to load. This function will use SOIL to load and convert the image; if this fails, the function prints an error message and exits. If the texture object creation succeeds, the function returns the texture object ID to your calling routine.

You may have trouble getting XCode to use this library, so you may need to fall back on using the compiler by hand for this lab:

g++ -g -o lab3 -I/home/fac/wrc/SOIL/include
   lab3.cpp InitTextures.cpp ShaderSetup.cpp
   -L/home/fac/wrc/SOIL/lib  -lSOIL
   -framework GLUT -framework OpenGL -framework CoreFoundation

Note the inclusion of another framework, CoreFoundation; this is used by the SOIL functions. For the trackball version:

g++ -g -o lab3track -I/home/fac/wrc/SOIL/include
   lab3-trackball.cpp InitTextures.cpp ShaderSetup.cpp TrackBall.cpp
   -L/home/fac/wrc/SOIL/lib  -lSOIL
   -framework GLUT -framework OpenGL -framework CoreFoundation

For your convenience, a shellscript named COMPILE has been provided to simplify compilation of this code. When invoked without a command-line argument (e.g., as "./COMPILE"), it compiles and links the basic lab3 binary; when invoked with any command-line argument (e.g., "./COMPILE t"), it compiles and links the lab3track binary.


Part 2 - Intermediate

For this part, you will be using a simple gradient texture to implement toon (cel) shading.

In the folder /usr/local/pub/wrc/graphics/textures/images/gradients you will find several simple gradient textures, each 400x10 pixels:

File Name Prefix Appearance
blue-0000bb-9999ff   basic blue gradient
green-006600-88ee88   basic green gradient
red-bb0000-ff6666   basic red gradient
blue-stepped   sample blue gradient from notes

All four are available in BMP, JPG, PNG, and TIF variants. The fourth one is just the simple stepped blue gradient image shown in the lecture notes. The first three are all smooth (i.e., not "stepped") gradient images, and were created in about five minutes using the gradient texture generator at www.grsites.com; their file named include the starting and ending RGB colors (in hex notation) used to create the gradient.

As with Part 1, you are free to use any of these, or locate or create an image of your own. These are all very simple gradient files; you may find that more complex images provide more interesting results. If you do use an image of your own, remember to submit it along with your solution.


Part 3 - Advanced

For the final part, you will implement environment mapping using a spherical or cylindrical projection. Recall that rather than using the shading point directly, the perfectly reflected direction is used to index the texture map. Equations for these mappings can be found in many CG textbooks, and in the Texture 1 slides in the lecture notes. (You may want to experiment with the mapping applets described in the lecture notes to solidify your understanding of how these mapping techniques work.)

If you choose to implement spherical mapping, select the Earth image (earth.jpg, etc.) found in the textures folder and map it onto the sphere. (This image courtesy of James Hastings-Trew.)

If you choose to implement cylindrical mapping, use the panorama image (landscape.jpg, etc.) found in the textures folder. (This image comes from Marlin Studios). I strongly recommend replacing the sphere in the image with an actual cylinder. In RenderMan, this is done using the Cylinder parametric quadric; see page 96 in the Advanced RenderMan text, or the Quadrics section of the RenderMan Pro Server documentation (online at the RenderMan download page on the CS web server, linked from the course Resources page) for a description of this primitive.

In OpenGL, the gluSphere() call should be replaced with a gluCylinder() call; you can find descriptions of the parameters to this function in Chapter 11 of the OpenGL Programming Guide (a.k.a. the Red book) or through an online description such as the gluCylinder() manual page at www.opengl.org.


RenderMan® is a registered trademark of Pixar.