Lab 3: Introduction to Textures

Assigned: 03/31

Due: 04/13


Introduction

This lab is designed to introduce you to using textures read from image files in your shader.

Shaders

In this lab, you will implement shaders corresponding to the three texture mapping applications covered in lecture:

As with the previous assignments, two ZIP archives have been provided for you:

What to Submit

As with the previous assignment, please place your solutions in the lab3results folder, as follows:

Create a ZIP archive of your results folder by using the "Compress" entry of the right-click menu for the folder; to submit it, run the command

try  wrc-grd  shading-3  lab3results.zip

on a CS system that supports 'try' (e.g., one of the compute servers, or one of the machines in ICLs 1-4).


Part 1 - Basic

Part 1 is designed to introduce you to the mechanism by which textures are read in from a file and sampled in a shader. For this shader, you will read in a texture from an image file and apply the values in the shader to a number of parameters in the shading algorithm. Part 1 consists of 3 sub-parts. You can either combine all sub-parts into a single "mega-shader", or create individual smaller shaders for each sub-part.

Several texture maps can be found in the textures folder of the distribution ZIP file, in both BMP and TIFF formats: a very boring one (boring.bmp, boring.tif); a more exciting one (earth.bmp, earth.tif); and a panorama image (landscape.bmp, landscape.tif) intended to be used for the Intermediate portion of the assignment.

You are also free to make use of your own favorite image for the texture. Just be aware that textures for RenderMan® should be in TIFF format and, for this lab, textures for GLSL should be in BMP format. If you do this, please submit a copy of the texture file along with your solution.

You can find many freely-usable textures (and models) online; here are some links:

Also, you will be submitting more images for this task than in previous assignments; you'll need to submit an image for each subpart for each renderer (a total of six images). Also, please submit a README file which indicates the parameters/textures used in generating each image.

A: RenderMan®

For RenderMan you can use the old standby scene from Lab 1 (included in the ZIP distribution for convenience). Apply the shaders above to any of the three objects in the scene. Please be aware that you may get an interesting looking mapping if the texture is applied to the sphere (unless your texture is designed for spherical mapping). Be sure to convert your texture to prman format by using txmake before running your shader.

One note about RenderMan texturing. RenderMan will automatically set up texture coordinates using a set of default rules based on the type of primative being shaded. Though this is handy, it might be confusing if you're not aware of the rules. For example, observe the difference between applying your shader on the wall of rit.rib (which defines absolute values of the vertex coords) and that of rib2.rit (which applies scaling to the wall polygon). You may find it handy to use the -mode periodic option to txmake.

B: GLSL

For GLSL, texture access is typically done in the fragment shader. However, you must assure that all of the necessary values (e.g. texture coordinates) are attached the vertices and passed to the fragment shader via a vertex shader. This can be passed via the predefined varying variable gl_TexCoord. Note that this variable is an array; for this lab, you will be using the first element of this array. Texture coordinates from OpenGL are available to the vertex program in the variables gl_MultiTexCoordn where n is the texture value of interest (for this lab, n will be 0). Thus, adding the following line to your vertex shader will assure that the texture coordinate is indeed attached to a vertex and will be properly interpolated and available when the fragment shader is run:

gl_TexCoord[0] = gl_multTexCoord0;

Setting up textures in OpenGL is a bit complex, even when not passing to a shader. To assist you in the passing of a shader from OpenGL application to your shader, please use the file lab3.cpp as a guide. This file attempts to recreate the basic scene used in the RenderMan exercises. Be sure to read the comments included in this file. Also included is lab3-trackball.cpp which enables a trackball-like interaction for moving the scene. Feel free to use either.

As you may remember, OpenGL does not natively support any image file format for texturing. There are many freely available libraries for reading in image file in a variety of formats. For this lab, we will use the DevIL library, an open source, community project used by many OpenGL programmers. Note that the DevIL library relies on other free packages in support of some file formats (like JPEG, TIFF, PNG). We currently do not have these other libraries installed on the iMacs (which is why the use of BMP files is suggested).

You may have trouble getting XCode to use these libraries, so you may want to fall back on using the compiler by hand for this lab:

g++ -g -o foo lab3.cpp ShaderSetup.cpp -framework GLUT
   -framework OpenGL -I/usr/local/include
   -L/usr/local/lib  -lILUT -lILU -lIL

If that doesn't work for you, I will be installing a backup version in my own account:

g++ -g -o foo lab3.cpp ShaderSetup.cpp -framework GLUT
   -framework OpenGL -I/home/fac/wrc/ILdist/include
   -L/home/fac/wrc/ILdist/lib  -lILUT -lILU -lIL

If you receive complaints about a function named ilutGLLoadImage() being undefined, add the statement

#define ILUT_USE_OPENGL

immediately before the include of IL/ilu.h in lab3.cpp.


Part 2 - Intermediate

In this part, you will implement environment mapping using a cylindrical projection. Recall that rather than using the shading point directly, the perfectly reflected direction is used to index the texture map. Please refer to the Texture 1 slides for the equations of cylindrical projection. Use the panorama image (landscape.bmp, landscape.tif) found in the textures folder for this part. (This image comes from Marlin Studios).


Part 3 - Advanced

For this final part, you will be implementing bump mapping to create a textured map of the world by applying the earth image (earth.bmp, earth.tif) as both a color map and a bump map. Recall that with bump mapping, your underlying illumination model must somehow depend upon the normal. A modified Phong would do the trick. Initially, you should try mapping the texture on the wall of the image. If you feel daring, try applying to the sphere (especially challenging in GLSL where you must do the calculation of the spherical coordinates in the shader).

For RenderMan, recall that there is a special shader just for bump mapping called a Displacement Shader. Thus, to create this effect you will need both a displacement and a surface shader.

With GLSL, all must be done in the fragment shader. Remember to pass the correct vectors (in the correct space, namely tangent space) from the vertex shader to the fragment shader via varying variables. In order to convert your vectors into tangent space, you will need the tangent vector at a point. A GLSL function to find this vector is used in the supplied file bumpShader.vert.


RenderMan is a registered trademark of Pixar.