Lab 3: Introduction to Textures

Assigned: 04/11

Due: 04/22


Introduction

This lab is designed to introduce you to using textures read from image files in your shader.

Shaders

In this lab, you will implement shaders corresponding to the three texture mapping applications covered in lecture:

As with previous assignments, two ZIP archives have been provided for you:

You will be using several texture images located in /usr/local/pub/wrc/graphics/textures/images, in BMP, JPG, and TIFF formats: a very boring one (boring.bmp, boring.jpg, boring.tif); a more exciting one (earth.bmp, etc.); a panorama image (landscape.bmp, etc.); and a simple black-and-white star (star.bmp, etc.). You will need to convert the ones you will use for the RenderMan portions of the assignment.

What to Submit

As with the previous assignment, please place your solutions in the lab3results folder, as follows:

Create a ZIP archive of your results folder by using the "Compress" entry of the right-click menu for the folder. Submit this ZIP archive on one of the Ubuntu systems with the command

try  wrc-grd  shading-3  lab3results.zip

on a CS system that supports 'try' (e.g., one of the compute servers, or one of the machines in ICLs 1-4).


Tasks

Part 1 - Basic

Part 1 is designed to introduce you to the mechanism by which textures are read in from a file and sampled in a shader. For this shader, you will read in a texture from an image file and apply the values in the shader to a number of parameters in the shading algorithm. Part 1 consists of 3 sub-parts. You can either combine all sub-parts into a single "mega-shader", or create individual smaller shaders for each sub-part.

You may use any of the texture images found in the folder /usr/local/pub/wrc/graphics/textures/images. You are also free to make use of your own favorite image for the texture. Just be aware that textures for RenderMan should be in TIFF format and, for this lab, textures for GLSL can be in almost any format. If you do this, please submit a copy of the texture file along with your solution.

You can find many freely-usable textures (and models) online; here are some links:

Also, you will be submitting more images for this task than in previous assignments; you'll need to submit an image for each subpart for each renderer (a total of six images). Also, please submit a README file which indicates the parameters/textures used in generating each image.

A: RenderMan

For RenderMan you can use the old standby scene from Lab 1 (included in the ZIP distribution for convenience). Apply the shaders above to any of the three objects in the scene. Please be aware that you may get an interesting looking mapping if the texture is applied to the sphere (unless your texture is designed for spherical mapping). Be sure to convert your texture to prman format by using txmake before running your shader.

One note about RenderMan texturing. RenderMan will automatically set up texture coordinates using a set of default rules based on the type of primative being shaded. Though this is handy, it might be confusing if you're not aware of the rules. For example, observe the difference between applying your shader on the wall of rit.rib (which defines absolute values of the vertex coords) and that of rib2.rit (which applies scaling to the wall polygon). You may find it handy to use the -mode periodic option to txmake.

B: GLSL

For GLSL, texture access is typically done in the fragment shader. However, you must assure that all of the necessary values (e.g. texture coordinates) are attached the vertices and passed to the fragment shader via a vertex shader. This can be passed via the predefined varying variable gl_TexCoord. This variable is an array; for this lab, you will be using the first element of this array. Texture coordinates from OpenGL are available to the vertex program in the variables gl_MultiTexCoordn where n is the texture value of interest (for this lab, n will be 0). Thus, adding the following line to your vertex shader will assure that the texture coordinate is indeed attached to a vertex and will be properly interpolated and available when the fragment shader is run:

gl_TexCoord[0] = gl_multTexCoord0;

Setting up textures in OpenGL is a bit complex, even when not passing to a shader. To assist you in the passing of a shader from OpenGL application to your shader, please use the file lab3.cpp as a guide. This file attempts to recreate the basic scene used in the RenderMan exercises. Be sure to read the comments included in this file. Also included is lab3-trackball.cpp which enables a trackball-like interaction for moving the scene. Feel free to use either.

As you may remember, OpenGL does not natively support any image file format for texturing. There are many freely available libraries for reading in image file in a variety of formats. For this lab, we will use the Simple OpenGL Image Library (SOIL), an open source project used by many OpenGL programmers.

You may have trouble getting XCode to use this library, so you may need to fall back on using the compiler by hand for this lab:

g++ -g -o lab3 -I/home/fac/wrc/SOIL/include
   lab3.cpp InitTextures.cpp ShaderSetup.cpp
   -L/home/fac/wrc/SOIL/lib  -lSOIL
   -framework GLUT -framework OpenGL -framework CoreFoundation

Note the inclusion of another framework, CoreFoundation; this is used by the SOIL functions. For the trackball version:

g++ -g -o lab3track -I/home/fac/wrc/SOIL/include
   lab3-trackball.cpp InitTextures.cpp ShaderSetup.cpp TrackBall.cpp
   -L/home/fac/wrc/SOIL/lib  -lSOIL
   -framework GLUT -framework OpenGL -framework CoreFoundation

For your convenience, a shellscript named COMPILE has been provided to simplify compilation of this code. When invoked without a command-line argument, it compiles and links the basic lab3 binary; when invoked with any command-line argument, it compiles and links the lab3track binary.


Part 2 - Intermediate

In this part, you will implement environment mapping using a spherical or cylindrical projection. Recall that rather than using the shading point directly, the perfectly reflected direction is used to index the texture map. Equations for these mappings can be found in many CG textbooks; two slides showing how to calculate the texture coordinates from the location of the point being shaded have been added to the Texture 1 slides in the lecture notes; see slides #68 and #69 in the updated PDF files.

For cylindrical mapping, use the panorama image (landscape.jpg, etc.) found in the textures folder for this part. (This image comes from Marlin Studios). For spherical mapping, use the Earth image (earth.jpg, etc.) found there. (This image courtesy of James Hastings-Trew.)

If you choose to do the cylindrical mapping, you may want to replace the sphere in the image with a cylinder. In this case, also submit the RIB file and modified OpenGL source code.


Part 3 - Advanced

For this final part, you will be implementing bump mapping to create a textured map of the world. Use the Earth image (earth.jpg, etc.) as a color map. Recall that with bump mapping, your underlying illumination model must somehow depend upon the normal. A modified Phong would do the trick.

For the bump map, you can use any of the images in the texture directory, or find one of your own. The star image (star.jpg, etc.) may be easiest to use, as it is a black-and-white image; you can calculate gradient vectors from differences in adjacent pixel values and apply those to the surface normal to vary its orientation.

For a more interesting challenge, consider using the Earth image as both a color map and a bump map. Use the color differences between adjacent pixels to calculate gradients and apply these to create a "relief map" appearance. (See Chapter 22 of Computer Graphics with OpenGL, fourth edition, by Hearn, Baker, and Carithers, for an example of how to do this.)

Initially, you should try mapping the texture on the wall of the image. If you feel daring, try applying to the sphere (especially challenging in GLSL where you must do the calculation of the spherical coordinates in the shader).

For RenderMan, recall that there is a special shader just for bump mapping called a Displacement Shader. Thus, to create this effect you will need both a displacement and a surface shader.

With GLSL, all must be done in the fragment shader. Remember to pass the correct vectors (in the correct space, namely tangent space) from the vertex shader to the fragment shader via varying variables. In order to convert your vectors into tangent space, you will need the tangent vector at a point. A GLSL function to find this vector is used in the supplied file bumpShader.vert.

Don't forget that there are good online tutorials for doing bump mapping in GLSL; feel free to consult them:


RenderMan® is a registered trademark of Pixar.