This lab is designed to introduce you to using textures read from image files in your shader.
In this lab, you will implement shaders corresponding to the three texture mapping applications covered in lecture:
BASIC: Read texture map from file
INTERMEDIATE: Environment mapping
ADVANCED: Bump mapping
As with the previous assignments, two ZIP archives have been provided for you:
lab3files.zip
expands into a folder named lab3 which contains "starter"
code for your GLSL and RenderMan®
implementations; in addition, there is a textures subfolder
which contains several different texture images.
lab3results.zip
expands into a folder named lab3results which is a framework
into which you should place your solutions and resulting images.
As with the previous assignment, please place your solutions in the
lab3results folder, as follows:
Place your source code (not object files or binaries) in the appropriate
subfolder (e.g., your GLSL basic source files in the
GLSL/Basic folder, your RenderMan intermediate source
files in the RenderMan/Intermediate folder, etc.).
Please do not submit copies of the texture files I have provided - I
already have copies of those, and including them will add about 13MB to
your submission.
If you use your own image (or images), though, please do submit
it (or them).
Place your result images in the Results subfolder, with
names of the form
lang-task.suffix.
For lang, use glsl or
rman,
as appropriate; for task, use basic,
intermediate. or advanced;
suffix should match the type of the image file
you created (typically, tiff, png,
gif,
etc.
For example, the result image for the Basic RenderMan task might
be named rman-basic.tiff, and the Advanced GLSL task result
image might be named glsl-advanced.png.
Create a ZIP archive of your results folder by using the "Compress" entry of the right-click menu for the folder; to submit it, run the command
try wrc-grd shading-3 lab3results.zip
on a CS system that supports 'try' (e.g., one of the compute servers, or one of the machines in ICLs 1-4).
Part 1 is designed to introduce you to the mechanism by which textures are read in from a file and sampled in a shader. For this shader, you will read in a texture from an image file and apply the values in the shader to a number of parameters in the shading algorithm. Part 1 consists of 3 sub-parts. You can either combine all sub-parts into a single "mega-shader", or create individual smaller shaders for each sub-part.
A) Use the texture to define the color of an object.
B) Use individual channels of the texture to define the opacity of an object; supply which channel to use (0,1,2) as a parameter to the shader.
C) Repeat subpart (a) but switch the axis of the texture within the shader (i.e. make the horizontal direction vertical and visa-versa). Feel free to experiment with remapping of how texture coordinates are applied
Several texture maps can be found in the textures
folder of the distribution ZIP file, in both BMP and TIFF formats:
a very boring one
(boring.bmp, boring.tif);
a more exciting one
(earth.bmp, earth.tif);
and a panorama image
(landscape.bmp, landscape.tif)
intended to be used for the Intermediate portion of the assignment.
You are also free to make use of your own favorite image for the texture. Just be aware that textures for RenderMan® should be in TIFF format and, for this lab, textures for GLSL should be in BMP format. If you do this, please submit a copy of the texture file along with your solution.
You can find many freely-usable textures (and models) online; here are some links:
Also, you will be submitting more images for this task than in previous assignments; you'll need to submit an image for each subpart for each renderer (a total of six images). Also, please submit a README file which indicates the parameters/textures used in generating each image.
For RenderMan you can use the old standby scene from Lab 1 (included in
the ZIP distribution for convenience).
Apply the shaders above to any of the three objects in the scene.
Please be aware that you may get an interesting looking mapping if the
texture is applied to the sphere (unless your texture is designed for
spherical mapping).
Be sure to convert your texture to prman format by using
txmake before running your shader.
One note about RenderMan texturing.
RenderMan will automatically set up texture coordinates using a set of
default rules based on the type of primative being shaded.
Though this is handy, it might be confusing if you're not aware of the
rules.
For example, observe the difference between applying your shader on the
wall of rit.rib (which defines absolute values of the
vertex coords) and that of rib2.rit (which applies scaling
to the wall polygon).
You may find it handy to use the -mode periodic option
to txmake.
For GLSL, texture access is typically done in the fragment shader.
However, you must assure that all of the necessary values (e.g. texture
coordinates) are attached the vertices and passed to the fragment
shader via a vertex shader.
This can be passed via the predefined varying variable
gl_TexCoord.
Note that this variable is an array; for this lab, you will be using the
first element of this array.
Texture coordinates from OpenGL are available to the vertex program in the
variables gl_MultiTexCoordn where
n is the texture value of interest (for
this lab, n will be 0).
Thus, adding the following line to your vertex
shader will assure that the texture coordinate is indeed attached to a
vertex and will be properly interpolated and available when the fragment
shader is run:
gl_TexCoord[0] = gl_multTexCoord0;
Setting up textures in OpenGL is a bit complex, even when not passing to
a shader.
To assist you in the passing of a shader from OpenGL application to your
shader, please use the file lab3.cpp as a guide.
This file attempts to recreate the basic scene used in the RenderMan
exercises.
Be sure to read the comments included in this file.
Also included is lab3-trackball.cpp which enables a
trackball-like interaction for moving the scene.
Feel free to use either.
As you may remember, OpenGL does not natively support any image file format for texturing. There are many freely available libraries for reading in image file in a variety of formats. For this lab, we will use the DevIL library, an open source, community project used by many OpenGL programmers. Note that the DevIL library relies on other free packages in support of some file formats (like JPEG, TIFF, PNG). We currently do not have these other libraries installed on the iMacs (which is why the use of BMP files is suggested).
You may have trouble getting XCode to use these libraries, so you may want to fall back on using the compiler by hand for this lab:
g++ -g -o foo lab3.cpp ShaderSetup.cpp -framework GLUT -framework OpenGL -I/usr/local/include -L/usr/local/lib -lILUT -lILU -lIL
If that doesn't work for you, I will be installing a backup version in my own account:
g++ -g -o foo lab3.cpp ShaderSetup.cpp -framework GLUT -framework OpenGL -I/home/fac/wrc/ILdist/include -L/home/fac/wrc/ILdist/lib -lILUT -lILU -lIL
If you receive complaints about a function named
ilutGLLoadImage() being undefined, add the statement
#define ILUT_USE_OPENGL
immediately before the include of IL/ilu.h in
lab3.cpp.
In this part, you will implement environment mapping using a cylindrical
projection.
Recall that rather than using the shading point directly, the perfectly
reflected direction is used to index the texture map.
Please refer to the Texture 1 slides for the equations of
cylindrical projection.
Use the panorama image
(landscape.bmp, landscape.tif)
found in the textures folder for this part.
(This image comes from
Marlin Studios).
For this final part, you will be implementing bump mapping to create a
textured map of the world by applying the earth image
(earth.bmp, earth.tif)
as both a color map and a bump map.
Recall that with bump mapping, your underlying illumination model must
somehow depend upon the normal.
A modified Phong would do the trick.
Initially, you should try mapping the texture on the wall of the image.
If you feel daring, try applying to the sphere (especially challenging
in GLSL where you must do the calculation of the spherical coordinates
in the shader).
For RenderMan, recall that there is a special shader just for bump mapping called a Displacement Shader. Thus, to create this effect you will need both a displacement and a surface shader.
With GLSL, all must be done in the fragment shader.
Remember to pass the correct vectors (in the correct space, namely
tangent space) from the vertex shader to the fragment shader via
varying variables.
In order to convert your vectors into tangent space, you will need the
tangent vector at a point.
A GLSL function to find this vector is used in the supplied
file bumpShader.vert.