Assignment 4 Frequently Asked Questions (FAQ)

Some general comments:

As far as this assignment is concerned, all of your shaders will probably look something like this:

set intensity of light seen by viewer to zero
for every light
   determine incident light intensity on current vertex from current light
   determine how much of that light reflects to the viewer
   add this to the total amount of light seen by the viewer

Note that, in general, not all shaders have to loop over the lights as shown above. For example, if you take a look at the reflection map example shader, you will see that the color at each vertex depends only on the color in a texture map. The lights are not taken into consideration.


Q: A comment at the end of shader.c says "there are many ways to add the light shaders." What different ways are there?

A: There are at least two ways of doing this.

If you like, you can create a global pointer to a light shader, and call that from within each of your surface shaders to figure out how much light is incident on the surface from each of the lights. Alternatively, you can duplicate the code from one of your surface shaders, and insert your light shader code into that. The former is more general, since then it's possible to combine any of your light shaders with any of your surface shaders; however, the latter is fine too. We will accept code written either way.

If you're feeling really ambitious, you can try creating a light shader pointer for every light, so that each light can be of a different type. This is not required, but if you're interested in doing this, consider looking at how doConfig() in viewer.c uses the configuration file routines to gather parameters for multiple lights. You will probably want to do something similar in your shaderSetup() function.


Q: How do I specify which light shader to use?

A: You need to add code to your shaderSetup() function to get the name of the light shader from the configuration file. Depending on how you add the light shaders to your shader.c, you will want specify light shaders either by using a new parameter called "whichLightShader" or by using the existing parameter called "whichShader."


Q: Why doesn't the light's intensity in the sample light shader fall off with r2, as light does in real life?

A: This is a classic graphics hack that is often done either for performance or for expedience. No real reason that it doesn't fall off; it can be easier to control that way in the interests of achieving some effect. Your shaders don't need to implement r2 falloff, though you're free to do so.


Q: What are uscale and vscale in the LightInfo struct used for?

A: These two parameters are useful for projected textures. For instance, you can use them, along with your projected texture's width and height, to adjust how to map texture coordinates to (or from) the coordinate space defined by the light's up and right vectors.

Projected textures are discussed in more detail below.


Q: I can't get the noise function FBm to work correctly. What's going wrong?

A: Maybe you need to #include "mnoise.h"? You might also find the comments at the top of that file to be helpful.


Q: OpenGL lights have a specular component. Should we implement specular reflections by adding a specular component to our lights?

A: No. Real lights do not emit different amounts of light for different types of reflections. Instead of adding specular components to your lights, you should add a specular component to your surface.


Q: For the 3D checkerboard shader, what do you mean by "use the object-space coordinates of each vertex to obtain 3D texture coordinates"?

A: We want you to write a shader that uses the object-space coordinates of the vertices, which are stored in rawSurfacePosition, to determine which checkerboard color to use. To do this, you will probably want to shift and scale the rawSurfacePosition coordinates to be in a well-defined, easy-to-use range, such as 0 to 1 or 0 to the number of checks in a given direction. You will almost certainly find that the object-space bounding box is needed to perform this step. After the shifting and scaling, you have what we call the 3D texture coordinates, which can then be used to directly compute which color to select.

We do not want you to write a shader that maps a 2D checkerboard pattern onto the surface of an object. You will lose a significant number of points if you do this.


Q: I'm still stuck on the checkerboard shader. How do I directly compute which color to select?

A: This is an important part of the problem, and you have to figure it out on your own. But here is a hint to get you started:

Think of a 2D checkerboard. Pretend that the lower left square is light, and call that square 0,0. The rest of the squares are numbered according to their location on the board: 0,0 0,1 0,2 are the first three squares in the first row, 0,0 1,0 2,0 are the first three squares in the first column. Given that sort of parameterization of the checkerboard, you can determine the color of a square given its coordinates: 0,0 is light; 0,1 is dark; 0,2 is light; etc.

Solve the 2D problem first, by finding a function that decides whether x,y is light or dark. Then extend your function to work for x,y,z in 3D.


Q: I think I implemented my checkerboard shader correctly, but when I shade the teapot, it looks distorted! What am I doing wrong?

A: Perhaps nothing.

Think of a taking a 3D cookie cutter to an infinite stack of alternating gray and brown cubes. If your object is curved, you might see what look to be rings. And if you lop off the corners of cubes, you might see triangles instead of squares.

If you reduce the number of cubes and shade the sphere, you might be able to visualize what things are supposed to look like. This might help you understand whether or not what you are seeing is correct.


Q: Can you give me some hints on how to make a nice looking wood shader?

A: Wood has layers made up of dark and light bands, due to the fact that trees grow a new ring each year. Layers are not perfectly flat, and they also vary in thickness.

Find a way to model the layers, then find a way to use the noise function (FBm) to make the layers wavy and bumpy. Use the noise function again to vary the thickness of the layers.

There are forseeably other ways to model wood, so if you've used something different and are satisfied with the results, then don't go reimplementing your wood shader.


Q: For the wood shader, do we have to have specular reflections?

A: It's your choice. Some types of wood have specular reflections, some types of wood do not.

Note that, if you call your diffuse plus specular shader (i.e. your red plastic shader) from inside your wood shader, you can get both diffuse and specular reflections without writing any new code. Just have your wood shader set the diffuse color (and possibly also the specular color) before calling your diffuse plus specular shader to finish off the lighting computation.


Q: For the bump-mapped surface, the assignment says to use a specular shading model. Does this mean we shouldn't have any ambient or diffuse reflection? It sure looks bad if I don't.

A: For the bump-mapped shader, we want you to implement diffuse reflection as well as specular reflection. Ambient reflection is optional. You can reuse your code from the red plastic shader if you like, as long as you configure it for a gray surface and not a red one.


Q: For the bump mapped shader, where do I get the texture coordinates u and v? Where do I get Blinn's Pu and Pv?

A: You did read the online guide, didn't you? In there, we point you to shader.h. Please read the comments in that file.


Q: For the bump-mapped shader, where do I get Blinn's F(u,v) (the function to purturb the normal) from?

A: You have two choices. Either you can read in an image using the image functions described in image.h, or you can use a mathematical function of your choosing. In the former case, you can use images/stencilblur.ppm or your own image. If you choose to use your own image, or if you choose to use your own function, remember that we want to see your bump mapping shader in action. Do not choose a function that does not show off your bump mapping shader!


Q: For the spotlight shader, should we use the configuration file parameter lightNSpotAngle to set the angle of our spotlights?

A: You can, if you like.

Alternatively, you can ignore lightNSpotAngle and use a single global for all of your spotlights.


Q: The projected shader does a projection along the z-axis. What do we have to do that's different from this?

A: You should be able to handle the general case of an arbitrarily positioned light, with orientation defined by the light direction, up, and right vectors, plus texture scaling using uscale and vscale. Assume that the vectors describing the orientation of the light are all of unit length and at right angles to one another, and assume that the plane containing the texture is perdendicular to the light direction.

The exact distance from the light to the texture is up you, as long as you specify scaling factors in your configuration file to project a nicely-positioned image onto the object of your choice. We do not want to have to tweak your configuration files to make things look nice.

Note that you have to specify the light direction, up, and right vectors in your configuration file. It's easist to start by placing the light on the z axis, then choosing the light direction to be along the -z axis (0 0 -1), the up vector to be along the y axis (0 1 0), and the right vector to be along the x axis (1 0 0). This way, you are sure to have all of the vectors with unit length and at right angles to one another. Don't worry about the fact that this is a left-handed coordinate system and not a right-handed one; things will still work fine.

Once you've set up your configuration file in this way, make sure you test things by using the "Move Lights" option on the Shader Menu, which is accessible with the right mouse button.

Finally, modify your configuration file and test the case where your light does not necessarily start out on the z axis.


Q: I'm stuck on the projected light shader. How do I start?

A: One way to implement the projected light shader is to extend the sample projected light shader, which assumes the light is sitting somewhere on the z axis and is pointing toward -z. To handle a light at any position and orientation, you might think about how to translate and rotate such a light to place it in the position and orientation required by the sample projected light shader.

You will probably find that the translation is the easy part and that the rotation is the hard part.

Hint on the rotation: you have light direction, up, and right vectors supplied to you; these vectors are all of unit length and are all at right angles to one another. You can do the rotation simply by forming a 3x3 matrix out of the three direction vectors and doing a matrix multiply. The details are for you to figure out.

Be warned that, although you specify the light direction, up, and right vectors as constants in the configuration files, these values will change as you move the light with the "Move Lights" menu option. Please do not assume that these values will remain constant.


Q: What are the light's up and right vectors supposed to do?

A: The up and right vectors are needed to orient the projected light shader's texture. Specifically, if your texture is centered on the ray going from the light along the light's direction, you should further align the texture so that its horizontal u axis is aligned with the right vector and its vertical v axis is aligned with the up vector.

You can assume that the up, right, and direction vectors are all of unit length and all at right angles to one another; however, you must set this up in your configuration file for this to really be the case. Please see the other questions on the projected light shader for more info on this.


Q: Could you please elaborate on what you're looking for in the disco ball shader?

A: Sure. We are looking for a light shader that radiates a collection of at least 50-100 spots from a point light source. The spots can be circular or rectangular or otherwise simply shaped. One of the more important criteria, which we did not explicitly mention in the handout, but which falls under the "reasonably nice" category and is clarified here, is to make all of the spots around the same size and shape, and to distribute the spots relatively uniformly across the sphere surrounding the point light source; spots that narrow or converge considerably near the poles will not receive full credit.

A real disco ball light makes use of a spotlight shining on a faceted, mirrored ball. This is unlike our light shader, which is equivalent to placing a point light source at the center of a spherical mask with holes in it. If you like, you are welcome to model the spotlight and faceted ball combination; just make sure you mention this in your README. Also make doubly sure that you make it clear how to modify your configuration file to reorient the spotlight and the ball, since the geometry will be much too tricky for us to figure out by guessing.

Don't sweat too much over the uniform distribution of spots. If you want to skip that part, only a few points will be deducted.


Q: I see strange artifacts at the poles of my sphere and teapot, and I think this is because every so often I get a surfaceNormal that contains invalid floating point values (NaNs). Am I doing something wrong?

A: Probably not. Whoever wrote the normal-computing code in the NURBS dicer didn't do something reasonable to handle degenerate cases. If you see flicker-type errors at the poles of the teapot or the sphere as a result of bad normals, ignore them.


Q: Aaaaaaagh! How do I use the noise function?

A: Turn to the whole page full of documentation about noise and you will (Matt hopes) be enlightened.


Q: Which way does lights[i].direction point? From the light to the object or from the object to the light?

A: Neither. It points from the light toward the direction that the light is aimed. It does not necessarily point at the object.

The light direction is only used for directional lights. It is not used for omnidirectional lights (lights that shine equally in all directions).


Q: Regarding the images I'm supposed to put up on my web page, can I make those images have a really high dicing factor?

A: We want the images on your web page to be made using the configuration files you turn in, the dicing factor included. If you want to make extra images that have a really high dicing factor, feel free to do so. Just be sure to note which images are which.

Related to this, remember to set the dicing in your configuration files to values that cause your images to be rendered in a few seconds or less. If we have to wait forever for your images to render, you will make us very unhappy.


Q: What exactly is going to get picked up by the submit script?

A: The following files will be picked up by the submit script:

README Your README file
shader.c Your shader code
*.cfg Your configuration files
*.ppm Any textures that go along with your shaders

The submission will fail if it is larger than 512K compressed. Please limit the size of your textures.

If you must submit large images, please contact cs248tas@graphics.stanford.edu long before you want to submit to make special arrangements.

Please do not submit the images you are placing on your web page.


Q: Any tips for how to debug my disco ball shader? A: Yep. Try placing it at (0,0,0) and using the sphere object. This is an easy way to visualize the distribution of dots that you've generated. (Note that you may need to adjust your surface shading model to shade when edotn is less than zero, since otherwise the sphere won't be shaded since the light is behind the surface according to the surface normal.
CS248: Introduction to Computer Graphics, Pat Hanrahan, Fall 1998