CS 445: Exam 2 Review, Fall 2014
Exam 2 will be Tuesday, Nov. 18, 2014.
Review class notes, online notes, labs, and relevant sections in textbook. The exam will be closed
book, no calculators.
Topics
See the midterm topics. This exam will primarily focus on the topics since the midterm.
However, everything is cumulative. One can't completely understand the new material without understanding the old material.
-
Transformation Pipeline
- Pipeline: Understand what the different stages of the pipeline do:
geometry pipeline
(also see pipeline).
- View Matrix, V:
- Calculating the view matrix: V = viewRotation * Translate( -eye )
- "eye" is the location of the camera in the WCS
- Calculating u, v, n for the camera. Use these to create the viewRotation matrix.
Also see Coordinate Transform Notes
- Modelview Matrix: Coordinate transformations of a point: Object Coordinates (OCS) to World Coordinates (WCS) to
Camera Coordinates (CCS):
- The Modelview matrix is the product of the view matrix, V, and the model transforms M (e.g. obtained from the scene graph).
See picture
- Also see Coordinate Transform Notes
- How do normals transform? If points transform as P' = M P, then the normals transform as
N' = (M-1)T N. Why?
For which transforms can one substitute
M for (M-1)T?
- Projection and Viewport Transformations:
- What is the view volume or Frustum?
Orthographic vs Perspective.
- What is the purpose of the projection transformation? How does it change the
frustum? See picture.
- What is the purpose of the viewport transformation? See picture.
- See notes and the methods in mat.h:
Ortho(), Frustum(), Perspective().
- What is the Perspective Division? Why is it separated out from the projection transformation?
- What is rasterization?
-
Controls for Navigation through a scene
- Fly through transformations. See your code from Lab 4.
- Tumble, track (pan), dolly (zoom) - For tumble, see Coordinate Transform Notes
(Note, in lab we did not differentiate beween zoom and dolly, but zoom is actually a little different. With zoom,
you don't move the camera but rather change the focal length so objects appear closer or farther. With dolly, you
keep the camera focal length fixed but physically move the camera closer or farther. The lab had you dolly the camera.)
Shaders
- What is a vertex shader and what is a fragment shader?
- Vertex-Fragment Shader Pipeline: Where do vertex and fragment shaders fit in the OpenGL graphics pipeline.
- What gets implemented in the application program (e.g. setting Vertex Buffers for the attribute variables, setting uniform variables)? What is implemented in the shaders? What does openGL automatically do for you
(e.g. perspective division, rasterization)?
- Communication between openGL and GLSL - uniform and attribute variables. What is the difference? Which shader has access to
uniform shaders? to attribute shaders?
- Implementing the Phong Lighting Model in the vertex shader (Gouraud Shading) vs the fragment shader (Phong Shading).
What are advantages or disadvantages of each?
Also see Shading Models
- Implementing a color map in the shader.
- Advanges of using shaders over the fixed pipeline.
Blending and transparency with the alpha component.
- How does a z-buffer work?
- Why is it difficult to do transparency correctly? See blending.
- How does the "discard()" function work in a programmable shader and why is it not possible to get the same effect in fixed pipeline.
- Implementing shadows
Textures
- Texels
- Mapping coordinate systems: texture (s,t),
parameterized (u,v), world (x,y,z), screen (x_s, y_s).
- Texture coordinates in OpenGL. How does the choice of texture coordinates determine how the
texture is applied to geometry, e.g. a quad. (see Nate Robins tutorials)
- Texture parameters: wrap type - repeat vs clamp
- Combining lighting and texture in shader.
- Generating textures: procedural vs image
- Bump mapping, or see Wikipedia.
- What is Mipmapping -
multum in parvo (many things in a small place) and why is it used? E.g.
Minification and Magnification