Computer GraphicsComputer Graphics- Discrete Techniques -- Discrete Techniques -
Hanyang University
Jong-Il Park
Division of Electrical and Computer Engineering, Hanyang University
ObjectivesObjectives Buffers and pixel operations Mapping methods
Texture mapping Environmental (reflection) mapping
Variant of texture mapping Bump mapping
Solves flatness problem of texture mapping Blending Anti-aliasing
Division of Electrical and Computer Engineering, Hanyang University
BufferBufferDefine a buffer by its spatial resolution (n x m)
and its depth (or precision) k, the number of bits/pixel
pixel
Division of Electrical and Computer Engineering, Hanyang University
OpenGL Frame BufferOpenGL Frame Buffer
Division of Electrical and Computer Engineering, Hanyang University
OpenGL BuffersOpenGL Buffers Color buffers can be displayed
Front Back Auxiliary Overlay
Depth Accumulation
High resolution buffer Stencil
Holds masks
Division of Electrical and Computer Engineering, Hanyang University
Writing in BuffersWriting in Buffers Conceptually, we can consider all of memory as a
large two-dimensional array of pixels We read and write rectangular block of pixels
Bit block transfer (bitblt) operations The frame buffer is part of this memory
frame buffer(destination)
writing into frame buffer
sourcememory
Division of Electrical and Computer Engineering, Hanyang University
The Limits of Geometric ModelingThe Limits of Geometric Modeling
Although graphics cards can render over 10 million polygons per second, that number is insufficient for many phenomena Clouds Grass Terrain Skin
Division of Electrical and Computer Engineering, Hanyang University
Modeling an OrangeModeling an Orange
Consider the problem of modeling an orange (the fruit) Start with an orange-colored sphere
Too simple Replace sphere with a more complex shape
Does not capture surface characteristics (small dimples) Takes too many polygons to model all the dimples
Division of Electrical and Computer Engineering, Hanyang University
Modeling an Orange (2)Modeling an Orange (2) Take a picture of a real orange, scan it, and “paste”
onto simple geometric model This process is known as texture mapping
Still might not be sufficient because resulting surface will be smooth Need to change local shape Bump mapping
Division of Electrical and Computer Engineering, Hanyang University
Three Types of MappingThree Types of Mapping Texture Mapping
Uses images to fill inside of polygons
Environment (reflection mapping) Uses a picture of the environment for texture maps Allows simulation of highly specular surfaces
Bump mapping Emulates altering normal vectors during the rendering
process
Division of Electrical and Computer Engineering, Hanyang University
Texture MappingTexture Mapping
geometric model texture mapped
Division of Electrical and Computer Engineering, Hanyang University
Environment Mapping Environment Mapping
Division of Electrical and Computer Engineering, Hanyang University
Bump MappingBump Mapping
Division of Electrical and Computer Engineering, Hanyang University
Where does mapping take place?Where does mapping take place? Mapping techniques are implemented at the
end of the rendering pipeline Very efficient because few polygons make it past the
clipper
Division of Electrical and Computer Engineering, Hanyang University
Is it simple?Is it simple? Although the idea is simple---map an image to a
surface---there are 3 or 4 coordinate systems involved
2D image
3D surface
Division of Electrical and Computer Engineering, Hanyang University
Coordinate SystemsCoordinate Systems
Parametric coordinates May be used to model curves and surfaces
Texture coordinates Used to identify points in the image to be mapped
Object or World Coordinates Conceptually, where the mapping takes place
Window Coordinates Where the final image is really produced
Division of Electrical and Computer Engineering, Hanyang University
Texture MappingTexture Mapping
parametric coordinates
texture coordinatesworld coordinates window coordinates
Division of Electrical and Computer Engineering, Hanyang University
Mapping FunctionsMapping Functions Basic problem is how to find the maps Consider mapping from texture coordinates to a point
on a surface Appear to need three functions
x = x(s,t)y = y(s,t)z = z(s,t)
But we really want to go the other way
s
t
(x,y,z)
Division of Electrical and Computer Engineering, Hanyang University
Backward MappingBackward Mapping We really want to go backwards
Given a pixel, we want to know to which point on an object it corresponds
Given a point on an object, we want to know to which point in the texture it corresponds
Need a map of the form s = s(x,y,z)t = t(x,y,z)
Such functions are difficult to find in general
Division of Electrical and Computer Engineering, Hanyang University
Two-part mappingTwo-part mapping One solution to the mapping problem is to first map
the texture to a simple intermediate surface Example: map to cylinder
Division of Electrical and Computer Engineering, Hanyang University
Box MappingBox Mapping Easy to use with simple orthographic projection Also used in environment maps
Division of Electrical and Computer Engineering, Hanyang University
Second MappingSecond Mapping Map from intermediate object to actual object
Normals from intermediate to actual Normals from actual to intermediate Vectors from center of intermediate
intermediateactual
Division of Electrical and Computer Engineering, Hanyang University
AliasingAliasing Point sampling of the texture can lead to aliasing
errors
point samples in u,v (or x,y,z) space
point samples in texture space
miss blue stripes
Division of Electrical and Computer Engineering, Hanyang University
Area AveragingArea Averaging
A better but slower option is to use area averaging
Note that preimage of pixel is curved
pixelpreimage
Division of Electrical and Computer Engineering, Hanyang University
Basic StragegyBasic StragegyThree steps to applying a texture
1. specify the texture read or generate image assign to texture enable texturing
2. assign texture coordinates to vertices Proper mapping function is left to application
3. specify texture parameters wrapping, filtering
Division of Electrical and Computer Engineering, Hanyang University
Texture MappingTexture Mapping
s
t
x
y
z
image
geometry display
Division of Electrical and Computer Engineering, Hanyang University
Based on parametric texture coordinates glTexCoord*() specified at each vertex
s
t1, 1
0, 1
0, 0 1, 0
(s, t) = (0.2, 0.8)
(0.4, 0.2)
(0.8, 0.4)
A
B C
a
bc
Texture Space Object Space
Mapping a TextureMapping a Texture
Division of Electrical and Computer Engineering, Hanyang University
Typical CodeTypical Code
glBegin(GL_POLYGON);glColor3f(r0, g0, b0); //if no shading usedglNormal3f(u0, v0, w0); // if shading usedglTexCoord2f(s0, t0);glVertex3f(x0, y0, z0);glColor3f(r1, g1, b1);glNormal3f(u1, v1, w1);glTexCoord2f(s1, t1);glVertex3f(x1, y1, z1);
.
.glEnd();
Division of Electrical and Computer Engineering, Hanyang University
Magnification and MinificationMagnification and Minification
Texture PolygonMagnification Minification
PolygonTexture
More than one texel can cover a pixel (minification) ormore than one pixel can cover a texel (magnification)
Can use point sampling (nearest texel) or linear filtering( 2 x 2 filter) to obtain texture values
Division of Electrical and Computer Engineering, Hanyang University
Environment mappingEnvironment mapping Environmental mapping is way to create the
appearance of highly reflective surfaces without ray tracing which requires global calculations
Examples: The Abyss, Terminator 2 Is a form of texture mapping
Supported by OpenGL and Cg
Division of Electrical and Computer Engineering, Hanyang University
ExampleExample
Division of Electrical and Computer Engineering, Hanyang University
Reflecting the EnvironmentReflecting the Environment
V
N
R
Division of Electrical and Computer Engineering, Hanyang University
Mapping to a sphereMapping to a sphere
V
N
R
Division of Electrical and Computer Engineering, Hanyang University
Cube MapCube Map
Division of Electrical and Computer Engineering, Hanyang University
IssuesIssues Must assume environment is very far from object
(equivalent to the difference between near and distant lights)
Object cannot be concave (no self reflections possible)
No reflections between objects Need a reflection map for each object Need a new map if viewer moves
Division of Electrical and Computer Engineering, Hanyang University
Bump mappingBump mapping
Division of Electrical and Computer Engineering, Hanyang University
Modeling an OrangeModeling an Orange Consider modeling an orange Texture map a photo of an orange onto a surface
Captures dimples Will not be correct if we move viewer or light We have shades of dimples rather than their correct
orientation Ideally we need to perturb normal across surface of
object and compute a new color at each interior point
Division of Electrical and Computer Engineering, Hanyang University
Bump Mapping (Blinn)Bump Mapping (Blinn) Consider a smooth surface
n
p
Division of Electrical and Computer Engineering, Hanyang University
Rougher VersionRougher Version
n’
p
p’
Division of Electrical and Computer Engineering, Hanyang University
Displacement FunctionDisplacement Function
p’ = p + d(u,v) n
d(u,v) is the bump or displacement function
|d(u,v)| << 1
Division of Electrical and Computer Engineering, Hanyang University
Approximating the NormalApproximating the Normal
n’ = p’u p’v
≈ n + (∂d/∂u)n pv + (∂d/∂v)n pu
• The vectors n pv and n pu lie in the tangent plane • Hence the normal is displaced in the tangent plane• Must precompute the arrays ∂d/ ∂u and ∂d/ ∂v
• Finally, we perturb the normal during shading
Division of Electrical and Computer Engineering, Hanyang University
Image ProcessingImage Processing Suppose that we start with a function d(u,v) We can sample it to form an array D=[dij] Then ∂d/ ∂u ≈ dij – di-1,j
and ∂d/ ∂v ≈ dij – di,j-1 Embossing: multipass approach using
accumulation buffer
Division of Electrical and Computer Engineering, Hanyang University
Eg. Bump mappingEg. Bump mapping
Division of Electrical and Computer Engineering, Hanyang University
Opacity and TransparencyOpacity and Transparency Opaque surfaces permit no light to pass
through Transparent surfaces permit all light to pass Translucent surfaces pass some light translucency = 1 – opacity ()
opaque surface =1
Division of Electrical and Computer Engineering, Hanyang University
Physical ModelsPhysical Models Dealing with translucency in a physically
correct manner is difficult due to the complexity of the internal interactions of light and
matter Using a pipeline renderer
Division of Electrical and Computer Engineering, Hanyang University
Writing ModelWriting Model Use A component of RGBA (or RGB) color to
store opacity During rendering we can expand our writing
model to use RGBA values
destinationcomponent
blend
destination blending factor
source blending factor sourcecomponent
Color Buffer
Division of Electrical and Computer Engineering, Hanyang University
Blending EquationBlending Equation
We can define source and destination blending factors for each RGBA component
s = [sr, sg, sb, s]
d = [dr, dg, db, d]
Suppose that the source and destination colors are b = [br, bg, bb, b]
c = [cr, cg, cb, c]
Blend as c’ = [br sr+ cr dr, bg sg+ cg dg , bb sb+ cb db , b s+ c d ]
Division of Electrical and Computer Engineering, Hanyang University
FogFog We can composite with a fixed color and have
the blending factors depend on depth Simulates a fog effect
Blend source color Cs and fog color Cf by
Cs’=f Cs + (1-f) Cf
f is the fog factor Exponential Gaussian Linear (depth cueing)
Division of Electrical and Computer Engineering, Hanyang University
Fog FunctionsFog Functions
Division of Electrical and Computer Engineering, Hanyang University
OpenGL Fog FunctionsOpenGL Fog FunctionsGLfloat fcolor[4] = {……}:
glEnable(GL_FOG);glFogf(GL_FOG_MODE, GL_EXP);glFogf(GL_FOG_DENSITY, 0.5);glFOgv(GL_FOG, fcolor);
Division of Electrical and Computer Engineering, Hanyang University
Line AliasingLine Aliasing Ideal raster line is one pixel wide All line segments, other than vertical and horizontal
segments, partially cover pixels Simple algorithms color only whole pixels Lead to the “jaggies” or aliasing Similar issue for polygons
Division of Electrical and Computer Engineering, Hanyang University
Anti-AliasingAnti-Aliasing
Division of Electrical and Computer Engineering, Hanyang University
Area Averaging Area Averaging
Use average area 1+2-12 as blending factor
Division of Electrical and Computer Engineering, Hanyang University
OpenGL AntialiasingOpenGL Antialiasing Can enable separately for points, lines, or polygons
glEnable(GL_POINT_SMOOTH);glEnable(GL_LINE_SMOOTH);glEnable(GL_POLYGON_SMOOTH);
glEnable(GL_BLEND);glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Division of Electrical and Computer Engineering, Hanyang University
Division of Electrical and Computer Engineering, Hanyang University
Accumulation BufferAccumulation Buffer Compositing and blending are limited by
resolution of the frame buffer Typically 8 bits per color component
The accumulation buffer is a high resolution buffer (16 or more bits per component) that avoids this problem
Write into it or read from it with a scale factor Slower than direct compositing into the frame
buffer Application: Compositing, Image Filtering (convolution),
Whole scene anti-aliasing, Motion effects