Top Banner
CS361 Week 2 - Friday
24

Week 2 - Friday. What did we talk about last time? Graphics rendering pipeline Geometry Stage.

Dec 31, 2015

Download

Documents

Lambert McCoy
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

CS361Week 2 - Friday

Page 2: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Last time

What did we talk about last time? Graphics rendering pipeline

Geometry Stage

Page 3: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Questions?

Page 4: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Project 1

Page 5: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Assignment 1

Page 6: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Let's see those matrices in SharpDX again

Page 7: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Backface culling

I did not properly describe an important optimization done in the Geometry Stage: backface culling

Backface culling removes all polygons that are not facing toward the screen

A simple dot product is all that is needed This step is done in hardware in SharpDX and OpenGL You just have to turn it on Beware: If you screw up your normals, polygons could

vanish

Page 8: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Graphics rendering pipeline For API design, practical top-down problem

solving, and hardware design, and efficiency, rendering is described as a pipeline

This pipeline contains three conceptual stages:

Produces

material to be

rendered

Application

Decides what, how, and

where to

render

Geometry

Renders the final image

Rasterizer

Page 9: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Student Lecture: Rasterizer Stage

Page 10: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Rasterizer Stage

Page 11: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Rasterizer Stage

The goal of the Rasterizer Stage is to take all the transformed geometric data and set colors for all the pixels in the screen space

Doing so is called: Rasterization Scan Conversion

Note that the word pixel is actually a portmanteau for "picture element"

Page 12: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

More pipelines

As you should expect, the Rasterization Stage is also divided into a pipeline of several functional stages:

Triangle

Setup

Triangle Travers

al

Pixel Shadi

ng

Merging

Page 13: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Triangle Setup

Data for each triangle is computed This could include normals This is boring anyway because fixed-

operation (non-customizable) hardware does all the work

Page 14: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Triangle Traversal

Each pixel whose center is overlapped by a triangle must have a fragment generated for the part of the triangle that overlaps the pixel

The properties of this fragment are created by interpolating data from the vertices

Again, boring, fixed-operation hardware does this

Page 15: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Pixel Shading

This is where the magic happens Given the data from the other

stages, per-pixel shading (coloring) happens here

This stage is programmable, allowing for many different shading effects to be applied

Perhaps the most important effect is texturing or texture mapping

Page 16: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Texturing

Texturing is gluing a (usually) 2D image onto a polygon To do so, we map texture coordinates onto polygon

coordinates Pixels in a texture are called texels This is fully supported in hardware Multiple textures can be applied in some cases

Page 17: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Merging

The final screen data containing the colors for each pixel is stored in the color buffer

The merging stage is responsible for merging the colors from each of the fragments from the pixel shading stage into a final color for a pixel

Deeply linked with merging is visibility: The final color of the pixel should be the one corresponding to a visible polygon (and not one behind it)

Page 18: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Z-buffer

To deal with the question of visibility, most modern systems use a Z-buffer or depth buffer

The Z-buffer keeps track of the z-values for each pixel on the screen

As a fragment is rendered, its color is put into the color buffer only if its z value is closer than the current value in the z-buffer (which is then updated)

This is called a depth test

Page 19: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Pros and cons of the Z-buffer

Pros Polygons can usually be rendered in any order Universal hardware support is available

Cons Partially transparent objects must be

rendered in back to front order (painter's algorithm)

Completely transparent values can mess up the z buffer unless they are checked

z-fighting can occur when two polygons have the same (or nearly the same) z values

Page 20: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

More buffers

A stencil buffer can be used to record a rendered polygon This stores the part of the screen covered by the

polygon and can be used for special effects Frame buffer is a general term for the set of

all buffers Different images can be rendered to an

accumulation buffer and then averaged together to achieve special effects like blurring or antialiasing

A back buffer allows us to render off screen to avoid popping and tearing

Page 21: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Finals words on the pipeline This pipeline is focused on interactive graphics

Micropolygon pipelines are usually used for film production

Predictive rendering applications usually use ray tracing renderers

The old model was the fixed-function pipeline which gave little control over the application of shading functions

The book focuses on programmable GPUs which allow all kinds of tricks to be done in hardware

Page 22: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Upcoming

Page 23: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Next time…

GPU architecture Programmable shading

Page 24: Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.

Reminders

Read Chapter 3 Start on Assignment 1, due next

Friday, January 30 by 11:59 Keep working on Project 1, due

Friday, February 6 by 11:59