Top Banner
CS361 Week 9 - Monday
31

CS361

Feb 22, 2016

Download

Documents

PauloS

Week 9 - Monday. CS361. Last time. What did we talk about last time? BRDFs Texture mapping and bump mapping in shaders. Questions?. Project 3. Choosing BRDFs. Fresnel reflectance. Fresnel reflectance is an ideal mathematical description of how perfectly smooth materials reflect light - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: CS361

CS361Week 9 - Monday

Page 2: CS361

Last time

What did we talk about last time? BRDFs Texture mapping and bump mapping

in shaders

Page 3: CS361

Questions?

Page 4: CS361

Project 3

Page 5: CS361

Choosing BRDFs

Page 6: CS361

Fresnel reflectance Fresnel reflectance is an ideal mathematical

description of how perfectly smooth materials reflect light

The angle of reflection is the same as the angle of incidence and can be computed:

The transmitted (visible) radiance Lt is based on the Fresnel reflectance and the angle of refraction of light into the material:

it

iiFt L

θθθRL 2

2

sinsin))(1(

lnlnr )(2i

Page 7: CS361

Snell's Law The angle of refraction into the material is related to the

angle of incidence and the refractive indexes of the materials below the interface and above the interface:

We can combine this identity with the previous equation:

)sin()sin( 21 ti θnθn

iiFt LnnθRL 2

1

22))(1(

Page 8: CS361

Fresnel example

Page 9: CS361

External reflection Reflectance is obviously

dependent on angle Perpendicular (0°) gives

essentially the specular color of the material

Higher angles will become more reflective

The function RF(θi) is also dependent on material (and the light color)

Page 10: CS361

Approximating reflection

Because it's non-linear, Schlick gives an approximation that works for most substances:

We can use a table of RF(0°) values

5)cos1))(0(1()0()( iFFiF θRRθR

Page 11: CS361

Internal reflection

External reflection needs to be modeled more often than internal reflection

Modeling internal reflection is the same except that the higher optical density can cause total internal reflection

Page 12: CS361

Diffuse light Usually is not as complex as specular light We can measure a value ρ that gives the ratio

between light escaping a surface relative to light entering a surface

ρ is called the scattering albedo Because of conversation of energy, the more

light that is reflected through Fresnel reflection, the less there is to be reflected diffusely

Thus, a simple approximation for diffuse light is

πρθRf iF ))(1(),(diff vl

Page 13: CS361

Microgeometry

Page 14: CS361

Microgeometry

The cause of many lighting effects is microgeometry The smoother the surface, the tighter (and brighter) the

reflections are

Page 15: CS361

Weird effects Glancing angles can minimize the impacts of

surface roughness, making rough surfaces reflective at very high angles

Most surfaces are isotropic (symmetrical) in the way they are rough

Anisotropic surfaces like brushed metal have directional blurring

Page 16: CS361

Implementing BRDFs

Page 17: CS361

Where do BRDFs come from? The book gives a number of BRDF

equations It is also possible to samples

materials (from every angle, at every color of light) to measure a BRDF of your own

Once you've got such a model, how do you implement it?

Page 18: CS361

Implementation The shader will use the following equation:

The cosine term is found with the dot product Most BRDFs contain a 1/π term Many systems pre-divide EL by π

Make sure you don't double divide (or double multiply) If some value is computed repeatedly, consider

putting it in a texture for lookup Mipmapping may not work for non-linear BRDFs

kiL

n

kko EfL θcos),()(

1

vlv

Page 19: CS361

Optimizations

It may be expensive to compute the shading based on all the light sources

Also, many APIs (and various graphics cards) limit the number of light sources

Some lights must be averaged into each other for performance reasons

Page 20: CS361

Deferred shading

Shading is usually done while z-buffer testing is done

It's possible to do all the z-buffer testing and then go back and shade only those fragments that contribute to the final scene

Page 21: CS361

The Other Side of the Fence

Page 22: CS361

Image based rendering A great deal of graphics research deals with

rendering real scenes Don't cameras do that?

Sure, but these graphics guys couldn't publish papers if the stuff wasn't hard for some reason: Reconstructing novel viewpoints Walkthroughs with user controlled paths Introducing synthetic objects into real scenes Re-lighting real scenes with new light sources

I would be remiss if I didn't mention these topics even though they usually have nothing to do with video games and often cannot be rendered in real time

Page 23: CS361

Plenoptic function A central idea in image-based rendering is

the plenoptic function, sometimes called a light field

The basic plenoptic function is and its result is a color In other words, it tells you the color you would

see if you were at and looked in the direction given by angles and

There are also more complicated plenoptic functions that take into account time, wavelength, and more

Page 24: CS361

Sea of Images Although the research is old now,

Daniel Aliaga et al. produced an impressive system for recreating real scenes in real time in which a user can control the path he or she takes

A robot records thousands and thousands of omnidirectional images and its location when it takes them

Then, images are merged together to create a novel view for the current location and orientation

Page 25: CS361

Sea of Images issues Rendering the images in real time isn't hard Knowing the robot's position for all images is

surprisingly difficult Storing and loading the next images that will

be needed in reconstruction is a huge caching and compression problem

Getting the robot to walk around and scan a scene automatically ended up being too hard

Some of these ideas were used for Google Street View, which is neither real time nor allows for arbitrary locations

Page 26: CS361

Sea of Images Video

Page 27: CS361

Image based lighting Synthetic objects can be rendered using a

BRDF based on measurements of real-world materials

Alternatively, we could sample a real-world object from many different directions and get enough information to re-light it

You can also capture lighting from the real world using a mirrored ball

Then you can re-light: A real image with a different set of real lights Synthetic objects with realistic real light

Page 28: CS361

Image Based Lighting Video

Page 29: CS361

Upcoming

Page 30: CS361

Next time…

Area lighting Environment mapping

Page 31: CS361

Reminders

Work on Assignment 4 Due this Friday, March 20

Start working on Project 3 Due April 2

Keep reading Chapter 8