CS361 Week 8 - Friday
Feb 24, 2016
CS361Week 8 - Friday
Last time
What did we talk about last time? Radiometry Photometry Colorimetry Lighting with shader code
Ambient Directional (diffuse and specular) Point
Questions?
Project 3
Implementing Point Lights
Back to specular lighting
Adding a specular component to the diffuse shader requires incorporating the view vector
It will be included in the shader file and be set as a parameter in the C# code
Specular light declarations The camera location is added to the declarations As are specular colors and a shininess parameter
float4x4 World;float4x4 View;float4x4 Projection;float4x4 WorldInverseTranspose;float3 Camera;
static const float PI = 3.14159265f;
float4 AmbientColor = float4(1, 1, 1, 1);float AmbientIntensity = 0.1;
float3 DiffuseLightDirection = float3(1, 1, 0);float4 DiffuseColor = float4(1, 1, 1, 1);float DiffuseIntensity = 0.7;
float Shininess;float4 SpecularColor = float4(1, 1, 1, 1);float SpecularIntensity = 0.5;
Specular light structures The output adds a normal so that the half vector can be
computed in the pixel shader A world position lets us compute the view vector to the camera
struct VertexShaderInput{
float4 Position : SV_POSITION;float3 Normal : NORMAL;
};
struct VertexShaderOutput{
float4 Position : SV_POSITION;float4 Color : COLOR;float3 Normal : NORMAL;float4 WorldPosition : POSITIONT;
};
Specular vertex shader The same computations as the diffuse shader, but
we store the normal and the transformed world position in the output
VertexShaderOutput VertexShaderFunction(VertexShaderInput input){
VertexShaderOutput output;
float4 worldPosition = mul(input.Position, World);output.WorldPosition = worldPosition;float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose));float lightIntensity = dot(normal, normalize(DiffuseLightDirection));output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity);output.Normal = normal;return output;
}
Specular pixel shader Here we finally have a real computation because
we need to use the pixel normal (which is averaged from vertices) in combination with the view vector
The technique is the samefloat4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{
float3 light = normalize(DiffuseLightDirection);float3 normal = normalize(input.Normal);float3 reflect = normalize(2 * dot(light, normal) * normal –
light);float3 view = normalize(input.WorldPosition - Camera);float dotProduct = dot(reflect, view);float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity *
SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color);
return saturate(input.Color + AmbientColor * AmbientIntensity + specular);
}
Point lights in SharpDX Point lights model omni lights at a specific position
They generally attenuate (get dimmer) over a distance and have a maximum range
DirectX has a constant attenuation, linear attenuation, and a quadratic attenuation
You can choose attenuation levels through shaders They are more computationally expensive than
directional lights because a light vector has to be computed for every pixel
It is possible to implement point lights in a deferred shader, lighting only those pixels that actually get used
Point light declarations We add light position
float4x4 World;float4x4 View;float4x4 Projection;float4x4 WorldInverseTranspose;float3 LightPosition;float3 Camera;
static const float PI = 3.14159265f;
float4 AmbientColor = float4(1, 1, 1, 1);float AmbientIntensity = 0.1f;float LightRadius = 50;
float4 DiffuseColor = float4(1, 1, 1, 1);float DiffuseIntensity = 0.7;
float Shininess;float4 SpecularColor = float4(1, 1, 1, 1);float SpecularIntensity = 0.5f;
Point light structures We no longer need color in the output We do need the vector to the camera from the location We keep the world location at that fragment
struct VertexShaderInput{
float4 Position : SV_POSITION;float3 Normal : NORMAL;
};
struct VertexShaderOutput{
float4 Position : SV_POSITION;float4 WorldPosition : POSITIONT;float3 Normal : NORMAL;
};
Point light vertex shader
We compute the normal and the world position
VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{VertexShaderOutput output;
float4 worldPosition = mul(input.Position, World);output.WorldPosition = worldPosition;float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose));output.Normal = normal;return output;
}
Point light pixel shader
Lots of junk in herefloat4 PixelShaderFunction(VertexShaderOutput input) : SV_Target{
float3 normal = normalize(input.Normal);float3 lightDirection = LightPosition – (float3)input.WorldPosition;float intensity = pow(1.0f – saturate(length(lightDirection)/LightRadius), 2); lightDirection = normalize(lightDirection); //normalize afterfloat3 view = normalize(Camera - (float3)input.WorldPosition);float diffuseColor = dot(normal, lightDirection); float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view);float specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(diffuseColor);return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular);
}
Student Lecture: BRDFs
BRDFs
BRDF theory The bidirectional reflectance distribution
function is a function that describes the difference between outgoing radiance and incoming irradiance
This function changes based on: Wavelength Angle of light to surface Angle of viewer from surface
For point or directional lights, we do not need differentials and can write the BRDF:
iL
o
θELf cos
)(),( vvl
How is this different?
We've been talking about lighting models Lambertian, specular, etc.
A BRDF is an attempt to model physics slightly better
A big difference is that different wavelengths are absorbed and reflected different by different materials
Rendering models in real time with (more) accurate BRDFs is still an open research problem
Spheres with different BRDFs
They also have global lighting (shadows and reflections) Taken from www.kevinbeason.com
Revenge of the BRDF
The BRDF is supposed to account for all the light interactions we discussed in Chapter 5 (reflection and refraction)
We can see the similarity to the lighting equation from Chapter 5, now with a BRDF:
n
kiLko kkθEfL
1cos),()( vlv
When a BRDF isn't enough… If the subsurface scattering effects
are great, the size of the pixel may matter
Then, a bidirectional surface scattering reflectance distribution function (BSSRDF) is needed
Or if the surface characteristics change in different areas, you need a spatially varying BRDF
And so on…
Constraints on BRDFs Helmholtz reciprocity:
f(l,v) = f(v,l) Conservation of energy:
Outgoing energy cannot be greater than incoming energy
The simplest BRDF is Lambertian shading We assume that energy is scattered equally in all
directions Integrating over the hemisphere gives a factor of π Dividing by π gives us exactly what we saw before:
n
kiLo k
Eπ
L1
diff θcos)( cv
Texture Mapping and Bump Mapping in Shaders
Texture mapping in a shader We'll start with our specular shader
for directional light and add textures to it
Texture
The texture for the ship is below:
Texturing additions
We add a texture variable called ModelTexture
We also add a SamplerState structure that specifies how to filter the texture
Texture2D ModelTexture;SamplerState ModelTextureSampler{
Filter = MIN_MAG_MIP_LINEAR;AddressU = Clamp;AddressV = Clamp;
};
Texturing structures
We add a texture coordinate to the input and the output of the vertex shaderstruct VertexShaderInput
{float4 Position : SV_POSITION;float3 Normal : NORMAL;float2 Texture : TEXCOORD;
};
struct VertexShaderOutput{
float4 Position : SV_POSITION;float4 Color : COLOR;float3 Normal : NORMAL;float4 WorldPosition : POSITIONT;float2 Texture : TEXCOORD;
};
Texturing vertex shader Almost nothing changes here except that we
copy the input texture coordinate into the outputVertexShaderOutput VertexShaderFunction(VertexShaderInput input){
VertexShaderOutput output;
float4 worldPosition = mul(input.Position, World);output.WorldPosition = worldPosition;float4 viewPosition = mul(worldPosition, View);output.Position = mul(viewPosition, Projection);float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose));float lightIntensity = dot(normal, normalize(DiffuseLightDirection));output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity);output.Normal = normal;output.Texture = input.Texture;return output;
}
Texturing pixel shader We have to pull the color from the texture and set its alpha
to 1 Then scale the components of the color by the texture color float4 PixelShaderFunction(VertexShaderOutput input) : SV_Target
{float3 light = normalize(DiffuseLightDirection);float3 normal = normalize(input.Normal);float3 reflect = normalize(2 * dot(light, normal) * normal – light);float3 view = normalize(input.WorldPosition - Camera);float dotProduct = dot(reflect, view);float4 specular = (8 + Shininess) / (8 * PI) *
SpecularIntensity * SpecularColor * max(pow(dotProduct, Shininess), 0) * length(input.Color);float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture);textureColor.a = 1;return saturate(textureColor * input.Color + AmbientColor * AmbientIntensity + specular);
}
Updates to SharpDX
To use a texture, we naturally have to load a texture
We have to set the texture, but as a resource, not as a value
texture = Content.Load<Texture2D>("ShipTexture");
effect.Parameters["ModelTexture"].SetResource<Texture2D>(shipTexture);
Bump mapping in shaders It's easiest to do bump mapping in
SharpDX using a normal map Of course, a normal map is hard to
create by hand What's more common is to create a
height map and then use a tool for creating a normal map from it
xNormal is a free utility to do this http://www.xnormal.net/Downloads.aspx
Height map to normal map The conversion from a grayscale
height map to a normal map looks like this
How does bump mapping work?
We have a normal to a surface, but there are also tangent directions
We call these the tangent and the binormal Apparently serious mathematicians
think it should be called the bitangent
The binormal is tangent to the surface and orthogonal to the other tangent
We distort the normal with weighted sums of the tangent and binormal (stored in our normal map)
Normal
Tangent
Binormal
Upcoming
Next time…
Choosing BRDFs Implementing BRDFs Image-based approaches to
sampling BRDFs
Reminders
Finish reading Chapter 7Finish Project 2
Due tonight by midnight