Top Banner
Computer Graphics Recitation 6
29

Computer Graphics

Mar 19, 2016

Download

Documents

Greta*

Computer Graphics. Recitation 6. Last week - eigendecomposition. We want to learn how the transformation A works:. A. Last week - eigendecomposition. If we look at arbitrary vectors, it doesn’t tell us much. A. Spectra and diagonalization. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Computer Graphics

Computer Graphics

Recitation 6

Page 2: Computer Graphics

2

Last week - eigendecomposition

A

We want to learn how the transformation A works:

Page 3: Computer Graphics

3

Last week - eigendecomposition

A

If we look at arbitrary vectors, it doesn’t tell us much.

Page 4: Computer Graphics

4

Spectra and diagonalization

A

Moreover, if A is symmetric, the eigenvectors are orthogonal (and there’s always an eigenbasis).

A = UUT

n

2

1

= Aui = i ui

Page 5: Computer Graphics

5

In real life…

Matrices that have eigenbasis are rare – general transformations involve also rotations, not only scalings.

We want to understand how general transformations behave.

Need a generalization of eigendecomposition SVD: A = UVT

Before we learn SVD, we’ll see Principal Component Analysis – usage of spectral analysis to analyze the shape and dimensionality of scattered data.

Page 6: Computer Graphics

6

The plan for today

First we’ll see some applications of PCA Then look at the theory.

Page 7: Computer Graphics

7

x’y’

PCA finds an orthogonal basis that best represents given data set.

The sum of distances2 from the x’ axis is minimized.

PCA – the general idea

x

y

Page 8: Computer Graphics

8

PCA – the general idea

PCA finds an orthogonal basis that best represents given data set.

PCA finds a best approximating plane (again, in terms of distances2)

3D point set instandard basis

x y

z

Page 9: Computer Graphics

9

PCA – the general idea

PCA finds an orthogonal basis that best represents given data set.

PCA finds a best approximating plane (again, in terms of distances2)

3D point set instandard basis

Page 10: Computer Graphics

10

Application: finding tight bounding box

An axis-aligned bounding box: agrees with the axes

x

y

minX maxX

maxY

minY

Page 11: Computer Graphics

11

Usage of bounding boxes (bounding volumes)

Serve as very simple “approximation” of the object Fast collision detection, visibility queries Whenever we need to know the dimensions (size) of the object

The models consist of thousands of polygons

To quickly test that they don’t intersect, the bounding boxes are tested

Sometimes a hierarchy of BB’s is used

The tighter the BB – the less “false alarms” we have

Page 12: Computer Graphics

12

Application: finding tight bounding box

Oriented bounding box: we find better axes!

x’y’

Page 13: Computer Graphics

13

Application: finding tight bounding box

This is not the optimal bounding box

x y

z

Page 14: Computer Graphics

14

Application: finding tight bounding box

Oriented bounding box: we find better axes!

Page 15: Computer Graphics

15

Notations

Denote our data points by x1, x2, …, xn Rd

dn

n

n

n

dd x

xx

x

xx

x

xx

2

1

2

22

12

2

1

21

11

1 ,,, xxx

Page 16: Computer Graphics

16

The origin is zero-order approximation of our data set (a point)

It will be the center of mass:

It can be shown that:

The origin of the new axes

n

iin

1

1 xm

n

ii

1

2argmin xxmx

Page 17: Computer Graphics

17

Scatter matrix

Denote yi = xi – m, i = 1, 2, …, n

n

kkk

1

TyyS

dk

k

k

y

yy

2

1 dkkk yyy 21

j

kik yy=

n

k 1

n

k 1

d d

Page 18: Computer Graphics

18

Scatter matrix - eigendecomposition

S is symmetric S has eigendecomposition: S = VVT

S = v2v1 vn

12

n

v2

v1

vn

The eigenvectors formorthogonal basis

Page 19: Computer Graphics

19

Principal components

S measures the “scatterness” of the data. Eigenvectors that correspond to big eigenvalues

are the directions in which the data has strong components.

If the eigenvalues are more or less the same – there is not preferable direction.

Page 20: Computer Graphics

20

Principal components

There’s no preferable direction

S looks like this:

Any vector is an eigenvector

There is a clear preferable direction

S looks like this:

is close to zero, much smaller than .

TVV

Page 21: Computer Graphics

21

How to use what we got

For finding oriented bounding box – we simply compute the bounding box with respect to the axes defined by the eigenvectors. The origin is at the mean point m.

v2v1

v3

Page 22: Computer Graphics

22

For approximation

x

yv1

v2

x

y

This line segment approximates the original data set

The projected data set approximates the original data set

x

y

Page 23: Computer Graphics

23

For approximation

In general dimension d, the eigenvalues are sorted in descending order:

1 2 … d The eigenvectors are sorted accordingly. To get an approximation of dimension d’ < d, we

take the d’ first eigenvectors and look at the subspace they span (d’ = 1 is a line, d’ = 2 is a plane…)

Page 24: Computer Graphics

24

For approximation

To get an approximating set, we project the original data points onto the chosen subspace:

xi = m + 1v1 + 2v2 +…+ d’vd’ +…+dvd

Projection: xi’ = m + 1v1 + 2v2 +…+ d’vd’ +0vd’+1+…+ 0 vd

Page 25: Computer Graphics

25

Optimality of approximation

The approximation is optimal in least-squares sense. It gives the minimal of:

The projected points have maximal variance.

n

kkk

1

2xx

Original set projection on arbitrary line projection on v1 axis

Page 26: Computer Graphics

26

Technical remarks:

i 0, i = 1,…,d (such matrices are called positive semi-definite). So we can indeed sort by the magnitude of i

Theorem: i 0 <Sv, v> 0 v

Proof:

Therefore, i 0 <Sv, v> 0 v

uuuuvVvV

vVVvvvvvVVTTTT

TTTT

,)()(

, SSS

2222

211 ..., dduuuvv S

Page 27: Computer Graphics

27

Technical remarks:

In our case, indeed <Sv, v> 0 v This is because S can be represented as

S = XXT, X is dn matrix

So:

0||||,)()(

,,2

vXvXvXvXvX

vXXvvvXXvvTTTTTT

TTTS

Page 28: Computer Graphics

28

Technical remarks:

S = XXT, X is dn matrix

dk

k

k

y

yy

2

1 dkkk yyy 21

j

kik yy=

n

k 1

n

k 1

d d

=

=

dnnn

d

d

dn

dd

n

n

yyy

yyyyyy

yyy

yyyyyy

21

222

12

121

11

21

222

21

112

11

= XXT

Page 29: Computer Graphics

See you next time