Top Banner
Signal Processing and Representation Theory Lecture 1
49
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Signal Processing and Representation Theory Lecture 1.

Signal Processingand

Representation Theory

Lecture 1

Page 2: Signal Processing and Representation Theory Lecture 1.

Outline:• Algebra Review

– Numbers– Groups– Vector Spaces– Inner Product Spaces– Orthogonal / Unitary Operators

• Representation Theory

Page 3: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Numbers (Reals)Real numbers, ℝ, are the set of numbers that we express in decimal notation, possibly with infinite, non-repeating, precision.

Page 4: Signal Processing and Representation Theory Lecture 1.

Algebra ReviewNumbers (Reals)Example: =3.141592653589793238462643383279502884197…

Completeness: If a sequence of real numbers gets progressively “tighter” then it must converge to a real number.

Size: The size of a real number aℝ is the square root of its square norm: 2aa

Page 5: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Numbers (Complexes)Complex numbers, , are the set of numbers that we ℂexpress as a+ib, where a,b and ℝ i= .

Example: ei=cos+isin

1

Page 6: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Numbers (Complexes)Let p(x)=xn+an-1xn-1+…+a1x1+a0 be a polynomial with ai .ℂ

Algebraic Closure:

p(x) must have a root, x0 in :ℂ

p(x0)=0.

Page 7: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Numbers (Complexes)Conjugate: The conjugate of a complex number a+ib is:

Size: The size of a real number a+ibℂ is the square root of its square norm:

ibaiba

22)()( baibaibaiba

Page 8: Signal Processing and Representation Theory Lecture 1.

Algebra ReviewGroupsA group G is a set with a composition rule + that takes two elements of the set and returns another element, satisfying:

– Asscociativity: (a+b)+c=a+(b+c) for all a,b,cG.– Identity: There exists an identity element 0G such

that 0+a=a+0=a for all aG.– Inverse: For every aG there exists an element -aG

such that a+(-a)=0.

If the group satisfies a+b=b+a for all a,bG, then the group is called commutative or abelian.

Page 9: Signal Processing and Representation Theory Lecture 1.

Algebra ReviewGroupsExamples:

– The integers, under addition, are a commutative group.– The positive real numbers, under multiplication, are a

commutative group.– The set of complex numbers without 0, under

multiplication, are a commutative group.– Real/complex invertible matrices, under multiplication

are a non-commutative group.– The rotation matrices, under multiplication, are a non-

commutative group. (Except in 2D when they are commutative)

Page 10: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Real) Vector SpacesA real vector space is a set of objects that can be added together and scaled by real numbers.

Formally:

A real vector space V is a commutative group with a scaling operator:

(a,v)→av,

a , ℝ vV, such that:

1. 1v=v for all vV.

2. a(v+w)=av+aw for all a , ℝ v,wV.

3. (a+b)v=av+bv for all a,b , ℝ vV.

4. (ab)v=a(bv) for all a,b , ℝ vV.

Page 11: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Real) Vector SpacesExamples:• The set of n-dimensional arrays with real coefficients is a

vector space.

• The set of mxn matrices with real entries is a vector space.

• The sets of real-valued functions defined in 1D, 2D, 3D,… are all vector spaces.

• The sets of real-valued functions defined on the circle, disk, sphere, ball,… are all vector spaces.

• Etc.

Page 12: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Complex) Vector SpacesA complex vector space is a set of objects that can be added together and scaled by complex numbers.

Formally:

A complex vector space V is a commutative group with a scaling operator:

(a,v)→av,

a , ℂ vV, such that:1. 1v=v for all vV.

2. a(v+w)=av+aw for all a , ℂ v,wV.

3. (a+b)v=av+bv for all a,b , ℂ vV.

4. (ab)v=a(bv) for all a,b , ℂ vV.

Page 13: Signal Processing and Representation Theory Lecture 1.

Algebra Review(Complex) Vector SpacesExamples:• The set of n-dimensional arrays with complex coefficients

is a vector space.• The set of mxn matrices with complex entries is a vector

space.• The sets of complex-valued functions defined in 1D, 2D,

3D,… are all vector spaces.• The sets of complex-valued functions defined on the

circle, disk, sphere, ball,… are all vector spaces.• Etc.

Page 14: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Real) Inner Product SpacesA real inner product space is a real vector space V with a mapping V,V→ℝ that takes a pair of vectors and returns a real number, satisfying:

u,v+w= u,v+ u,w for all u,v,wV. αu,v=αu,v for all u,vV and all αℝ. u,v= v,u for all u,vV. v,v0 for all vV, and v,v=0 if and only if v=0.

Page 15: Signal Processing and Representation Theory Lecture 1.

Algebra Review(Real) Inner Product SpacesExamples:

– The space of n-dimensional arrays with real coefficients is an inner product space.If v=(v1,…,vn) and w=(w1,…,wn) then:

v,w=v1w1+…+vnwn

– If M is a symmetric matrix (M=Mt) whose eigen-values are all positive, then the space of n-dimensional arrays with real coefficients is an inner product space.If v=(v1,…,vn) and w=(w1,…,wn) then:

v,wM=vMwt

Page 16: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Real) Inner Product SpacesExamples:

– The space of mxn matrices with real coefficients is an inner product space.If M and N are two mxn matrices then:

M,N=Trace(MtN)

Page 17: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Real) Inner Product SpacesExamples:

– The spaces of real-valued functions defined in 1D, 2D, 3D,… are real inner product space.If f and g are two functions in 1D, then:

– The spaces of real-valued functions defined on the circle, disk, sphere, ball,… are real inner product spaces.If f and g are two functions defined on the circle, then:

dxxgxfgf )()(,

2

0)()(, dgfgf

Page 18: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Complex) Inner Product SpacesA complex inner product space is a complex vector space V with a mapping V,V→ℂ that takes a pair of vectors and returns a complex number, satisfying:

u,v+w= u,v+ u,w for all u,v,wV. αu,v=αu,v for all u,vV and all αℝ.

– for all u,vV. v,v0 for all vV, and v,v=0 if and only if v=0.

uv,vu,

Page 19: Signal Processing and Representation Theory Lecture 1.

Algebra Review(Complex) Inner Product SpacesExamples:

– The space of n-dimensional arrays with complex coefficients is an inner product space.If v=(v1,…,vn) and w=(w1,…,wn) then:

– If M is a conjugate symmetric matrix ( ) whose eigen-values are all positive, then the space of n-dimensional arrays with complex coefficients is an inner product space.If v=(v1,…,vn) and w=(w1,…,wn) then:

v,wM=vMwt

nn11 wv...wvwv, tMM

Page 20: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Complex) Inner Product SpacesExamples:

– The space of mxn matrices with real coefficients is an inner product space.If M and N are two mxn matrices then:

NMNM, tTrace

Page 21: Signal Processing and Representation Theory Lecture 1.

Algebra Review

(Complex) Inner Product SpacesExamples:

– The spaces of complex-valued functions defined in 1D, 2D, 3D,… are real inner product space.If f and g are two functions in 1D, then:

– The spaces of real-valued functions defined on the circle, disk, sphere, ball,… are real inner product spaces.If f and g are two functions defined on the circle, then:

dxxgxfgf )()(,

2

0)()(, dgfgf

Page 22: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Inner Product SpacesIf V1,V2V, then V is the direct sum of subspaces V1, V2, written V=V1V2, if:

– Every vector vV can be written uniquely as:

for some vectors v1V1 and v2V2.21 vvv

Page 23: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Inner Product SpacesExample:

If V is the vector space of 4-dimensional arrays, then V is the direct sum of the vector spaces V1,V2V where:

– V1=(x1,x2,0,0)

– V2=(0,0,x3,x4)

Page 24: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Orthogonal / Unitary OperatorsIf V is a real / complex inner product space, then a linear map A:V→V is orthogonal / unitary if it preserves the inner product:

v,w= Av,Awfor all v,wV.

Page 25: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Orthogonal / Unitary OperatorsExamples:

– If V is the space of real, two-dimensional, vectors and A is any rotation or reflection, then A is orthogonal.

A=

v2v1

A(v2)

A(v1)

Page 26: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Orthogonal / Unitary OperatorsExamples:

– If V is the space of real, three-dimensional, vectors and A is any rotation or reflection, then A is orthogonal.

A=

Page 27: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Orthogonal / Unitary OperatorsExamples:

– If V is the space of functions defined in 1D and A is any translation, then A is orthogonal.

A=

Page 28: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Orthogonal / Unitary OperatorsExamples:

– If V is the space of functions defined on a circle and A is any rotation or reflection, then A is orthogonal.

A=

Page 29: Signal Processing and Representation Theory Lecture 1.

Algebra Review

Orthogonal / Unitary OperatorsExamples:

– If V is the space of functions defined on a sphere and A is any rotation or reflection, then A is orthogonal.

A=

Page 30: Signal Processing and Representation Theory Lecture 1.

Outline:• Algebra Review

• Representation Theory– Orthogonal / Unitary Representations– Irreducible Representations– Why Do We Care?

Page 31: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Orthogonal / Unitary RepresentationAn orthogonal / unitary representation of a group G onto an inner product space V is a map that sends every element of G to an orthogonal / unitary transformation, subject to the conditions:

1. (0)v=v, for all vV, where 0 is the identity element.

2. (gh)v=(g) (h)v

Page 32: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Orthogonal / Unitary RepresentationExamples:

– If G is any group and V is any vector space, then:

is an orthogonal / unitary representation.

– If G is the group of rotations and reflections and V is any vector space, then:

is an orthogonal / unitary representation.

vvg )(

vgvg )det()(

Page 33: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Orthogonal / Unitary RepresentationExamples:

– If G is the group of nxn orthogonal / unitary matrices, and V is the space of n-dimensional arrays, then:

is an orthogonal / unitary representation.

vgvg )(

Page 34: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Orthogonal / Unitary RepresentationExamples:

– If G is the group of 2x2 rotation matrices, and V is the vector space of 4-dimensional real / complex arrays, then:

is an orthogonal / unitary representation.

),(),,(,,,)( 43214321 xxgxxgxxxxg

Page 35: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Irreducible RepresentationsA representation , of a group G onto a vector space V is irreducible if cannot be broken up into smaller representation spaces.

That is, if there exist WV such that:

(G)WW

Then either W=V or W=.

Page 36: Signal Processing and Representation Theory Lecture 1.

Representation TheoryIrreducible RepresentationsIf WV is a sub-representation of G, and W is the space of vectors perpendicular to W:

v,w=0for all vW and wW, then V=WW and W is also a sub-representation of V.For any gG, vW, and wW, we have:

So if a representation is reducible, it can be broken up into the direct sum of two sub-representations.

wvgwggvgwgv ,,,0 11

Page 37: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Irreducible RepresentationsExamples:

– If G is any group and V is any vector space with dimension larger than one, then:

is not an irreducible representation.

vvg )(

Page 38: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Irreducible RepresentationsExamples:

– If G is the group of 2x2 rotation matrices, and V is the vector space of 4-dimensional real / complex arrays, then:

is not an irreducible representation since it maps the space W=(x1,x2,0,0) back into itself.

),(),,(,,,)( 43214321 xxgxxgxxxxg

Page 39: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Why do we care?

Page 40: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Why we careIn shape matching we have to deal with the fact that rotations do not change the shape of a model.

=

Page 41: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Exhaustive SearchIf vM is a spherical function representing model M and vn is a spherical function representing model N, we want to find the minimum over all rotations T of the equation:

NMNM

NMNM

vTvvv

vTvvvT

,2

),,(22

2

D

Page 42: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Exhaustive SearchIf V is the space of spherical functions then we can consider the representation of the group of rotations on this space.

By decomposing V into a direct sum of its irreducible representations, we get a better framework for finding the best rotation.

Page 43: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Exhaustive Search (Brute Force)Suppose that {v1,…,vn} is some orthogonal basis for V, then we can express the shape descriptors in terms of this basis:

vM=a1v1+…+anvn

vN=b1v1+…+bnvn

Page 44: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Exhaustive Search (Brute Force)Then the dot-product of M and N at a rotation T is equal to:

n

jijiji

n

jjj

n

iii

n

jjj

n

iiiNM

vTvba

vTbva

vbTvavTv

1,

11

11

,

,

,,

Page 45: Signal Processing and Representation Theory Lecture 1.

n

jijijiNM vTvbavTv

1,

,,

Representation Theory

Exhaustive Search (Brute Force)So that the nxn cross-multiplications are needed:

T(vn)

vM

v1

v2

vn

=

+

+

+

T(v1)

=

+

+

+

T(v2)

T(vN)… …

Page 46: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Exhaustive Search (w/ Rep. Theory)Now suppose that we can decompose V into a collection of one-dimensional representations.

That is, there exists an orthogonal basis {w1,…,wn} of functions such that T(wi)wiℂ for all rotations T and hence:

wi,T(wj)=0 for all i≠j.

Page 47: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Exhaustive Search (w/ Rep. Theory)Then we can express the shape descriptors in terms of this basis:

vM=α1w1+…+αnwn

vN=β1w1+…+βnwn

Page 48: Signal Processing and Representation Theory Lecture 1.

Representation Theory

Exhaustive Search (w/ Rep. Theory)And the dot-product of M and N at a rotation T is equal to:

n

iiiii

n

jijiji

n

jjj

n

iii

n

jjj

n

iiiNM

wTw

wTw

wTw

wTwvTv

1

1,

11

11

,

,

,

,,

Page 49: Signal Processing and Representation Theory Lecture 1.

n

iiiiiNM wTwvTv

1

,,

Representation Theory

Exhaustive Search (w/ Rep. Theory)So that only n multiplications are needed:

T(wn)

vM

w1

w2

wn

=

+

+

+

T(w1)

=

+

+

+

T(w2)

T(vN)… …