Top Banner
Lecture 3 INNER PRODUCT SPACES Ioana Luca UPB - Dept. Metode si Modele Matematice 2011 – 2012 I. Luca (UPB)  Inner Produc t Spaces  2011 2012 1 / 33
33

M2 Lecture 3

Jun 04, 2018

Download

Documents

Vlad Alexandru
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 1/33

Page 2: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 2/33

Content

1   Inner product spaces

2   Orthonormal bases

3   The orthogonal complement

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 2 / 33

Page 3: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 3/33

1. Inner product spaces

Definition

Let  V

 be a real vector space. A function  f   ∶V

×V

→ 

  is called an inner product  on  V, if it has the following properties ( axioms of the inner product ):

a

)  f 

(u + v,w

) = f 

(u,w

)+ f 

(v,w

)b)   f (λu,v)  = λf (u,v)c)   f (u,v)  = f (v,u) ,

d)   f (v,v)  ≥ 0 ;   f (v,v)  = 0   ⇐⇒   v  =  0 

 for all  u,v,w  ∈ V  and for all  λ  ∈    . A vector space equipped with an 

inner product is an  inner product space ; a finite dimensional inner product space is a  Euclidean vector space .

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 3 / 33

Page 4: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 4/33

Page 5: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 5/33

1. Inner product spaces

Examples

 

n x  ≡

 (x1, . . . , xn

),  y  ≡

 (y1, . . . , yn

)    x ⋅ y  ≡ x1y1 + . . . + xnyn

 

n x  ≡ (x1, . . . , xn),  y  ≡ (y1, . . . , yn)     x ⋅ y  ≡ x1y1 + . . . + xnyn

Mm,n(  )     A⋅ B  ≡ tr ABT 

=

m

i=1

n

 j=1

AijBij   ⇒ Mm,n(   )  ≡  

  mn

in M

m,1( 

 ):   u⋅

v ≡

 tr uv

T =

m

i=1 uivi ;  M

m,1( 

 ) ≡  

  m

Mm,n(  )     A⋅ B  ≡ tr AB

T =

n

i,j=1

AijBij

([a, b

]) continuous, real-valued functions on

 [a, b

]:

< f, g  >≡     ba f (x)g(x)dx

C ([a, b]) continuous, complex-valued functions on [a, b]:< f, g  >≡

  

  b

a

(x

)g

(x

)dx

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 5 / 33

Page 6: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 6/33

1. Inner product spaces

  n[X ]P   ≡ p0 + p1X  + . . . + pnX n

Q  ≡ q 0 + q 1X  + . . . + q nX n     P   ⋅ Q  ≡ p0q 0 + p1q 1 + . . . + pnq n

 [X ] P   ⋅ Q  ≡   

  1

−1P (x)Q(x)dx

V3   the space of free vectors:

u⋅ v  ≡  u v cos θ   if  u =  0   and  v =  0 

0   if  u  =  0   or  v  =  0 

u

θ

vθ  ∈ [0, π]

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 6 / 33

Page 7: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 7/33

1. Inner product spaces

2

 ) the space of square-summable complex sequences:

2(  )  ≡ (xn)n∈  

    xn   ∈    ,

n=0

xn2 <  ∞x  ≡

 (xn

)n∈  

  ,   y  ≡

 (yn

)n∈  

    x ⋅ y  ≡∞

n=

0

xnyn

Proposition (Properties of an inner product)

1)   u ⋅ (λv)  = λ (u ⋅ v),   0  ⋅ v  = 0

2)   u⋅v =

 0   for all  v ∈ V   ⇒

  u =  0 

3) if  {e1, . . . ,en}  is a basis of  V, then 

u ⋅ ei  = 0   for    i  = 1, . . . , n   ⇒   u  =  0 

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 7 / 33

Page 8: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 8/33

1. Inner product spaces

Definition

1) v  ≡√ v ⋅ v     the  norm  ( induced by the inner product ) of  v

2) v  = 1    v  is called  versor  or  unit vector 

Proposition1) v  ≥ 0   and  v  = 0   ⇐⇒   v  =  0 

2) λv  = λ vExamples

 

n x  = x21 + . . . + x2

n

 

n

x

 =

 

x1

2

+ . . . +

xn

2

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 8 / 33

Page 9: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 9/33

1. Inner product spaces

Mm,n

 ) 

  A

 = tr AAT

=

m

i=

1

n

 j=

1

A2ij

Mm,n(  )     A  = tr AA

T=

n

i, j=1

Aij2C 

([a, b

]) continuous, real-valued functions on

 [a, b

]:

f   =     ba

f (x)2 dx12C ([a, b]) continuous, complex-valued functions on [a, b]:

f   =     baf (x)2 dx12

V3   v  = v;  i,  j,  k  are unit vectors.

if  v

=  0   ⇒   1

v

v  is a unit vector

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 9 / 33

Page 10: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 10/33

Page 11: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 11/33

1. Inner product spaces

Definition

1) The  angle  θ   between the non-zero vectors  u,v:

θ  ∈ [0, π] ,   cos θ  =u⋅ vuv .

2) If  u⋅ v  = 0    the vectors  u,v  are   orthogonal .

Examples

In 

  n the vectors of the standard basis are mutually orthogonal:ei ⋅ e j   = 0 , i = j .

In  V3:   i ⋅ j  = 0,  i ⋅ k  = 0,  j   ⋅ k  = 0.

Proposition

1) Triangle inequality : u + v  ≤ u + v2) Pythagoras’ theorem :   u⋅ v  = 0   ⇒ u ± v2 = u2 + v23) Parallelogramme law :

u + v

2

+

u − v

2= 2

(u

2

+

v

2

)I. Luca (UPB)   Inner Product Spaces   2011 – 2012 11 / 33

Page 12: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 12/33

1. Inner product spaces

Proposition

Let  v1, . . . ,vn  be given in  V

 

  and denote  vij  ≡

 vi ⋅v j . Then,1) the  Gram matrix  (vij)   is symmetric:   vij   = v ji

2) {v1, . . . ,vn}   is a linearly independent set   ⇐⇒   det(vij) = 0.

Example   If  v1, . . . ,vn  are mutually orthogonal and  v1, . . . ,vn

 =  0 ,

then  v1, . . . ,vn  are linearly independent: the Gram matrix (vi  ⋅ v j)  is

v1 ⋅v1   0   . . .   0

0   v2 ⋅v2   . . .   0

⋮ ⋮ ⋮

0 0   . . .   vn ⋅vn

,   with   v1 ⋅v1

 = 0 , . . . ,  vn ⋅vn

 = 0.

Exercise   Note that  AT A  is the Gram matrix corresponding to thecolumns of  A  ∈Mm,n

), and deduce that, if the columns of  A  are

linearly independent, then  AT 

A  is invertible.I. Luca (UPB)   Inner Product Spaces   2011 – 2012 12 / 33

Page 13: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 13/33

2. Orthonormal bases

Definition

Let   S

 be a non-empty subset of an inner product space  V

.1)   S  is   orthogonal  if  u ⋅ v  = 0  for all  u,v  ∈ S,  u = v.2)   S  is  orthonormal  if it is orthogonal and  v  = 1, for any  v  ∈ S.3) An  orthonormal basis  of  V   is a basis which is an orthonormal set.

ExamplesThe trigonometric system

1, sin x, cos x, sin 2x, cos2x, . . . , sin nx, cos nx, . . .

is an orthogonal set in  C ([0, 2π]), and

1√ 2π

,  sin x√ 

π  ,

  cos x√ π

  , . . . , sin nx√ 

π  ,

  cos nx√ π

  , . . .

is orthonormal.I. Luca (UPB)   Inner Product Spaces   2011 – 2012 13 / 33

Page 14: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 14/33

2. Orthonormal bases

The canonical basis of  

  n is an orthonormal basis.

The basis {i, j,k}  of  V3  is orthonormal.If  {e1, . . . ,en, . . .}  is an orthonormal set, then  ei  ⋅ e j   = δ ij .

Exercise   Show that  Q  ∈Mn(   )  is orthogonal, i.e.,  Q−1

= QT ,

(Q  ∈Mn

 )  is unitary, i.e.,  Q−1

= QT 

) if and only if its columns/rows

are orthonormal.

Proposition

1) If   S   is orthogonal and   0  ∈ S,   S   is linearly independent.2) Any orthonormal set is linearly independent.

3) If  {e1, . . . ,en}  is an orthonormal basis of  V    ( V    ) and f  j   =∑n

i=1 C ijei, j  = 1, . . . , n ,   then 

{f 1, . . . ,f n}  is an orthonormal basis   ⇐⇒ the matrix  (C ij) is orthogonal (unitary)

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 14 / 33

Page 15: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 15/33

2. Orthonormal bases

Proposition

If  {e1, . . . ,en}  is an orthonormal basis of  V

 

  , and  u,  v ∈ V

 have the representations 

u  =n

i=1

uiei ,   v  =n

i=1

viei ,

the components of  v  with respect to this basis, the inner product  u ⋅ v,

and the norm of  v  are given by 

vi  = v ⋅ ei ,   u ⋅ v  =n

i=1

uivi ,   v2 = n

i=1

v2i   ,

respectively.

Exercise  Restate and prove the preceding proposition for the caseV

 

  .

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 15 / 33

Page 16: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 16/33

2. Orthonormal bases

Example   The components of  v  = (1, 2, 1)  ∈  

  3 with respect to theorthonormal basis

e1  = 13 , −23

,  23 ,   e2  = −

23

, −23

, −13 ,   e3  = 23 , −

13

, −23

are as follows:

v1  = v ⋅ e1  =  −

1

3 , v2  = v ⋅ e2  =  −

7

3 , v3  = v ⋅ e3  = −

2

3 .

Proposition (Gram-Schmidt orthonormalization process)

If 

 {f 1, . . . ,f n, . . .

} ⊂ V  is a linearly independent set, there exists an 

orthonormal set  {e1, . . . ,en, . . .}  ⊂ V  such that 

Sp [e1, . . . ,en ]  = Sp [f 1, . . . ,f n ] ,

 for each  n  = 1, 2, . . . . It is deduced as follows:

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 16 / 33

O h l b

Page 17: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 17/33

2. Orthonormal bases

Step 1  One determines the orthogonal set

 {d1 ,d2 , . . . ,dn , . . .

} of 

non-zero vectors:

d1  ≡ f 1

d2  ≡ f 2 + ξ d1 ,   where  ξ  satisfies   d2 ⋅ d1  = 0

d3  ≡ f 3 + ξ 1d1 + ξ 2d2 ,   where  ξ 1,  ξ 2  satisfy   d3 ⋅ d1  = 0,   d3 ⋅ d2  = 0

⇒ Sp [d1, . . . ,dn ]  = Sp [f 1, . . . ,f n ]Step 2  One normalizes the vectors  d1 ,d2 , . . . ,dn , . . .:

e1  ≡1d1d1 ,   e2  ≡

1d2d2 , . . . ,   en  ≡1dndn , . . .

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 17 / 33

2 O th l b

Page 18: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 18/33

2. Orthonormal bases

Example   In 

  3, starting from the basis

 {f 1,f 2,f 3

}, where

f 1  ≡ (1,   −2, 2),   f 2  ≡ (−1, 0,  −1),   f 3  ≡ (5,  −3,   −7) ,

we find an orthonormal basis by using the Gram-Schmidt algorithm.Step 1   d1  ≡ f 1 ,   d2  ≡ f 2 + ξ d1 ,   d3  ≡ f 3 + ξ 1d1 + ξ 2d2 ,

where  ξ ,  ξ 1,  ξ 2  are such that

d2 ⋅ d1  = 0,   d3 ⋅ d1  = 0,   d3 ⋅ d2  = 0

⇒ ξ  = −f 2 ⋅ d1

d1 ⋅ d1

=  13

 , ξ 1  =  −f 3 ⋅ d1

d1 ⋅ d1

=  13

 , ξ 2  =  −f 3 ⋅ d2

d2 ⋅ d2

= −1 .

Therefore,

d2  = f 2 +   13 d1  = −2

3 , −23 , −1

3 ,   d3  ≡ f 3 +   13d1 − d2  = (6, −3, −6) .

Step 2  The orthonormal basis consists of the vectors

e1  ≡d1

d1

  =

13

, −23

,  23

, e2  ≡

d2

d2

  =

23

, −23

, −13

, e3  ≡

d3

d3

  =

23

, −13

, −23

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 18 / 33

2 O th l b

Page 19: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 19/33

2. Orthonormal bases

Proposition (The QR decomposition)

Suppose the columns of  A  ∈Mm,n(  )  are linearly independent. Then A  = QR, where the columns of  Q  ∈Mm,n(  )  are orthonormal, and R  ∈Mn( 

 )   is an invertible upper triangular matrix.

Proof   Mm,1  ≡  

  m the columns of  A:

v1  ≡ (A11, A21, . . . , Am1) ,   v2  ≡ (A12, A22, . . . , Am2) , . . . ,

vn  ≡ (A1n, A2n, . . . , Amn)Since  v1, . . . ,vn  are linearly independent, by using the Gram-Schmidtalgorithm one obtains an orthonormal set

 {e1, . . . , en

} such that

Sp [e1, . . . ,ek]  = Sp [v1, . . . ,vk] , k  = 1, . . . , n .

In particular this shows that  ek+1, . . . , en  are orthogonal to  v1, . . . , vk.Thus,  v1, . . . ,vn  ∈ Sp

[e1, . . . ,en

] have the following representations

with respect to the  orthonormal  basis {e1, . . . , en}  of  Sp [e1, . . . ,en]:I. Luca (UPB)   Inner Product Spaces   2011 – 2012 19 / 33

2 O th l b

Page 20: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 20/33

2. Orthonormal bases

v1  =n

i=1

(v1 ⋅ ei

)ei  =

 (v1 ⋅ e1

)e1 ,

v2  =n

i=1

(v2 ⋅ ei)ei  = (v2 ⋅ e1)e1 + (v2 ⋅ e2)e2 ,

vn  =n

i=1(vn  ⋅ ei

)ei  =

 (vn  ⋅ e1

)e1 + . . . +

(vn  ⋅ en

)en .

This implies  A  = QR, where

Q  =

e1   en↓ ↓

  ⋮ ⋮    ,   R  =

v1 ⋅e1   v2 ⋅e1   . . .   vn ⋅e1

0   v2 ⋅e2   . . .   vn ⋅e2⋮ ⋮ ⋮

0 0   . . .   vn ⋅en

.

Moreover,

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 20 / 33

2 O th l b

Page 21: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 21/33

2. Orthonormal bases

v1 ⋅e1  = d1 ⋅ d1d1   = d1 = 0 ,   v2 ⋅e2  = (d2 − ξ d1) ⋅ d2d2   = d2 = 0 , . . .

proving that the matrix  R is invertible.

Example

A  ≡

1 20 11 4

The columns  v1  ≡

 (1, 0, 1

),  v2  ≡

 (2, 1, 4

) are linearly independent, and

hence the factorization  A  = QR  does exist. We start from {v1,v2}  anduse the Gram-Schmidt algorithm:

d1  ≡ v1 ,   d2  ≡ v2 + ξ d1 ,   d2 ⋅ d1  = 0

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 21 / 33

2 Orthonormal bases

Page 22: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 22/33

2. Orthonormal bases

This gives

ξ  = −v2 ⋅ d1

d1 ⋅ d1

=  −3   ⇒   d1  = (1, 0, 1) ,   d2  = (−1, 1, 1) ,

implying

e1  = 1d1 d1  =   1√ 2 (1, 0, 1) ,   e2  = 1d2 d2  =   1√ 

3 (−1, 1, 1) .

Thus, the matrices  Q  and  R  are given by

Q  = 1

√ 2  −

  1

√ 30   1√ 

31√ 2

1√ 3

,   R  =    v1 ⋅ e1   v2 ⋅ e1

0   v2 ⋅ e2  = √ 2 3√ 20 √ 3  .

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 22 / 33

3 The orthogonal complement

Page 23: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 23/33

3. The orthogonal complement

Definition

Let  U  be a vector subspace of an inner product space  V. The subset  U

of  V  consisting of all vectors which are orthogonal to any vector in  U,

U≡ {v  ∈ V  v ⋅ u  = 0 ,   ∀ u  ∈ U} ,

is called the  orthogonal complement of the subspace  U.

Example   In  

  2 we consider the vector subspace

U ≡ Sp [ (2, 1) ]  = {(2α, α)  α  ∈    } .

The orthogonal complement of  U   is the setU= {(x, y)  ∈  

  2 (x, y) ⋅ (2α, α)  = 0 ,   ∀ α  ∈    }  == {(x, y)  ∈  

  2  y  =  −2x}  ==

{x

(1, −2

)  x  ∈  

} = Sp

[ (−1, 2

) ]  x

y

U

U⊥

90◦

(2, 1)

(−1, 2)

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 23 / 33

3 The orthogonal complement

Page 24: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 24/33

3. The orthogonal complement

Proposition

1)  U is a vector subspace of   V.2)  U ∩ U

= {0 }.

3) If   U  is finite dimensional and  {e1, . . . ,ek}  is a basis of  U, then 

v  ∈ U ⇐⇒   v ⋅ ei = 0 ,   ∀   i  = 1, . . . , k .

4) If   U  is finite dimensional, then  V = U ⊕ U.

Remark   Consequence of 4): any vector  v  ∈ V  has the uniquedecomposition

v  = u + u ,   u  ∈ U ,   u ∈ U ;

u  is called the orthogonal projection of  v  onto  U.

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 24 / 33

3 The orthogonal complement

Page 25: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 25/33

3. The orthogonal complement

Remark  Rule for finding the orthogonal projection  u:

If  {e1, . . . ,ek}  is a basis of  U

, then  u  is known if its coordinates  u1to  uk  with respect to this basis are known. We have

v  = u1e1 + . . . + ukek  + u ,

and taking the inner product of  v  with  e1, . . . ,ek  we obtain

(e1 ⋅ e1)u1 + (e2 ⋅ e1)u2 + . . . + (ek ⋅ e1)uk  = v ⋅ e1 ,

⋮   (1)(e1 ⋅ ek)u1 + (e2 ⋅ ek)u2 + . . . + (ek ⋅ ek)uk  = v ⋅ ek .

The Gram determinant corresponding to

 {e1, . . . ,ek

} is non-zero

⇒ unique solution  u1, . . . , uk.If  {e1, . . . ,ek}  is an orthonormal basis of  U, (1) gives

u  =k

i=1

(v ⋅ ei

)ei

(2

)I. Luca (UPB)   Inner Product Spaces   2011 – 2012 25 / 33

3 The orthogonal complement

Page 26: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 26/33

3. The orthogonal complement

Example   In  

  2:

x

y

U

U⊥

90◦

u

u⊥

v

If e.g.   U = Sp [(2, 1)]  and  v  = (5, 6), an orthonormal basis of  U  is {e},e  ≡

 (2

√ 5, 1

√ 5

). Thus, using (2), the orthogonal projection  u  of  v

onto  U  emerges as

u  = (v ⋅ e)e  = (325, 165).

Since  u = v − u, the decomposition of  v  along  U  and  U is

v =

 u+u

=

(325, 165)+

(−

75, 145).I. Luca (UPB)   Inner Product Spaces   2011 – 2012 26 / 33

3 The orthogonal complement

Page 27: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 27/33

3. The orthogonal complement

Proposition

If  u  is the orthogonal projection of  v  ∈ V  on  U, then 

v − u  ≤ v − u′ ,   ∀  u′ ∈ U ;

moreover, if for a some  u′ ∈ U  we have  v − u′  = v − u, then  u′ = u.In other words,

minu

′∈U

v − u′  = v − u ,   (3)and this minimum is attained only by the orthogonal projection of  vonto  U.

Remark v − u  = d(v,u) Reformulation of (3): among all theelements of  U, the “closest” one to  v  ∈ V   is the orthogonal projection  uof  v  on  U. Moreover,  u  is the unique element of  U  closest to  v; it isalso called the element in  U  of best approximation for  v.

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 27 / 33

3 The orthogonal complement

Page 28: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 28/33

3. The orthogonal complement

Example   In the vector space  C ([0, 2π])  with the inner product

< f, g  >≡     2π0f (x)g(x)dx

we consider the subspace

U ≡ Sp

[e0,  e1,  e2, . . . ,e2n−1,  e2n

],

where

e0≡1√ 2π

,   e1≡sin x√ 

π  ,   e2≡

cos x√ π

  , . . . ,e2n−1≡sin nx√ 

π  ,   e2n≡

cos nx√ π

  .

Since

 {e0,e1,e2, . . . ,e2n−1,e2n

} is an orthonormal basis of  U, the

element in  U  of best approximation for  f   ∈ C ([0, 2π])  can be deducedaccording to (2). One obtains the   trigonometric polynomial 

P (x)  = c0 + c1 sin x + c2 cos x + . . . + c2n−1 sin nx + c2n cos nx ,   (4)whereI. Luca (UPB)   Inner Product Spaces   2011 – 2012 28 / 33

3. The orthogonal complement

Page 29: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 29/33

3. The orthogonal complement

c0 =

1

2π   

  2π

0f 

(x

)dx , c

2k−

1 =

1

π   

  2π

0f 

(x

) sinkxdx,

c2k  =1

π  

  2π

0f (x) cos kx dx , k  = 1, . . . , n ;

c0, . . . , c2n  are the  Fourier coefficients  of  f . In  C 

([a, b

]) the norm

f   − g2 =     ba(f (x) − g(x))2 dx

is called the  quadratic error . So, we have obtained that, among alltrigonometric polynomials of at most degree  n,  P 

(x

) given by (4) is

the trigonometric polynomial of which the quadratic error with respectto  f   is minimal.

Exercise   Find mina,b ∈  

    1

0(ex − (a + bx))2 dx

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 29 / 33

3. The orthogonal complement

Page 30: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 30/33

3. The orthogonal complement

Example   (The least squares method ) Let  y  be a physical quantitywith the following dependence on the variables  x1, . . . , xk:

y  = c1x1 + . . . + ckxk .

The parameters  c1, . . . , ck  must be adjusted to best fit the data setfrom the next table:

y x1   x2   . . . xk

y1   x11   x12   . . . x1k

y2   x21   x22   . . . x2k

⋮ ⋮ ⋮ ⋮ ⋮

yn   xn1   xn2   . . . xnk

Usually,  n >

 k, and the overdetermined systemy1  = c1x11 + . . . + ckx1k

⋮   (5)yn  = c1xn1 + . . . + ckxnk

is inconsistent (incompatible). Thus, the goal is to find c1

, . . . , ck

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 30 / 33

3. The orthogonal complement

Page 31: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 31/33

3. The orthogonal complement

which render as small as possible the sum of squares of “errors”between the left- and right-hand sides of these equations:

ni=1(yi  − (c1xi1 + . . . + ckxik))2 ;

(c1, . . . , ck) is called a  pseudo-solution  (in the sense of the least squares )of the linear system (5). Defining the vectors  y,  e1, . . . ,  ek  of 

 

  n by

y  ≡ (y1, . . . , yn) ,   e1  ≡ (x11, . . . , xn1), . . . ,   ek  ≡ (x1k, . . . , xnk),

the least squares requirement can be restated as: find  c1, . . . , ck   whichminimize

y −

(c1e1 + . . . + ckek

).

Equivalently: find  u  ∈ Sp [e1, . . . ,ek]  ≡ U, such thaty − u  = minu

′∈U

y − u′.

Clearly,  u  is the orthogonal projection of  y  onto  U. If  e1, . . . ,ek   are

linearly independent (which is most likely to happen, due toI. Luca (UPB)   Inner Product Spaces   2011 – 2012 31 / 33

3. The orthogonal complement

Page 32: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 32/33

g p

measurement errors), {e1, . . . ,ek} is a basis of  U, and hence (c1, . . . , ck)represents the (unique) solution of the system

(e1 ⋅ e1)c1 + (e2 ⋅ e1)c2 + . . . + (ek ⋅ e1)ck  = y ⋅ e1 ,

⋮   (6)(e1 ⋅ ek)c1 + (e2 ⋅ ek)c2 + . . . + (ek ⋅ ek)ck  = y ⋅ ek .

For the case  k  = 1, i.e., the model function is  y  = cx  (a straight line),one registers the data (x1, y1), . . . , (xn, yn). With  e  ≡ (x1, . . . , xn), thepreceding system reduces to (e⋅ e)c  = y ⋅ e, so that

c  =n

i=

1

xiyi

 n

i=

1

x2i

x

y

 y =

 c x

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 32 / 33

3. The orthogonal complement

Page 33: M2 Lecture 3

8/13/2019 M2 Lecture 3

http://slidepdf.com/reader/full/m2-lecture-3 33/33

g p

Remark  In matrix form the system (5) emerges as

Ac  = y,

  (5′

)while (6) reads as

AT Ac  = AT y .   (6′)We have shown that, if the columns of  A  are linearly independent,there is a unique pseudo-solution of  (5′), which is the (unique) solutionof  (6′). The system (6′)  is called the  normal system associated to (5′).

Exercise  Find the pseudo-solution of the linear system

c1

 − c2

 +

3c3 =

 12c1 + 3c2  = 3c2 + c3  = 4−2c1 + 3c2 + 2c3  = −2−c1 + 4c2 + 3c3  = 1

I. Luca (UPB)   Inner Product Spaces   2011 – 2012 33 / 33