Top Banner
Principle Component Principle Component Analysis Analysis Presented by: Sabbir Ahmed Roll: FH-227
26

Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Jan 11, 2016

Download

Documents

Laurel Hill
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Principle Component Principle Component AnalysisAnalysis

Presented by:Sabbir Ahmed

Roll: FH-227

Page 2: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

OverviewOverviewVariance and CovarianceEigenvector and EigenvaluePrinciple Component AnalysisApplication of PCA in Image

Processing

2

Page 3: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Variance and Variance and Covariance(1/2)Covariance(1/2)The variance is a measure of

how far a set of numbers is spread out.

The equation of variance is

1)var( 1

n

xxxxx

n

iii

3

Page 4: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Variance and Variance and Covariance(2/2)Covariance(2/2)Covariance is a measure of how

much two random variables change together.

The equation of variance is

1

))((),cov( 1

n

yyxxyx

n

iii

4

Page 5: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Covariance MatrixCovariance MatrixCovariance Matrix is a n*n matrix

where each element can be define as

A covariance matrix over 2 dimensional dataset is

),cov( jiM ij

),cov(),cov(

),cov(),cov(

yyxy

yxxxM

5

Page 6: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

EigenvectorEigenvectorThe eigenvectors of a square matrix A are

the non-zero vectors x such that, after being multiplied by the matrix, remain parallel to the original vector.

11

12

33

33

6

Page 7: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

EigenvalueEigenvalueFor each Eigenvector, the corresponding Eigenvalue is the factor by which the eigenvector is scaled when multiplied by the matrix.

11

12

33

33

1

7

Page 8: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Eigenvector and Eigenvalue Eigenvector and Eigenvalue (1/2)(1/2)The vector x is an eigenvector of

the matrix A with eigenvalue λ (lambda) if the following equation holds:

0)(,

0,

xIAor

xAxor

xAx

8

Page 9: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Eigenvector and Eigenvalue Eigenvector and Eigenvalue (2/2)(2/2)Calculating Eigenvalues

Calculating Eigenvector

0 IA

0)( xIA

9

Page 10: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Principle Component Analysis Principle Component Analysis (1/3)(1/3)PCA (Principle Component

Analysis) is defined as an orthogonal linear transformation that transforms the data to a new coordinate system such that the greatest variance comes to lie on the first coordinate, the second greatest variance on the second coordinate and so on.

10

Page 11: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Principle Component Analysis Principle Component Analysis (2/3)(2/3)

11

Page 12: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Principle Component Analysis Principle Component Analysis (3/3)(3/3)

12

Page 13: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Principle ComponentPrinciple ComponentEach Coordinate in Principle

Component Analysis is called Principle Component.

Ci = bi1 (x1) + bi2 (x2) + … + bin(xn)

where, Ci is the ith principle component, bij is the regression coefficient for observed variable j for the principle component i and xi are the variables/dimensions.

13

Page 14: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Eigenvector and Principle Eigenvector and Principle ComponentComponentIt turns out that the Eigenvectors

of covariance matrix of the data set are the principle components of the data set.

Eigenvector with the highest eigenvalue is first principle component and with the 2nd highest eigenvalue is the second principle component and so on.

14

Page 15: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Steps to find Principle Steps to find Principle ComponentsComponents1. Adjust the dataset to zero mean

dataset.2. Find the Covariance Matrix M3. Calculate the normalized

Eigenvectors and Eigenvalues of M4. Sort the Eigenvectors according to

Eigenvalues from highest to lowest5. Form the Feature vector F using

the transpose of Eigenvectors.6. Multiply the transposed dataset

with F 15

Page 16: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

ExampleExample

X Y

2.5 2.4

0.5 0.7

2.2 2.9

1.9 2.2

3.1 3.0

2.3 2.7

2 1.6

1 1.1

1.5 1.6

1.1 0.9

X Y

0.69 0.49

-1.31 -1.21

0.39 0.99

0.09 0.29

1.29 1.09

0.49 0.79

0.19 -0.31

-0.81 -0.81

-0.31 -0.31

-0.71 -1.01

Original Data Adjusted Dataset16

AdjustedDataSet = OriginalDataSet - Mean

Page 17: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Covariance Matrix Covariance Matrix

716555556.0615444444.0

615444444.060.61655555M

17

Page 18: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Eigenvalues and Eigenvalues and EigenvectorsEigenvectorsThe eigenvalues of matrix M are

Normalized Eigenvectors with corresponding eigenvales are

28402771.1

0490833989.0seigenvalue

735178656.0677873399.0

677873399.0735178656.0rseigenvecto

18

Page 19: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Feature VectorFeature VectorSorted eigenvector

Feature vector

677873399.0735178656.0

735178656.0677873399.0rseigenvecto

677873399.0735178656.0

735178656.0677873399.0,

677873399.0735178656.0

735178656.0677873399.0

For

FT

19

Page 20: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Final Data (1/2)Final Data (1/2)

X Y

-0.82797018

6

-0.17511530

7

1.77758033 0.142857227

-0.99219749

4

0.384374989

-0.27421041

6

0.130417207

-1.67580142 -0.20949846

1

-0.91294910

3

0.175282444

-0.09910943

7

-0.34982469

8

1.14457216 0.0464172582

0.438046137

0.0177646297

1.22382056 -0.16267528

7

FinalData = F x AdjustedDataSetTransposed

20

Page 21: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Final Data (2/2)Final Data (2/2)FinalData = F x

AdjustedDataSetTransposed

21

X

-0.827970186

1.77758033

-0.992197494

-0.274210416

-1.67580142

-0.912949103

0.0991094375

1.14457216

0.438046137

1.22382056

Page 22: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Retrieving Original Retrieving Original Data(1/2)Data(1/2)

FinalData = F x AdjustedDataSetTransposed

AdjustedDataSetTransposed = F-1 x FinalDatabut, F-1=FT

So, AdjustedDataSetTransposed =FT x FinalData

and, OriginalDataSet = AdjustedDataSet + Mean

22

Page 23: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Retrieving Original Retrieving Original Data(2/2)Data(2/2)

23

Page 24: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

Application of PCA in Image Application of PCA in Image ProcessingProcessingPattern RecognitionImage CompressionDetermination of Object

Orientation and Rotation

24

Page 25: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

QuestionQuestion

?

25

Page 26: Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.

ReferencesReferencesPrinciple Component Analysis in

Wikipedia http://en.wikipedia.org/wiki/Principal_component_analysis

A tutorial on Principal Components Analysisby Lindsay I Smith http://www.sccg.sk/~haladova/principal_components.pdf

Principle Component Analysis in Image Processing by M. Mudrov´, A. Proch´zkahttp://dsp.vscht.cz/konference_matlab/matlab05/prispevky/mudrova/mudrova.pdf

26