Initialization enhancer for non-negative matrix factorization Zhonglong Zheng, Jie Yang, Yitan Zhu Engineering Applications of Arti cial Intelligence 20 fi (2007) 101–110 Presenter Chia-Cheng Chen 1
Initialization enhancer for non-negative matrix factorization
Zhonglong Zheng, Jie Yang, Yitan Zhu
Engineering Applications of Artificial Intelligence 20 (2007) 101–110
Presenter Chia-Cheng Chen 1
Introduction
Non-negative matrix factorization algorithm
Initializing NMF with different techniques
Experimental results
Conclusion
Outline
2
Background(1/2)
3
Background(2/2)
4
NMF has been applied to many areas such as dimensionality reduction, image classification, image compression.
However, particular emphasis has to be placed on the initialization of NMF because of its local convergence, although it is usually ignored in many documents.
Introduction
5
Non-negative matrix factorization (NMF) algorithm
where
Dimensionality reduction is achieved when r < N
Non-negative matrix factorization algorithm(1/4)
6
Euclidean distance
◦Update rule
Non-negative matrix factorization algorithm(2/4)
7
KL divergence
Update rule
Non-negative matrix factorization algorithm(3/4)
8
SJTU-face-database◦ 400 images ◦ Size: 64x64
Non-negative matrix factorization algorithm(4/4)
9
Three techniques
◦ PCA-based initialization
◦Clustering-based initialization
◦Gabor-based initialization
Initializing NMF with different techniques(1/5)
10
PCA-based initialization
m x N matrix X
Use SVD compute the eigenvectors and eigenvalues
Initializing NMF with different techniques(2/4)
11
PCA-based initialization
Initializing NMF with different techniques(3/5)
12
Clustering-based initialization (Fuzzy c-means) Membership matrix
Objective function
Update rule
Initializing NMF with different techniques(4/5)
13
Gabor-based initialization Gabor kernals
where
Gabor feature
Initializing NMF with different techniques(5/5)
14
Experimental results
15
Experimental results
16
Non-negative matrix factorization is a useful tool in the analysis of a diverse range of data.
Researchers often take random initialization into account when utilizing NMF.
In fact, random initialization may make the experiments unrepeatable because of its local minima property, although neural networks are not.
Conclusion
17