Slides from RecSys 2010 presentation. Context has been recognized as an important factor to con- sider in personalized Recommender Systems. However, most model-based Collaborative Filtering approaches such as Ma- trix Factorization do not provide a straightforward way of integrating context information into the model. In this work, we introduce a Collaborative Filtering method based on Tensor Factorization, a generalization of Matrix Factoriza- tion that allows for a flexible and generic integration of con- textual information by modeling the data as a User-Item- Context N-dimensional tensor instead of the traditional 2D User-Item matrix. In the proposed model, called Multiverse Recommendation, different types of context are considered as additional dimensions in the representation of the data as a tensor
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Multiverse Recommendation: N-dimensional TensorFactorization for Context-aware Collaborative
We then iteratively update the parameter matrices and tensors usingthe following update rules:
U t+1i∗ = U t
i∗ − η∂UL− ηλUUi∗
M t+1j∗ = M t
j∗ − η∂ML− ηλMMj∗
Ct+1k∗ = Ct
k∗ − η∂CL− ηλCCk∗
St+1 = St − η∂S l(Fijk ,Yijk )− ηλSS
where η is the learning rate.11
Optimization - Stochastic Gradient Descent for TF
Movies!
Users! U!
C!
M!
S!
12
Optimization - Stochastic Gradient Descent for TF
Movies!
Users! U!
C!
M!
S!
13
Optimization - Stochastic Gradient Descent for TF
Movies!
Users! U!
C!
M!
S!
14
Optimization - Stochastic Gradient Descent for TF
Movies!
Users! U!
C!
M!
S!
15
Optimization - Stochastic Gradient Descent for TF
Movies!
Users! U!
C!
M!
S!
16
Experimental evaluation
We evaluate our model on contextual rating data and computing theMean Absolute Error (MAE),using 5-fold cross validation defined asfollows:
MAE =1K
n,m,c∑ijk
Dijk |Yijk − Fijk |
17
Data
Data set Users Movies Context Dim. Ratings ScaleYahoo! 7642 11915 2 221K 1-5Adom. 84 192 5 1464 1-13Food 212 20 2 6360 1-5
Table: Data set statistics
18
Context Aware Methods
Pre-filtering based approach, (G. Adomavicius et.al), computesrecommendations using only the ratings made in the same contextas the target oneItem splitting method (L. Baltrunas, F. Ricci) which identifies itemswhich have significant differences in their rating under differentcontext situations.
19
Results: Context vs. No Context
Nocontext
TensorFactorization
1.9
2.0
2.1
2.2
2.3
2.4
2.5
MA
E
(a)
Nocontext
TensorFactorization
0.80
0.85
0.90
0.95
MA
E
(b)
Figure: Comparison of matrix (no context) and tensor (context) factorizationon the Adom and Food data.
20
Yahoo Artificial Data
0.60
0.65
0.70
0.75
0.80
0.85
0.90
0.95
1.00
MA
E
α=0.1
0.60
0.65
0.70
0.75
0.80
0.85
0.90
0.95
1.00α=0.5
0.60
0.65
0.70
0.75
0.80
0.85
0.90
0.95
1.00α=0.9
No Context Reduction Item-Split Tensor Factorization
Figure: Comparison of context-aware methods on the Yahoo! artificial data
Figure: Evolution of MAE values for different methods with increasinginfluence of the context variable on the Yahoo! data.
22
Tensor Factorization
Reduction Item-Split TensorFactorization
1.9
2.0
2.1
2.2
2.3
2.4
2.5
MA
E
Figure: Comparison of context-aware methods on the Adom data.
23
Tensor Factorization
Nocontext
Reduction TensorFactorization
0.80
0.85
0.90
0.95
MA
E
Figure: Comparison of context-aware methods on the Food data.
24
Conclusions
Tensor Factorization methods seem to be promising for CARSMany different TF methods existFuture work: extend to implicit taste dataTensor representation of context data seems promising