ADVANCED ANALYTICS INSTITUTE UNIVERSITY OF TECHNOLOGY SYDNEY feit.uts.edu.au dsmi.tech Knowledge Graph Embedding and Applications Zili Zhou, Qian Li, Guandong Xu, Shaowu Liu School of Software & Advanced Analytics Institute University of Technology Sydney [email protected]
74
Embed
Knowledge Graph Embedding and Applicationsprojects.dsmi.tech/share/pakdd2019-kge.pdf · 8 Knowledge Graph Embedding and Applications 15/4/2019 Why Knowledge Graph? Why Knowledge Graph?
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
ADVANCED ANALYTICS INSTITUTE
UNIVERSITY OF TECHNOLOGY SYDNEY
feit.uts.edu.au
dsmi.tech
Knowledge Graph Embedding and Applications
Zili Zhou, Qian Li, Guandong Xu, Shaowu LiuSchool of Software & Advanced Analytics Institute
2 Knowledge Graph Embedding and Applications 15/4/2019
Agenda
Concepts of Knowledge Graph (KG)
Knowledge Graph Completion
Knowledge Graph Embedding Methods
Application of Knowledge Graph
Conclusion and open research questions
3 Knowledge Graph Embedding and Applications 15/4/2019
Knowledge Graph
• Knowledge graph (KG) is a large scale semantic network consisting of entities/concepts as well as the semantic relationships among them, which could be considered as a concise version of Semantic Web.
10 Knowledge Graph Embedding and Applications 15/4/2019
KG advantage: Semantically dense
• Quantitively Sparse item A, a repository of Githubwhich is selected by precious few users, it is hard to do recommendation based limited number of historical user-item selection records.
• However, the item is linked to an entity “TensorFlow”, and “TensorFlow” has dense semantic information in KG.
• Through KG connection, another item B can be linked to item A, we may also recommend item A to a user who selected item B in historical user-item selection records.
A
B
KG advantage: Semantically dense
11 Knowledge Graph Embedding and Applications 15/4/2019
KG advantage: High quality
• Structured data
• Unified format
Unstructured
Semi-structured
StructuredRDF Triples
Subject Predicate Object
TensorFlow Developer Google Brain
TensorFlow WrittenIn Python
Google Brain Location Mountain View, California
Python Developer Python Software Foundation
…...
KG advantage: High quality
12 Knowledge Graph Embedding and Applications 15/4/2019
KG challenges: multi-relational network
KG challenges: multi-relational network
- Multiple types of edges- Large number of types
13 Knowledge Graph Embedding and Applications 15/4/2019
KG challenges: Semantic Inference
KG challenges: semantic inference
(a, r1, b) => (c, r1, d)
KG related models needs to be reasonable for knowledge semantic inference
14 Knowledge Graph Embedding and Applications 15/4/2019
Knowledge Graph Completion
Knowledge graph completion (KGC)• Given a particular 𝑟, for any entity pair (ℎ, 𝑡) such that (ℎ, 𝑟, 𝑡) ∉ 𝑂 judge whether ℎ and 𝑡 should
be linked by 𝑟 or not.
• Inferring missing facts from existing facts and external information.
• Knowledge graph embedding is proposed to encode KG into low dimensional vector space
15 Knowledge Graph Embedding and Applications 15/4/2019
• Exploit distance-based scoring functions that measures the plausibility of a fact as the distance between the two entities.
Knowledge Graph Embedding
Wang, Quan, et al. TKDE 2017
18 Knowledge Graph Embedding and Applications 15/4/2019
TransE
• Entities and relations are embedded into same
vector space.
• Consider relation r as translation from entity h to
entity t
• Learning Assumption h+r=t
Knowledge Graph Embedding
Bordes, Antoine, et al. "Translating embeddings for modeling multi-relational data." Advances in neural information processing systems. 2013.
19 Knowledge Graph Embedding and Applications 15/4/2019
TransH: From original space to Hyperplane
• TransH enables different roles of an entity in
different relations.
• Entities h and t are projected into specific
hyperplane of relation r.
• Then predict new links based on translation on
hyperplane.
Knowledge Graph Embedding
Wang, Zhen, et al. "Knowledge graph embedding by translating on hyperplanes." Twenty-Eighth AAAI conference on artificial intelligence. 2014.
20 Knowledge Graph Embedding and Applications 15/4/2019
TransR:
• TransR has similar idea with TransH.
• Entities h and t are projected into
specific subspace of relation r.
• Predict new links based on translation in
subspace.
Knowledge Graph Embedding
Lin, Yankai, et al. "Learning entity and relation embeddings for knowledge graph completion." Twenty-ninth AAAI conference on artificial intelligence. 2015.
21 Knowledge Graph Embedding and Applications 15/4/2019
Semantic Matching Model:
• Exploit similarity-based scoring functions that measures plausibility of facts by matching
latent semantics of entities and relations embodied in their vector representations.
• Two types
• Matrix operation based
• Represent relation as a matrix which models pairwise interactions between
latent factors and produce score function by operation on matrix.
• RESCAL, DistMult, HolE
• Neural network based
• Conducts semantic matching using neural network architectures and outputs the
score by the final layer.
• SME, NTN, MLP, NAM
Knowledge Graph Embedding
22 Knowledge Graph Embedding and Applications 15/4/2019
Semantic Matching Model:
• Matrix operation based (RESCAL, DistMult, HolE)
Knowledge Graph Embedding
Wang, Quan, et al. TKDE 2017
23 Knowledge Graph Embedding and Applications 15/4/2019
RESCAL
• Vectors for entities, matrices for
relations
• Calculate semantic matching score by
production of h, r, t embedding.
• Predict new links based on semantic
matching.
Knowledge Graph Embedding
Nickel, Maximilian, Volker Tresp, and Hans-Peter Kriegel. "A Three-Way Model for Collective Learning on Multi-Relational Data." ICML. Vol. 11. 2011.
24 Knowledge Graph Embedding and Applications 15/4/2019
DistMult
• Vectors for entities and relations
• Simplify RESCAL by user diagonal matrix
diag(r) instead of Mr in RESCAL
• Predict new links based on semantic
matching.
Knowledge Graph Embedding
Yang, Bishan, et al. "Embedding entities and relations for learning and inference in knowledge bases." arXiv preprint arXiv:1412.6575 (2014).
25 Knowledge Graph Embedding and Applications 15/4/2019
HolE
• Vectors for entities and relations
• Combine vector h and t with circular
correlation operation, then product r to
calculate semantic matching.
• Expressive power of RESCAL and
Efficiency of DistMult
Knowledge Graph Embedding
Nickel, Maximilian, Lorenzo Rosasco, and Tomaso Poggio. "Holographic embeddings of knowledge graphs." Thirtieth Aaaiconference on artificial intelligence. 2016.
26 Knowledge Graph Embedding and Applications 15/4/2019
Semantic Matching Model:
• Neural network based (SME,NTN,MLP, NAM )
Knowledge Graph Embedding
Wang, Quan, et al. TKDE 2017
27 Knowledge Graph Embedding and Applications 15/4/2019
SME
• Semantic matching by neural network.
• In middle layer, SME model combines
representation of h, r pair and t, r pair
respectively.
• Product two representations of middle layer to
get final output
Knowledge Graph Embedding
Bordes, Antoine, et al. "A semantic matching energy function for learning with multi-relational data." Machine Learning94.2 (2014): 233-259.
28 Knowledge Graph Embedding and Applications 15/4/2019
NTN
• Vector for entities h, t as input layer, relation r is
represented as parameters of middle layer.
• In middle layer, entity h and t are combined by a
relation-specific tensor of relation r.
• Two bias matrices for relation r are also added into
middle layer.
Knowledge Graph Embedding
Socher, Richard, et al. "Reasoning with neural tensor networks for knowledge base completion." Advances in neural information processing systems. 2013.
29 Knowledge Graph Embedding and Applications 15/4/2019
MLP
• Three input vector, h, r, t, three neural network
parameter matrices to project input vector into
non-linear middle layer.
• Then output of neural network is computed by
a linear output layer.
Knowledge Graph Embedding
Dong, Xin, et al. "Knowledge vault: A web-scale approach to probabilistic knowledge fusion." Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2014.
30 Knowledge Graph Embedding and Applications 15/4/2019
NAM
• Deep neural network architecture.
• Vector of entity h and relation r as input layer, use
ReLU to combine h and r.
• Use interaction of neural network output with
embedding of entity t as semantic matching score.
Knowledge Graph Embedding
Liu, Quan, et al. "Probabilistic reasoning via deep learning: Neural association models." arXiv preprint arXiv:1603.07704(2016).
31 Knowledge Graph Embedding and Applications 15/4/2019
Entity Embedding
• Highly rely on entity embedding
• Almost all the current embedding based KG completion models need to learn embedding for entities.
Sparse KG
• Not performing well on sparse KG
• Because of reliability on entity embedding, current KG completion models perform not well for sparse KG.
Computing Efficiency
• Low computing efficiency
• In most of current KG, number of entities is far larger than number of relations, entity embedding may causes low efficiency computing on parameters optimization
….
….
….
…. …. …
. ……….
Huge amount of entity embeddings
Frequent Infrequent
….
…. …
. …. …
. …….
Challenges in KG completion
Knowledge Graph Embedding
32 Knowledge Graph Embedding and Applications 15/4/2019
On Completing Sparse Knowledge Graph with Transitive Relation Embedding (AAAI2019)
Barack Obama
Honolulu
USA Yao Ming
Shanghai
China
Born in City of
Citizen of
Born in City of
Citizen of
ent1
ent2
ent3
Born in City of
Citizen of
Observe triangle patternin KG to represent the
transitive relation inference
Conclude transitive relation inference rule based on triangle patterns.
33 Knowledge Graph Embedding and Applications 15/4/2019
On Completing Sparse Knowledge Graph with Transitive Relation Embedding (AAAI2019)
Motivation: • Knowledge Graph completion
• Previous Embedding approach looks at the global structures from the entire KB, biased for frequent entities or fails for infrequent entities
• Local structure can be used to alleviate the sparsity problem
Analogous to Matrix factorisation and K-nearest-neighbour - Global vs Local Structure
Limitation: favours to frequent entitiesSolution: learn embedding of relations, predict missing relationDifference: not necessary to learn embedding of entity;Advantage: better interpretability; efficient learning
Idea: Triangle pattern
34 Knowledge Graph Embedding and Applications 15/4/2019
TRE model
U𝑟𝑜+ = 𝑀3
+𝑟𝑜𝑉𝑟𝑝,𝑟𝑞 = 𝑀1𝑟𝑝 +𝑀2𝑟𝑞 U𝑟𝑜− = 𝑀3
−𝑟𝑜
rp rq
ro
rp rq ro
Apply
𝑃 𝑟𝑝 𝑟𝑜+, 𝑟𝑞 =
exp(U𝑟𝑜+𝑇𝑉𝑟𝑝,𝑟𝑞)
σ𝑟𝑘𝑅 exp(U𝑟𝑜
+𝑇𝑉𝑟𝑘,𝑟𝑞)
𝑃 𝑟𝑝 𝑟𝑜−, 𝑟𝑞 =
exp(U𝑟𝑜−𝑇𝑉𝑟𝑝,𝑟𝑞)
σ𝑟𝑘𝑅 exp(U𝑟𝑜
−𝑇𝑉𝑟𝑘,𝑟𝑞)
𝑃 𝑟𝑞 𝑟𝑜+, 𝑟𝑝 =
exp(U𝑟𝑜+𝑇𝑉𝑟𝑝,𝑟𝑞)
σ𝑟𝑘𝑅 exp(U𝑟𝑜
+𝑇𝑉𝑟𝑝,𝑟𝑘)
𝑃 𝑟𝑞 𝑟𝑜−, 𝑟𝑝 =
exp(U𝑟𝑜−𝑇𝑉𝑟𝑝,𝑟𝑞)
σ𝑟𝑘𝑅 exp(U𝑟𝑜
−𝑇𝑉𝑟𝑝,𝑟𝑘)
𝑃 𝑟𝑜+ 𝑟𝑝, 𝑟𝑞 =
exp(U𝑟𝑜+𝑇𝑉𝑟𝑝,𝑟𝑞)
σ𝑟𝑘𝑅 exp(U𝑟𝑘
+𝑇𝑉𝑝,𝑟𝑞) + σ𝑟𝑘
𝑅 exp(U𝑟𝑘−𝑇𝑉𝑟𝑝,𝑟𝑞)
𝑃 𝑟𝑜− 𝑟𝑝, 𝑟𝑞 =
exp(U𝑟𝑜−𝑇𝑉𝑟𝑝,𝑟𝑞)
σ𝑟𝑘𝑅 exp(U𝑟𝑘
+𝑇𝑉𝑝,𝑟𝑞) + σ𝑟𝑘
𝑅 exp(U𝑟𝑘−𝑇𝑉𝑟𝑝,𝑟𝑞)
Specific relation 𝑟𝑝
Any relation 𝑟k
On Completing Sparse Knowledge Graph with Transitive Relation Embedding (AAAI2019)
35 Knowledge Graph Embedding and Applications 15/4/2019
TRE model
Confidence(𝑟𝑜+|𝑟𝑝, 𝑟𝑞)=
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞,𝑟𝑜+)
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞)
Confidence(𝑟𝑜−|𝑟𝑝, 𝑟𝑞)=
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞,𝑟𝑜−)
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞)
Confidence(𝑟𝑝|𝑟𝑜+, 𝑟𝑞)=
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞,𝑟𝑜+)
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑜+,𝑟𝑞)
Confidence(𝑟𝑝|𝑟𝑜−, 𝑟𝑞)=
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞,𝑟𝑜−)
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑜−,𝑟𝑞)
Confidence(𝑟𝑞|𝑟𝑜+, 𝑟𝑝)=
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞,𝑟𝑜+)
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑜+,𝑟𝑝)
Confidence(𝑟𝑞|𝑟𝑜−, 𝑟𝑝)=
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞,𝑟𝑜−)
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑜−,𝑟𝑝)
Confidence(𝑟𝑜+|𝑟𝑝, 𝑟𝑞)=
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞,𝑟𝑜+)
𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦(𝑟𝑝,𝑟𝑞)conditional distributionempirical distribution
KL-divergence
𝐿 =
𝑟𝑝,𝑟𝑞,𝑟𝑜+
𝑆+
𝐶𝑜𝑛𝑓𝑖𝑑𝑒𝑛𝑐𝑒 𝑟𝑜+ 𝑟𝑝, 𝑟𝑞 log[𝑃 𝑟𝑜
+ 𝑟𝑝, 𝑟𝑞 ]
𝑟𝑝,𝑟𝑞,𝑟𝑜+
𝑆−
𝐶𝑜𝑛𝑓𝑖𝑑𝑒𝑛𝑐𝑒 𝑟𝑜− 𝑟𝑝, 𝑟𝑞 log[𝑃 𝑟𝑜
− 𝑟𝑝, 𝑟𝑞 ]
+𝐶𝑜𝑛𝑓𝑖𝑑𝑒𝑛𝑐𝑒 𝑟𝑝 𝑟𝑜+, 𝑟𝑞 log[𝑃 𝑟𝑝 𝑟𝑜
+, 𝑟𝑞 ]
+𝐶𝑜𝑛𝑓𝑖𝑑𝑒𝑛𝑐𝑒 𝑟𝑞 𝑟𝑜+, 𝑟𝑝 log[𝑃 𝑟𝑞 𝑟𝑜
+, 𝑟𝑝 ]
+𝐶𝑜𝑛𝑓𝑖𝑑𝑒𝑛𝑐𝑒 𝑟𝑝 𝑟𝑜−, 𝑟𝑞 log[𝑃 𝑟𝑝 𝑟𝑜
−, 𝑟𝑞 ]
+𝐶𝑜𝑛𝑓𝑖𝑑𝑒𝑛𝑐𝑒 𝑟𝑞 𝑟𝑜−, 𝑟𝑝 log[𝑃 𝑟𝑞 𝑟𝑜
−, 𝑟𝑝 ]
+
On Completing Sparse Knowledge Graph with Transitive Relation Embedding (AAAI2019)
36 Knowledge Graph Embedding and Applications 15/4/2019
Joint Prediction
Joint Prediction Strategy
• Limitation of TRE model:• If there is no potential transitive relation A-r1-B-r2-C between 2 entities A and C,
TRE model can’t predict the relation A-r3-C between A and C accurately.
• Solution:• We predict the new relation of KG by jointly using TRE and other baseline KG
• If there is potential transitive relation A-r1-B-r2-C between 2 entities A and C, we use prediction result of TRE, otherwise we use baseline KG Completion model prediction result.
37 Knowledge Graph Embedding and Applications 15/4/2019
Experimental Evaluation
KG datasets• FB15K
• A subset of Freebase database. Large number of relations. Most of entities are dense, typical dense KG data. We want to test performance on common dense KG.
• WN18
• A subset of WordNet database. Limited number of relations, only 18, which causes challenges. We want to test performance on KG with limited number of relations.
• DBP
• A subset of DBpedia database. Extremely sparse KG data, most of entities only occur for 1 times. We want to test performance on extremely sparse KG.
38 Knowledge Graph Embedding and Applications 15/4/2019
Experimental Evaluation
Baselines• TransE
• Bordes, Antoine, et al. "Translating embeddings for modeling multi-relational data." Advances in neural information processing systems. 2013.
• TransH• Wang, Zhen, et al. "Knowledge graph embedding by translating on hyperplanes." Twenty-Eighth
AAAI conference on artificial intelligence. 2014.
• TransR• Lin, Yankai, et al. "Learning entity and relation embeddings for knowledge graph completion."
Twenty-ninth AAAI conference on artificial intelligence. 2015.
• TransD• Ji, Guoliang, et al. "Knowledge graph embedding via dynamic mapping matrix." Proceedings of the
53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Vol. 1. 2015.
39 Knowledge Graph Embedding and Applications 15/4/2019
Experimental Evaluation
Baselines• RESCAL
• Nickel, Maximilian, Volker Tresp, and Hans-Peter Kriegel. "A Three-Way Model for Collective Learning on Multi-Relational Data." ICML. Vol. 11. 2011.
• DistMult• Yang, Bishan, et al. "Embedding entities and relations for learning and inference in knowledge
bases." arXiv preprint arXiv:1412.6575 (2014).
• ComplEx• Trouillon, Théo, et al. "Complex embeddings for simple link prediction." International
Conference on Machine Learning. 2016.
40 Knowledge Graph Embedding and Applications 15/4/2019
Experimental Evaluation
Entity Link Prediction on FB15K
• TRE model performs well on FB15K entity link prediction, better than baselines.
• TRE model works well for entity link prediction on common dense KG dataset.
41 Knowledge Graph Embedding and Applications 15/4/2019
Experimental Evaluation
Entity Link Prediction on WN18
• TRE model performs better than baselines in entity link prediction.
• Although WN18 with limited number of relations, TRE still performs well.
42 Knowledge Graph Embedding and Applications 15/4/2019
Experimental Evaluation
Relation Link Prediction • TRE performs well for relation link prediction on common dense KG FB15K.
• TRE performs slightly worse than some baselines on WN18, because with limited number of relations, it is hard to extract useful triangle pattern.
• TRE performs much better than baselines on DBP, baselines perform not well on extremely sparse KG, while TRE can still do relational link prediction effectively.
43 Knowledge Graph Embedding and Applications 15/4/2019
Knowledge Graph Application
• In-KG Application• Link Prediction
• Triple Classification
• Entity Classification
• Entity Resolution
• Out-of-KG Application• Relation Extraction
• Question Answering
• Recommender Systems
• Domain Search Engine
44 Knowledge Graph Embedding and Applications 15/4/2019
In-KG Application
• Link Prediction• Given two elements of RDF triple, predict
………………Washington, D.C. is the capital of the United States..............................
…………………………………………………………………………………………………
……………………………………………………..
entity h entity trelation ?
?
46 Knowledge Graph Embedding and Applications 15/4/2019
Relation Extraction With Knowledge Graph
Collaborative learning for Relation Extraction and TransE KG embedding.
Weston, Jason et al. “Connecting Language and Knowledge Bases with Embedding Models for Relation
Extraction.” EMNLP (2013).
47 Knowledge Graph Embedding and Applications 15/4/2019
Relation Extraction With Knowledge Graph
Each row of the matrix stands for a pair of entities. Each column a textual mention or a KG relation.
Riedel, Sebastian, et al. "Relation extraction with matrix factorization and universal schemas." Proceedings of the
2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language
Technologies. 2013.
1 1 1
1 1 1
1 1
1 1
1 1 1
1
(h, t)1
(h, t)2
(h, t)3
……
…
r1m2……… ………m1
r2
(h, t)i: i-th pair of entities h, tmj: j-th textual mentionrk: k-th relation in KG
Textual mentions KG relations
48 Knowledge Graph Embedding and Applications 15/4/2019
Relation Extraction With Knowledge Graph
Similar idea with "Relation extraction with matrix factorization and universal schemas.", but use tensor instead of matrix.
• Two dimensions of tensor to represent entities h and t, one dimension to represent textual mention/KG relation.
• Typed-RESCAL method considers types of entities during the tensor factorization. The two dimensions of Xk are restricted into different types for different relations, for example, “person” type and “location” type for relation “born_in”.
Chang, Kai-Wei, et al. "Typed tensor decomposition of knowledge bases for relation extraction." Proceedings of the
2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014.
49 Knowledge Graph Embedding and Applications 15/4/2019
Question Answering With Knowledge Graph
Given a natural language question, find the answer/answers based on structured RDF KG triple/triples. Treat question answering problem as link prediction problem in KG.
Bordes, Antoine, Jason Weston, and Nicolas Usunier. "Open question answering with weakly supervised embedding
models." Joint European conference on machine learning and knowledge discovery in databases. Springer, Berlin,
Heidelberg, 2014.
Embeddings of words,
entities, relationsVector containing
occurrences of words in
the question
• The answer for question could be an entity or a relation
• The key idea here is to learn the similarity of question and answer entity/relation.
• The right answer should have higher similarity to the question based on their embeddings.
Vector containing
occurrences of entities and
relations in the answer
50 Knowledge Graph Embedding and Applications 15/4/2019
Recommender System Challenges
1 1
1 1
1
1 1
1 1
New Item
New User
Challenge: Sparsity and Cold-start
• Sparsity and cold start problems are common problems in RS
• The RS is trained on given user-item interaction dataset
• If given new item or a new user, the recommendation accuracy may be largely compromised
51 Knowledge Graph Embedding and Applications 15/4/2019
RS Challenge: Quantitatively sparse
• Some users only selected precious few items
• Some items are only selected by precious few users
• Some items even have not been selected
RS Challenge: Quantitatively sparse
Zili Zhou, Shaowu Liu, Guandong Xu, Xing Xie, Jun Yin, Yidong Li, Wu Zhang. "Knowledge-Based Recommendation with Hierarchical Collaborative Embedding." Pacific-Asia Conference on Knowledge Discovery and Data Mining. 2018.
52 Knowledge Graph Embedding and Applications 15/4/2019
Solution: Side information
Unstructured
Mixed format
High volume low information gain
AttributeText
LinkImage
Side/auxiliary Information
53 Knowledge Graph Embedding and Applications 15/4/2019
Tensor model
• 𝑎𝑢,𝑡,𝑟 = ቊ1, (𝑢, 𝑡, 𝑟) ∈ 𝑌0, 𝑒𝑙𝑠𝑒
Symeonidis et al.2008 Rendle et al.2009
If user i annotate a tag on item j, then “+”Otherwise “-”For no annotation item, “?”
Bayesian Personalised Ranking (BPR)
Rendle, Steffen, et al. "BPR: Bayesian personalized ranking from implicit feedback." Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence. AUAI Press, 2009.
Symeonidis, Panagiotis, et al. "Tag recommendations based on tensor dimensionality reduction." Proceedings of the 2008 ACM conference on Recommender systems. ACM, 2008.
54 Knowledge Graph Embedding and Applications 15/4/2019
Social Regularization
similar scenario: use social network graph to enhance recommendation, but focus on offsetting the loss function in MF
Ma, Hao, et al. "Recommender systems with social regularization." Proceedings of the fourth ACM international conference on Web search and data mining. ACM, 2011.
55 Knowledge Graph Embedding and Applications 15/4/2019
social regularization
Intention: Sometimes, in order to make a decision, we will consult lots of our friends for valuable suggestions.
Average-based Regularization
56 Knowledge Graph Embedding and Applications 15/4/2019
social regularization
Individual-based Regularization
57 Knowledge Graph Embedding and Applications 15/4/2019
Factorization Machine
Incorporate user and item attributes X (LibFM)
Rendle, Steffen. "Factorization machines." 2010 IEEE International Conference on Data Mining. IEEE, 2010.
58 Knowledge Graph Embedding and Applications 15/4/2019
Other side information
• SocialMF [Jamali Recsys2010]
• Circle-based recommendation [Yang kdd2012]
• POI recommendation [Ye Sigir2011]
• Hybrid of social and geo-location [Cheng AAAi2012]
59 Knowledge Graph Embedding and Applications 15/4/2019
Integrate KG to RS
• Linked elements, some RS items have connections to some KG entities.
• Large scale semantic enhancement
• Dense semantic relation in KG
• KG is high quality structured side information data with unified format
Integrate KG to RS
60 Knowledge Graph Embedding and Applications 15/4/2019
Embedding in RS
• Represent users and items in same low dimensional vector space
• Use vector space distance for similarity measurement (Similar users / items are close to each other), match users with their interested items
Embedding Model
𝑓(𝒙)
61 Knowledge Graph Embedding and Applications 15/4/2019
Motivation
Collaborative Embedding
=> UTV
Zhang, Fuzheng, et al. "Collaborative knowledge base embedding for recommender systems." Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining. ACM, 2016.
62 Knowledge Graph Embedding and Applications 15/4/2019
Motivation
ansible
pyzmq
? ? ? ?
? ? ? ?
? ? ? ?
ansible
63 Knowledge Graph Embedding and Applications 15/4/2019
Motivation
ansible
pyzmq
? ? ? ?
? ? ? ?
? ? ? ?
ansible
64 Knowledge Graph Embedding and Applications 15/4/2019
Hierarchical Collaborative Embedding learning
• End to end learning instead of stack model
• Linking users, items and entities in one network
• Optimizing parameters collaboratively
65 Knowledge Graph Embedding and Applications 15/4/2019
ansible
pyzmq
? ? ? ?
? ? ? ?
? ? ? ?
ansible
Hierarchical Collaborative Embedding learning
66 Knowledge Graph Embedding and Applications 15/4/2019