Top Banner
Deep Multi-Task Learning with Shared Memory Pengfei Liu, Xipeng Qiu, Xuanjing Huang EMNLP2016 reading group presenter: ryosuke miyazaki
14

Deep Multi-Task Learning with Shared Memory

Apr 12, 2017

Download

Technology

marujirou
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Deep Multi-Task Learning with Shared Memory

DeepMulti-TaskLearningwithSharedMemoryPengfei Liu,Xipeng Qiu,Xuanjing HuangEMNLP2016readinggrouppresenter:ryosukemiyazaki

Page 2: Deep Multi-Task Learning with Shared Memory

AbstractDuetothelargenumberofparametersneuralmodelsneedalarge-scalecorpus. →unsupervisedpre-trainingiseffectiveMulti-tasklearningalsoimprovethefinalperformance.ThispaperproposeLSTMwithexternalmemoryformulti-tasklearning.

Page 3: Deep Multi-Task Learning with Shared Memory

Model:ME-LSTM

Keyvector,Erasevector,Addvector

Page 4: Deep Multi-Task Learning with Shared Memory

Model:ME-LSTM

Readingoperation

Ksegment,Mdimensionsperonesegment

,

Page 5: Deep Multi-Task Learning with Shared Memory

Model:ME-LSTM

DeepFusionstrategy

Page 6: Deep Multi-Task Learning with Shared Memory

Model:ME-LSTM

Writingoperation

Page 7: Deep Multi-Task Learning with Shared Memory

Twoarchitectures

ARC-1 ARC-2

Page 8: Deep Multi-Task Learning with Shared Memory

TrainingTask-specificoutputlayer

Linearcombinationofcostfunction

λm istheweightsforeachtaskm

Page 9: Deep Multi-Task Learning with Shared Memory

Experiment:textclassification

Page 10: Deep Multi-Task Learning with Shared Memory

Result:Movie

Page 11: Deep Multi-Task Learning with Shared Memory

Result:Product

Page 12: Deep Multi-Task Learning with Shared Memory

Analysis:Visualizedeepfusiongate

Sentimentscore

Dimensionsofdeepfusiongategt

Activate→ black

Page 13: Deep Multi-Task Learning with Shared Memory

Analysis:Visualizedeepfusiongate

Page 14: Deep Multi-Task Learning with Shared Memory

Conclusion・ Thispaperproposetwodeeparchitecturesformulti-tasklearning.・ Theydesignanexternalmemorytostoretheknowledgebyrelatedtasks.・ Deepfusionstrategyenablingthemodeltogivesharedinformation.