Top Banner
The Neural Basis of Thought and Language Final Review Session
53

The Neural Basis of Thought and Language

Dec 30, 2015

Download

Documents

keelie-hays

The Neural Basis of Thought and Language. Final Review Session. Administrivia. Final in class next Tuesday, May 9 th Be there on time! Format: closed books, closed notes short answers, no blue books And then you’re done with the course!. Motor Control. Grammar. Metaphor. Bayes Nets. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: The Neural Basis of Thought and Language

The Neural Basis ofThought and Language

Final

Review Session

Page 2: The Neural Basis of Thought and Language

Administrivia

• Final in class next Tuesday, May 9th

• Be there on time!

• Format:

– closed books, closed notes

– short answers, no blue books

• And then you’re done with the course!

Page 3: The Neural Basis of Thought and Language

The Second Half

Cognition and Language

Computation

Structured Connectionism

Computational Neurobiology

Biology

Midterm Final

abst

ract

ion

Motor Control

Metaphor Grammar

Bailey Model

KARMA ECG

SHRUTI

Bayesian Model of HSP

Bayes Nets

Page 4: The Neural Basis of Thought and Language

Overview

• Bailey Model

– feature structures

– Bayesian model merging

– recruitment learning

• KARMA

– X-schema, frames

– aspect

– event-structure metaphor

– inference

• Grammar Learning

– parsing

– construction grammar

– learning algorithm

• SHRUTI

• FrameNet

• Bayesian Model of Human Sentence Processing

Page 5: The Neural Basis of Thought and Language

Full CircleNeural System &

Development

Motor Control & Visual System

Spatial Relation

Psycholinguistics Experiments

Metaphor

Grammar Verbs & Spatial Relation

Embodied Representation

StructuredConnectionism

Probabilistic algorithms

ConvergingConstraints

Page 6: The Neural Basis of Thought and Language

Q & A

Page 7: The Neural Basis of Thought and Language

How can we capture the difference between

“Harry walked into the cafe.”

“Harry is walking into the cafe.”

“Harry walked into the wall.”

Page 8: The Neural Basis of Thought and Language

Analysis Process

Utterance

Simulation

Belief State

General Knowledg

e

Constructions

SemanticSpecificatio

n

“Harry walked into the café.”

Page 9: The Neural Basis of Thought and Language

The INTO construction

construction INTO subcase of Spatial-Relation

form selff .orth ← “into”

meaning: Trajector-Landmarkevokes Container as contevokes Source-Path-Goal as spgtrajector ↔ spg.trajectorlandmark ↔ contcont.interior ↔ spg.goalcont.exterior ↔ spg.source

Page 10: The Neural Basis of Thought and Language

The Spatial-Phrase construction

construction SPATIAL-PHRASE

constructionalconstituents

sr : Spatial-Relationlm : Ref-Expr

formsrf before lmf

meaningsrm.landmark ↔ lmm

Page 11: The Neural Basis of Thought and Language

The Directed-Motion construction

construction DIRECTED-MOTIONconstructional

constituentsa : Ref-Expm: Motion-Verbp : Spatial-Phrase

form af before mf

mf before pf

meaningevokes Directed-Motion as dmselfm.scene ↔ dm

dm.agent ↔ am

dm.motion ↔ mm

dm.path ↔ pm

schema Directed-Motionroles

agent : Entitymotion : Motionpath : SPG

Page 12: The Neural Basis of Thought and Language

What exactly is simulation?

• Belief update plus X-schema execution

hungry meeting

cafe

time of day

readystart

ongoingfinish

done

iterate

WALK

at goal

Page 13: The Neural Basis of Thought and Language

“Harry walked into the café.”

readywalk

done

walker=Harry goal=cafe

Page 14: The Neural Basis of Thought and Language

Analysis Process

Utterance

Simulation

Belief State

General Knowledg

e

Constructions

SemanticSpecificatio

n

“Harry is walking to the café.”

Page 15: The Neural Basis of Thought and Language

“Harry is walking to the café.”

readystart

ongoingfinish

done

iterateabortcancelled

interrupt resume

suspended

WALKwalker=Harry goal=cafe

Page 16: The Neural Basis of Thought and Language

Analysis Process

Utterance

Simulation

Belief State

General Knowledg

e

Constructions

SemanticSpecificatio

n

“Harry has walked into the wall.”

Page 17: The Neural Basis of Thought and Language

Perhaps a different sense of INTO?

construction INTO subcase of spatial-prep

form selff .orth ← “into”

meaningevokes Trajector-Landmark as tlevokes Container as contevokes Source-Path-Goal as spgtl.trajector ↔ spg.trajectortl.landmark ↔ contcont.interior ↔ spg.goalcont.exterior ↔ spg.source

construction INTO subcase of spatial-prep

form selff .orth ← “into”

meaningevokes Trajector-Landmark as tlevokes Impact as imevokes Source-Path-Goal as spgtl.trajector ↔ spg.trajectortl.landmark ↔ spg.goalim.obj1 ↔ tl.trajectorim.obj2 ↔ tl.landmark

Page 18: The Neural Basis of Thought and Language

“Harry has walked into the wall.”

readystart

ongoingfinish

done

iterateabortcancelled

interrupt resume

suspended

WALKwalker=Harry goal=wall

Page 19: The Neural Basis of Thought and Language

Map down to timeline

S

R

E

readystart

ongoingfinish

done

consequence

Page 20: The Neural Basis of Thought and Language

further questions?

Page 21: The Neural Basis of Thought and Language

What about…

“Harry walked into trouble”

or for stronger emphasis,

“Harry walked into trouble, eyes wide open.”

Page 22: The Neural Basis of Thought and Language

Metaphors

• metaphors are mappings from a source domain to a target domain

• metaphor maps specify the correlation between source domain entities / relation and target domain entities / relation

• they also allow inference to transfer from source domain to target domain (possibly, but less frequently, vice versa)

<TARGET> is <SOURCE>

Page 23: The Neural Basis of Thought and Language

Event Structure Metaphor

• Target Domain: event structure

• Source Domain: physical space• States are Locations

• Changes are Movements

• Causes are Forces

• Causation is Forced Movement

• Actions are Self-propelled Movements

• Purposes are Destinations

• Means are Paths

• Difficulties are Impediments to Motion

• External Events are Large, Moving Objects

• Long-term Purposeful Activities are Journeys

Page 24: The Neural Basis of Thought and Language

KARMA

• DBN to represent target domain knowledge

• Metaphor maps link target and source domain

• X-schema to represent source domain knowledge

Page 25: The Neural Basis of Thought and Language

Metaphor Maps

1. map entities and objects between embodied and abstract domains

2. invariantly map the aspect of the embodied domain event onto the target domain

by setting the evidence for the status variable based on controller state (event structure metaphor)

3. project x-schema parameters onto the target domain

Page 26: The Neural Basis of Thought and Language

further questions?

Page 27: The Neural Basis of Thought and Language

How do you learn…

the meanings of spatial relations,

the meanings of verbs,

the metaphors, and

the constructions?

Page 28: The Neural Basis of Thought and Language

How do you learn…

the meanings of spatial relations,

the meanings of verbs,

the metaphors, and

the constructions?

That’s the Regier model. (first half of semester)

Page 29: The Neural Basis of Thought and Language

How do you learn…

the meanings of spatial relations,

the meanings of verbs,

the metaphors, and

the constructions?

VerbLearn

Page 30: The Neural Basis of Thought and Language
Page 31: The Neural Basis of Thought and Language

schema elbow jnt posture accel

slide extend palm 6

schema elbow jnt posture accel

slide extend palm 8

schema elbow jnt posture accel

slide 0.9 extend 0.9 palm 0.9 [6]

data #1

data #2

data #3

data #4

schema elbow jnt posture accel

depress 0.9

fixed 0.9 index 0.9 [2]

schema elbow jnt posture accel

slide 0.9 extend 0.9 palm 0.9 [6 - 8]

schema elbow jnt posture

slide 0.9 extend 0.9 palm 0.7

grasp 0.3

schema elbow jnt posture accel

depress fixed index 2

schema elbow jnt posture accel

slide extend grasp 2

Page 32: The Neural Basis of Thought and Language

Computational Details

• complexity of model + ability to explain data

• maximum a posteriori (MAP) hypothesis

)|( argmax DmPm

rule Bayes'by )()|( argmax mPmDPm

how likely is the data given this model?

penalize complex models – those with too many word senses

wants the best model given data

Page 33: The Neural Basis of Thought and Language

How do you learn…

the meanings of spatial relations,

the meanings of verbs,

the metaphors, and

the constructions?

conflation hypothesis(primary metaphors)

Page 34: The Neural Basis of Thought and Language

How do you learn…

the meanings of spatial relations,

the meanings of verbs,

the metaphors, and

the constructions?

construction learning

Page 35: The Neural Basis of Thought and Language

Acquisition

Reorganize

Hypothesize

Production

Utterance

(Comm. Intent, Situation)

Generate

Constructions

(Utterance, Situation)

Analysis

Comprehension

Analyze

Partial

Usage-based Language Learning

Page 36: The Neural Basis of Thought and Language

Main Learning Loop

while <utterance, situation> available and cost > stoppingCriterion

analysis = analyzeAndResolve(utterance, situation, currentGrammar);

newCxns = hypothesize(analysis);

if cost(currentGrammar + newCxns) < cost(currentGrammar)

addNewCxns(newCxns);

if (re-oganize == true) // frequency depends on learning parameter

reorganizeCxns();

Page 37: The Neural Basis of Thought and Language

Three ways to get new constructions

• Relational mapping

– throw the ball

• Merging

– throw the block

– throwing the ball

• Composing

– throw the ball

– ball off

– you throw the ball offTHROW < BALL < OFF

THROW < OBJECT

THROW < BALL

Page 38: The Neural Basis of Thought and Language

Minimum Description Length• Choose grammar G to minimize cost(G|D):

– cost(G|D) = α • size(G) + β • complexity(D|G)

– Approximates Bayesian learning; cost(G|D) ≈ posterior probability P(G|D)

• Size of grammar = size(G) ≈ 1/prior P(G) – favor fewer/smaller constructions/roles; isomorphic mappings

• Complexity of data given grammar ≈ 1/likelihood P(D|G)– favor simpler analyses

(fewer, more likely constructions)

– based on derivation length + score of derivation

Page 39: The Neural Basis of Thought and Language

further questions?

Page 40: The Neural Basis of Thought and Language

Connectionist Representation

How can entities and relations be represented at the structured connectionist level?

or

How can we represent Harry walked to the caféin a connectionist model?

Page 41: The Neural Basis of Thought and Language

SHRUTI

• entity, type, and predicate focal clusters

• An “entity” is a phase in the rhythmic activity.

• Bindings are synchronous firings of role and entity cells

• Rules are interconnection patterns mediated by coincidence detector circuits that allow selective propagation of activity

• An episode of reflexive processing is a transient propagation of rhythmic activity

Page 42: The Neural Basis of Thought and Language

• asserting that walk(Harry, café)

• Harry fires in phase with agent role

• cafe fires in phase with goal role

+ - ? agt goal

+e +v ?e ?v

+ ?

walk

cafe

Harry

type

entity

predicate

“Harry walked to the café.”

Page 43: The Neural Basis of Thought and Language

• asserting that walk(Harry, café)

• Harry fires in phase with agent role

• cafe fires in phase with goal role

+ - ? agt goal

+e +v ?e ?v

+ ?

walk

cafe

Harry

type

entity

predicate

“Harry walked to the café.”

Page 44: The Neural Basis of Thought and Language

Activation Trace for walk(Harry, café)

+: Harry

+: walk

+e: cafe

walk-agt

walk-goal

1 2 3 4

Page 45: The Neural Basis of Thought and Language

further questions?

Page 46: The Neural Basis of Thought and Language

Human Sentence Processing

Can we use any of the mechanisms we just discussed

to predict reaction time / behavior

when human subjects read sentences?

Page 47: The Neural Basis of Thought and Language

Good and Bad News

• Bad news:

– No, not as it is.

– ECG, the analysis process and simulation process are represented at a higher computational level of abstraction than human sentence processing (lacks timing information, requirement on cognitive capacity, etc)

• Good news:

– we can construct bayesian model of human sentence processing behavior borrowing the same insights

Page 48: The Neural Basis of Thought and Language

Bayesian Model of Sentence Processing

• Do you wait for sentence boundaries to interpret the meaning of a sentence? No!

• As words come in, we construct

– partial meaning representation

– some candidate interpretations if ambiguous

– expectation for the next words

• Model

– Probability of each interpretation given words seen

– Stochastic CFGs, N-Grams, Lexical valence probabilities

Page 49: The Neural Basis of Thought and Language

SCFG + N-gram

Reduced RelativeMain Verb

S

NP VP

D N VBN

The cop arrested the detective

S

NP VP

NP VP

D N VBD PP

The cop arrested by

Stochastic CFG

Page 50: The Neural Basis of Thought and Language

SCFG + N-gram

Main VerbReduced Relative

S

NP VP

D N VBN

The cop arrested the detective

S

NP VP

NP VP

D N VBD PP

The cop arrested by

N-Gram

Page 51: The Neural Basis of Thought and Language

SCFG + N-gram

Main VerbReduced Relative

S

NP VP

D N VBN

The cop arrested the detective

S

NP VP

NP VP

D N VBD PP

The cop arrested by

Different Interpretations

Page 52: The Neural Basis of Thought and Language

Predicting effects on reading time

• Probability predicts human disambiguation

• Increase in reading time because of...

– Limited Parallelism

• Memory limitations cause correct interpretation to be pruned

• The horse raced past the barn fell

– Attention

• Demotion of interpretation in attentional focus

– Expectation

• Unexpected words

Page 53: The Neural Basis of Thought and Language

Open for questions