Top Banner
Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit
24

Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Dec 29, 2015

Download

Documents

Godwin Pope
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Modelling Language EvolutionLecture 5: Iterated Learning

Simon Kirby

University of Edinburgh

Language Evolution & Computation Research Unit

Page 2: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Models so far…

Models of learning language Models of evolving ability to learn language Models of differing abilities to learn differing languages

What do these have in common? The language comes from “outside”

LINGUISTICAGENT

LANGUAGE

Page 3: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Neural network

TrainingSentences

Weightsettings

Two kinds of models

Language Acquisition Device

Primary Linguistic Data

Grammatical Competence

What can be learned?

What can evolve?

LADPLD GCLADPLD GC LADPLD GC

LADPLD GC LADPLD GCLADPLD GC

LADPLD GC

LADPLD GC

LADPLD GC

Page 4: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

A new kind of model: Iterated Learning

LADPLD GCLADPLD GC LADPLD GC

LADPLD GC

LADPLD GC

LADPLD GC

LADPLD GC

LADPLD GCLADPLD GC

What kind of language evolves?

Page 5: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

What can Iterated Learning explain?

My hypothesis: some functional linguistic structure emerges inevitably from the process of iterated learning without the need for natural selection or explicit functional pressure.

First target structure:

Recursive Compositionality: the meaning of an utterance is some function of the meaning of parts of that utterance and the way they are put together.

Compositional Holistic

walked went

I greet you Hi

I thought I saw a pussy cat chutter

Page 6: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

The agent

Meaning-signal Pairs in(utterances from parent)

Meaning-signal Pairs out(to next generation)

Meanings(generated by environment)

Learning Algorithm

Internal linguistic representation

Agent(simulated individual)

Production AlgorithmNe

xt g

ene

ratio

n

Page 7: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

What will the agents talk about?

Need some simple but structured “world”. Simple predicate logic:

Agents can string random characters together to form utterances.

loves(mary, john)admires(gavin, heather)

thinks(mary, likes(john, heather))

knows(heather, thinks(mary, likes(john, heather)))

Page 8: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

How do agents learn?

Not using neural networksIn this model, interested in more traditional, symbolic

grammarsLearners try and form a grammar that is consistent

with the primary linguistic data they hear.Fundamental principle: learning is compression.

Learners try and fit data heard, but also generaliseLearning is a trade-off between these two

Page 9: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Two steps to learning

aryjohnlovesm),(/ maryjohnlovesS

aryjohnlovesm

marypeterloves

),(/

),(/

maryjohnlovesS

marypeterlovesS

peter

john

lovesmary

peterC

johnC

xCmaryxlovesS

/

/

/),(/

INCORPORATION (for each sentence heard)

GENERALISATION (whenever possible)

Page 10: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

A simulation run

1. Start with one learner and one adult speaker neither of which have grammars.

2. Choose a meaning at random.

3. Get speaker to produce signal for that meaning (may need to “invent” random string).

4. Give meaning-signal pair to learner.

5. Repeat 2-4 one hundred and fifty times.

6. Delete speaker.

7. Make learner be the new speaker.

8. Introduce a new learner (with no initial grammar)

9. Repeat 2-8 thousands of times.

Page 11: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Results 1a: initial stages

Initially, speakers have no language, so “invent” random strings of characters.

A protolanguage emerges for some meanings, but no structure. These are holistic expressions:

1. ldg “Mary admires John”

2. xkq “Mary loves John”

3. gj “Mary admires Gavin”

4. axk “John admires Gavin”

5. gb “John knows that Mary knows that John admires Gavin”

Page 12: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Page 13: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Results 1b: many generations later…

6. gj h f tej m John Mary admires“Mary admires John”

7. gj h f tej wp John Mary loves“Mary loves John”

8. gj qp f tej m Gavin Mary admires“Mary admires Gavin”

9. gj qp f h m Gavin John admires“John admires Gavin”

10. i h u i tej u gj qp f h m John knows Mary knows Gavin John admires“John knows that Mary knows that John admires Gavin”

Page 14: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Page 15: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Quantitative results: languages evolve

Page 16: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

What’s going on?

There is no biological evolution in the ILM.There isn’t even any communication; no notion of

function in model at all.So, why are structured languages evolving?Hypothesis:

Languages themselves are evolving to the conditions of the ILM in order that they are learnable.

The agents never see all the meanings…Only rules that are generalisable from limited exposure

are stable.

Page 17: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Language has to fit through a narrow bottleneck

This has profound implications for the structure of language

Linguistic competence

Linguistic performance

Linguistic competence

Production

Learning

Page 18: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

A nice and simple model…

Language Meanings: 8-bit binary numbers Signals: 8-bit binary numbers

Agents 8x8x8 neural network (not SRN) Learns to associate signals to meanings

SIGNALS

MEANINGS

Page 19: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Bottleneck

Only one parameter in this model The bottleneck:

The number of meaning-signal pairs (randomly chosen) given to the next generation…

In each simulation, we can measure two things: Expressivity: the proportion of the

meaning-space an adult agent can give a unique signal to

Instability: how different each generation’s language is to that of the previous generation

Subset of meaning signal pairs

Subset of meaning signal pairs

Page 20: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Results

Bottleneck too tight: unstable and inexpressive language

Page 21: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Results

Bottleneck too wide: fairly stable and expressive eventually

Page 22: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Results

Medium bottleneck: maximal stability and expressivity

Page 23: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Adaptation

Language is evolving to be learnable

Structure in mapping emerges Meanings and signals are related by

simple rules of bit flipping and re-ordering

These rules can be learned from a subset

Despite the hugely different model, this is a very similar result to the earlier simulation

Page 24: Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Summary

Language is learned by individuals with innate learning biases

The language data an individual hears is itself the result of learning

Languages adapt through iterated learning in response to our innate biases

There’s more! Our learning biases adapt through

biological evolution in response to the language we use

Tomorrow… use a simulation package to look at “grounding” models in an environment

Culturalevolution

Individual learning

Biological evolution