Top Banner
Introduction to probabilistic models of cognition Josh Tenenbaum MIT
26

Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Dec 30, 2015

Download

Documents

Shauna Palmer
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Introduction to probabilistic models of cognition

Josh TenenbaumMIT

Page 2: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Why probabilistic models of cognition?

Page 3: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

The fundamental problem of cognition

How does the mind get so much out of so little?

How do we make inferences, generalizations, models, theories and decisions about the world from impoverished (sparse, incomplete, noisy) data?

“The problem of induction”

Page 4: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Visual perception

(Marr)

Page 5: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

• Goal of visual perception is to recover world structure from visual images.

• Why the problem is hard: many world structures can produce the same visual input.

• Illusions reveal the visual system’s implicit knowledge of the physical world and the processes of image formation.

Ambiguity in visual perception

(Shepard)

Page 6: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

“horse”

“horse”

“horse”

Learning concepts from examples

Page 7: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Learning concepts from examples

“tufa”

“tufa”

“tufa”

Page 8: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Causal inference

15

62no drug

drug

cold 1 week

cold 1 week

Don’t press this button!

Does this drug help you get over a cold faster?

Page 9: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Causal inference

15

62no drug

drug

cold 1 week

cold 1 week

Don’t press this button!

Does this drug help you get over a cold faster?

Page 10: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Language

• Parsing:– Two cars were reported stolen by the Groveton police

yesterday.

– The judge sentenced the killer to die in the electric chair for the second time.

– No one was injured in the blast, which was attributed to a buildup of gas by one town official.

– One witness told the commissioners that she had seen sexual intercourse taking place between two parked cars in front of her house.

(Pinker)

Page 11: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Language

• Parsing

• Acquisition:– Learning the English past tense (rule vs.

exceptions)– Learning the Spanish or Arabic past tense

(multiple rules plus exceptions)– Learning verb argument structure (“give” vs.

“donate”)– Learning to be bilingual.

Page 12: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Intuitive theories• Physics

– Parsing: Inferring support relations, or the causal history and properties of an object.

– Acquisition: Learning about gravity and support.• Gravity -- what’s that?• Contact is sufficient• Mass distribution and location is important

• Psychology– Parsing: Inferring beliefs, desires, plans.– Acquisition: Learning about agents.

• Recognizing intentionality, but without mental state reasoning• Reasoning about beliefs and desires• Reasoning about plans, rationality and “other minds”.

Page 13: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

The big questions

1. How does knowledge guide inductive learning, inference, and decision-making from sparse, noisy or ambiguous data?

2. What are the forms and contents of our knowledge of the world?

3. How is that knowledge itself learned from experience?

4. When faced with surprising data, when do we assimilate the data to our current model versus accommodate our model to the new data?

5. How can accurate inductive inferences be made efficiently, even in the presence of complex hypothesis spaces?

Page 14: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

A toolkit for answering these questions

1. Bayesian inference in probabilistic generative models

2. Probabilities defined over structured representations: graphs, grammars, predicate logic, schemas

3. Hierarchical probabilistic models, with inference at all levels of abstraction

4. Adaptive nonparametric or “infinite” models, which can grow in complexity or change form in response to the observed data.

5. Approximate methods of learning and inference, such as belief propagation, expectation-maximization (EM), Markov chain Monte Carlo (MCMC), and sequential Monte Carlo (particle filtering).

Page 15: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

VerbVP

NPVPVP

VNPRelRelClause

RelClauseNounAdjDetNP

VPNPS

][

][][

Phrase structure S

Utterance U

Grammar G

P(S | G)

P(U | S)

P(S | U, G) ~ P(U | S) x P(S | G)

Bottom-up Top-down

(P

Page 16: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

VerbVP

NPVPVP

VNPRelRelClause

RelClauseNounAdjDetNP

VPNPS

][

][][

Phrase structure

Utterance

Speech signal

Grammar

“Universal Grammar” Hierarchical phrase structure grammars (e.g., CFG, HPSG, TAG)

P(phrase structure | grammar)

P(utterance | phrase structure)

P(speech | utterance)

P(grammar | UG)

Page 17: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

(Han and Zhu, 2006)

Vision as probabilistic parsing

Page 18: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.
Page 19: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Principles

Structure

Data

Whole-object principleShape biasTaxonomic principleContrast principleBasic-level bias

Learning word meanings

Page 20: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Causal learning and reasoning

Principles

Structure

Data

Page 21: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Goal-directed action (production and comprehension)

(Wolpert et al., 2003)

Page 22: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Why probabilistic models of cognition?

• A framework for understanding how the mind can solve fundamental problems of induction.

• Strong, principled quantitative models of human cognition.

• Tools for studying people’s implicit knowledge of the world.

• Beyond classic limiting dichotomies: “structure vs. statistics”, “nature vs. nurture”, “domain-general vs. domain-specific” .

• A unifying mathematical language for all of the cognitive sciences: AI, machine learning and statistics, psychology, neuroscience, philosophy, linguistics…. A bridge between engineering and “reverse-engineering”.

Why now? Much recent progress, in computational resources, theoretical tools, and interdisciplinary connections.

Page 23: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Summer school plan• Weekly plan

– Week 1: Basic probabilistic models. Applications to visual perception, categorization, causal learning.

– Week 2: More advanced probabilistic models (grammars, logic, MDPs). Applications to reasoning, language, scene understanding, decision-making, neuroscience.

– Week 3: Further applications to memory, motor control, sensory integration, unsupervised learning and cognitive development. Symposia on open challenges and student research.

Page 24: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Summer school plan• Daily plan

– 5 (or 6) lectures per day.– Starting Wednesday, break-out sessions after lunch, for

discussion with speakers. – Evening tutorials:

Matlab, Probability basics, Bayes net toolbox (for matlab), SamIam, BUGS, Markov logic networks and Alchemy.

– Psych computer lab (available afternoons). – Self-organizing activities:

Sign up for calendar on 30boxes.com:

Email address: [email protected]

Password: “ipam07”

Page 25: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Background poll• Bayes’ rule• Conjugate prior• Bayesian network• Plate notation for graphical models• Mixture model • Hidden Markov model• Expectation-maximization (EM) algorithm • Dynamic programming • Gaussian processes• Dirichlet processes• First-order logic• (Stochastic) context-free grammar• Probabilistic relational models• MCMC• Particle filtering• Partially observable Markov decision process• Serotonin

Page 26: Introduction to probabilistic models of cognition Josh Tenenbaum MIT.

Poll for tonight

• Matlab tutorial?

• Probability basics?