Top Banner
SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group [email protected] June 24, 2008
45

SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group [email protected] June 24, 2008.

Dec 19, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

SuperToy : Using Early Childhood Development to Guide AGI

Scott SettembreUniversity at Buffalo, SNePS Research Group

[email protected] 24, 2008

Page 2: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Overview

• Demonstration of the SuperToy interface• Explore capabilities and progression of human

baby communication and thinking– What studies have been done in psychology?– Implications to developing an AGI?

• How is AGI different from classical AI?– Different approaches in AGI– My approach and observations

• Demonstrate– Baby LyLy : the learning agent interface

Page 3: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

SuperToy Interface

• Visual Interface– 3D Head, ability to mimic natural movements– Accurate lip-syncing– Ability to display emotional state

• Listening Interface– Speech Recognition, allowing for hand-coded grammars or

free-speech– Ability to view HMM probabilities (and choose for myself)

• Talking Interface– Realistic voice, child-like voice preferred– Ability to change prosody (rhythm, stress and intonation)

Page 4: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Baby Agent – Video teaser

Page 5: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.
Page 6: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Adult Agent – Video Teaser

Page 7: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.
Page 8: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Child Psychology Studies

• Purpose: to understand how one AGI develops may indicate a progression of learning path that is possible (or perhaps necessary)– Progression in world representation/objects– Progression in language development– Interesting indication of dependency– What does this imply in terms of• Conversation and interaction• Mental representation and manipulation

Page 9: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Newborns

• 12 hr old newborns can distinguish between native and foreign languages†

– Maybe not so innate, since they probably learn rhythms of language in the womb

• 24 hr old – “Lexical Words” experiment†

– Can distinguish between content words, like nouns/verbs, and words with no meaning of their own, like prepositions/articles. (Occurs even in different languages)

† Studies done by Janet Werker at the University of British Columbia, Canada.

Page 10: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Newborns

• 10 days old – “Face Recognition” experiment†

– Newborn prefers correct orientation of features, but favors contrast over features (A & C)

• 6 weeks old – same experiment– Baby prefers features over contrasts (A & D)

† Study done by Daphne Maurer, McMaster University, Canada.

A. B. C. D.

Page 11: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Newborns

• What is innate?– Crying – an innate signal of distress– Soothing – mother’s voice will soothe baby, but

any human voice will work– Attention – visual attention drawn to contrasts

within first two weeks– Language development? We will return to this…

Page 12: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Few week old Babies

• Develop simple rules for conversation– Gaze of the eyes, directed at speaker– Facial expressions, match emotion in voice

• “Eye gaze” experiment †

– Baby gets frustrated if mother averts her eyes when talking to the baby.

• “Upside-down face” experiment †

– An inverted face of mother is not recognized• “Happy-Sad face” experiment †

– Emotions of face and in voice should match

† Study done by Darwin Muir lab, Queens University, Canada.

Page 14: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Few month old Babies

• Ability to “search” and understand objects increases

• 6 mo. – “Hidden Object” experiment †

– Toy hidden under a blanket cannot be found, but can at 8 mo.

• 9 mo. – “The Search” experiment †

– Toy hidden behind wall cannot be found at 8 mo., but can at 9 mo.

† Study done by Andrea Aguiar, University of Waterloo, Canada.

Page 15: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Few month old Babies

• 18 mo. – “Object Permanence” experiment †

– Cannot find a toy twice hidden. Hidden under hand, then hand under blanket. Once hand is removed, baby thinks it is still in hand.

• These experiments are significant because although objects are understood at 2.5 months, not all properties & physical laws are.

† Study done Andrew Meltzoff, Center for Mind Brain & Learning, U. at Washington.

Page 16: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Object Properties Timeline

• 2.5 mo. – understand there are objects1

• 5 mo. – width of objects, no big into small1

• 6 mo. – simple addition, 1 toy and 1 toy = 2 toys2

• 7.5 mo. – height of objects, no long into short1

• 9 mo. – occlusion, can find obscured object1

• 9 mo. – motion, can predict path of ball3

– But cannot do occlusion and motion at same time• 11 mo. – tool use4

• 18+ mo. – twice hidden object can be found5

Page 17: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Object Learning - Summary

• Children first learn by observation, then by experimenting– How objects behave– Size and Shape– Under, behind, inside– Falling, inclines, planes, volume, liquids… etc.

• They learn properties/categories one at a time• At some stages, they cannot apply two physical

laws/properties/categories at the same time

Page 18: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Baby vs Computer - Communication

Page 19: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Initial Language Development

• 6 mo. old – still able to distinguish all sounds †

• 10 mo. old – loses this ability †

– Beginning to filter out sounds that are not part of the language they hear all the time.

• Implies that babies “start out as universal linguists and then become culture bound specialists”

† Study done by Andrea Aguiar, University of Waterloo, Canada.

Page 20: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

New Words and Pointing

• 13 mo. old – “Joint Visual Attention” ex.†

– Uniquely human gesture of pointing– Pointing must coincide with the gaze of the

speaker in order to be meaningful

• How to do this in a text based interface?– Even Hellen Keller was able to select things to be

named (or have them brought to her attention by touch and then named)

† Study done by Andrea Aguiar, University of Waterloo, Canada.

Page 21: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Conversation and Turn Taking

• 14 mo. old – “The Robot” experiment†

– Green fur makes noise after baby makes noise, and they begin taking turns making noise.

• 18 mo. old – “Imitation” experiment‡

– Taking turns with a toy may be important to understand taking turns with a conversation.

† Study done by Susan Johnson, Stanford University.‡ Study done Andrew Meltzoff, Center for Mind Brain & Learning, U. at Washington.

Page 22: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

New Words and Shape

• 17-24mo. olds – “Shape Bias” experiment †

– Identifies object by shape, not color or texture– Getting a child to pay attention to shape can increase

vocabulary 3x faster

• Implication? Perhaps language in humans is intertwined with the learning of objects and properties of the world.– Is “Language Explosion” at two years old due to

understanding objects and categories better?

† Study done by Susan Jones, Indiana University.

Page 23: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Language Learning Timeline• 24 week old fetus – can hear voices

– Melody and rhythms of speech make heart beat faster• Birth – crying, mother’s voice soothes baby• 12 hrs old – distinguish native language1

• 24 hrs old – distinguish between POS1

• Weeks old – needs eye contact and matching voice & face emotions2

• 6 mo old – can distinguish all language sounds1

• 10 mo old – filters out non-native language sounds1

• 13 mo old – uses gestures and pointing to understand3

• 14 mo old – understands give and take of conversation4

• 18 mo old – imitation useful in learning how to converse5

• 18mo-2years – Everything has a name, shape bias6

• “Language explosion” – micro-sentences, mimic actions

Page 24: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Language Development - Summary

• Language initially taught by Mother-ese and Father-ese– Sing song quality, pitched higher– Sentences reduced to short phrases– Exaggerated words, stretched vowels– Repeating of the words– Emphasis on most important “meaningful” words

• Gesturing and facial expressions– Can convey intentions, feelings, desires

• Imitation and repetition– “primary engine for learning” language

• Understand more words than can use– At 2 yrs old child uses 300 words, but understands 1000.

Page 25: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Anything Applicable to AGI

• An underlying representation of objects helps learn a language– Maybe words are “verbal objects” and are subject

to the same rules that objects are to categories?– Maybe grammar is equivalent to physical laws?– Maybe articles/prepositions, non-content words,

are properties/relations between objects?

Page 26: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

…Applications cont.

• Humans have a clear progression in object understanding and language development– Is this sequential progression a limitation of neural

architecture?– Is this sequential progression necessary for any

kind of language learning and development?

Page 27: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

…Applications cont.

• Are we having difficulty because we do not process visual data in connection with language development?– Forms of gesturing and pointing are necessary in

language development (required)– At early ages (weeks old) eye contact and matching

emotions of the voice and face occurs (must be advantageous?)

– Shape bias occurs at/around the time of the “language explosion” (coincidence?)

Page 28: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Baby vs Computer - Story Time

Page 29: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Artificial General Intelligence

• “…the construction of a software program that can solve a variety of complex problems in a variety of different domains, and that controls itself autonomously, with its own thoughts, worries, feelings, strengths, weaknesses and predispositions.” †

• Whereas normal AI would be “creating programs that demonstrate intelligence in one or another specialized area”

† Section content from “Contemporary Approaches to Artificial General Intelligence” by Cassio Pennachin and Ben Goertzel

Page 30: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Taxonomy of AGI Approaches

• AGI approaches:– symbolic– symbolic and probability- or uncertainty-focused– neural net-based– evolutionary– artificial life– program search based– embedded– integrative

Page 31: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

AGI - Symbolic• GPS – General Problem Solver

– Uses heuristic search, break goals into subgoals– No learning involved

• Doug Lenat’s CYC project– Encodes all common sense knowledge in first-order predicate logic– Uses humans to encode knowledge in CycL– New effort called “CognitiveCyc” to address creating autonomous,

creative, interactive intelligence

• Alan Newell’s SOAR– No real autonomy or self-understanding– Used as limited-domain, problem solving tool

Page 32: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

AGI – Symbolic & Probabilistic• ACT-R framework – similar to SOAR

– Modeling of human performance on relatively narrow and simple tasks [modularity of mind mimicked]

– Uses probability, similar to some human cognition tasks

• Bayesian networks– Embody knowledge about probabilities and dependencies between

events in the world– Learning the probabilities (what to learn) is a problem

• NARS - uncertainty-based, symbolic AI system– uncertain logic – with fuzzy logic, certainty theory

• Failed Japanese 5th Generation Computer System

Page 33: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

AGI – Neural net-based

• Attempts include using:– Network models carrying out specialized functions

modeled on particular brain regions– Cooperative use of a variety of different neural net

learning algorithms– Evolving differing net-assemblies and piecing them

together– Physical simulation of brain tissue

Page 34: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

AGI – Evolutionary

• John Holland’s Classifier System– Hybridization of evolutionary algorithms and

probabilistic-symbolic AI– Specifically oriented toward integrating memory,

perception, and cognition to allow an AI system to act in the world

• CAM-Brain machine – Hugo de Garis– Evolving differing net-assemblies and piecing them

together

Page 35: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

AGI – Artificial Life

• Approaches are interesting, but not fruitful above a basic level of cognition– “Creatures” games (social)– “Network Tierra” project (multicellular)– “AlChemy” project (lower biological processes)– No Alife agent with significant general intelligence

Page 36: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

AGI - Program search

• General approach– Begins with a formal theory of general intelligence– Defines impractical algorithms that are provably known to

achieve general intelligence– Then approximate these algorithms with related, but less

comprehensive, algorithms

• There is actually a solution to AGI, but the search for the algorithm takes a long time– AIXI (Marcus Hutter) can work, but requires infinite

memory and infinitely fast processor!

Page 37: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

AGI – Integrative• Taking elements from various approaches and creating a

combined, synergistic system

• Create a unified knowledge representation and dynamics framework– Manifest the core ideas of the various AI paradigms within the

universal framework.

• Novamente project – Goertzel/Pennachin– Uses Semantic Networks, Genetic Algorithms, Description Logic,

Probability integration, Fuzzy Truth values– Uses psynet model of mind

http://www.goertzel.org/books/wild/chapPsynet.html

Page 38: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

My Approach

• Develop two systems– Top down – hand coded AI and KR&R– Bottom up – learn through interaction– Key idea is to use the same representation scheme

for both systems– My intent is to develop an environment for which

the needs of each system will constrain or expand the representation for the benefit of the other system

Page 39: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

My Approach – cont.

• Visual modality as input is out– Too costly in processing and effort– This means there will be:• No eye gaze consideration• No visual emotional content• No processing of gestures/pointing• No shape bias benefits

– These are serious drawbacks that will need to be compensated for

Page 40: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

My Approach – cont.

• Verbal processing partly done by Speech Recognition engine– Benefits include• No need for learning to become a “culture dependent

language specialist”, it is already done

– Drawbacks include:• No verbal emotional cues (face/voice matching active

within first few weeks, so might be important)• No distinguishing different people from voice alone

Page 41: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

My Approach – cont.

• Speech output done by TSS engine– Benefits include:• Speech production learning is unnecessary• We can still produce emotional output in the speech for

the benefit of listener

Page 42: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Baby LyLy• Purposes:– Playmate for Snow• Interact with Snow the way Snow interacts

– Mimicry – of words, micro-sentences– Association – one word recognized may induce another word– Imitation – repeat last word(s) of longer sentence– Repetition – repeat one word over and over until repeated– Emotional – frustration, happiness, boredom, clinical/serious

• Acquire words and micro-sentence use similar to Snow

– Only verbal interface for learning• Restrict teaching and feedback to speech• Discover what is necessary in such an interface

Page 43: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Adult Lyly• Purposes:– Productive interaction with OS/humans

• Hand-coded knowledge and conversation protocols• Innate knowledge of environment and use of some web• Integration with various tools (like email and schedule)• Specialized grammar and modules

– Kitchen - recipe, calorie, timers– Living room – TV schedules, movie database, lyric lookup– Office – web search, question answering, calculator– Bedroom – read stories, play games

– Assist in development of high-level knowledge representation and reasoning framework• Needs dictate KR scheme• Needs dictate specialized AI functions and tools to use

Page 44: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Additional Considerations• Forms of communication from computer to

human we can exploit (common references)– Visual• Facial emotions and actions• Diagrams, pictures, timelines, representations

– Auditory• Speech – with prosody• Recorded sound effects and speech

– Other• Sliders/progress bars to indicate intensity of emotion,

how well something is understood, certainty

Page 45: SuperToy : Using Early Childhood Development to Guide AGI Scott Settembre University at Buffalo, SNePS Research Group ss424@cse.buffalo.edu June 24, 2008.

Demonstration of Interface