Top Banner
1
29

1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

Dec 28, 2015

Download

Documents

Clare Brown
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

1

Page 2: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

2

SUMMARY

•Introduction to cNSBL

•Limits of cNSBL

•Fibred Neural Nets

•The eNSBL language

•An example

•Concluding remarks and future work

Page 3: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

3

subproblems :

1 - how to perform inference on large systems of rules meeting temporal constraints

2 - how to combine sensory information, provided by sensors as continuous signals, with typically discrete rule processing.

A chief goal for robotics today:

RECONCILING LOGICAL REASONING AND REACTIVE BEHAVIOR

Page 4: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

4

•We demonstrated* that mechanisms of monotonic and non-

monotonic forward reasoning can be implemented and controlled

using McCulloch and Pitts neurons.

•We introduced a language cNSBL to represent these reasoning

mechanisms.

•cNSBL programs can be compiled and implemented on a parallel

processor (FPGA).

1 - How to perform inference on large systems of rules meeting temporal constraints

* [Burattini et al., 2000].

Page 5: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

5

cNSBL building blocks

- A literal L is represented by neural element Nli, - the negation of literal L is represented by another neural element N~ li *

We stipulate that the truth-value of each propositional literal L can be True (Nli is active), False (N~ li is active), or Undefined (Nli and N~ li are quiescent)**.

* [Burattini et al., 2000]. **[von Neumann 1956].

Page 6: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

6

Why three-valued propositions?

Epistemic (or even ontological) motivations:

robotic systems may be compelled to act even when the truth-value of some given sentence is undefined;

and the action to undertake when the truth-value of some given sentence is undefined may differ from actions the system would undertake if the sentence were either (known to be) true or (known to be) false.

Page 7: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

7

cNSBL operators

Let P={pi} (0 < i < n) and Q={qi} (0 < j < m) be sets of propositional literals (for some n, m N).

Let P be the conjunction of the elements of P, and let Q be the disjunction of the elements of Q; let s be a literal.

IMPLY(P, s) is intuitively interpreted as “IF the conjunction of literals P is true THEN s is true”

Pj

pc

Qm

- j

1

sc =j -

UNLESS(P, Q, s) is intuitively interpreted as “IF the conjunction of literals P is true and the disjunction of literals Q is false or undefined THEN s is true”.

p2

p1

pc

pn

a1,c

a2,c

an,c

= 1

spc = n -

Page 8: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

8

Two Examples

• Traffic light example

• Ethological example

Page 9: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

9

•Traffic light example

Suppose that a robot has to cross the street. If there is a traffic-

light then the robot crosses provided that the light is green,

otherwise it waits. If there isn't a traffic-light at all, (the truth

value for both green and not green is undefined) then the robot

looks to the right and to the left in order to decide whether to

cross the street or not.

IMPLY( wish_to_cross G, cross)IMPLY( wish_to_cross G, not_cross)UNLESS( wish_to_cross, (G G), look_around)

Page 10: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

10

There are animals which pretend to be dead in order to deceive their predators.The behaviour of a “smart” predator may be descrided as follows:

IMPLY( (there_is_a_prey prey_is_alive), eat_it)IMPLY( (there_is_a_prey prey_is_alive) , go_away)UNLESS( there_is_a_prey, (prey_is_alive prey_is_alive), verify)

Ethological example

Page 11: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

11

NSP(FPGA)

inputs

To symbolic interfaces

To peripheral devices

FROM cNSBL TO FPGA IMPLEMENTATIONS

Logic symbolic expressions

cNSBL

IMPLY(P, s)………………

Neurosymbolic compiler

Formal neural network

p2

p1

pc

pn

a1,c

a2,c

an,c

= 1

spc = n -

VHDL codeneural network

VHDL compiler

library IEEE;

use IEEE.std_logic_1164.all;

entity neuron_ekb is port( clk, reset, edb : in std_logic; ekb : inout std_logic);

end neuron_ekb;

architecture SYN_USE_DEFA_ARCH_NAME of neuron_ekb is

……………….

end SYN_USE_DEFA_ARCH_NAME;

Page 12: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

12

A behaviour-based system is represented in cNSBL as a layer* which is connected to sensory transduction and motor actuation mechanisms.

At each time t (assuming discrete time), the state of the cNSBL layer is given by the truth-values of n propositional variables R={r1,…rn}.

A finite set of cNSBL propositions (a cNSBL program) specifies how the values of some cNSBL variables in R at time t+1 depend on the value of the variables in R at time t.

* [Aiello, Burattini, Tamburrini, 1995,1998]),

Page 13: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

13

Subsumption architectures

Suppression of behaviours – competitive action selection mechanisms

UNLESS(W, (G A), M)

UNLESS(G, A, M)

IMPLY(A, M)

AvoidA

Move_to_GoalG

WanderingW

MotorM

Page 14: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

14

Behavioural sequencing

• IMPLY(a, backwardon)

• IMPLY(backwardend, turnon)

• IMPLY(turnend, forwardon)

backward turn forward

backwardon backwardend turnon turnend forwardona…

cNSBL layer

motor actuation layer

discrete actions

Page 15: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

15

cNSBL is not sufficiently powerful to specify some familiar robotic behaviours and cooperative control functions.

In the extended NSBL framework, behaviours are modelled as nets of threshold neurons (corresponding to sets of cNSBL rules), as fibred Neural Nets, (fNN for short, introduced in [d’Avila Garcez and Gabbay, 2004]), or as a combination of both.

eNSBL is obtained by representing fNNs as real-valued variables, and by extending the semantics of IMPLY and UNLESS statements so as to admit real-valued variables as arguments.

2 - How to combine sensory information, provided by sensors as continuous signals, with typically discrete rule processing.

Page 16: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

16

A fibring function i from A to B maps the weights Wj of B to new values, depending on the values of Wj and on the input potential Ii

of the neuron i in A;

B is said to be embedded into A if i is a fibring function from A toB, and the output of neural unit i in A is given by the output of network B.

The resulting network, composed of networks A and B, is said to be a fibred neural network .

A Fibred Neural Network

Fibred Neural Nets (fNN)

Ii

Wj

i

Page 17: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

17

eNSBL is obtained from cNSBL by allowing neurons to embed other neural networks via fibring functions.

The output of neuron i is represented as an eNSBL real value (which we refer to by the superscript ‘e’).

The statement IMPLY(a, be) is interpreted as “if a is true, then the network embedded in neuron b is enabled to compute a value for the eNSBL variable be”.

No additional constraints are imposed on the other neurons of embedded networks.

Page 18: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

18

As proved by d’Avila Garcez and Gabbay*, fibred neural networks can approximate any polynomial function to any desired degree of accuracy.

Here, fNNs may be used to calculate attractive or repulsive potentials, or cooperative coordination among behaviours; and, for each fibred neural network Ni , the corresponding embedding neuron i enables the embedded network.

d’Avila Garcez and Gabbay, 2004

Page 19: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

19

An attractive (repulsive) potential is represented as a vector, whose direction points towards (away from) the goal, and whose magnitude is directly proportional to the distance between current point and goal or some sensory cue;

A typical equation for the calculation of the repulsive vector magnitude is

where x is the distance perceived by a range detector device and d is the maximum distance that the sensor can perceive.

This potential field functions can be modeled by fNN

0 for x dx

1- for 0 x dd

magnitudeV

Example of a potential field navigation mechanism based on eNSBL.

Page 20: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

20

A sketch of the neural circuitry for calculating the potential fields

This example includes six neurons, three of which (b, c, and m) embed nested fNNs.

b calculates a repulsive potential, with sonar readings as input.

c calculates an attractive potential, taking as input the local position of the robot and a map that represents the target position

m blends the repulsive and attractive potentials by vectorial sum into one heading to be sent to the motors.

Page 21: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

21

Each of the three computations is triggered by a cNSBL variable. The eNSBL program for this network is:

IMPLY(p, be)IMPLY(q, ce)IMPLY(s, me)

Page 22: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

22

0 for x dx

1- for 0 x dd

magnitudeV

= IY*(WJ) =x/d*WJ* i

x

1 f(x) =1

f(x) = 1/x

Iz =Wj1*1+WJ2*1/x=-1*x/d* i +1/x*d*x/d* i =1-x/d

J

m

Y

x

f(x) = xK

WK= i*1/d

IY= i*WK*x

p

WJ1=-1* =-1*x/d* i

WJ2=d*=d*x/d* i

p

X1 b

X0

’ = 0* (WK) = 0* (WK)

Wp= 1

Wp= 1

” = 1* (WK) = 1* (WK)

Z

f(z) = z

Q

H

’ = IZ*(WH) th(Iz)=(1-x/d)*WH*Th(Iz)* i

WH=1*’

f(x) =11

= (1-x/d)*Th(1-x/d)* i

f(x) =x

Page 23: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

23

CONCLUSIONS

cNSBL is a significant tool for robotic BBS insofar as it:•enables one to meet reactive time responses•enables one to model competitive control

eNSBL extends cNSBL and•enables one to model cooperative control•enables one to combine connectionist and McCulloch & Pitts nets.

Page 24: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

24

FUTURE WORK

•Learning in hierarchically organized eNSBL nets

•Wider logical repertoire for robotic control (modal logics, fragments of first order logic)

•Implementation of eNSBL on FPGA processor.

Page 25: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

25

REFERENCES1. Aiello, A., Burattini, E., Tamburrini, G., 1995, "Purely neural, rule-based diagnostic systems. I,

II”, International Journal of Intelligent Systems, Vol. 10, pp. 735-769.2. Aiello, A., Burattini, E., Tamburrini, G., 1998, “Neural Networks and Rule-Based Systems”, in

Leondes C. D. (ed.), Fuzzy Logic and Expert Systems Applications, Academic Press, Boston, MA.

3. Arbib, M., 1995, Schema Theory, in The Handbook of Brain Theory and neural Networks; M. Arbib ed., MIT press, Cambridge, MA, pp. 830-34

4. Arkin R.C. - Behavior-based robotics - MIT Press – 19985. Brooks, R.A., 1986, "A Robust Layered Control System for a Mobile Robot", IEEE Journal of

Robotics and Automation, pp. 14-236. Burattini, E., Datteri, E., Tamburrini, G., 2005, “Neuro-Symbolic Programs for

Robots, IJCAI 20057. Burattini, E., De Gregorio, M., Tamburrini, G., 2000, “NeuroSymbolic Processing: non-

monotonic operators and their FPGA implementation”, in Proceedings of the Sixth Brazilian Symposium on Neural Networks (SBRN 2000), IEEE Press.

8. Burattini, E., Tamburrini, G., 1992, “A pseudo-neural system for hypothesis selection”, International Journal of Intelligent Systems, vol. 7, pp. 521-545.

9. d'Avila Garcez, A. S., Gabbay, D. M., 2004, “Fibring Neural Networks”, in Proceedings of 19th National Conference on Artificial Intelligence (AAAI 04), San Jose, California, USA, AAAI Press.

10. von Neumann, J.,1956, “Probabilistic logics and the synthesis of reliable organisms from unreliable components”, in C.E. Shannon, J. Mc Carthy (eds.), Automata Studies, Princeton U.P.

Page 26: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

26

SOME DETAILS

Page 27: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

27

0 for x dx

1- for 0 x dd

magnitudeV

= IY*(WJ) =x/d*WJ* i

x

1 f(x) =1

f(x) = 1/x

Iz =Wj1*1+WJ2*1/x=-1*x/d* i +1/x*d*x/d* i =1-x/d

J

m

Y

x

f(x) = xK

WK= i*1/d

IY= i*WK*x

p

WJ1=-1* =-1*x/d* i

WJ2=d*=d*x/d* i

p

X1 b

X0

’ = 0* (WK) = 0* (WK)

Wp= 1

Wp= 1

” = 1* (WK) = 1* (WK)

Z

f(z) = z

Q

H

’ = IZ*(WH) th(Iz)=(1-x/d)*WH*Th(Iz)* i

WH=1*’

f(x) =11

= (1-x/d)*Th(1-x/d)* i

f(x) =x

m

p X1 bWp= 1

” = 1* (WK) = 1* (WK)

p X0

’ = 0* (WK) = 0* (WK)

Wp= 1

= IY*(WJ) =x/d*WJ* i

J

Yx

f(x) = x K

WK= i*1/d

IY= i*WK*x

” = 1* (WK) = i * (WK)

x

= IY*(WJ) =x/d*WJ* i

x

1f(x) =1

f(x) = 1/x

Iz =Wj1*1+WJ2*1/x=-1*x/d* i +1/x*d*x/d* i =1-x/d

J

WJ1=-1* =-1*x/d* i

WJ2=d*=d*x/d* i

Z

f(z) = z

H

’ = IZ*(WH) Th(Iz)=(1-x/d)*WH*Th(Iz)* i

Q

H

’ = IZ*(WH) th(Iz)=(1-x/d)*WH*Th(Iz)* i

WH=1*’

f(x) =1

1 = (1-x/d)*Th(1-x/d)* i

f(x) =x

0 for x dx

1- for 0 x dd

magnitudeV

= IY*(WJ) =x/d*WJ* i

x

1 f(x) =1

f(x) = 1/x

Iz =Wj1*1+WJ2*1/x=-1*x/d* i +1/x*d*x/d* i =1-x/d

J

m

Y

x

f(x) = xK

WK= i*1/d

IY= i*WK*x

p

WJ1=-1* =-1*x/d* i

WJ2=d*=d*x/d* i

p

X1 b

X0

’ = 0* (WK) = 0* (WK)

Wp= 1

Wp= 1

” = 1* (WK) = 1* (WK)

Z

f(z) = z

Q

H

’ = IZ*(WH) th(Iz)=(1-x/d)*WH*Th(Iz)* i

WH=1*’

f(x) =11

= (1-x/d)*Th(1-x/d)* i

f(x) =x

Page 28: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

28

p1 p2 … pn pc

Variables are literals Conjunction of literals

Just one literal

p2

p1

pc

pn

a1,c

a2,c

an,c

= 1

spc = n -

Pj

pc

Pm

- j

1

sc =j -

UNLESS(Pj, Pm, pc)

IMPLY(P j, pc).

Page 29: 1. 2 SUMMARY Introduction to cNSBL Limits of cNSBL Fibred Neural Nets The eNSBL language An example Concluding remarks and future work.

29

Neural Forward ChainingSet of rules:

b de d a

d c ad a b

Input literals

a, b, c, d, d, e

c e a b d ¬d

Output literals

a, b, d a b d

DB

c e b d ¬dKB

a*

a´ a´´

a b d

Output

d c ae d a

b d

d a b

a b d

d ¬dctrl

end

Ctrl

e b

d, d