8/16/2019 Hans Marko The bidirectional Communication Theroy - A generalization of information theory 1973
1/7
8/16/2019 Hans Marko The bidirectional Communication Theroy - A generalization of information theory 1973
2/7
1346
IEEE
TRANSACTIONS
O N
COMMUNICATIONS, DECEMBER
973
Forhispurpose the following essential characteristics are
considered.
1) Information ransmitter and receiver of a hum an being
are ide ntical; a person has a characteristic infor ma tion source
which is different rom he nformation source ofanother
person. Therefore a mansupplies a stocha stic processwhich
is typical for him with an entr op y which characterizes him .
2) Information transmission between wo persons is con-
~
sidered as a ‘stochastic sync hroniz ation” of this tocha stic
process. The entropy ‘of each process has herefore a “free”
anda“dependent”par t. The latter represents the “received
transinformation.”
To
describe this idea ma them atically, the theor y of Markov
processes is used;and stati ona rity ha s tobe assumed.
’
Even the philosophers of antiquity tried to describe the
process of thinking byme ans of association andmemory.
Aristotle described with a remarkable clearness the ,sequence
of imagination as a statistical process with inner linkage, as we
would call it li’owadays, in his sh ort essay “Memory an d
Recollection.” According to him, recollection is characterized
by the fact tha t “m ove me nts” (i.e., imagination) follow each
other habitu ally” (i.e., mo stly).The presentequence is
essentially determ ined by th e sequence of the earlier process.
Caused by the mechanical systems of Galilei an dN ew ton , the
psychology of association came to a con cep t of a mechanistic-
deterministic behavior in its search fora “physics of soul.”
LockendHartleyn England and Herbart in Germ any
attributed he process of thinking oa causally deter mine d
mechanism of association.This view has been corre cted by
the mo dern psychology of thinkin g w hich recognized the im-
portanceof ntuition.Natu rally, he value of all of these
theories is l imited because the statements cannot be quantized.
At the ndof he19th entury,Galton , Ebbinghaus, nd
Wundt began with the experimental nvestigation of association
processes, ncluding association sequences. Finally, Shannon’s
fundamental work [
11
‘rendered a q uantita tive description and
therefore made ameasuring of inform ation possible.
The idea of he bidirectional commun ication heory was
first presentedby heauthorat hecybernetics congress at
Kiel, Ge rmany, in S eptember 1965
[2]
and in May 196 6 with
a lecture at th e congress of the Popov Socie ty in Moscow. An
explicit representation of this theory isgiven in the ourn al
Kybernetik [3 ] , and short representations are given in [4] and
[ 5 ]
Related mathematicalproofs are presented in [6] and
[7 ]. An extension to the communicat ion of a group has been
given byNeuburger
[8],
[ 9 ] . T h ebidirectionalcommunica-
tion heory has beenapplied so far with behavioral sciences
[ IO ] ,
[
111
.
Other applications for statistically coupled
systems, i .e., economical systems,are possible.
Two-way com mu nicatio n channels have been nvestigated
earlier by S hann on and others , especially the feedback ch anne l
[
12].,
[131
and the crosstalk channel [141 In these investiga-
tions, however, the conventional definitionof transinformation
is used, and the inform ation source is considered independ ent
and not statistically depen dent, as in the present work.
The same applies for previous wor k with the aim to investi-
gatemultivariate corre lation using inform ationa lquantities
[151- [171 The conventional inform ation theory is capable
of giving a generalized measure of cor relation , but not of dis-
tinguishing the direc tion of informa tion flow. This exactly is
the aim of the bidirectional co mm unication theory.
In he following par t, hor t description o f Shannon’s
channel with inner statistical linkages betw een the symbols i s
presented. Then the model of a comm unication is established.
The last part is a mathem atical representation of com munica-
tion, and the variables of the directed information are defined.
Information flow diagrams are presentedorllustration.
There it can be seen that Shannon’s information channel
1s
a
special case of the mo re general com mu nicatio n theory given
in this pqper.
11
SHANNON’STRANSMISSIONCHANNEL
Fig. 1 shows the block diagram of unidirectionalrans-
mission according to S han non . It consists of a message source
e), a coding device C) which code s the message in an appro:
priate way for the transmission, the transmission channel,a
decoding device D),nd he receiver R). The channel Ch)
conta ins a noise source
N )
which represents the disturbance.
It is essential foraquantitativedescription of information
transmission. Thu s the bloc k diagram con tain s two statistical
generators: the message source
Q
and he noise source
N .
According to the usual symbolism, the sources are represented
by circles; all the other par ts are passive and drawn as boxes.
The receiver is passive in contrast
to
the ’ bidirectional com -
mun ication m odel. ‘Th e receiver usually is considered as ideal,
supposing that t can valuate the received message in its
statistical propertiesoptim ally (i.e., it is supp osed to have
storage of nfin ite size). Because of these assum ptions, he
transmission becomes indep ende nt of th e receiver and is de-
termined solely b y he prope rties of the message source and
the hannel.To pply Shannon’s formulas or he general
case of a channel with mem ory considered here, stationarity
mu st be assumed. A sequence of symbols x, = x l x 2 at the
receiver is observed. Both transm itted and received sequences
are suppose d o have n symbols. The followingprobabilities
are defined.
p x , ) Probabilityorheccurrencefhe sequence
p b , ) Probabilityorheccurrencefhe sequence
p x ,
y,) Joint probabi l i ty for the occurrence of
x
and
y,.
p x , l y , ) Condit ionalprobabi l i tyfor th e occurrence f
p b ,
[x,)
Conditional robability orhe ccurrence f
x,
at the transmitter.
. yn a the receiver.
x,
when
y ,
is known.
y , whenxis known.
{ }
designates theexpe ctation value (mean value). Theen-
tropies related to one ymbol can be calculated from he
probabilities as follows.
Entropy at the t ransmit ter:
1
H x )
= lim - - log p x , ) } .
n- m n
Entropy at the receiver:
8/16/2019 Hans Marko The bidirectional Communication Theroy - A generalization of information theory 1973
3/7
MARKO: B I D I R E C T I O N A L COMMUNICATION T H E O R Y
1347
s o u r c e c o d e r d e c o d e r
r q w r e r
channel
Ch
R
y
‘
noire
Fig.
1. Shannon’s block schematic of a Unidirectional communication
channel.
Joint entropy:
Equivocation:
Irrelevancy:
The mean ransinformation between he sequence x and the
sequence
y ,
eferred to on e symbol pair, is fou nd tobe
T
=
H x ) -tH ( Y )
-
H x,V >
= N x )
-
f w y )
= H ( y ) -
H W ) .
6 )
This is the fundamental law
of
the information theory. It can
be writte n in those th ree versions because of the relation
P x n y , ) = p x n I Y , ) . P ~ n ) = p c V , l x n ) . p x n ) .
The coding theorems of information and ransinfoymation-not
being discussed in this paper-show that a channel defined in
thema nner described above is actually able to transm it
messages of T binary digits (bits) with an arbitrary small error
probability if the message considered is infinitely long.
Shannon’s conception describes a unidirectional linkage
with an active transm itter and a passive receiver. To apply this
conception f0r.a bidirectional linkage, it could be repeated for
the oppo site directio n. This, however, would yield two inde-
pendent systems where both directions are repeated entirely.
111. THE MODEL O F A
COMMUNICATION
ND THE
G E N E RA T IO N
F
IN FO RMA T IO N
Fig. 2 shows the block diagram of the new conc eption . It is
supposed to describe the comm unication between two persons
(from here on denoted M1 and M 2 ) nformation-theoretically.
Both have as an essential part the inform ation source Ql , re-
spectively, Q 2 , two statistical generators with a symbol alpha-
bet which m ay be generally different (the information which
they prod uce ma y be interp reted p sychologically as conscious
processes). Fu rthe rm ore , they have, like Shannon’s channel, a
codingand a decoding device (neurophysiologically the de-
coding device corresponds to he fferen t and the coding
device to the effe ren t signal processing).
M
a n d M 2 are c w -
nectedby wo transmissionchannels corresponding to this
connection is the external world, as, far exam ple, an optical
or acoustical linkage). These channels angenerally, as in
Shannon’s case, be disturbed . However, it is not necessary to
consider the disturbances isolated; their influence is conta ined
in the statistical description of the model.
The essential characterigics
of
the nformation generation
and transmission according to this concep tion are as follows.
1)
The receiver is active and identical with the transmitter
of the same side. It generates information continuously, even
when bo th transmission channels are interrupte d.
2) The transinformation transmitted during a linkage causes
a .“stochastic s ynch roniz ation” of the receiver, and because of
this it influences its in form ation.
Fig. 3 shows for M 1 and M 2 the stochastic processes and
the statisticalcoupling,which is shownby the dashed lines.
The choice of the present m essage elem ents (symbols) is de-
pendent on the past symbols of its own process and the past
symbols of theothe r process. Influencing at he same time
does otake place; with this, ausality has been taken
into account .
The ollowingdefinitions are valid for Markov processes
with decreasing statistical linkages; however, i t h as to be men -
t ionedhat all variables defined exist in amore general
representation or whichonly station arity is required.The
entrop y of the two processes is denoted
H1
and H z . The di-
rected mean transinformation is denoted T I 2 or the direction
M2 -
M I and T21 for
MI
-
M2 (the first index refers to the
receiver, the second to the transm itter).
Three cases can be distinguished.
I )
Recoupling,
M1
M 2 :
The linkage is interrupted in both
directions. Therefore, T 1 2= 0 and TZ
=O.
The tw o stochastic
processes are independent of one another.
2 )
Monologue,
M1 + M 2 o r M 2
-+MI:
The linkage is inter-
rupted only in one direction, for instance, the lower of Fig.
2.
Then M 2 is the ransm itter and
M1
is the receiver. Now the
process of M 2 is not influenced;he process
of
M I
is
stochastically synch ronize d. Consequently
TZl
= 0 and
T I
exists.
3 Dialogue, M1 2 M 2 : The linkage exists in both direc-
tions, and herefore he wo processes nfluence each other.
Generally, T 1 2 s well as T2, exist.
8/16/2019 Hans Marko The bidirectional Communication Theroy - A generalization of information theory 1973
4/7
348
IEEE TRANSACTIONS ON COMMUNICATIONS, DECEMBER 973
Fig.
3.
Time run
of
the stochastic processes
of M1
and M2 and heir
statistical interdependence.
The last case (dialogue) is themo st general: the first two
cases are conta ined in it as special cases. The definitions of
the following part therefo re refer to this general case.
Iv. MATHEM ATICAL ESCRIPTION
F
COMMUNICATION,
DEFINITION
F THE
DIRECTED TRANSINFORMATION
AND THE
FREE
NFORMATION
The following.conditiona1 probabilit ies are t o be defined: x
is a symbol of
M1,
and y i s a symbol ofM 2 at the same t ime.
p xl
x,) Conditionalprobability or heoccurrenceof x
when
n
previous symbols
x,
of the own process
are known.
p ( y
y,)
Similarly.
p xl
x,y,) Conditional probability for the occurrence of
x
when n previous sym bols of the own process
x,
as well as of the otherprocess y n are known.
p yly,x,) Similarly.
p xyl
xny,) Conditional p robability for the o ccurrence of x
and y when
n
previous sym bols of b oth pro-
cesses are know n.
The equat ions
P xlxnyn)
= ~ x l x m ~ m )
P(YIYnXn) = ~ ( Y l ~ m x m )
are valid f or Markov processes of the orde r m when n
m .
The “transitio n probabilities”
p xl xmym)
nd p ylymxm)
determ ine the two stochastic processes com pletely; therefore
they can be designated as “generator probabilities.”
The symbol
{
} represents expec tatio n values. The follow-
ing equation s represent m ean nformation values per symbol
(entropy), according to the following definitions.
Total information of
MI
:
H1 lim {
-
log
p x)x,)}.
n-*-
’ (7)
’
Free informat ion ofMl :
F 1
lim
{ -
logp(xlx,y,)}.
8)
n
-
Directed transinformationM 2 - + M I :
Total informat ion of 2 :
Free information of
M 2 :
Directed transinformation
M1
f M2 :
Coincidence M1
- M 2
According to these definitions,he totalnformation” is
equal to the usual entropyof he single process. It can be
shown, for example in [ 6 ] , hat (7) and (1) correspond; he
same is valid for (10) and 2). The “free information” is the
entropy of one process with knowledge of theother. Nu-
merically it is naturally smaller than the entrop y without this
knowledge. The difference ofntropy
T 1 2 H1 F 1
designates the statistical influenc e of the second process on the
first process and is defined as the “directed transinformation.”
Further more , t can be shown
[ 6 ]
that he coincidence ac-
cording t o (13) agrees with Shannon’s transinformation equa-
tion ( 6 ) :
withhessumption, as before,f decreasing statistical
linkages.
It could be called “und irected” or “total” transinformation.
It couples the two processes in a symm etrical manner; there-
fore tdoesnot have a special directio n.The following m-
portan t re lation for the coincidence is valid:
K
= TI2
I T21
(14)
For the case “m onologue,” one of the two transinformations
vanishes; therefore the other transinformation is equal to the
coincidence and to Shannon’s transinf orm ation. With this, it
is proven t hat Shannon’s channel represents a special case of
the bidirectional comm unication.
All the variables define d above are positive. Thedirected
transinformation represents the information gain for the next
own produ ced symb ol due to the received message. The total
information is theentropy of theow n process; ithasone
part which is due tohe received transinformation, and
anothe r part, the “free” information; this can be shown from
the definitions since
Hi
=
Ti2 Fl
(15)
H2 = T21
-t
F2. (1 6 )
Finally, the following conditionalentrop ies, called residual
entropies, are introduce d:
R
= H K =
1 1 F1 - T21 17)
R2 = H z - K =F 2 T12. 18)
Then the above relations can’ be represented clearly by the in-
formation flow diagram of Fig.
4.
The two stochastical
generators of MI and
M 2
generate thefree nformation. At
the nodes
15)-
18) are valid as “K irchho ff laws.”
8/16/2019 Hans Marko The bidirectional Communication Theroy - A generalization of information theory 1973
5/7
MARKO: BIDIRECTIONAL COMMUNICATION THEORY
1349
Fig.
4
Information flow diagram with a bidirectional communication.
Fig.
4
shows, for exam ple, that the total information H 1 is
composed of the freegenerated information
F 1
and the re-
ceived transinformation
T 1 .
In the transmission direction
T 1 2 has o be subtracted, or t is already knownby he
counterpart . F 1 is then tra nsm itted, but du e to the absence of
a strong coupling, the residual e ntropy
R 1
s subtracted . .This
may be interpreted as the effect of disturba nces, if desired.
Finally,
TZ1
eaches th e coun terpart and yields together with
his free generated inform ation
F 2
to i ts total informat ion
H z .
For the oppo site directio n, the ame reasoning holds.
Fig.
5
represents the special case of “monologue” M 2
+M1
It corresponds to S hannon’s unidirectional transmission chan-
nel because
TZ1=
0. Here R 2 represents the equivocation and
R = F 1 the irrelevancy.
Fig.
6
shows finally the nforma tion flow diagram f or he
special case of “decoupling.”
Now th e laws which the nform ation flows obey are con-
sidered. From Fig. 4 i t follows that he sum of he arriving
currents has to be equal to the sum of the leaving currents:
F 1 + F z = R I
+R ,
+ K = H 1+ H z -K .
19)
From this we get an importan t statem ent. The larger the co-
incidence, that is, the otal ransmitted ransinforma tion in
both directio ns, the smaller is (for given entropies
H 1
and
H z )
the sum of the free informations. An effective comm unication
limits the sum of the free generated inf orm ation, ‘which seems
to be logical because of the tron g coupling of the .two
processes in this case. The re is anoth er limitation for the mag-
nitude of the ransinforma tion flows whichresults from he
identity of the otal nforma tion defined herewith Shannon’s
ent rop y. Th e following inequalities are valid, since Shannon’s
transinformation, here the coincidence, is always less than
H ( x )
as well as
H ( y ) .
H1
K (20)
H z . (21)
Inserting this in 1
5)-
18) yields
With this the directed transinformations are limited. Especially
T12
R1
Fig.
5
Information flow diagram with a “monologue”
M 2
+ M I .
t
R l H l
Fig. 6 . Information flow diagram with decoupling.
clear are (22)and (23). Theystat e hat he received trans-
information at M 1 cannot be larger than he freegenerated
information of
M 2 .
It is no t possible, for exam ple, that the
information generatedby
M1
canbe eflected and retrans-
mit ted by
M 2
because M1 knows this information already.
Because of this limitation, no loop current can develop in the
loop of Fig. 4. The wo ransinform ations as well as the co-
incidence become ma xim um if the ine,qualities become equa-
tions. This case is called “maxim um coupling.” If the relations
Tl2 = F2 26)
T21 =F1 (27)
exist, then the information flow diagram of Fig. 7 is valid. It
can be seen that he ree nform ation of theoppositeend
appears as transinformation; that means it is accepted entirely.
Both residual entropies vanish. The total information in both
cases has the same m agnitude and is composed of the sum of
the two free informations.
It seems to be meaningful to define a “stochastical degree of
sync hron ization ” or-psychologically-a “degree of percep-
tion” which is given by the received transinformation referred
to the total informat ion.
u1
as well as
u 2
can vary between
0
and
1.
The case
u
=
1 is
called “suggestion.” The free inform ation the receiver
vanishes, a nd the total informatio n is only determined by the
received transin form ation. It can be shown that either
u1
= 1
or
u2
= 1, but not both equal o one at he same t ime, can
8/16/2019 Hans Marko The bidirectional Communication Theroy - A generalization of information theory 1973
6/7
1 3 5 0
IEEE TRANSACTIONS O N COMMUNICATIONS, D E C E M B E R 1 9 7 3
-
M1
T21
t
Fig. 7 . Information
flow
diagram with maximum
coupling.
occur , i.e., a suggestion in bot h dire ction s at the same time is
impossible. For the sum of th e tw o degrees of synchron ization
the following hold s generally:
<
Tl2 T2 1
=
1.
T12
+
T21 T21
+
T12
It ma y be noted that the minimum values for F 1 and F2 re
introduced n this equation according to (22) and (23). The
equality sign holds for the case of maximum coupling, accord-
ing to 26) and (27). This means tha t
ul
+ u2 = 1 for maxi-
mu m coupling. Given this case,
ul
= 1 corresponds to the
case of suggestion
M2
M I , and u2 = 1 to the case of sug-
gestion M1 + M2
Fig. 8 shows the possible values of u1 and u 2 according to
30).
They are situated within a rectangularriangle. The
hypoten use represents the case of maximu m coupling. The
two cathetusses correspon d o hemonologue
ul
=
0
re-
spectively,
u2
=
0 ) and therefore to,Shann on's unidirectiona l
channel.Twocornerpoints describe thesuggestion,and he
origin means decoupling.The diagram contain s all special
cases, and showsnwhich manner hebidirectional om-
mun ication is a generalization of Shannon's inform ation theory .
possible to give a quantita tive descrip tion of the comm unica-
tionbetweenhum an beings in termsof ommunication
theo ry. The ma them atical model equires the existence of
two (or more) statistical processes, and describes th eir mutua l
coupling by m eans of a stochastical syn chroniz ation.
With the definition s suggested in this paper , it seems to be
'
V. APPLICATIONS
The bidirectional com mun ication theory has been applied to
the social behavior ofmonkeys
[
111. Twomonkeysofa
social gro up have been p ut toge ther in one cage and their ac-
tions have been observed and registered for a period o f 15 min.
Five typ ical b ehavioral activities (seating, leaving, genital dis-
play, hurrcall expressing aggressive mo tiva tion ] , pushing)
form ed two statistical time sequences o which the bidirectional
comm unication heoryhas been pplied.The esults were
typical for the social relation and agreed for each selected pair
of tw o individuals. Two already know n mod ifications of the
dominant behaviorshowed ypical values with regard to he
communication quantities (see Fig. 9).
G1
,suggestion
M Ml
/
decouplmg
Fig. 8. Possible values
of the
degrees
of
synchronization 7
and 172
Dominant Subdominod
animal onimd
dictator -
tehaviour
hero -
behviour
Fig. 9. Information f low diagram for bidirectional
communicat ion as
group
behavior
between
two
monkeys .
(All
figures given by bit/action.)
1)
Thebehavior of "dictator" nwhich the free action as
expressed y the free entr opyof hedomina nt animal is
greater as comparedwith he ubdom inantone. The major
behavioral influence as expressed by the directed transinforma-
tion goes.from the dom inant to the subdom inant animal.
2)
The behavior of "hero" in which the free actions of both
animals are about equal. Most of the behavioral influe nce goes
the opposite direction, namely, from the subdominant to the
domin ant animal.
The dictator" mo dification is unstab le. It occurr ed im-
mediately after he establishment of thedominant elation-
ship,and changed continuously nabout
6
weeks time nto
the "hero" mod ification, which proved to be the stable o ne.
The observations and evaluations have been perform ed as a
cooperationbetween he nstitut urNachrichtentechnik of
the TechnischeHochschule, Mu nchen , nd Deutsche For -
schungsanstalt fur Psychiatrie, M unch en, by
W.
Mayer.
In biologyandsociology, comm unication heory seems to
have wideapplications due o he fac t hat living beings are
(by their actions) sources of information which influence each
othe r in all possible dire ctions.
In order to avoid misunderstanding, it has to be mentio ned
that the present theo ry is just able to describe the comm unica-
tion between two persons or betwe en two living beings only
from an inform ation heoreticalpointof view. The imited
impo rtance of this view results from the definition of the in-
formation based on he probability
of
the message. Further-
more, this conception is onlya first appro xima tion of this
problem because stationarity is assumed, therefore excluding
8/16/2019 Hans Marko The bidirectional Communication Theroy - A generalization of information theory 1973
7/7
MARKO: BIDIRECTIONAL COMMUNICATION THEORY
1351
learning processes. An exte nsio n o henonstationary, re-
spectively, the quasi-s tationary ‘case considering learning and
forgetting , seems to be possible and meaningful. Fur ther mo re,
an extension of this the ory to com munic ation relations within
a group has been don e in [9]. Using this, the investigation of
multivariatesystems, .e.,socioeconomicalsystems, seems to
be possible in a similar way. The ability of distinguishing th e
direction of inf orma tion flow with his theory may prove to
be a useful tool for examining multivariate complex systems.
AC KNOW L E DGM E NT
The author wishes to thank Dr. Neuburger, who helped with
ma ny fruitful discussions and ma them atical proofs, and who
extended the theo ry to the m ultidirectional ase.
REFERENCES
[
11 C.
E.
Shannon, “A mathematical heory of communication,”
[2 ] H. Marko, “Informationstheorie und Kybernetik.Fortschr.d.Ky-
Bell Syst . Tec h. J., vol. 21, pp. 379-423, 623-652, 1948.
bernetik,” in Bericht uber die Tagung Kiel
1965
der Deutschen
Arbeitsgemeinschaftybernetik. Munich, Germany: Olden-
[
31
-,
“Die Theorie der bidirektionalen Kommunikation und ihre
bourg-Verlag, 1967, pp. 9-28.
Anwendung auf die
Informationsiibermittlung
zwischen Menschen
subjective information),” Kyberne t ik , vol. 3, pp. 128-136, 1966.
[4]
-,
“Information theory and cybernetics,” I E E E p e c t r u m ,
[5]
H.
Marko and E. Neuburger, “A short reviewof the heory of
bidirectionalcommunication,” Nachrichtentech. Z. vol. 6,pp.
[6]
-,
“Uber gerichtete Groi3en in der Informationstheor ie. Unter-
suchungen zur Theorie er idirektionalen Kommunikation,”
Arch. Elek. Ubertragung, vol. 21, no. 2, pp. 61-69, 1967.
[
71 E. Neuburger, “Zwei Fundamentalgesetzederbidirektionalen
Kommunikation,” Arch . Elek. Ubertragung, no. 5, pp. 208-214,
1970.
[8] -, “Beschreibung der gegenseitigen Abhangigkeit von Signal-
Fachberichte, vol. 33, pp. 49-59, 1967.
folgen durchgerichtete nformationsgroDen,” Nachrichtentech.
[9]
-,
Komm unikation der Gruppe (Ein Beitrag zur Informations-
theorie) . Munich, Germany: Oldenbourg-Verlag, 1970.
pp. 75-83,1967.
320-323, 1970.
[
101
G.
Hauske and
E.
Neuburger,
“Gerichtet:,Informationsgrofien
zur Analyse gekoppelterVerhaltensweisen, Kyberne t ik , vol. 4,
pp. 171-181, 1968.
[
111 W. Mayer, “Gruppenverhaltenon Totenkopfaffennter
besonderer Beriicksichtigung der Kommunikationstheorie,” K y -
bernetik, vol. 8, pp. 59-68, 1970.
[12] C.
E.’
Shannon,“Thezero-errorcapacity of
a
noisy channel,”
I R E
Trans . In form. Theory , vol. IT-2, pp. 8-19, Sept. 1956.
[13]
J
P.
M
Schalkwijk, “Recent developments in feedback com-
munication,” Proc . IEEE, vol. 57, pp. 1242-1249, July 1969.
[
141
C.
E. Shannon,“Two way communicationchannels,” in Proc.
4 thBerke leySymp.Math .Stat. and Prob.,
vol. 1. Berkeley:
[ 151 W. J McGill, “Multivariate information transmission,” Psycho-
Univ. California Press, pp. 6 11-644.
[
161 W. R. Ashby, “Information low withincoordinated systems,”
metrica, vol. 19, pp. 97-116, 1954.
Publ. 203, Int. Congr. Cybern., John Rose, Ed. London:1969,
[
171
S
Watanabe, “Information theoretical analysis of multivariate
pp. 57-64.
correlation,” IBM J . , pp. 66-82, 1960.