Page 1
Czech Technical University in Prague
Faculty of Mechanical Engineering
Ph.D. Thesis Summary
Modeling of Complex Dynamic Systems by
Nonconventional Artificial Neural Architectures and
Adaptive Approach to Evaluation of Chaotic Time Series
Ing. Ivo Bukovský
In the field of
Control and Systems Engineering
Supervisor
Prof. Ing. Jiří Bíla, DrSc.
Prague, July 2007
Page 3
The Ph.D. thesis was conducted at the Department of Instrumentation
and Control Engineering at the Faculty of Mechanical Engineering of
the Czech Technical University in Prague
Ph.D. Candidate: Ing. Ivo Bukovský
Supervisor: Prof. Ing. Jiří Bíla, DrSc.
Opponents:
The thesis summaries were distributed on:
The Ph.D. thesis defense is scheduled on:
in conference room # 17, 1st floor in the Faculty of Mechanical Engineering,
Czech Technical University in Prague, Technicka 4, Prague 6 – Dejvice,
examined by an ad hoc appointed committee in the field of Control and Sys-
tems Engineering.
The Ph.D. thesis is available from the Department of Science and Research
of the Faculty of Mechanical Engineering, CTU, Technicka 4, Prague 6 –
Dejvice.
Prof. Ing. Pavel Zítek, DrSc.
The Chairman of Postgraduate Studies Board
in Technical Cybernetics
Faculty of Mechanical Engineering
Czech Technical University in Prague
Page 5
Abstrakt
Disertační práce zakládá teorii nekonvenčních umělých neuronových
architektur s nelinearitami v synaptické neurální operaci a teorii spojitých
lineárních a nelineárních neuronových jednotek s adaptovatelnými
dopravními zpožděními na neurálních vstupech a ve stavové zpětné vazbě.
Nelineární agregační funkce zvláště spolu s dopravními zpožděními
implementovanými jako neurální parametry podstatně zvyšují aproximační
schopnosti jednotlivých umělých neuronů a minimalizují počet neurálních
parametrů potřebných pro aproximaci složitého systému ať už neuronovou
sítí nebo samostatnou neuronovou jednotkou. Je interpretována paralela
matematické struktury předkládaných nelineárních neuronových architektur
do biologické struktury neuronů se zaměřením na výpočetní kapacitu
samostatných biologických neuronů. Praktická technika zajišťující stabilní
adaptaci dynamických neuronových jednotek je navržena a aplikována. Jsou
demonstrovány aplikace a výsledky v úlohách aproximace a řízení systémů.
Je uvedena nová metodika adaptivního vyhodnocování variability složitých
signálů založená na využití předkládaných nekonvenčních neuronových
architektur. Metodika je založena na vyhodnocování variability neurálních
parametrů během adaptace umělé dynamické neuronové jednotky. Je
uvedena nová třída umělých neuronových jednotek s aproximační schopností
zvýšenou pomocí adaptabilního vstupního signálového preprocesoru.
Aplikace a výsledky jsou demonstrovány na citlivé detekci okamžitých změn
ve variabilitě jak simulovaných signálů tak i reálných srdečních tachogramů
(beat-by-beat variability monitoring) Je navržena metodika vizualizace
detekovaných změn variability, která dále umožňuje další exaktní
vyhodnocování pomocí matematických postupů.
Page 6
Abstract
First, the thesis establishes the theory of novel artificial neural architectures
with higher-order nonlinear synaptic neural operation and with linear and
nonlinear continuous-time neural architectures with adaptable time delays in
both the state feedback and neural inputs. The higher-order nonlinear aggre-
gation function, especially together with time-delay neural parameters of a
unit, increases the computational capability of the static and dynamic neural
units and thus simplifies the neural architecture and minimizes the number of
neural parameters necessary for complex system approximation. The parallel
between the novel neural architectures and a real biological neuron is drawn
especially with focus on higher computational capability expected from sin-
gle neurons. The practical stability-maintaining technique for the learning
algorithm of dynamic neural architectures is proposed and applied. The ap-
plications to the approximation and control of systems are shown. Second, a
novel methodology based on utilization of the proposed neural architectures
for adaptive evaluation of variability in complex signals is established. The
methodology is based on observation of neural parameters during the adapta-
tion of a dynamic neural unit. Novel class of neural units with approximating
capability increased by the adaptable signal preprocessor is established.
Applications and results are shown on sensitive sample-to-sample detection
of changes in dynamics of highly chaotic time series including simulated
time series and real heart-beat tachograms (beat-by-beat variability monitor-
ing). The methodology for visualization of the detected variability is devel-
oped and enables consequent exact evaluation by further mathematical
measures and methods.
Page 7
1 of 30
Acknowledgements
I would like to express my sincere thanks to my supervisor, Prof. Ing. Jiří Bíla, Dr.Sc.
1, for his professional and valuable supervision, open-minded
attitude towards my research, and all support resulting in this Ph.D. thesis. Very special thanks belong to Professor Madan M. Gupta
2 who kindly in-
troduced me to the research of nonconventional neural architectures. I would like to thank him for his excellent supervision during my visit to his research lab in 2003 and for his kind overseas co-supervision since 2003.
My sincere thanks also belong to Professor Zeng-Guang Hou
3 for his
valuable suggestions regarding my research, publication activity, and sup-port within a scientific community.
The development of higher-order nonlinear neural units (HONNU), time-
delay dynamic neural units (TmD-DNU), and time-delay dynamic higher-order nonlinear neural units (TmD-HONNU) are parallel branches of the research of nonconvential neural architectures. By courtesy of the NATO Science Fellowships Program and with partial support of an internal grant of the Czech Technical University in 2003, the research of these nonconven-tional neural units was conducted with the cooperation of the Department of Instrumentation and Control Engineering directed by Prof. Jiří Bíla, with the Intelligent System Research Laboratory in Canada directed by Prof. M. M. Gupta, and also with Prof. Zeng-Guang Hou.
Ivo Bukovský Prague, March 2007
1 Professor and director of the Department of Instrumentation and Control Engi-
neering at the Faculty of Mechanical Engineering, Czech Technical University in
Prague.
2 Professor and director of Intelligent System Research Laboratory at the Univer-
sity of Saskatchewan in Canada, (IEEE life fellow, SPIA).
3 Professor at the Key Laboratory of Complex Systems and Intelligence Science,
Institute of Automation at the Chinese Academy of Science, Bejing, China.
Page 8
2 of 30
Contents
ACKNOWLEDGEMENTS ............................................................................................ 1
CONTENTS ..................................................................................................................... 2
GOALS OF THE THESIS .............................................................................................. 3
1 INTRODUCTION ................................................................................................. 3
2 STATE OF THE CURRENT RESEARCH ......................................................... 5
2.1 Heart Rate Variability and Nonlinear Methods.......................................... 5
2.2 Heart Rate Variability and Artificial Neural Networks ............................. 8
2.3 Onsets of Nonconventional Artificial Neural Units................................... 9
3 DETERMINISTIC CHAOS, HRV, AND AUTONOMOUS NERVOUS
SYSTEM ..............................................................................................................11
4 DEVELOPMENT OF NONCONVENTIONAL ARTIFICIAL NEURAL
ARCHITECTURES.............................................................................................12
4.1 The Learning Algorithm............................................................................14
4.2 Classification of Nonconventional Neural Units .....................................16
5 DISCRETE HONNU AND ADAPTIVE APPROACH TO MONITORING
OF VARIABILITY OF COMPLEX TIME-SERIES........................................19
6 CONCLUSIONS..................................................................................................26
6.1 Summary of Achievements .......................................................................26
6.2 The Limitations and Challenges for Further Research ............................27
AUTHOR’S PUBLICATIONS AND SELECTED REFERENCES..........................28
Page 9
3 of 30
Goals of the Thesis
he thesis introduces novel artificial neural units as a new tool for novel
evaluation of complex systems that generate complex output signals featur-
ing chaos, multi-attractor behavior, and where external or and internal perturbations
may occur. The new adaptive evaluation of variability of chaotic time series has
been established and has appeared as a very promising methodology.
1) The theory of nonconventional neural units is established as it is required for
general nonlinear approximation (identification) of complex dynamic systems
with focus on high computational (approximation) capability, minimum number
of network parameters, and simplicity of neural architecture. This can be more
further categorized as follows:
1.1) To develop a tool capable of describing complex systems with a minimum
number of neural parameters and with a sufficient approximating capabil-
ity.
1.2) To benefit from cognitive capabilities of artificial neural network tools, and
1.3) To maintain simplicity of mathematical notation of a problem consisting in
a low number of parameters and simple dynamic structure.
2) The second objective is to introduce these new artificial neural architectures as
a tool applicable to fast monitoring of changes in levels of variability in signals
generated by complex dynamic systems. This thesis can be further categorized
as follows:
2.1) To develop a tool reflecting important characteristics of the dynamics as
well as appropriately responding to sudden and continuous changes in dy-
namics of complex systems in a real-time, and
2.2) To establish foundations for novel method for variability evaluation of
complex signals.
1 INTRODUCTION
Since the beginning of the development of artificial neurons and artificial neural
networks (NN), the emphasis has been put on the linear synaptic operation (input
aggregating function) of a neuron while the neural somatic (output) operation has
been considered nonlinear one, usually except the output layer. Nonlinear synaptic
operation has not attracted much interest in the literature. A conventional structure
of an artificial neuron (linear synaptic operation) can provide NN with ability to
T
Page 10
4 of 30
solve typical tasks such as pattern classification, identification, adaptive control, and
signal prediction. A conventional NN still represent a black or gray box that does not
provide users with a useful explicit mathematical description of a problem and pre-
vents them from seeking natural and simple solutions by further mathematical sys-
tem analysis. The implementation of multiple conventional neural units minimizes
the chance of finding appropriate mathematical solution in the form of correspond-
ing, meaningful, and simple mathematical equations hidden in the structure of
trained or adapted NN.
The design of artificial neural architectures that would consist of a minimal
number of neural parameters and would sustain high computational power has been
one of the strongest motivations for this research.
The developed discrete and continuous static and dynamic higher-order nonlin-
ear neural units (HONNU) [1][2][15], continuous time-delay dynamic neural units
(TmD-DNU) [6][7][8], and continuous time-delay dynamic higher-order nonlinear
neural units (TmD-DHONNU) [23][23] represent a movement toward the design of
more natural morphology of an artificial neural unit that is facilitated, among others,
with natural ability to reveal existing implicit mathematical solution in the form of
corresponding static or dynamic equations. These new neural units can be applied to
static or dynamic tasks for nonlinear static as well as dynamic systems. These new
neural units can be implemented in a network or can function as a standalone units
undertaking the role of generally applicable adaptive algorithms to approximation of
high-order and nonlinear systems by comprehensible mathematical equations. The
units promise to provide us with novel solutions to engineering problems dealing
with complex systems.
The major motivations for the development of nonconventional neural units were
1. the need for a new computation tool applicable to complex systems,
2. the increase of computational capability of artificial neural units with a possible
functional and structural resemblance to the real biological neurons, and
3. the simplification of artificial neural architectures.
It has become apparent that the research of evaluation of complex nonlinear dy-
namic systems, heart rate variability, and the development of HONNU, TmD-DNU,
and TmD-DHONNU shall be closely related issues. The crucial simplification of an
artificial neural network structure would result in a simpler acquisition of knowledge
stored in minimum number of neural parameters resulting in the appropriate and as
simple as possible mathematical description of a complex system. Such a simplifica-
Page 11
5 of 30
tion of systems description can help many current problems; especially, it can influ-
ence many current engineering challenges regarding complex dynamic systems.
2 STATE OF THE CURRENT RESEARCH
2.1 Heart Rate Variability and Nonlinear Methods
The achievements and limitations of nonlinear methods in evaluation of chaotic
systems and HRV [16][17][22] are summarized and discussed in the thesis. This
summary introduces a challenge of novel evaluation of chaotic systems and multi-
attractor dynamics of HRV.
Common nonlinear methods suffer from complex dynamic and nonlinear nature
of the data (system), e.g., R-R diagrams - the interbeat time series. The major prop-
erties expected from signals allowing the correlation dimension (CD) to be reliably
evaluated [16] are:
1. the signal is not too complex (not too high embedding dimension),
2. the signal is self-returning enough (cycle-like behavior),
3. the signal is long enough,
4. (the signal has appropriately low noise to signal ratio).
Even though the above mentioned conditions #1 to #4 are originally expressed
for CD; other common nonlinear methods, such as Largest Lyapunov Exponents
(LLE), also suffer when the above conditions are violated. Physiological signals,
including signals generated by the human cardiovascular system of healthy and
steady subjects and even the deterministically simulated data [13], often break the
above mentioned conditions to some considerable extend. The results on HRV
evaluation by nonlinear methods suffer a from considerable degree of uncertainty
and have led to not too uniform results and conclusions. The previously mentioned
conditions should be viewed as relevant to a particular evaluated signal generated by
a system that performs on a single attractor during the data acquisition. However, the
system can alter behavior by switching between chaotic modes and not necessarily,
however, always run in a chaotic mode [9] to [13].
The possible existence of multiple strange attractors in HRV has been already
indicated in literature (Kanters, J., K., et al.). The idea of the multi-attractor behavior
of HRV is supported by research and simulation experiments [16] [17] [13]. It points
out the low efficiency of common nonlinear methods for evaluation of HRV because
they have to be applied to long-enough time-series of R-R interbeat recordings of
Page 12
6 of 30
not too complex dynamics, however, the dynamics (attractor) may change signifi-
cantly within the evaluated signal during a single recordings. The evaluation of
correlation dimension has also failed even for apparently single-attractor highly
chaotic signals generated by model whose parameters were kept constant during
simulation [9] to [13].
Behavior of the cardiovascular system becomes complex importantly because of
the fast beat-by-beat control influences of autonomous nervous system (ANS) and
has been investigated within the frame of cooperation of U12110.3 CVUT FS and
1st Faculty of Medicine of Charles University in Prague [19] [20]. The ability of
ANS to develop complex (chaotic) heartbeat dynamics via settings of even a very
few physiological parameters in the “control loop of ANS” has been revealed. The
high-level deterministic chaos in HRV can develop because of time-delayed fast
beat-by-beat control influences of ANS. The chaos can be high even if these delays
are kept constant [13]; the development of deterministic chaos in time-delay systems
is in good conformance with observations made in technical systems or differential
equations that are even much simpler than the human cardiovascular system, e.g.,
Mackey-Glass delay differential equation. The idea of chaos in HRV due to the fast
ANS control well corresponds with a regular heart rate observed with patients after
heart transplants, where the neural lines of ANS were cut and the fast ANS feedback
control was disabled.
Interesting results are obtained by recurrence-plot (RP) method (Eckmann et al.
1987) further developed and applied also to evaluation of HRV with successful
prediction of ventricular tachyarrhythmia [18]. The RP visualizes n-dimensional-
system behavior into 2-D plot and thus reveals hidden periodicity (recurrences) of
the evaluated signal, i.e., the RP displays two-dimensional visualization of a self-
approaching orbit in a reconstructed state space of an appropriately chosen embed-
ding dimension (theoretically ED ≥2n+1, Takens). The RP method is referred to as
being also able to capture inter-attractor transitions of a system (also corresponding
to as “laminar states” [18]) where common nonlinear methods may not provide
reliable results as mentioned above .
The common nonlinear methods (CD, LLE,…) become ambiguous for systems
with varying dynamics either due to varying system parameters, external perturba-
tions, due to deterministic transients on multiple chaotic attractors, or due to too
high chaos and high dimensionality (order, embedding) of a system [13].
Page 13
7 of 30
The recurrence plot handles multi-attractor behavior to some extent, but the method
works in a ‘sliding-window’ regime and has to wait for a relatively considerable
amount of new samples to make significant changes in the recurrence plot evident,
i.e., until the plot can clearly indicate the modified dynamics (change in attractor) of
a system. Not always can the recurrence plot display changes in dynamics clearly;
this is demonstrated on the sudden change of the bifurcation parameter a of a chaotic
time series generated by the well known logistic equation in Figure 1.
Figure 1: The recurrence plot of time series generated by the logistic equation;
the recurrence plot do not clearly indicate the change of the bifurcation parameter a
from 3.95 to 3.96 (see Figure 14, in this summary)
Logistic Equation x (k+1)=ax (k)(1-ax (k))
-0.2
0
0.2
0.4
0.6
0.8
1
1.2
0 20 40 60 80 100 120 140 160 180 200
k
x(k
)
a = 3.95 a = 3.96
k
k
Logistic Equation x (k+1)=ax (k)(1-ax (k))
-0.2
0
0.2
0.4
0.6
0.8
1
1.2
0 20 40 60 80 100 120 140 160 180 200
k
x(k
)
a = 3.95 a = 3.96
k
k
Logistic Equation x (k+1)=ax (k)(1-ax (k))
-0.2
0
0.2
0.4
0.6
0.8
1
1.2
0 20 40 60 80 100 120 140 160 180 200
k
x(k
)
a = 3.95 a = 3.96
k
k
Page 14
8 of 30
2.2 Heart Rate Variability and Artificial Neural Networks
The use of NN for prediction of chaotic signals with focus on analysis of cardio-
signals has been elaborated in [17] (Mankova 1997) and further studied in [16] (Vit-
kaj 2001). The conclusions from analytical and practical observations, resulting
from the development of feed-forward neural-network (FFNN) models and their
application to chaotic signals and R-R diagrams (interbeat tachograms) prediction,
introduce important facts that conform to original ideas leading to this proposed
thesis. The summary of some of the conclusions from [16], which are most relevant
to the conception of this thesis, follows:
1) “Small” FFNN (~4/6/1) seemed to extract the characteristic orbit of the system
or the characteristic transition between orbits. That is, the FFNN with smaller
number of neurons can still learn the chaotic behavior so accurately, so that they
can learn the transition to another (coexisting) attractor.
2) “Large” FFNN (16/24/1) generated signal with frequency spectrum more simi-
lar to the original signal. That is, a FFNN with higher number of neurons tends
to learn the dynamics of a system as a whole.
3) Low-dimensional chaotic systems with appropriately “returnable orbit” are very
well predictable by simple neural models. That is, a simple FFNN can very well
predict systems behaving apparently on a single attractor.
4) Predictive NN models do not have to characterize the modeled process as a
whole; the extracted characteristics can be used possibly for modeling and clas-
sification of chaotic systems. That is, even though the trained NN describes only
the actual dynamics of a system for a single orbit (attractor) on which the system
at that time behaves, it can be used for modeling and classification of chaotic
signals.
5) NN can extract attractor’s geometrical characteristics from noisy data and low
number of input data; this NN property may become even improved in the case
of incremental adaptation of NN if artefacts or noise were present in minority of
data samples. For example, if 10% of samples were due to unwanted artefacts or
affected by noise, than 90% of weight increments accurately approaches the sys-
tem dynamics while only 10% diverges it away during adaptation.
6) NN can be successfully applied in cases where ECG recordings are not long
enough and where common nonlinear methods can not be used for their insuffi-
cient convergence or instability.
Page 15
9 of 30
7) Pathological changes within a critical group of patients can be possibly detected
by generating the residua resulting from the comparing the NN model of a pa-
tient with actual physiological recordings.
8) The design of NN model of a patient will not be trivial because of complexity and
multi-attractor behavior.
9) NN models should be used also for decomposition of multi-attractor dynamics.
Due to the multi-attractor dynamics, the common nonlinear methods are incor-
rect over the whole length of the recorded signal, do not converge, or result in a
significant variance of results, thus disabling precise and detailed medical diag-
nostics.
2.3 Onsets of Nonconventional Artificial Neural Units
Common architectures of neural networks (MLP, RBF, Hopfield networks,…)
consist of conventional artificial neurons featured with linear synaptic operation, i.e.,
weighted summation as an aggregating function. These NN have a good ability for
practical solutions to typical tasks such as pattern classification, system identifica-
tion, adaptive control, signal prediction and so one. From the point of view of exact
mathematical solutions, represented by one or more governing implicit static or
dynamic equations, conventional NN still represent a black box that does not allow
users to obtain information about useful explicit mathematical description of the
problem. A complex structure of conventional NN prevents researchers from reveal-
ing natural and simple solutions based on further analysis of governing equations
describing complex systems. The first indications of the need for the increase of
computational power of individual artificial neurons were introduced by A.G.
Ivakhnenko (late 1960‘) in the polynomial neural networks (PNN), where Kolmo-
gorov-Gabor polynomials were utilized in the aggregating function of neurons.
Since then, the PNN have been further developed and became a branch of neural
network research.
Another signs of equipping artificial neural units with greater computational
power and with the utilization of backpropagation gradient learning algorithm was
recently published in literature (Wiley & Sons) by Gupta, Liang, and Homma in
2003 [15], where the first signs and rudimentary concepts of higher order neural
units are mentioned In the spring of 2003 (when the author joined the research group
of M.M. Gupta), the concept of the nonconventional artificial architectures had been
Page 16
10 of 30
briefly introduced in its basic principle, and static and simple dynamic modifications
were consequently developed (Gupta, Song, Redlapalli, Bukovsky, [2]).
Figure 2: A simplified sketch of a biological neuron assuming nonlinearity in neural
synapses (from Gupta, 2003) compared to conventional artificial dynamic neural
unit with linear aggregation (summation) of neural inputs and neural state feedback
(Hopfield, Pineda 1980s).
In the thesis, the research concept of artificial neural networks (ANN) that had
been outlined by Hopfield and later also by Pineda is the primary focus. This con-
cept has been further pursued and developed by the research group of Prof. M.M.
Gupta (ISRL, University of Saskatchewan, Canada) [15] which is also close to the
research group of Prof. J. Bila (FS CVUT, CR) [10] to [14] . The terminology in the
Page 17
11 of 30
thesis, regarding the development of new artificial neural architectures, is based on
the terminology used by M.M. Gupta in his book on neural networks [15].
3 DETERMINISTIC CHAOS, HRV, AND AUTONOMOUS
NERVOUS SYSTEM
Even though the sinoatrial node itself is a very periodic pacemaker of heart
rhythm, complex heart rate variability is usually observed even with healthy people
in a relaxed state and supine position. The complex heart rate variability results from
the complexity of the dynamic system which cardiovascular system represents.
Figure 3 : Estimation of correlation dimension and largest Lyapunov exponents of
the simulated (deterministic) heart beat tachograms by deterministic model [9] to
[13] [19] [20].
The complex heart rhythm is due to many physiological control mechanism of
cardiovascular system, inputs (perturbations) to a human cardiovascular system,
such as mental activities and humoral influences, and very importantly also due to
fast beat-by-beat control influences of ANS. An evidence of the existence of a de-
terministic-chaos component in heart rate is supported by simulation experiments
and by evaluation of heart rate variability [9] to [13] [19] [20]. The fast control
feedback influences of ANS to heart performance are only one of the reasons why
complex behavior of heart rate develops. It is proposed in this thesis that the deter-
ministic contributions of ANS to changes in heart rate variability are crucial for the
novel method of diagnostics based on tracking the dynamics of the deterministic
heart rate variability component. The tracking of the dynamics of the deterministic
HRV00112 HRV00144
Page 18
12 of 30
component of heart rate variability is proposed to be performed by the new noncon-
ventional neural architectures established in the thesis.
Td2 Saturation of Correla-tion Exponent 0.2 0.3 0.4 0.5 0.6 0.7 0.8
0.1 yes yes
0.2 yes yes yes
0.3 yes yes
0.4 yes
0.5
0.6
0.7 yes
0.8
Td1
0.9 yes yes
Table 1: CD and LLE in were rarely evaluated with the saturation of the correlation
exponent (4096 heartbeats, evaluated by Dataplore) due to too complex variability
and possible multi-attractor nature of the heart-beat tachograms simulated by deter-
ministic model [1] (to [13]).
4 DEVELOPMENT OF NONCONVENTIONAL
ARTIFICIAL NEURAL ARCHITECTURES
Novel artificial neural architectures called higher-order nonlinear neural units
(HONNU) , linear time-delay dynamic neural units (TmD-DNU), and time-delay
dynamic higher-order nonlinear neural units (TmD-DHONNU) are introduced as
they represent new classes of neural architectures suitable for system approximation
with both minimum number of neural parameters and simple internal architecture.
The computational power of HONNU is increased, compared to conventional neual
units, by implementing a nonlinearity into neural aggregating function fHONNU . This
function can be understood as the composition of the full neural synaptic neural
operation and a part of the somatic neural operation. Example of the nonlinear ag-
gregation of the quadratic neural unit is in Eq.(1).
(1)
0
02
001 1
nonlinear synaptic neural operation
+ nonlinear somatic neural operation (a simplified effect of further
intera
( ) ( )
n n
i jH ij
ii ij
ONNUi j i
n n n
i i i ji i j i
i
synaptic somaticf u u
u u w u u
w
w w w
= ≥
= = >
+
= =
=
ν = = ν + ν = =
+ + +
∑∑
∑ ∑∑
ctions in soma that affects the signal transmitted into the axon)
Page 19
13 of 30
Figure 4: The structure of a biological neuron compared to time-delay dynamic
quadratic neural unit – type 2, where wij are neural weights of the aggregating
nonlinearity fHONNU, and Tf and Ti are adaptable time delays considered another
neural parameters.
In order to increase the computational capability of conventional neural units,
maintaining their linear aggregation function, following types of TmD-DNU were
introduced [6][7][8][23][23]:
1. TmD1-DNU with adaptable time delays on neural inputs, and
2. TmD2-DNU with adaptable time delays both on neural inputs and in the state
feedback of the unit.
neural
output
Synapse of Neural
Inputs
Nucleus
Dendrites
Axon
Soma
neural
output
Synapse of Neural
Inputs
Nucleus
Dendrites
Axon
Soma
1 1
0
n n
iji j
i j i
wx x+ +
= =
∑∑
Tf ≥ 0, Ti≥ 0 for i=0…n
( )j tu
( )n tu
( )( )tξφ( )ξ t y(t)
( )1 tu
�
1
s
1
s
TmD2-DQNU
u0 … a constant neural bias
Tj
Tn
T1
0u
�
xa
0
1
j
n
u
u
u
u
ξ
�
�
Tf
T0T0
( )fTtξ −
( )tν
Page 20
14 of 30
Similarly to conventional linear neural units, TmD-DNU have linear synaptic
operation (weighted summation) on its neural inputs and either nonlinear or linear
function on the neural output. Contrary to conventional dynamic neural units, the
neural structure of TmD-DNU contains continuous time delays as adaptable neural
parameters. By combination of the both features that increase the computation capa-
bility of neural units, i.e., the nonlinearity of the aggregating function and adaptable
time delays, TmD-DHONNU were developed. TmD-DHONNU have maximum
approximating ability and minimum number of neural parameters.
4.1 The Learning Algorithm
The supervised learning algorithm was developed for each type of the neural ar-
chitectures proposed in the thesis. The neural parameters are adapted by the back-
propagation (BP) gradient learning algorithm. For discrete HONNU, a technique for
stable adaptation of neural weights was proposed as a combination of the use of
static HONNU and their corresponding dynamic versions, respectively. For continu-
ous neural units, i.e., HONNU, TmD-DNU, and TmD-DHONNU, the stability-
improving neural architectures were designed [6][7][8].
The principle of the learning rule for adaptation of neural parameters of linear
TmD-DNU, i.e. the neural weights and delays, is in Eq.(2).
(2)
The principle of the learning rule for neural parameters of continuous HONNU
and TmD-DHONNU, i.e., with the nonlinear aggregating function fHONNU , is in
Eq.(3).
(3)
2
0
( 1) ( ) ( )
( ) ( )( 1) ( )
( )( 1) ( ) ( ( )
( ))
1 ( ) ( )
2
( )
i i i
ii i
n-1
i i ii i
k k k
t tk t
tk t
t
w w w
ww w
w L G s U sw
ε ξµ µ ε
µ εξ
φ
φ
=
+
+
+
= + ∆
∂ ∂∆ = − =
∂ ∂
∂ ∂ ∆ =
∂ ∂ ∑
( )
( )
0
0
( 1) ( )
( )
,
,
t
ij HONNUij
t
HONNUij
kw f dw
f dw
τ
τ
µ ε τξ
µ ε τξ
φ
φ
+ ∂ ∂ ∆ = ⋅ =
∂ ∂
∂ ∂ = ⋅ ∂ ∂
∫
∫
a a
a a
x W
x W
Page 21
15 of 30
Figure 5: The supervised learning for all TmD-DNU, dynamic HONNU, and TmD-
HONNU. Neural input u, internal neural state variables, and neural outputs from
other units enter the calculation of neural-parameter increments ∆wi,, ∆Ti, ∆Tf,
∆τ,...
For the purpose of implementing discrete dynamic HONNU, Figure 6, the BP
gradient learning algorithm was extended to its dynamic form, Eq.(4).
Figure 6: General structure of the discrete dynamic quadratic neural unit.
UNKNOWN
SYSTEM
( )tu( )ty
( )r ty
TmD-DNU
ε (t)…error+-+-
weight adaptation
with the use of performance index J
21
2i
i i
Jw
w w
εµ µ
∂ ∂∆ = − = −
∂ ∂
21
2J ε=
...T Ti i fw τ∆ ∆ ∆ ∆
0u
1
i
n
u
u
u
�
�
( )φ ξ
nonlinear synaptic preprocessor and neural dynamicsneural
inputs
somatic
operation
neural
output
( )kν = ξ
y
xa
1
z
1
z
( )1k −ξ
u0… a constant neural bias
Discrete Dynamic QNU
0
1
j
n
u
u
u
u
ξ
�
�
1 1
0
n n
iji j
i j i
wx x+ +
= =
∑∑
( )1k −ξ
( 1)k −u
Page 22
16 of 30
(4)
where D is the delay operator.
The static and dynamic BP algorithms were combined to prevent instability
problems of learning algorithms for the general class of nonlinear (unbounded) func-
tions as synaptic and somatic aggregation operations of HONNU. This was the sim-
plest, most practical preventive measure to use to run a stable adaptation of dynamic
HONNU.
4.2 Classification of Nonconventional Neural Units
Except for that the developed neural units can be implemented as continuous or
discrete neural units, novel classification of artificial neural units into new classes is
established in the thesis. Novel artificial neural units HONNU, TmD-DNU, TmD-
HONNU, can be naturally classified according to three distinguishing attributes.
These attributes are:
1. The nonlinearity of the neural aggregating function ν=fHONNU ,
2. The dynamic order of a unit, i.e., the number of time integrations of ν or the
number of time integrations4 of neural state variable ξ, and
3. The type of implementation of adaptable time delays within a neural unit.
All these three attributes are generally distinct and their combinations create distinct
subclasses of artificial neural units, Figure 7 to Figure 9. This novel classification of
the new neural units, including the conventional ones, is sketched in Figure 9, where
each of the axes represents each of the three design attributes.
4 For discrete dynamic neural units, the dynamic order corresponds to the number of step
delays in the path of neural aggregated variable ν.
{ }
{ }
( 1)
for QNU
( )
( ) ( )= ,
,
,
mij HONNU
ij
m mHONNU i j
ij
kw D fw
D f D x xw
µ εξ
µ ε µ εξ ξ
φ
φ φ
+∂ ∂ ∆ = ⋅ =
∂ ∂
∂ ∂ ∂ = ⋅ = ⋅ ∂ ∂ ∂
a a
a a
W
W
x
x
Page 23
17 of 30
Figure 7: Classification of nonconventional artificial neural units according to the
type of aggregating nonlinearity fHONNU=ν and the number of its time integrations
(and corresponding state feedbacks), i.e., the dynamic order.
Figure 8: Classification of time-delay dynamic neural units according to the type of
aggregating nonlinearity ν and the type of time delay implementation.
Novel Classification of Artificial Neural Units
0
n n
i j
i j
ij
i
x wx= =
∑∑0
n n n
i j k
i j i k j
ijkx wx x= = =
∑∑∑
Dy
nam
ic Ord
er ( the #
of in
tegratio
ns o
f
aggreg
ated v
ariable ν
)
Nonlinearity of the Aggregating Function ν
Conventional Dynamic
Linear Neural Units
(Linear Aggregating Function)
Conventional Static
Linear Neural Units(Linear Aggregating Function)
Static
Quadratic Neural Units
(Static QNU)
Static Cubic Neural Units
(Static CNU)
Dynamic
Quadratic Neural Units
(Dynamic QNU)
Dynamic
Cubic Neural Units
(Dynamic CNU)
Dynamic-Order Extended
Quadratic Neural Units
(Dynamic QNU)
Dynamic-Order Extended
Cubic Neural Units
(Dynamic CNU)
0
n
i
i
ix w
=
∑ν =
2+
1
0
Dynamic-Order Extended
Linear Neural Units (Linear Aggregating Function)
‘1’ ‘2’ ‘3’
Classification of Time-Delay Dynamic Neural Units (TmD-DNU)
0
n n
i j
i j
ij
i
x wx= =
∑∑0
n n n
i j k
i j i k j
ijkx wx x= = =
∑∑∑
Ty
pe o
ftim
e-delay
imp
lemen
tation
Nonlinearity of the aggregating function ν
Linear Type-2 Time-Delay
Dynamic Neural Units (Linear Aggregating Function)
Linear Type-1 Time-Delay
Dynamic Neural Units (Linear Aggregating Function)
Type-1 Time-Delay
Quadratic Neural Units(TmD1-QNU)
Type-1 Time-Delay
Cubic Neural Units(TmD1-CNU)
Type-2 Time-Delay
Quadratic Neural Units(TmD2-QNU)
Type-2 Time-Delay
Cubic Neural Units(TmD2-CNU)
Extended Time-Delay
Dynamic Neural Units (Linear Aggregating Function)
Extended Time-Delay
Quadratic Neural Units(TmD3-QNU)
Extended Time-Delay
Cubic Neural Units(TmD3-CNU)
0
n
i
i
ix w=
∑ν =
‘2+’
‘2’
‘1’
‘1’ ‘2’ ‘3’
Page 24
18
of 3
0
Ty
pe
of
tim
e-d
elay
imple
men
tati
on
# of integratio
ns of n
eural
aggregated variable ν
Classification of Basic Types of Nonconventional Neural Units in the Design Space
Nonlinearity of
the aggregating
function ν‘2’ ‘3’
2+
‘2+’
‘2’
‘1’
1
Static QNU Static CNU
DOE CNU
Dynamic LNU
TmD1-CNU
DOE QNUDOE LNU
conventional
neural units
with linear
aggregating
function νννν
DOE … Dynamic-Order Extended
Dynamic QNU Dynamic CNU
TmD1-DNU
TmD2-CNUTmD2-DNU
‘1’
Static LNU
TmD1-QNU
TmD2-QNU
Fig
ure
9:
classification
of
basic
artificial n
eural
un
its acco
rdin
g
to
agg
regatin
g
no
nlin
earity
ν,
its tim
e in
teg
rations
(i.e., th
e d
yn
am
ic o
rder),
and
the
typ
e o
f
adap
table tim
e-dela
y im
ple
men
tation
; no
t all po
ssible ty
pes are sh
ow
n fo
r simp
licity
of th
e pictu
re.
Page 25
19 of 30
5 DISCRETE HONNU AND ADAPTIVE APPROACH TO MONITORING OF VARIABILITY OF COMPLEX TIME-
SERIES
There are two distinct approaches to retrieve the underlying dynamics of a sys-
tem. One case is when the system inputs and outputs are available, and the number
of state variables could be found or estimated, e.g., by analysis of a system. The
second case is only when only output signals are available; then the embedding
dimension of the state space has to be reconstructed by appropriate methods, e.g., by
the false nearest neighbors method (Kennel et. al.1992). A special neural unit HRV-
HONNU combining the both approaches was designed for adaptive evaluation of
heart rate variability.
Continuous-time dynamic systems with possible occurrence of complex (cha-
otic) behavior shall be approximated by models performing at least in 3-D to be able
to develop chaotic behavior and not only periodic one or quasiperiodic (forced oscil-
lator) one. However, even one-dimensional discrete (recurrent) nonlinear systems
can become chaotic, e.g., the well known logistic equation (Figure 1, Figure 14).
Because of the fact that the mathematical structures of HONNU perform poly-
nomial approximation, and because of the fact that the units can be enhanced with
additional nonlinearities other than polynomials, standalone HONNU can be used
for approximations of wide range of complex systems.
Dynamic HONNU may suffer from instability problems during adaptation also
because of the generally nonlinear and unbounded function fHONNU implemented in
the synaptic operation of HONNU. A technique maintaining stable adaptation of
discrete dynamic HONNU was developed. Stability-improved structures of continu-
ous HONNU, TmD-DNU, and TmD-DHONNU were designed.
The plot of markers, the monitor plot, was designed for evaluation of the com-
plex dynamics using the developed neural architectures (Figure 12 to Figure 16).
This new detection method is based on monitoring unusual increments of neural
parameters of a neural unit during its adaptation. Vertical patterns indicate sudden
changes in the dynamics of a signal, repeating horizontal patterns reveal periodicity
in a signal and can clearly reveal multi-attractor behavior, and blank spots indicate
intervals of similar variability (single attractor). The sensitivity of the detection is
adjustable (scalable detection sensitivity) by parameter p that is called the detection
sensitivity parameter; the higher value of p, the more significant weight increments
detected, but the less number of markers drawn. The established neural units to-
Page 26
20 of 30
gether with the monitor plot can be used for sensitive short-term (sample-by-sample)
as well as for long-term monitoring and evaluation of complex systems.
All the proposed neural units can test an unknown system for existence of the de-
terministic nature of its behavior by observing the actual convergence of the neural
parameters. Outputs from a deterministic system shall result in converging neural
weights of an adapted neural unit; the simpler the system or the system output (i.e.,
the simpler the approximating model), the better convergence of the neural parame-
ters expected during the approximation by a neural unit.
Figure 10: Power spectral density of heart beat tachogram. (copied and modified
from: John D. et al., Summary by Ichiro Kawachi et al.,
http://www.macses.ucsf.edu/Research/Allostatic/notebook/heart.rate.html, retrieved
01/2007)
It has been proposed that the dynamics of cardiovascular system has a significant
deterministic-chaos component due to fast beat-by-beat control influences of
autonomous nervous system (ANS) [19][20][9]. It has been shown that distinct
physiological time delays in transfer of information from baroreceptors to the cen-
ters in the brain and back to heart tissue result in significantly distinct heart rate
variability such as from periodic to highly chaotic heart beat [9]to[13].
The component reflecting tonus of vagus and cardiac sympathetic nerves (~u2).
The component reflecting respiration (~u1).
Page 27
21
of 3
0
0
1
2
( )
( )
( 1)
k m
k j
k
u
u
u
−
−
−
ξ ξ ξ
�
�
1 1 1cos( )w tω ϕ+
somatic
operation
( )φ ξ
( )ky
xa u0 … a constant neural bias
Dynamic HRV CNU
nonlinear synaptic and somatic aggregation of neural inputs and neural dynamics
(the past and present states affect future neural states)the time as a
neural input
neural
output
2 2 2cos( )w tω ϕ+Time
neural input-signal
preprocessor
u2 ... a self-tunning influence of vagal tonus
u1 ... a self-tuning influence of breathing rhythm
u1
u2
( )1k −ξ
( )kν = ξ
1
z
1
z
1
z
1
z
( )k m−ξ
�
2 2 2
0
m m m
i j k
i j i k
k
j
ija a a wx x x+ + +
= = =
∑ ∑ ∑
u0
Figure 11: Static heart rate variability quadratic neural unit HRV-QNU; again, the blue color highlights the adaptable neural
parameters; input-signal preprocessor introduces the breathing (u1) and the vagal tonus (u2) frequency components.
Page 28
22
of 3
0
Monitor Plot by Dynamic HRV-HONNU for n=1
Evaluated
time series
(HRV00123)
w12
w22
w13
w11
ω1ϕ1
ω2ϕ2
w00w01
w02w03
w33
A blue dot is drawn in every adaptation step k if the neural parameter
increment ∆ωi(k), ∆ϕi(k), ∆wij(k)) is unusually large
k
Vertical patterns of varying neural parameters indicate sudden changes in variability of
time series, and blank spots indicate intervals of similar variability (single attractor).
Repeating
horizontal
patterns of
varying neural
parameters
reveal
periodicity in a
signal and
multi-attractor
behavior of a
system.
adapted neural
parameters
Figure 12: The monitor plot for detection of changes in variability, transititions to another attractor, artefacts, system
perturbations ( LLE=0.9 and CD=0.88 by program Dataplore).
Page 29
23
of 3
0
2000 2020 2040 2060 2080 2100 2120 2140 2160 21800.6
0.8
1
1.2
1.4
1.6
1.8
2
2.2MP_HRV00112_n1_p50_p7
MP_HRV00112_n2_p100_p0.1
Repeating Horizontal Patterns Detects:
Intervals of Similar Dynamics and
thus the Occurence of Multiple Attractors
MP_HRV00112_n2_p100_p0.5
Vertical Pattern Detects:
Artefacts, Inter-Attractor Transitions, Perturbationsw12
w22
w13
w11
ω1ϕ1
ω2ϕ2
w00w01
w02w03
w33
w12
w22
w13
w11
ω1ϕ1
ω2ϕ2
w00w01
w02w03
w33
Figure 13: Monitor plot of simulated heart beat tachogram of relatively low variability (LLE=0.46, [1] to [13])
Page 30
24
of 3
0
20 40 60 80 100 120 140 160 1800
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
k
a = 3.95
k20 40 60 80 100 120 140 160 1800
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
k
a = 3.95
20 40 60 80 100 120 140 160 1800
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
k
a = 3.95
k
w12
w22
w13
w11
ω1
ϕ1ω2
ϕ2w00
w01w02
w03
w33
w12
w22
w13
w11
ω1
ϕ1ω2
ϕ2w00
w01w02
w03
w33
w12
w22
w13
w11
ω1
ϕ1ω2
ϕ2w00
w01w02
w03
w33
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
E a
k
a = 3.95 a = 3.96
k0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
E a
k
a = 3.95 a = 3.96
k
k
a = 3.95 a = 3.96
k
w12
w22
w13
w11
ω1ϕ1
ω2ϕ2
w00w01
w02w03
w33
w12
w22
w13
w11
ω1ϕ1
ω2ϕ2
w00w01
w02w03
w33
w12
w22
w13
w11
ω1ϕ1
ω2ϕ2
w00w01
w02w03
w33
y(k) y(k)
Monitor Plot of the Logistic Equation
x(k+1) = a x(k) (1- x(k))
Figure 14: Detection of dynamic changes in the time-series generated by logistic equation with a small sudden increase of the
bifurcation parameter a (on the right). The increased density of blue dots in top right corner indicates a change in the
dynamics of the time series; the recurrence plot is shown in red and does not indicate the change in the dynamics.
Page 31
25
of 3
0
792 794 796 798 800 802
0.5
1
1.5
2
2.5
3
3.5
500 1000 1500 2000 25000
0.5
1
1.5
2
2.5
3
3.5
Sample # 794:
Atrial fibrillation, ventricular couplets
MP_MIT_BIH_203_epoch120_goal10_n3_p100_p7 MP_MIT_BIH_203_epoch120_goal10_n2_p50_p3
.
.
.
wij...
ω1ϕ1
ω2ϕ2
w00w01
w02
w45
w44
w55
.
.
.
wij...
ω1ϕ1
ω2ϕ2
w00w01
w02
w45
w44
w55
ω1ϕ1
ω2ϕ2
w00w01
w02
w34
w33
w44
.
.
.
wij...
ω1ϕ1
ω2ϕ2
w00w01
w02
w34
w33
w44
.
.
.
wij...
Figure 15: Detail of detection of sudden variability change at the start of atrial fibrillation and ventricular couplets
(sample # 794, MIT-BIH, record 203, male, age 43).
Page 32
26 of 30
6 CONCLUSIONS
6.1 Summary of Achievements
The particular goals were achieved as follows:
1) A theory of novel neural architectures for the approximation (modeling) of com-
plex dynamic systems is established in this thesis.
1.1) A theory of nonconventional higher-order nonlinear neural units was further
developed in this thesis.
A theory of linear continuous time-delay dynamic neural units was established.
A theory of nonlinear continuous time-delay dynamic neural units was estab-
lished.
A novel methodology for the approximation of complex systems using the de-
veloped neural units was founded.
1.2) The learning algorithm for the adaptation of each of the classes of units has
been derived. All the units can be adapted by the dynamic modification of the
gradient-based backpropagation learning algorithm.
1.3) All the neural architectures, the learning, and the stability-maintaining adapta-
tion technique of system approximation are naturally simple, universal, highly
customizable, and can be learned and used by researchers of various fields and
levels.
As an aside, for the purpose of system identification, a simple hybrid network of
static and dynamic HONNU with extended nonlinearity and extended dynamic order
was designed and applied. Each hybrid of static and dynamic HONNU in this net-
work uses its dynamic structure to generate its own state variables while the other
state variables are measured and introduced from the real system (if available). In
this way, the hybrids of static and dynamic neural units pursue both the stable nature
of static HONNU and approximation accuracy of continuous-time dynamic
HONNU.
The achievements regarding new evaluation of complex systems are as follows:
2) A novel methodology for adaptive monitoring of sudden as well as smooth
changes of variability (level of chaos) in signals generated by complex dynamic
systems.
2.1) A novel class of special neural units (HRV-HONNU), with an adaptable input-
signal preprocessor, was designed as an adaptive neural tool approximating
Page 33
27 of 30
complex dynamics and responding to sudden and continuous changes in the
dynamics of a cardiovascular system.
2.2) A novel and universally applicable methodology for adaptive evaluation of
signal variability was established based on the monitoring of the neural parame-
ters of the proposed novel neural units.
Due to its adaptive nature, the proposed methodology of adaptive evaluation of
variability of chaotic time series is capable of:
• detecting changes in level of variability (chaos), inter-attractor transitions, arte-
facts, internal or external perturbation into a system, noise,
• allocating intervals of similar dynamics, e.g., single attractor behavior, in the
time series,
• reflecting the level of variability (complexity) of a signal in a particular region
of a single attractor,
• revealing the multiple attractor behavior of a signal by detecting the repeating
patterns of changes in the system dynamics.
The developed “monitor plot” provides useful visualization of the results.
The achievements regarding the novel evaluation and monitoring of variability
of complex systems are demonstrated on simulated chaotic time series, on simulated
heart beat tachograms, and on real physiological R-R diagrams..
6.2 The Limitations and Challenges for Further Research
Currently, the method itself does not indicate the type of cardiac arrhythmias (so
far), nor has it been investigated whether it could indicate their incoming occur-
rence. However, it has been shown that if arrhythmias occurred, they were accom-
panied (preceded or followed) by changes in variability that were clearly detected by
HRV-HONNU and visualized in the monitor plot.
The use of the novel neural units does not outperform the proper derivation of a
mathematical model of a complex system if such an analysis can be done. A combi-
nation of customization of the internal neural architectures together with the proper
mathematical analysis of a system can:
• maximize accuracy of the neural units,
• minimize the time of adaptation, and
Page 34
28 of 30
• find initial neural parameters from which the unit would converge to a more
appropriate minimum of error function.
The primary challenges for further research regarding the proposed neural units
in engineering problems, such as control and system approximation, are as follows:
• Investigation of the applicability of stand-alone HONNU, TmD-DNU, and TmD-
DHONNU to system approximation and to control of nonlinear systems where
piece-wise-linearization control approaches are commonly used.
• Further research of neural units with an adaptable signal input preprocessor for
identification of unknown system input signals for the purpose of advanced
monitoring of internal as well as external system perturbations.
The primary challenges of further research regarding the proposed theory and meth-
odologies in biomedical engineering problems are as follows:
• Beat-by-beat HRV fetal monitoring, i.e., a new variability monitoring that re-
flects the level of oxygen delivered to the brain of a fetus.
• Development of Type-2 HRV-HONNU for adaptive evaluation of HRV where
the frequency component of the vagal nerve tonus would be due to the limit cy-
cle of the dynamic neural unit.
• Investigation of capabilities of HONNU to detect and distinguish between par-
ticular types of cardiac arrhythmias.
• New investigation of multi-attractor dynamics in complex systems. The pro-
posed methodology can introduce new knowledge of heart beat dynamics by
HRV-HONNU of patients before, during, and after a cardiac surgery.
Author’s Publications and Selected References
[1] Bukovsky, I.: Development of Higher-Order Nonlinear Neural Units as a Tool
for Approximation, Identification and Control of Complex Nonlinear Dynamic
Systems and Study of Their Application Prospects for Nonlinear Dynamics of
Cardiovascular System, Final Report from NATO Science Fellowships re-
search, ISRL, University of Saskatchewan, Canada, FME Czech Technical
University in Prague (IGS #CTU0304112), 2003.
[2] Bukovsky I., S. Redlapalli, M. M. Gupta: “Quadratic and Cubic Neural Units
for Identification and Fast State Feedback Control of Unknown Non-Linear
Dynamic Systems”, Fourth International Symposium on Uncertainty Model-
ing and Analysis ISUMA 2003, IEEE Computer Society, ISBN 0-7695-1997-0,
Maryland USA, 2003, pp.330-334.
Page 35
29 of 30
[3] Bukovsky I., Bila J. : “Development of Higher Order Nonlinear Neural Units
for Evaluation of Complex Static and Dynamic Systems”, Proceedings of
Workshop 2004, Part A, March 2004, vol.8, Special Issue, Czech Technical
University, Czech Republic, Prague, pp. 372-373.
[4] Bila, J., Bukovsky, I.: “Nonlinear Dynamic Neural Units for Paralel Manipula-
tor TRIPOD” (in Czech) , In: Seminar Proceedings VZ MSM 212200008, vol.
1, [CD-ROM], CTU FME, ISBN 80-01-03105-5, Prague: 2004, pp. 66-68.
[5] Bukovsky, I.: “Extended Dynamic Neural Architectures HONNU with Mini-
mum Number of Neural Parameters for Evaluation of Nonlinear Dynamic Sys-
tems” (in Czech), In: New Methods and Approaches in the Fields of Control
Technology, Automatic Control, and Informatics, Czech Technical University,
ISBN 80-01-03240-X, Prague, 2005, pp. 93-97.
[6] Bukovsky, I., Bila, J., Gupta, M., M.: “Linear Dynamic Neural Units with
Time Delay for Identification and Control”(in Czech), In: Automatizace, vol.
48, No. 10, ISSN 0005-125X , Prague, Czech Republic, 2005, pp. 628-635.
[7] Bukovsky, I., Bila, J., Gupta, M., M.: “Stable Neural Architecture of Dynamic
Neural Units with Adaptive Time Delays”, 7th International FLINS Confer-
ence on Applied Artificial Intelligence, ISBN 981-256-690-2, 2006, pp. 215-
222.
[8] Bukovsky, I., Simeunovic, G.: “Dynamic-Order-Extended Time-Delay Dy-
namic Neural Units”, 8th Seminar on Neural Network Applications in Electri-
cal Engineering, NEUREL-2006, IEEE (SCG) CAS-SP, ISBN 1-4244-0432-0,
Belgrade, 2006.
[9] Bukovsky, I.: Analysis of Cardiovascular System Model and the Interpretation
of Chaotic Phenomena in Signals ECG and HRV [Diploma Thesis], Faculty of
Mechanical Engineering, CTU in Prague, 2002.
[10] Bila J., Zitek, P., Kuchar, P. and Bukovsky, I.: “Heart Rate Variability: Model-
ling and Discussion”, Proceedings of International IAESTED Conference on
Neural Networks, ISBN 0-88986-286-9, Pittsburgh, USA, 2000, pp. 54-59.
[11] Bila,J., Bukovsky, I.: “Modelling and Interpretation of Chaotic Phenomena in
Heart Rate”. In: Proceedings of 8th International Conference on Soft Comput-
ing, MENDEL 2002, ISBN 80-214-2135-5, Brno, Czech Republic, 2002, pp.
292-297.
[12] Bila, J., Bukovsky, I.: “Interpretation of Chaotic Phenomena in Heart Rate”,
In: Proceedings of Workshop 2002, Part B, vol.6, Special Issue, Czech Techni-
cal University, ISBN 80-01-02511-X, Prague, Czech Republic, 2002, pp. 908-
909.
[13] Bila J., Bukovsky I., Oliviera T., Martins J, I.: “Modeling of Influence of
Autonomic Neural System to Heart Rate Variability”, IASTED International
Conference on Artificial Intelligence and Soft Computing ~Asc 2003~, ISSN:
1482-7913, ISBN: 0-88986-367-9, Banff, Canada, 2003, pp. 345-350.
Page 36
30 of 30
[14] Bila, J., Vitkaj, J., Musil, M., Bukovsky, I.: “Some Limits of Neural Networks
Use in Diagnostics”(in Czech), Automatizace, vol. 46, issue 11, ISSN 0005 -
125X, Prague, 2003, pp. 734-737.
[15] Gupta, M.M., Liang, J., Homma, N.: Static and Dynamic Neural Networks:
from Fundamentals to Advanced Theory, IEEE Press and Wiley-Interscience,
published by John Wiley & Sons, Inc., 2003.
[16] Vitkaj, J.: Analysis of Chaotic signals by Means of Neural Networks. [PhD.
Thesis] (in Czech), Faculty of Mechanical Engineering, Czech Technical Uni-
versity in Prague, Czech Republic, 2001.
[17] Mankova, R.: Prediction of Chaotic Signals Using Neural Networks with Fo-
cus on Analysis of Cardiosignals [Candidate Dissertation] (in Czech), Faculty
of Mechanical Engineering, Czech Technical University in Prague, Czech Re-
public, 1997.
[18] Marwan, N., Wessel, N., Meyerfeldt, U., Schirdewan, A., Kurths, J.: “Recur-
rence-Plot-Based Measures of Complexity and their Application to Heart-Rate-
Variability Data”, Physical Review, vol. 66 (2), E 66, 2002, pp. 026702.1-
026702.8.
[19] Zitek, P., Bila, J., Kuchar, P.: “Blood Circulation Model Establishing Heart
Rate Variability as Control Performance”, Computational Intelligence for
Modelling, Control & Automation, IOS Press, Vienna, Austria, 1999, pp.305-
310.
[20] Zitek, P., Vyhlidal, T.: “Low Order Time Delay Approximation of
Conventional Linear Model”. In: 4th MATHMOD Vienna Proc., Vienna, Aus-
tria, 2003, pp. 197-204.
[21] Bila, J., Brandejsky, T., Jelinek, I., Bukovsky, I.: “Software Support of
Conceptual Design Process”In: Workshop 2004, vol. A [CD-ROM], Czech
Technical University in Prague, ISBN 80-01-02945-X, Prague, Czech
Republic, 2004, pp. 378-379.
[22] Bila, J., Vitkaj, J., Musil, M., Bukovsky, I.: “Some Limits of Neural Networks
Use in Diagnostics” (in Czech), In.: Automatizace, vol. 46, issue 11, ISSN
0005 -125X, Prague, Czech Republic, 2003, pp. 734-73.
[23] Bukovsky, I., Bila, J.: „Basic Classification of Nonconventional Artificial
Neural Units”(In Czech), Proceedings of Seminar Nové Hrady, Czech Techni-
cal University in Prague, FME, ISBN: 978-80-01-03747-8, Czech Republic,
2007, pp. 76-80.
[24] Bukovsky, I., Hou, Z-G., Gupta, M., M., Bila, J.: “Foundation of Notation and
Classification of Nonconventional Static and Dynamic Neural Units”, accepted
paper for special section on neural networks for ICCI 2007, The 6th IEEE
International Conference on COGNITIVE INFORMATICS, California , USA,
2007.