Neural Information Systems ANSER :Rainfall Estimating System THONN:Financial Date Simulation System FACEFLOW: Face Recognition system System Dr. Ming Zhang, Associate Professor Department of Physics, Computer Science & Engineering Christopher Newport University 1 University Place, Newport News, VA 23606, USA
46
Embed
Neural Information Systems ANSER :Rainfall Estimating System THONN:Financial Date Simulation System FACEFLOW: Face Recognition system System Dr. Ming Zhang,
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Neural Information SystemsANSER :Rainfall Estimating System
THONN:Financial Date Simulation System
FACEFLOW: Face Recognition system System
Dr. Ming Zhang, Associate ProfessorDepartment of Physics, Computer Science & Engineering
Christopher Newport University
1 University Place, Newport News, VA 23606, USA
Dr. Ming Zhang
Dr. Ming Zhang 11/1999 – 07/2000: Senior USA NRC Research Associate
Artificial Neural network expert System for Estimation of Rainfall from the satellite data
ANSER System (1991-2000)
- 1991-1992:US$66,000 suported by USA National Research Council & NOAA
- 1995-1996:A$11,000 suppouted by Australia Research Council& NOAA
- 1999-2000:US$62,000 suported by USA National Research Council & NOAA
Page 3
Why Develop ANSER ?
- More than $3.5 billion in property is damaged and, more than 225 people are killed by heavy rain and flooding each year
- No rainfall estimating system in GIS system, No real time and working system of rainfall estimation in the world
- Can ANN be used in the weather forecasting area? If yes, how should we use ANN techniques in this area?
Page 4
Why Use Neural Network Techniques ?
- Two Directions of New generation computer Quamtun Computer Artificial Neural Network- Much quicker speed ?- Complicated pattern recognition?- Unknown rule knowledge base?- Self learning reasoning network?- Super position for multip choice?
- A$ 10,000 Supported by Australia Research Council
PT-HONN Simulator (1999 - 2000)- Polynomial and Trigonometric polynomial Higher Order
Neural Network financial data simulator
- US$ 46,000 Supported by USA National Research Council
PT-HONN Data Simulator
Simulating by PT-HONN Simulator
Structure of PT-HONN
Page 17
Cloud Merger Operator
MI(i, j) is a black-and-white image which can be used to represent a satellite image.
Label Set: L = {0, 1, 2, …… M}
Each label corresponds to a different cloud merger.
Cloud Merger Recognising operator CMR
CMR: MI(i, j) L
Page 7
Cloud Merger Operator Set The cloud merger recognising operator CMR
is the operator set: CMR = { CMCI, CMR1, CMR2, CMS1,CMS2,CMS3,CMS4, CMM1,CMM2,CMM3,CMM4} Where CMCI: Circle input satellite data cloud
merger recognising operator. …...
Page 8
Ternary Output of Cloud Merger Operator
1, O(Ns,t) 1 – cloud merger L= 2, O(Ns,t) 3 – further test needed 0, O(Ns,t) 2 - cloud not merger where s-th level is the output layer of NN. All other operators (CMR1, CMR2, CMS1,
CMS2, CMS3, CMS4,CMM1, CMM2, CMM3, CMM4) have the same definitions as CMCI.
The network architecture of PT-HONN has combined both the characteristics of PHONN and THONN.
It is a multi-layer network that consists of an input layer with input-units, and output layer with output-units, and two hidden layers consisting of intermediate processing units.
Page 15
Definition of PT-HONN
Page 16
))()(
)()((0,
ySinxCosc
yxbyCosxSinaZ
ji
ij
ji
ij
jn
ji
i
ij
Knowledge of Rainfall Half Hour Rainfall Inches
Cloud Top Cloud Growth Latitude Degree
Temperature 2/3 1/3 0
> -32 C 0.05 0.05 0.03 -36 C 0.20 0.13 0.06 -46 C 0.48 0.24 0.11 -55 C 0.79 0.43 0.22 -60 C 0.94 0.65 0.36 -70 C 1.55 0.85 0.49 <-80 C 1.93 0.95 0.55
Page 18
PT-HONN ResultsCloud Top Cloud Growth PHONN PT_HONNTemperature Latitude Degree |Erro|r% |Error| % …… …… ……. …... > - 32 C 1/3 10.47 10.11 - 36 C 1/3 3.50 4.25 - 46 C 1/3 3.52 4.63 - 55 C 1/3 0.22 2.04 - 60 C 1/3 3.21 0.30 - 70 C 1/3 9.01 5.08 < - 80 C 1/3 3.89 1.21 …… …… …… …... Average 6.36% 5.68%
Page 19
CONCLUSION
The results of the comparative experiments show that THONG system is able to simulate higher frequency and higher order non-linear data, as well as being able to simulate discontinuous data.
The THONG model can not only be used for financial simulation, but also for financial prediction.
Using THONG System for Higher Frequency Non-liner Data Simulation & Prediction
FACEFLOW A Robot Vision System
For Moving Face Recognition
Dr. Ming Zhang
FACEFLOW (1992 - 2000) A computer vision system for recognition of
3-dimensional moving faces using GAT model (neural netowrk Groug-based Adaptive tolerance Tree)
A$850,000 supported by SITA (Society Internationale de Telecommunications Aeronautiques)
A$40,500 supported by Australia Research Council A$78,000 supported by Australia Department of
Education. US$160,000 supported by USA National Research
Council.
Page 20
Neuron-Adaptive Neural Network Simulator
* The network architecture of NANN is a multilayer feed-forward network that consists of an input layer with input-units, an output layer with output-units, and one hidden layer consisting of intermediate processing units.* There is no activation function in the input layer and the output neurones are summing units (linear activation)* our activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF)
NANN
Page 21
The activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF) defined as
where a1,b1,a2,b2,a3 and b3 are real variable which will be adjusted (as well as weights) during training.
xb
xb
ea
eaxbax
3
2
13
21sin1
NAAF
Page 22
Structure of NANN
Figure 1. An NANN with NAAF's for hidden layer
Input
NAAF's
Output
Page 23
NANN Group
Neuron-Adaptive Feedforward Neural network Group (NAFNG) is one kind of neural network group in which each element is a neuron-adaptive feedforward neural network (Fi). We have:
NAFNG ={F1, F2, F3,…... Fi,…...Fn}
Page 24
Feature of NANN Hornik (1991): If the activation function is
continuous, bounded and nonconstant, then standard FNN can approximate any continuous function.
Leshno (1993): A standard FNN can approximate any continuous function if the network's activation function is not a polynomial.
A neuron-adaptive feedforward neural network group with adaptive neurones can approximate any kind of piecewise continuous function.
Page 25
Neuron Network Group Models -complex system GAT Tree Model
- real time and real world face recognition Neuron-Adaptive Neural Network Models
- best match real world data Center Of Motion Model - motion center Second Order Vision Model - motion direction NAAT Tree Model - a possible more powerful
model for face recognition
FACEFLOW: A Robot Vision System
Hornik, K. (1991)
“Whenever the activation function is continuous, bounded and nonconstant, then for an arbitrary compact subset X Rn, standard multilayer feedforward networks can approximate any continuous function on X arbitrarily well with respect to uniform distance, provided that sufficiently many hidden units are available”
Leshno, M. (1993)
“A standard multilayer feedforward network with a locally bounded activation function can approximate any continuous function to any degree of accuracy if and only if the network’s activation function is not a polynomial”
Zhang, Ming (1995)
“ Consider a neural network piecewise function group, in which each member is a standard multilayer feedforward neural network, and which has locally boundded, piecewise continuous (rather than polynomial) activation function and threshold. Eash such group can approximate any king of piecewise continuous function, and to any degree of accuracy”
Knowledge of Rainfall Half Hour Rainfall Inches
Cloud Top Cloud Growth Latitude Degree
Temperature 2/3 1/3 0
> -32 C 0.05 0.05 0.03 -36 C 0.20 0.13 0.06 -46 C 0.48 0.24 0.11 -55 C 0.79 0.43 0.22 -60 C 0.94 0.65 0.36 -70 C 1.55 0.85 0.49 <-80 C 1.93 0.95 0.55
PT-HONN ResultsCloud Top Cloud Growth PHONN PT_HONNTemperature Latitude Degree |Erro|r% |Error| % …… …… ……. …... > - 32 C 1/3 10.47 10.11 - 36 C 1/3 3.50 4.25 - 46 C 1/3 3.52 4.63 - 55 C 1/3 0.22 2.04 - 60 C 1/3 3.21 0.30 - 70 C 1/3 9.01 5.08 < - 80 C 1/3 3.89 1.21 …… …… …… …... Average 6.36% 5.68%
Conclusion- What Approved
Artificial Neural Network Techniques can :
- Much quick speed: 5-10 time quick
- Complicated pattern recognition: cloud merger
- Unknown rule knowledge base: Rainfall
- Reasoning network: rainfall estimation
Conclusion- Next Step- Rebuild interface & retraining neural networks
- New neural netowrk models:
more complicated pattern recognition
- Self expending knowledge base:
attract knowledge from real time cases
- Self learning reasoning network: automatic system to
- Study in advance in 15 years: Artificial Neural Network - one of two directions of new generation computer Research