Top Banner
CSE5230 - Data Mining, 2002 Lecture 5.1 Data Mining - CSE5230 Neural Networks 1 CSE5230/DMS/2002/5
33
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.1

Data Mining - CSE5230

Neural Networks 1

CSE5230/DMS/2002/5

Page 2: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.2

Lecture Outline

Why study neural networks? What are neural networks and how do they

work? History of artificial neural networks (NNs) Applications and advantages Choosing and preparing data An illustrative example

Page 3: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.3

Why study Neural Networks? - 1

Two basic motivations for NN research: to model brain function to solve engineering (and business) problems

So far as modeling the brain goes, it is worth remembering:

“… metaphors for the brain are usually based on the most complex device currently available: in the seventeenth century the brain was compared to a hydraulic system, and in the early twentieth century to a telephone switchboard. Now, of course, we compare the brain to a digital computer.”

Page 4: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.4

Why study Neural Networks? - 2

Historically, NN theories were first developed by neurophysiologists. For engineers (and others), the attractions of NN processing include:

inherent parallelism speed (avoiding the von Neumann bottleneck) distributed “holographic” storage of information robustness generalization learning by example rather than having to understand the

underlying problem (a double-edged sword!)

Page 5: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.5

Why study Neural Networks? - 3

It is important to be wary of the black-box characterization of NNs as “artificial brains”

Beware of the anthropomorphisms common in the field (let alone in popular coverage of NNs!)

learning memory training forgetting

Remember that every NN is a mathematical model. There is usually a good statistical explanation of NN behaviour

Page 6: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.6

What is a neuron? - 1

a (biological) neuron is a node that has many inputs and one output

inputs come from other neurons or sensory organs

the inputs are weighted weights can be both positive

and negative inputs are summed at the node

to produce an activation value if the activation is greater than

some threshold, the neuron fires

f

Page 7: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.7

What is a neuron? - 2

In order to simulate neurons on a computer, we need a mathematical model of this node

node i has n inputs xj

each connection has an associated weight wij

the net input to node i is the sum of the products of the connection inputs and their weights:

The output of node i is determined by applying a non-linear transfer function fi to the net input:

n

jjiji xwnet

1

)( ii netfx

Page 8: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.8

What is a neuron? - 3

A common choice for the transfer function is the sigmoid:

The sigmoid has similar non-linear properties to the transfer function of real neurons:

bounded below by 0 saturates when input becomes large bounded above by 1

ineti enetf

1

1)(

Page 9: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.9

What is a neural network? Now that we have a model for an artificial

neuron, we can imagine connecting many of then together to form an Artificial Neural Network:

Input layer

Hidden layer

Output layer

Page 10: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.10

History of NNs - 1

By the 1940s, neurophysiologists knew that the brain consisted of billions of intricately interconnected neurons

The neurons all seemed to be basically identical

The idea emerged that the complex behaviour and power of the brain arose from the connection scheme

This led to the birth of connectionist approach to the explanation of:

memory, intelligence, pattern recognition, ...

Page 11: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.11

History of NNs - 2

Warren S. McCulloch and Walter Pitts, “A logical calculus of the ideas immanent in nervous activity”, Bulletin of Mathematical Biophysics, 5:115-133, 1943.

Historically very significant as an attempt to understand what the nervous system might actually be doing

First to treat the brain as a computational organ

Showed that their nets of “all-or-nothing” nodes could be described by propositional logic

Page 12: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.12

History of NNs - 3

Donald O. Hebb, The Organization of Behavior, John Wiley & Sons, New York, 1949.

Hebb proposed a learning rule for NNs:

“Let us assume then that the persistence of repetition of a reverberatory activity (or trace) tends to induce lasting cellular changes that add to its stability. The assumption can be precisely stated as follows: When an axon of cell A is near enough to excite cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place on one or both cells so that A's efficiency as one of the cells firing B is increased.”

Page 13: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.13

History of NNs - 4

Frank Rosenblatt, “The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain”, Psychological Review, 65:386-408, 1958.

used random, weighted connections between layers of nodes

connection weights were updated according a a Hebbian-like rule

was able to discriminate between some classes of patterns

Page 14: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.14

History of NNs - 5

Marvin Minsky and Seymour Papert, Perceptrons, An Introduction to Computational Geometry, MIT Press, Cambridge, MA, 1969.

AI community felt that NN researchers were overselling the capabilities of their models

highlighted the theoretical limitations of the Perceptron at the time (which had been improved since the original version). Classic example is the inability to solve the XOR problem

Effectively stopped NN research for many years

Page 15: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.15

History of NNs - 6

Some research continued: Associative memories

» James A. Anderson, “ A Simple Neural Network Generating an Interactive Memory”, Mathematical Biosciences 14:197-220, 1972.

» Teuvo Kohonen, “ Correlation Matrix Memories”, IEEE Transaction on Computers, C-21:353-359, 1972.

Cognitron - the first multilayer NN» K. Fukushima, “Cognitron: A Self-organizing

Multilayered Neural Network”, Biological Cybernetics, 20:121-136, 1975.

Hopfield Networks» J. J. Hopfield, “ Neural Networks and Physical

Systems with Emergent Collective Computational Abilities”, Proceedings of the National Academy of Sciences, 79:2554-2558, 1982.

Page 16: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.16

History of NNs - 7The Multilayered Back-

Propagation Association Networks The limitations pointed out by Minsky and

Papert were due the the fact that the Perceptron had only two layers (and was thus restricted to classifying linearly separable patterns)

Extending successful learning techniques to multilayer networks was the challenge

In 1986, several groups came up with essentially the same algorithm, which became known as back-propagation

This led to the revival of NN research

Page 17: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.17

History of NNs - 8The Multilayered Back-

Propagation Association Networks

David D. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams, “Learning Representations by Back-Propagating Errors”, Nature 323:533-536, 1986.

The idea of back-propagation is to calculate the error at the output layer, and then to trace the contributions to this error back through the network to the input layer, adjusting weights as one goes so as to reduce this error

Page 18: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.18

Mathematically, this is a gradient descent training procedure

In fact, back-propagation is the neural analogue of a gradient descent algorithm discovered earlier

Paul Werbos, “Beyond regression: New Tools for Prediction and Analysis in the Behavioral Sciences”, Doctoral thesis, Harvard University, 1974.

The back-propagation algorithm uses the Chain Rule from calculus to extend more traditional regression to multilayer networks

History of NNs - 9The Multilayered Back-

Propagation Association Networks

Page 19: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.19

Probably the most common type of NN used to today is a multilayer feedforward network trained using back-propagation (BP)

Often called a Multilayer Perceptron (MLP) Despite the title of Werbos’ thesis, back-prop

is now seen as a form of regression: a training set of input-output pairs is provided, and gradient descent is used to determine the the parameters of a model (the NN) to fit this training data

History of NNs - 10The Multilayered Back-

Propagation Association Networks

Page 20: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.20

History of NNs - 11

Other NN models have been developed during the last twenty years:

Adaptive Resonance Theory (ART)

» pattern recognition networks where activity flows back and forth between layers, and “resonances” form

» Gail Carpenter and Stephen Grossberg, “A Massively Parallel Architecture for a Self-Organizing Neural Pattern Recognition Machine”, Computer Vision, Graphics and Image Processing 37:54, 1987.

Self-Organizing Maps (SOMs)

» Also biologically inspired: “How should the neurons organize their connectivity to optimize the spatial distribution of their responses within the layer?”

» Can be used for clustering (more next week)

» Teuvo Kohonen, “Self-organized formation of topologically correct feature maps”, Biological Cybernetics 43:59-69, 1982.

Page 21: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.21

Applications of NNs

Predicting financial time series Diagnosing medical conditions Identifying clusters in customer databases Identifying fraudulent credit card transactions Hand-written character recognition (cheques) Predicting the failure rate of machinery and many more….

Page 22: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.22

Using a neural network for prediction - 1

Identify input and outputs Preprocess inputs - often scale to the range

[0,1] Choose a NN architecture (see next slide) Train the NN with a representative set of

training examples (usually using BP) Test the NN with another set of known

examples often the known data set is divided in to training and test

sets. Cross-validation is a more rigorous validation procedure.

Apply the model to unknown input data

Page 23: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.23

Using a neural network for prediction - 2

The network designer must decide the network architecture for a given application

It has been proven that one hidden layer is sufficient to handle all situations of practical interest

The number of nodes in the hidden layer will determine the complexity of the NN model (and thus its capacity to recognize patterns)

BUT, too many hidden nodes will result in the memorization of individual training patterns, rather than generalization

Amount of available training data is an important factor - must be large for a complex model

Page 24: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.24

An example

Note that here the network is treated as a “black-box”

Living space

Size of garage

Age of house

Heating type

Other attributes

Neural network

Appraised value

Page 25: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.25

Issues in choosing the training data set

The neural network is only as good as the data set with which it is trained upon

When selecting training data, the designer should consider:

Whether all important features are covered What are the important/necessary features The number of inputs The number of outputs Availability of hardware

Page 26: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.26

Preparing data

Preprocessing is usually the most complicated and time-consuming issue when working with NNs (as with any DM tool)

Main types of data encountered: Continuous data with known min/max values

(range/domain known). There problems with skewed distributions: solutions include removing values or using log function to filter

Ordered, discrete values: e.g. low, medium, high Categorical values (no order): e.g. {“Male”, “Female”,

“Unknown”} ( use “1 of N coding” or “1 of N-1 coding”) There will always be other problems where

the analyst’s experience and ingenuity must be used

Page 27: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.27

Illustrative Example – 1(following http://www.geog.leeds.ac.uk/courses/level3/geog3110/week6/sld047.htm ff.)

Organization a building society with 5 million customers and using a

direct mailing campaign to promote a new investment product to existing savers

Available data The 5 million customer database Results of an initial test mailing where 50,000 customers

(randomly selected) were mailed. There were 1000 responses (2%) in terms of product take up

Objective Find a way of targeting the mailing so that:

» the response rate is doubled to 4%

» at least 40,000 new investment holders are brought in

Page 28: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.28

Illustrative Example - 2

For simplicity we assume that only two attributes (features) of a customer are relevant for this situation:

TIMEAC: time (in years) that the account has been open AVEBAL: average account balance over the past 3

months

Examining the data, it was obvious to analysts that the pattern of respondents is different from the non-respondents. But what are the reasons for this?

We need to know the reasons to select/develop a model for identifying such responding customers

Page 29: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.29

Illustrative Example - 3

A neural network can be used to model this data without having to make any assumptions for the reasons of such patterns

Let a neural network learn the pattern from the data and classify the data for us

Neural network

AVEBAL

TIMEAC

SCORE

Page 30: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.30

Illustrative Example - 4Preparing the training and test data sets

We have 1000 respondents. Randomly split in to a training set and a test set:

The network is trained by making repeated passes over the training data, adjusting weights using the BP algorithm

500 respondents+ 500 non-respondents

1000 (training test)

500 respondents+ 500 non-respondents

1000 (test test)

Page 31: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.31

Illustrative Example - 5Using the resultant network

Order the score value for the test in descending order (see next slide)

45 degree line shows the results if random ranking is used (since the test set consists of 50% “good” customers)

The extent to which the graph deviates from the 45 degree line shows the power of the model to discriminate between good and bad customers

Now calculate the number of customers required to be mailed to achieve the company objective

Page 32: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.32

Illustrative Example - 6

Page 33: CSE5230 - Data Mining, 2002 Lecture 5.1

CSE5230 - Data Mining, 2002 Lecture 5.33

Illustrative Example - 7

Analysis shows that company objectives are achievable:

40,000 product holders at 4% response Can save hundreds of thousands of dollars in

mailing costs Better than the other model in this example