-
of the implementation of the system, rather than
on weaknesses in the algorithms. For example, side-
channel attacks may use timing, power consumption,
or electromagnetic measurements on the security
device. Side-channel attacks are primarily of concern
for biometric encryption systems and match-on-card
devices where an attack could potentially be mounted
by iteratively improving the presented biometric. Very
little research has been done to explore the feasibil-
ity of side-channel attacks, but the success of attacks
on biometric template security and biometric encryp-
tion suggests that such attacks are certainly feasible.
Security and Liveness, Overview
Signal to Noise Ratio
Information is transmitted or recorded by variations in a
physical quantity. For any information storage or trans-
mission system, there will be intended variations in the
physical quantity signal and unintended variations
noise. In an analog telephone system, the signal (voice) is
represented by variation in a voltage level. As the signal
is
transmitted along a phone line, it can pickup other
unintended variations e.g., leakage of other signals,
static from electrical storms that are noise. The ratio
of the signal level to the noise level is the signal to
noise
ratio (SNR). Since it is a ratio, SNR is dimensionless.
However, SNR can be expressed as either an amplitude
ratio (voltage ratio for the phone example) or a power
ratio (milliWatts for the phone example). This leads to
confusion. SNR is frequently expressed as the log (base
10) of the ratio.When expressed as a log, the dimension-
less unit of SNR is decibel (dB).
What is signal and what is noise can depend on the
circumstances. Radio waves from lighting are noise to
an AM radio broadcast, but can be signal to a meteo-
rological experiment.
Iris Device
Signature Benchmark
Signature Databases and Evaluation
Signature Characteristics
Signature Features
Signature Corporate
Signature Databases and Evaluation
Signature Databases and Evaluation
MARCOS MARTINEZ-DIAZ, JULIAN FIERREZ
Biometric Recognition Group - ATVS,
Escuela Politecnica Superior, Universidad
Autonoma de Madrid, Campus de Cantoblanco,
Madrid, Spain
Synonyms
Signature benchmark; Signature corpora; Signature
data set
Definition
Signature databases are structured sets of collected
signatures from a group of individuals that are used
either for evaluation of recognition algorithms or as
part of an operational system.
Signature databases for evaluation purposes are, in
general, collections of signatures acquired using a
digitizing device such as a pen tablet or a touch-
screen. Publicly available databases allow a fair
performance comparison of signature recognition
algorithms proposed by independent entities. More-
over, signature databases play a central role in public
performance evaluations, which compare different
recognition algorithms by using a common experi-
mental framework. This type of databases is covered
in this entry.
On the other hand, signature databases can also be
a module of a verification or identification system.
1178S Signal to Noise Ratio
-
They store signature data and other personal informa-
tion of the enrolled users. This signature database is
used during the operation of the recognition system
to retrieve the enrolled data needed to perform the
biometric matching. This kind of databases is not
addressed here.
Dynamic Signature Databases
Until the beginning of this century, research on auto-
matic signature verification had been carried out using
privately collected databases, since no public ones were
available. This fact limits the possibilities to compare
the performance of different systems presented in the
literature, which may have been tuned to specific cap-
ture conditions. Additionally, the usage of small data
sets reduces the statistical relevance of experiments.
The lack of publicly available databases has also been
motivated by privacy and legal issues, although the
data in these databases are detached from any personal
information. The impact of the signature structural
differences among cultures must also be taken into
account when considering experimental results on a
specific database. As an example, in Europe, signatures
are usually formed by a fast writing followed by a
flourish, while in North America, they usually corre-
spond to the signers name with no flourish. On the
other hand, signatures in Asia are commonly formed
by Asian characters, which are composed of a larger
number of short strokes compared with European or
North American signatures.
While some authors have made public the data-
bases used for their experimental results [1], most
current dynamic signature databases are collected by
the joint effort of different research institutions. These
databases are, in general, freely available or can be
obtained at a reduced cost. Many signature databases
are part of larger multimodal biometric databases,
which include other traits such as fingerprint or face
data. This is done for two main reasons: the research
interest on multimodal algorithms and the low effort
required to incorporate the collection of other biomet-
ric traits once a database acquisition campaign has
been organized.
Two main modalities in signature recognition exist.
Off-line systems use signature images that have been
previously captured with a scanner or camera. On the
other hand, on-line systems employ digitized signals
from the signature dynamics such as the pen position
or pressure. These signals must be captured with spe-
cific devices such as pen tablets or touch-screens.
The most popular databases in the biometric research
community are oriented to on-line verification,
although in some of them, the scanned signature
images are also available [2, 3]. Some efforts have been
carried out in the handwriting community to collect
large off-line signature databases such as the GPDS-960
Corpus [4].
Unlike other biometric traits, signatures can be
forged with relative ease. Signature verification systems
must not only discriminate traits from different sub-
jects (such as fingerprints) but also must discriminate
between genuine signatures and forgeries. In general,
signature databases provide a number of forgeries for
the signatures of each user. The accuracy of the for-
geries depends on the acquisition protocol, the skill of
the forgers, and on how much time the forgers are let
to train. Nevertheless, forgeries in signature databases
are usually performed by subjects with no prior expe-
rience in forging signatures, this being a limitation to
the quality of forgeries.
Most on-line signature databases have been cap-
tured with digitizing tablets. These tablets are, in
general, based on an electromagnetic principle, allow-
ing the detection of the pen position (x,y), inclination
angles (y,g)(azimuth, altitude), and pressure p. Theyallow
recording the pen dynamics even when the pen is
not in contact with the signing surface (i.e., during
pen-ups). On the other hand, databases captured
with other devices such as touch-screens (e.g., PDAs)
provide only pen position information, which is
recorded exclusively when the pen is in contact with
the device surface.
In the following, a brief description of the most
relevant available on-line signature databases is given
in chronological order.
PHILIPS Database
Signatures from 51 users were captured using a Philips
Advanced Interactive Display (PAID) digitizing tablet
at a sampling rate of 200 Hz [5]. The following signals
were captured: position coordinates, pressure, azi-
muth, and altitude.
Each user contributed 30 genuine signatures, leading
to 1,530 genuine signatures. Three types of forgeries are
present in the database: 1,470 over-the-shoulder for-
geries, 1,530 home-improved, and 240 professional
Signature Databases and Evaluation S 1179
S
-
forgeries. There is not a fixed number of forgeries avail-
able for each user. Over-the-shoulder forgeries were
produced by letting the forger observe the signing pro-
cess. Home-improved forgeries were produced by giving
to the forgers samples of the signature static image and
letting them to practice at home. Professional forgeries
were performed by forensic document examiners.
MCYT Bimodal Database
The MCYT bimodal database is comprised of signatures
and fingerprints from 330 individuals [2]. Signa-
tures were acquired using a Wacom Intuos A6 tablet
with a sampling frequency of 100 Hz. The users signed
repeatedly on a paper with a printed grid placed over the
pen tablet. The following time sequences are captured:
position coordinates, pressure, azimuth, and altitude.
There are 25 genuine signatures and 25 forgeries
per user, leading to 16,500 signatures in the database.
For each user, signatures were captured in groups of 5.
First, 5 genuine signatures, then 5 forgeries from an-
other user, repeating this sequence until 25 signatures
from each type, were performed. Each user provided 5
forgeries for the 5 previous users in the database. As
the user is forced to concentrate on different tasks
between each group of genuine signatures, the varia-
bility between groups is expected to be higher than the
one within the same group.
Genuine signatures and forgeries corresponding
to 75 users from the MCYT database were scanned
and are also available as an off-line signature database.
This signature corpus is one of the most popular for the
evaluation of signature verification algorithms that are
being used bymore than 50 research groups worldwide.
BIOMET Multimodal Database
The BIOMETmultimodal database [6] is comprised of
five modalities: audio (voice), face, hand, fingerprint,
and signature. The signatures were captured using
a Wacom Intuos2 A6 pen tablet and an ink pen with
a sampling rate of 100 Hz. The pen coordinates,
pen-pressure, azimuth, and altitude signals were cap-
tured. The database contains data from 84 users, with
15 genuine signatures and up to 12 forgeries per user.
Signatures were captured in two sessions separated by
35 months. In the first session, 5 genuine signatures
and 6 forgeries were acquired. The remaining 10
genuine signatures and 6 forgeries were captured in
the second session. Forgeries are performed by 4 dif-
ferent users (3 forgeries each). This database contains
2,201 signatures, since not all users have complete data:
8 genuine signatures and 54 forgeries are missing.
SVC2004 Database
Two signature databases were released prior to the
Signature Verification Competition (SVC) 2004 [7]
for algorithm development and tuning. They were
captured using a Wacom Intuos digitizing tablet and
a Grip Pen. Due to privacy issues, users were advised to
use invented signatures as genuine ones. Nevertheless,
users were asked to thoroughly practice their invented
signatures to reach a reasonable level of spatial and
temporal consistency.
The two databases differ in the available data, and
correspond to the two tasks defined in the competi-
tion. One contains only pen position information,
while the other provides pressure and pen orientation
(azimuth and altitude) signals also. Each database con-
tains 40 users, with 20 genuine signatures and 20 for-
geries per user acquired in two sessions, leading to
1,600 signatures per database. Forgeries for each user
were produced by at least four other users, aided by a
visual tool, which represented the signature dynamics
on a display. Both occidental and asian signatures are
present in the databases.
SUSIG Database
The SUSIG database consists of two sets: one cap-
tured using a pen tablet without visual feedback
(Blind subcorpus) and the other using a pen tablet
with an LCD display (Visual subcorpus) [8]. There are
100 users per database, but these do not coincide,
as the Visual subcorpus was captured 4 years after
the Blind one. For the Blind subcorpus, a WACOM
Graphire2 pen tablet was used. The Visual subcorpus
was acquired using an Interlink Electronics ePad-ink
tablet, with a pressure-sensitive LCD. In both subcor-
pora, the pen coordinates and the pen pressure signals
were captured using a sampling frequency of 100 Hz.
While performing forgeries, users had prior visual
input of the signing process on a separate screen or
on the LCD display for the Blind and Visual subcorpus
respectively.
1180S Signature Databases and Evaluation
-
For the Blind subcorpus, 8 or 10 genuine signatures
were captured in a single session. The users also
provided 10 forgeries from another randomly selected
user. Two sessions were performed in the Visual sub-
corpus. During each one, users provided 10 genuine
signatures and 5 forgeries.
MyIDea Multimodal Database
This signature set is a subset of the MyIDea Multimod-
al Biometric Database [9]. AWacom Intuos2 A4 graph-
ic tablet was used at a sampling rate of 100 Hz. Pen
position, pressure, azimuth, and altitude signals were
captured. This data set has the particularity that the
user must read loud what he is writing, allowing what
the authors call CHASM (Combined Handwriting and
SpeechModalities). This corpus consists of ca. 70 users.
Signatures were captured in 3 sessions. During each
session, each user performed 6 genuine signatures
and 6 forgeries, with visual access to the images of the
target signatures.
BiosecurID Multimodal Database
This database was collected by 6 different Spanish
research institutions [3]. It includes the following bio-
metric traits: speech, iris, face, signature, handwriting,
fingerprints, hand, and keystroke. The data were cap-
tured in 4 sessions distributed in a 4 month time span.
The user population was specifically selected to contain
a uniform distribution of users from different ages
and genders. Nonbiometric data were also stored,
such as age, gender, handedness, vision aids, and man-
ual worker (if the user has eroded fingerprints). This
allows studying specific demographic groups.
The signature pen-position, pressure, azimuth, and
altitude signals were acquired using a Wacom Intuos3
A4 digitizer at 100 Hz. During each session, two sig-
natures were captured at the beginning and two at the
end, leading to 16 genuine signatures per user. Each
user performed one forgery per session of signatures
from other three users in the database. The skill level
of the forgeries is increased by showing to the forger
more information of the target signature incremen-
tally. In the first session, forgers have only visual access
to one genuine signature; more data (i.e., signature
dynamics) are shown in further sessions and forgers
are let more time to train. Off-line signature data are
also available, since signatures were captured using an
inking pen.
BioSecure Multimodal Database
The BioSecure Multimodal Database was collected by
11 European institutions under the BioSecure Network
of Excellence [10]. It has three data sets captured in
different scenarios: DS1 was captured remotely over
the internet, DS2 was acquired in a desktop environ-
ment, and DS3 under mobile conditions. The database
covers face, fingerprint, hand, iris, signature, and
speech modalities and includes two signature subcor-
pora corresponding to the DS2 and DS3 data sets.
These two data sets were produced by the same group
of 667 users. The DS2 data set was captured using a
Wacom Intuos3 A6 digitizer at 100 Hz and an ink pen
while the user was sitting. On the other hand, the DS3
data set was captured with a PDA. Users were asked to
sign while standing and holding the PDA in one hand,
emulating realistic operating conditions. An HP iPAQ
hx2790 with a sampling frequency of 100 Hz was used
as capture device. The pen position, pressure, azimuth,
and altitude signals are available in DS2, while only the
pen position is available on DS3 due to the nature of
the PDA touch-screen.
Signatures were captured in two sessions and in
blocks of 5. An average of two months was left be-
tween each session. During each session, users were
asked to perform 3 sets of 5 genuine signatures and
5 forgeries between each set. Following this protocol,
each user performed 5 forgeries for the previous 4
users in the database. Thus, 30 genuine signatures and
20 forgeries are available for each user. Forgeries are
collected in a worst case scenario. For DS2, the
users had visual access to the dynamics of the signing
process of the signatures they had to forge on a com-
puter screen. In DS3, each forger had access to the
dynamics of the genuine signature on the PDA screen
and a tracker tool allowing to see the original strokes.
Some users were even allowed to sign following the
strokes produced by the tracker tool, reproducing
thus the geometry and dynamics of the forged signa-
ture with high accuracy.
The DS3 data set is the first multisession database
captured on a PDA and represents a very challenging
database [11]. Apart from the high accuracy of the
Signature Databases and Evaluation S 1181
S
-
forgeries, signatures from DS3 present sampling errors
and irregular sampling rates. Moreover, pen posi-
tion signals during pen-ups are not available, since
the acquisition device captures the pen dynamics only
when the PDA stylus is in contact with the touch-
screen surface.
The capture process for both DS2 and DS3 is shown
in Fig. 1. Examples of signatures from the BioSecure
Signature subcorpora corresponding to DS2 and DS3
are presented in Fig. 2. Unconnected samples represent
that at least one sample is missing between them due to
sampling errors.
In Table 1, the main features of the described sig-
nature databases are presented.
Signature Verification EvaluationCampaigns
Despite the usage of a common database, one of the
main difficulties when comparing the performance of
different biometric systems is the different experimen-
tal conditions, under which each system is evaluated by
its designers. To overcome these difficulties, evalua-
tions and competitions provide a common reference
for system comparison on the same database and pro-
tocol. Public evaluations in the field of automatic sig-
nature verification are less common than for other
biometric modalities such as fingerprint or speech.
In particular, only evaluations for the on-line signature
verification modality have been proposed. These in-
clude the Signature Verification Competition (SVC),
which took place in 2004 [7], the signature modality of
the BioSecure Multomodal Evaluation Campaign held
in 2007 [12], and the BioSecure Signature Evaluation
Campaign in 2009 [13].
Signature Verification Competition(SVC 2004)
The Signature Verification Competition (SVC 2004)
represents the first public evaluation campaign in the
field of signature verification [7]. The competition was
divided into two tasks, depending on the available
signature signals. In Task 1, only the pen position
signals (x,y) and the sample timestamps were available.
In Task 2, the pen pressure p and azimuth and altitude
angles (y,g) were also available. Participants had prioraccess
to a signature dataset for each task. These data
sets were later released for public access, and are
referred to as the SVC2004 database. Signatures from
40 users are present in each data set. This evaluation
has the particularity that users were advised to use
invented signatures because of privacy issues. More-
over, they did not have visual feedback from the sign-
ing process, since signatures were captured with a
digitizing tablet and a special pen.
The evaluation results were first released to par-
ticipants, which then had the choice to remain anony-
mous. The best Equal Error Rate (EER) in Task 1
was of 2.84% against skilled forgeries and 1.85%
for random forgeries. In Task 2 (which included
pressure and pen-inclination signals), the lowest
EERs were 2.89% against skilled forgeries and 1.70%
against random forgeries.
Signature Databases and Evaluation. Figure 1 PDA signature
capture process in the BioSecure DS3 - Mobile Scenario
dataset (left) and pen-tablet capture process in the BioSecure
DS2 - Access Control Scenario dataset (right). The
acquisition setup and paper template used in DS2 is similar to
the ones used in MCYT, BIOMET, MyIDea and BiosecurID.
1182S Signature Databases and Evaluation
-
BioSecure Multimodal EvaluationCampaign (BMEC 2007)
The BioSecure Multimodal Evaluation Campaign
(BMEC) was held in 2007 with the aim of compar-
ing the performance of verification systems from
different research groups on individual biometric
modalities and fusion scenarios [14]. Two scenarios
were considered: access control and mobile condi-
tions. In particular, the Mobile Scenario consisted of
four modalities and fusion, using a subset of the
BioSecure Multimodal Database DS3 captured on
mobile conditions (i.e., using portable devices such
as a PDA).
Signature Databases and Evaluation. Figure 2 Examples of
signatures and associated signals from the BioSecure
Multimodal Database DS2 and DS3 signature subcorpora captured
using a pen tablet (top) and a PDA (bottom),
respectively. As can be seen, there are missing samples for the
signature captured with PDA, and no signals are available
during pen-ups, contrary to the pen-tablet case.
Signature Databases and Evaluation S 1183
S
-
In this evaluation, a signature subset from the Bio-
Secure Multimodal DS3 database was used. A set of
signatures from 50 users was previously released to
participants for algorithm development and tuning.
For each user, 20 genuine signatures (15 from the first
session and 5 from the second) as well as 20 forgeries
were available.
Eleven signature verification systems from seven
independent European research institutions were pre-
sented to the evaluation. The results of the evaluation
and a description of each system that participated can
be found in [12]. Another evaluation study in similar
conditions, including a comparative analysis with
respect to the BMEC participants, can be found in
[11]. The best Equal Error Rate (EER) in the evalua-
tion was of 4.03% for random forgeries and of 13.43%
for skilled forgeries. The relatively high EER for skilled
forgeries reveals the high quality of the forgeries
acquired in this database.
BioSecure Signature Evaluation Campaign(BSEC 2009)
The BioSecure Signature Evaluation Campaign is
aimed at measuring the impact of mobile acquisition
conditions, time variability, and the information con-
tent of signatures in the performance of verification
algorithms [13]. Signature subsets from the BioSecure
Multimodal Databases DS2 (pen tablet) and DS3 (PDA
touch-screen) corresponding to 50 users have been
released to participants prior to the evaluation. At the
time of publication, the results of the evaluation cam-
paign are still not available.
Related Entries
Biometric Sample Acquisition
Off-line signature verification
Performance Evaluation, Overview
Signature Recognition
References
1. Munich, M.E., Perona, P.: Visual identification by
signature
tracking. IEEE Trans. Pattern Anal. Mach. Intell. 25(2),
200217 (2003)
2. Ortega-Garcia, J., Fierrez-Aguilar, et al.: MCYT baseline
corpus:
a bimodal biometric database. IEE Proc. Vis. Image Signal
Pro-
cess. 150(6), 391401 (2003)
3. Fierrez, J., Galbally, J., Ortega-Garcia, J., Freire, M.R.,
Alonso-
Fernandez, F., Ramos, D., Toledano, D.T.,
Gonzalez-Rodriguez,
J., Siguenza, J.A., Garrido-Salas, J., Anguiano-Rey, E., de
Rivera,
G.G., Ribalda, R., Faundez-Zanuy, M., Ortega, J.A.,
Cardenoso-
Payo, V., Viloria, A., Vivaracho, C.E., Moro, Q.I., Igarza,
J.J.,
Sanchez, J., Hernaez, I., Orrite-Urunuela, C.,
Martinez-Contreras,
F., Gracia-Roche, J.J.: Biosecurid: A multimodal biometric
data-
base. Pattern Analysis & Applications (to appear) (2009)
Signature Databases and Evaluation. Table 1 Summary of the most
popular on-line signature databases. The symbols
x,y,p,y,g denote pen position horizontal coordinate, vertical
coordinate, pen pressure, azimuth and altitude respectively
Name Device Users Sessions
Signatures per user
Signals Interval between sessionsGenuine Forgeries
PHILIPS Pen tablet 51 35 30 up to 70 x,y,p,y,g 1 week
approx.
BIOMET Pen tablet 84 3 15 up to 12 x,y,p,y,g 35 months
MCYT Pen tablet 330 1 25 25 x,y,p,y,g -
SVC2004 Task 1 Pen tablet 40 2 20 20 x,y min. 1 week
SVC2004 Task 2 Pen tablet 40 2 20 20 x,y,p,y,g min. 1 week
SUSIG Blind Subcorpus Pen tablet 100 1 8 or 10 10 x,y,p -
SUSIG Visual Subcorpus Pen tablet 100 2 20 10 x,y,p 1 week
approx.
MyIDea Pen tablet ca. 100 3 18 18 x,y,p,y,g days to months
BioSecurID Pen tablet 400 4 16 16 x,y,p,y,g 1 month approx.
BioSecure DS2 Pen tablet ca. 650 2 30 20 x,y,p,y,g 1 month
approx.
BioSecure DS3 PDA ca. 650 2 30 20 x,y,p,y,g 1 month approx.
1184S Signature Databases and Evaluation
-
4. Vargas, J., Ferrer, M., Travieso, C., Alonso, J.: Off-line
hand-
written signature GPDS-960 corpus. In: Proceedings of ninth
International Conference on Document Analysis and Recogni-
tion, ICDAR, vol. 2, pp. 764768. Curituba, Brazil (2007)
5. Dolfing, J.G.A., Aarts, E.H.L., van Oosterhout, J.J.G.M.:
On-line signature verification with Hidden Markov Models.
In: Proceedings of the International Conference on Pattern
Recognition, ICPR, pp. 13091312. IEEE CS Press. Brisbane,
Australia (1998)
6. Garcia-Salicetti, S., Beumier, C., Chollet, G., Dorizzi, B.,
Jardins,
J.L.L., Lanter, J., Ni, Y., Petrovska-Delacretaz, D.: BIOMET:
A
multimodal person authentication database including face,
voice, fingerprint, hand and signature modalities. In:
Proceed-
ings of IAPR International Conference on Audio- and Video-
based Person Authentication, AVBPA, pp. 845853. Springer
LNCS-2688. Brisbane, Australia (2003)
7. Yeung, D.Y., Chang, H., Xiong, Y., George, S., Kashi, R.,
Matsumoto, T., Rigoll, G.: SVC2004: First international
signa-
ture verification competition. In: Proceedings of
International
Conference on Biometric Authentication, ICBA, pp. 1622.
Springer LNCS-3072 (2004)
8. Kholmatov, A., Yanikoglu, B.: Newblock Susig: an on-line
signa-
ture database, associated protocols and benchmark results.
Pat-
tern Analysis & Applications (2008)
9. Dumas, B., Pugin, C., Hennebert, J., Petrovska-Delacretaz,
D.,
Humm, A., Evequoz, F., Ingold, R., Rotz, D.V.: MyIDea -
multimodal biometrics database, description of acquisition
pro-
tocols. In: Proceedings of third COST 275 Workshop (COST
275), pp. 5962. Hatfield, UK (2005)
10. Association BioSecure: BioSecure multimodal database.
(http://
www.biosecure.info) (2007). Last Accessed 03 March, 2009
11. Martinez-Diaz, M., Fierrez, J., Galbally, J., Ortega-Garcia,
J.:
Towards mobile authentication using dynamic signature
verifi-
cation: useful features and performance evaluation. In:
Proc.
Intl. Conf. on Pattern Recognition, ICPR pp. 16 (2008)
12. TELECOM&Management SudParis: BioSecure Multimodal
Eval-
uation Campaign 2007 Mobile Scenario - experimental results.
Tech. rep. (2007).
(http://biometrics.it-sudparis.eu/BMEC2007/
files/Results_mobile.pdf). Last Accessed 03 March, 2009
13. TELECOM & Management SudParis: Biosecure Signature
Evaluation Campaign, BSEC 2009.
http://biometrics.it-sudparis.
eu/BSEC2009. URL http://biometrics.it-sudparis.eu/BSEC2009
14. Alonso-Fernandez, F., Fierrez, J., Ramos, D., Ortega-Garcia,
J.:
Dealing with sensor interoperability in multi-biometrics:
the
UPM experience at the BioSecure Multimodal Evaluation 2007.
In: Defense and Security Symposium, Biometric Technologies
for Human Identification, BTHI, Proc. SPIE, vol. 6944.
Orlando,
USA (2008)
Signature Dataset
Signature Databases and Evaluation
Signature Features
MARCOS MARTINEZ-DIAZ1, JULIAN FIERREZ1,
SEIICHIRO HANGAI2
1Biometric Recognition Group - ATVS, Escuela
Politecnica Superior, UniversidadAutonoma deMadrid,
Campus de Cantoblanco, Madrid, Spain2Department of Electrical
Engineering, Tokyo
University of Science, Japan
Synonyms
Signature characteristics
Definition
Signature features represent magnitudes that are extrac-
ted from digitized signature samples, with the aim of
describing each signature as a vector of values. The
extraction and selection of optimum signature features
is a crucial step when designing a verification system.
Features must allow each signature to be described in a
way that the discriminative power between signatures
produced by different users is maximized while allowing
variability among signatures from the same user.
On-line signature features can be divided into two
main types. Global features model the signature as
a holistic multidimensional vector and represent mag-
nitudes such as average speed, total duration, and
aspect ratio. On the other hand, local features are
time-functions derived from the signals, such as the
pen-position coordinate sequence or pressure signals,
captured with digitizer tablets or touch-screens.
In off-line signature verification systems, features
are extracted from a static signature image. They can
also be classified as global, if they consider the image as
a whole (e.g., image histogram, signature aspect ratio);
or local, if they are obtained from smaller image
regions (e.g., local orientation histograms).
This entry is focused on on-line signature features,
although a brief outline of off-line signature features
is also given.
Introduction
Several approaches to extract discriminative informa-
tion from on-line signature data have been proposed
Signature Features S 1185
S
-
in the literature [1]. The existing systems can be broadly
divided into two main types: Global systems, in which a
holistic vector representation consisting of a set of
global features (e.g., signature duration, direction
after first pen-up) is derived from the signature trajec-
tories [2, 3], and function-based systems, in which time
sequences describing local properties of the signature
are used for recognition [4, 5], (e.g., position, acceler-
ation). Although recent works show that global
approaches are competitive with respect to local meth-
ods in some circumstances [6], the latter approach has
traditionally yielded better results. Despite this advan-
tage, systems based on local features usually employ
matching algorithms, which are computationally more
expensive than global-feature ones.
Due to the usually low amount of training data
in signature verification, feature selection techniques
must be applied in order to reduce the feature vector
dimensionality. These techniques allow of finding the
optimal feature set for each system or scenario [7].
Feature extraction and preprocessing
Signature features are, in general, extracted from
the time functions captured from the pen dynamics
while an individual signs. In most cases, the capture
of time functions from the handwritten signature
is carried out with acquisition devices such as digitiz-
ing tablets or touch-screens. These devices provide
pen position information (i.e. horizontal x and verti-
cal y coordinates), and in some cases, pen pressure
z and pen inclination ( azimuth and altitude).
A diagram showing the nature of the captured signals
and an example of the signals from a real signature
are shown in Fig. 1. Other less common examples of
on-line signature acquisition devices are special pens
with dedicated hardware inside that captures signa-
ture data such as coordinate, force, or velocity
information.
The sampling rate of these devices is, in general, bet-
ween 100 and 200 Hz. Since the maximum frequencies
of the pen movements during handwriting are 20-30 Hz
[1], these sampling rates satisfy the Nyquist criterion.
Preprocessing steps before feature extraction may
be performed, such as position, size and rotation nor-
malization, noise filtering, or resampling. In some
works, resampling is avoided as it degrades the velocity
related features [4].
Global features
Global feature-based systems describe each signature
as a multidimensional vector where each element
consists on a feature extracted from the whole pen
trajectory. Many feature sets have been proposed in
the literature [2, 3, 8, 9], with variable sizes and a
maximum size of 100 features [6]. Due to the train-
ing data scarcity and adverse effects of the curse of
dimensionality, feature selection techniques must be
applied to reduce the feature vector size. In Table 1,
the 100 features described in [6] are presented.
This global feature set includes most of the features
described in previous works from other authors.
Features are arranged in the order of descending
individual discriminative power. In Fig. 2, examples
of the distribution of global features presented in
Table 1 are shown.
Local features
Local features represent time sequences extracted
from the signature raw captured data. A set of local
features leads to a multidimensional discrete se-
quence that describes a signature. Depending on the
matching algorithm, feature sets of varying sizes have
been proposed in the literature. As a rule of thumb,
Dynamic Time Warping-based algorithms employ
few local features, while systems based on Hidden
Markov Models or Gaussian Mixture Models employ
larger feature sets. In Table 2, the most popular local
features found in the literature are presented [2, 3, 4, 5,
10, 11, 12].
As in the case of global features, feature selection
algorithms must be applied to discriminate the best
performing feature set. Usually, small feature sets are
selected for Dynamic Time Warping-based matching
algorithms. In these systems, speed-related features
extracted from the first derivative of the pen-coordi-
nate time sequences (features 10-11 in Table 2) have
shown to be very effective [4]. On the other hand,
larger feature sets are used when Hidden Markov or
Gaussian Mixture Models are employed [5, 11] for
signature matching. Features related to second-order
derivatives (features 19-27 in Table 2) have not proved
to significantly improve the system verification perfor-
mance [3]. Examples of the local features presented in
Table 2 are depicted in Fig. 3.
1186S Signature Features
-
The usage of features related to pen orientation
(azimuth and altitude) is a subject of controversy.
Although some authors report that these features in-
crease the verification performance [12], others have
reported a low discriminative power for these features
[2]. Moreover, these features are not always available,
since many touch-screen acquisition devices such as
Tablet-PCs or PDAs are unable to capture pen orienta-
tion information.
The fusion of the global and local feature-based
systems has been reported to provide better perfor-
mance than the individual systems [6].
Signature Features. Figure 1 (a) Representation of the position,
azimuth and altitude of the pen with respect to the
capture device. (b) Example of raw captured data from a
signature.
Signature Features S 1187
S
-
Signature Features. Table 1 Set of global features sorted by
individual discriminative power (T denotes time interval,
t denotes time instant, N denotes number of events, y denotes
angle. Note that some symbols are defined in differentfeatures of
the table (e.g., D in feature 7 is defined in feature 15)
Ranking Feature Description Ranking Feature Description
1 signature total duration Ts 2 N(pen-ups)
3 N(sign changes of dx dt and dy dt) 4 average jerk j
5 standard deviation of ay 6 standard deviation of vy
7 (standard deviation of y)/Dy 8 N(local maxima in x)
9 standard deviation of ax 10 standard deviation of vx
11 jrms 12 N(local maxima in y)
13 t(2nd pen-down) Ts 14 (average velocity v)/vx,max15
Aminymaxyminxmaxxmin
DxPpendowns
i1 xmax jixmin jiDy16 (xlast pen-upxmax) Dx
17 (x1st pen-downxmin) Dx 18 (ylast pen-upymin) Dy19 (y1st
pen-downymin) Dy 20 (Twv) (ymaxymin)21 (Twv) (xmaxxmin) 22
(pen-down duration Tw)/Ts23 v vy,max 24 (ylast pen-upymax) Dy25
Tdy=dt=dx=dt>0
Tdy=dt=dx=dt0 jpen-up) Tw
33 N(vx0) 34 direction histogram s135 (y2nd local maxy1st
pen-down) Dy 36 (xmaxxmin)/xacquisition range37 (x1st pen-downxmax)
Dx 38 T(curvature>Thresholdcurv) Tw39 (integrated abs. centr.
acc. aIc)/amax 40 T(vx>0) Tw41 T(vx0 jpen-up) Tw43 (x3rd local
maxx1st pen-down) Dx 44 N(vy0)45 (acceleration rms a)/amax 46
(standard deviation of x)/Dx47 Tdx=dtdy=dt>0
Tdx=dtdy=dt
-
Signature Features. Table 1 (Continued)
Ranking Feature Description Ranking Feature Description
83 jy,max 84 y(2nd pen-down to 2nd pen-up)
85 jmax 86 spatial histogram t3
87 (1st t(vy,min))/Tw 88 (2nd t(xmax))/Tw
89 (3rd t(xmax))/Tw 90 (1st t(vy,max))/Tw
91 t(jmax) Tw 92 t(jy,max) Tw93 direction change histogram c2 94
(3rd t(ymax))/Tw
95 direction change histogram c4 96 jy
97 direction change histogram c3 98 y(initial direction)
99 y(before last pen-up) 100 (2nd t(ymax))/Tw
Signature Features. Figure 2 Examples of genuine signatures and
forgeries (left) and scatter plots of 4 different
global features from the 100-feature set presented in Table 1
(right). The signatures belong to the BioSecure database
and the Figure has been adapted from [13].
Signature Features S 1189
S
-
Off-line signature features
Off-line signature verification systems usually rely
on image processing and shape recognition techni-
ques to extract features. As a consequence, additional
preprocessing steps such as image segmentation and
binarization must be carried out. Features are
extracted from gray-scale images, binarized images,
or skeletonized images, among other possibilities.
The proposed feature sets in the literature are nota-
bly heterogeneous, specially when compared with the
case of on-line verification systems. These include,
among others, the usage of image transforms (e.g.,
Hadamard), morphological operators, structural
representations, graphometric features [14], direc-
tional histograms, and geometric features. Readers are
referred to [15] for an exhaustive listing of off-line
signature features.
Related Entries
Feature Extraction
Off-line Signature Verification
On-line Signature Verification
Signature Matching
Signature Recognition
References
1. Plamondon, R., Lorette, G.: Automatic signature
verification
and writer identification: the state of the art. Pattern
Recogn.
22(2), 107131 (1989)
2. Lei, H., Govindaraju, V.: A comparative study on the
consistency
of features in on-line signature verification. Pattern Recogn.
Lett.
26(15), 24832489 (2005)
Signature Features. Table 2 Extended set of local features. The
upper dot notation (e.g., xn) indicates time derivative
# Feature Description
1 x-coordinate xn
2 y-coordinate yn
3 Pen-pressure zn
4 Path-tangent angle ynarctan(yn xn)5 Path velocity magnitude
un
_yn _xn
p6 Log curvature radius rn log(1 kn) log(n _yn), where kn is the
curvature of the
position trajectory
7 Total acceleration magnitude an t2n c2n
p _u2n u2ny2nq , where tn and cn are respectively thetangential
and centripetal acceleration components of the penmotion
8 Pen azimuth gn9 Pen altitude fn1018 First-order derivative of
features 19 xn, yn, zn, _yn, _un, _rn, an, _gn, _fn1927
Second-order derivative of features 19 xn,yn,zn,yn,un,rn,an,gn,
fn28 Ratio of the minimum over the maximum
speed over a window of 5 samplesnrmin {n4, . . . ,n} max {n4, .
. . ,n }
2930 Angle of consecutive samples and firstorder difference
anarctan(ynyn1 xnxn1) _an
31 Sine snsin(an)32 Cosine cncos(an)33 Stroke length to width
ratio over a window
of 5 samples r5n Pknkn4
xkxk12ykyk12
p
max xn4;:::;xnf gmin xn4 ;:::;xnf g34 Stroke length to width
ratio over a window
of 7 samples r7n Pknkn6
xkxk12ykyk12
p
max xn6 ;:::;xnf gmin xn6;:::;xnf g
1190S Signature Features
-
3. Richiardi, J., Ketabdar, H., Drygajlo, A.: Local and global
feature
selection for on-line signature verification. In: Proceedings
of
IAPR eighth International Conference on Document Analysis
and Recognition, ICDAR, Seoul, Korea (2005)
4. Kholmatov, A., Yanikoglu, B.: Identity authentication using
im-
proved online signature verification method. Pattern Recogn.
Lett. 26(15), 24002408 (2005)
5. Fierrez, J., Ramos-Castro, D., Ortega-Garcia, J.,
Gonzalez-
Rodriguez, J.: HMM-based on-line signature verification:
feature
extraction and signature modeling. Pattern Recogn. Lett.
28(16),
23252334 (2007)
6. Fierrez-Aguilar, J., Nanni, L., Lopez-Penalba, J.,
Ortega-Garcia, J.,
Maltoni, D.: An on-line signature verification system based
on
fusion of local and global information. In: Proceedings of
IAPR
Signature Features. Figure 3 Examples of functions from the
27-feature set presented in Table 2 for a genuine signature
(left) and a forgery (right) of a particular subject.
Signature Features S 1191
S
-
International Conference on Audio- and Video-Based Biometric
Person Authentication, AVBPA, Springer LNCS-3546, pp. 523532
(2005)
7. Jain, A.K., Zongker, D.: Feature selection: evaluation,
applica-
tion, and small sample performance. IEEE Trans. Pattern
Anal.
Mach. Intell. 19(2), 153158 (1997)
8. Nelson, W., Turin, W., Hastie, T.: Statistical methods for
on-line
signature verification. Int. J. Pattern Recogn. Artif. Intell.
8(3),
749770 (1994)
9. Lee, L.L., Berger, T., Aviczer, E.: Reliable on-line human
signa-
ture verification systems. IEEE Trans. Pattern Anal. Mach.
Intell.
18(6), 643647 (1996)
10. Dolfing, J.G.A., Aarts, E.H.L., van Oosterhout, J.J.G.M.:
On-line
signature verification with Hidden Markov Models. In:
Proceed-
ings of the International Conference on Pattern Recognition,
IEEE Press, pp. 13091312 (1998)
11. Van, B.L., Garcia-Salicetti, S., Dorizzi, B.: On using the
Viterbi
path along with HMM likelihood information for online signa-
ture verification. IEEE Trans. Syst. Man Cybern. B 37(5),
12371247 (2007)
12. Muramatsu, D., Matsumoto, T.: Effectiveness of pen
pressure,
azimuth, and altitude features for online signature
verification.
In: Proceedings of IAPR International Conference on
Biometrics, ICB, Springer LNCS 4642 (2007)
13. Martinez-Diaz,M., Fierrez, J., Galbally, J., Ortega-Garcia,
J.: Towards
mobile authentication using dynamic signature verification:
use-
ful features and performance evaluation. In: Proceedings
Interna-
tional Conference on Pattern Recognition, ICPR, pp. 16
(2008)
14. Sabourin, R.: In: Off-line signature verification: recent
advances
and perspectives. Lect. Notes Comput. Sci. 1339 8498 (1997)
15. Impedovo, D., Pirlo, G.: Automatic signature verification:
The
state of the art. IEEE Trans. Syst. Man Cybern. C Appl. Rev.
38(5), 609635 (2008)
Signature Matching
MARCOS MARTINEZ-DIAZ1, JULIAN FIERREZ1,
SEIICHIRO HANGAI2
1Biometric Recognition Group - ATVS, Escuela
Politecnica Superior, Universidad Autonoma de
Madrid, Madrid, Spain2Department of Electrical Engineering
Tokyo
University of Science, Japan
Synonyms
Signature similarity computation
Definition
The objective of signature matching techniques is to
compute the similarity between a given signature and a
signature model or reference signature set. Several pat-
tern recognition techniques have been proposed as
matching algorithms for signature recognition. In on-
line signature verification systems, signature matching
algorithms have followed two main approaches. Fea-
ture-based algorithms usually compute the similarity
among multidimensional feature vectors extracted
from the signature data with statistical classification
techniques. On the other hand, function-based algo-
rithms perform matching by computing the distance
among time-sequences extracted from the signa-
ture data with technique such as Hidden MarkovMod-
els and Dynamic Time Warping. Off-line signature
matching has followed many different approaches,
most of which are related to image processing and
shape recognition.
This essay focuses on on-line signature matching,
although off-line signature matching algorithms are
briefly outlined.
Introduction
As in other biometric modalities, signature matching
techniques vary depending on the nature of the features
that are extracted from the signature data. In feature-
based systems (also known as global), each signature
is represented as a multidimensional feature vector,
while in function-based systems (also known as local)
signatures are represented by multidimensional time
sequences. Signature matching algorithms also depend
on the enrollment phase.Model-based systems estimate
a statistical model for each user from the training
signature set. On the other hand, in reference-based
systems the features extracted from the set of training
signatures are stored as a set of template signatures.
Consequently, given an input signature, in model-
based systems the matching is performed against a
statistical model, while in reference-based systems the
input signature is compared with all the signatures
available in the reference set.
Feature-Based Systems
Feature-based systems usually employ classical pattern
classification techniques. In reference-based systems, the
matching score is commonly obtained by using a dis-
tance measure between the feature vectors of input and
template signatures [1, 2], or a trained classifier.
Distance
1192S Signature Matching
-
measures used for signature matching include Eucli
dean, weighted Euclidean, and Mahalanobis distance. In
model-based systems, trained classifiers are employed,
including approaches such as Neural Networks, Gaussian
Mixture Models [3] or Parzen Windows [4].
Function-Based Systems
In these systems, multidimensional time sequences
extracted from the signature dynamics are used as fea-
tures. Given the similarity of this task to others related
to
speaker recognition, the most popular approaches
in local signature verification are related to algorithms
proposed in the speech recognition community.
Among these, signature verification systems using
Dynamic Time Warping (DTW) [5, 6, 7] or Hidden
Markov Models (HMM) [8, 9, 10, 11] are the most
popular approaches in signature verification. In such
systems, the captured time functions (e.g., pen coordi-
nates, pressure, etc.) are used to model each user sig-
nature. In the following, Dynamic Time Warping and
Hidden Markov Models are outlined. An brief over-
view of other techniques is also given.
Dynamic Time Warping
Dynamic Time Warping (DTW) is an application of
the Dynamic Programming principles to the problem
of matching discrete time sequences. DTW was origi-
nally proposed for speech recognition applications
[12]. The goal of DTW is to find an elastic match
among samples of a pair of sequences X and Y that
minimizes a predefined distance measure. The algo-
rithm is described as follows. Lets define two
sequences
X x1; x2; :::; xi; :::; xIY y1; y2; :::; yj ; :::; yJ 1
and a distance measure as
di; j xi yj 2
between sequence samples. A warping path can be
defined as
C c1; c2; :::; ck; :::; cK 3where each ck represents a
correspondence (i, j) be-
tween samples of X and Y . The initial condition of the
algorithm is set to
g1 g1; 1 d1; 1 w1 4where gk represents the accumulated distance
after
k steps and w(k) is a weighting factor that must be
defined. For each iteration, gk is computed as
gk gi; j minck1 gk1 dck wk 5
until the Ith and Jth sample of both sequences respec-
tively is reached. The resulting normalized distance is
DX ;Y gKPKk1wk
6
where w(k) compensates the effect of the length ofthe
sequences.
The weighting factors w(k) are defined in order to
restrict which correspondences among samples of both
sequences are allowed. In Fig. 1a, a common definition
of w(k) is depicted, and an example of a warping path
between two sequences is given. In this case, only three
transitions are allowed in the computation of gk. Con-
sequen tly, Eq. (5 ) becomes
gk gi; j mingi; j 1 di; jgi 1; j 1 2di; jgi 1; j di; j
24
35 7
which is one of the most common implementations
found in the literature. In Fig. 1b, an example of point
correspondences between two signatures is depicted to
visually show the results of the elastic alignment.
The algorithm has been further refined for signa-
ture verification by many authors [5, 7], reaching a
notable verification performance. For example, the
distance measure d(i, j) can be alternatively defined,
or other normalization techniques may be applied
to the accumulated distance gK among sequences.
DTW can be also applied independently for each
stroke, which may be specially well suited for oriental
signatures, since they are generally composed of seve-
ral strokes. Although the DTW algorithm has been
replaced in speech-related applications by more pow-
erful approaches such as HMMs, it remains as a highly
effective tool for signature verification as it is best
suited for small amounts of training data, which is
the common case in signature verification.
Hidden Markov Models
Hidden Markov Models (HMM) have been widely
used for speech recognition applications [13] as well
Signature Matching S 1193
S
-
as in many handwriting recognition applications.
Several approaches using HMMs for dynamic signa-
ture verification have been proposed in the last years
[8, 9, 10, 11]. An HMM represents a double stochas-
tic process, governed by an underlying Markov
chain, with a finite number of states and a random
function set that generate symbols or observations
each of which is associated with one state [11].
Observations in each state are modeled with GMMs
in most speech and handwriting recognition applica-
tions. In fact, GMMs can be considered a single-state
HMM and have also been successfully used for signa-
ture verification [14]. Given a sequence of multi-
dimensional vectors of observations O defined as
O o1; o2; . . . ; oN ;
corresponding to a given signature, the goal of HMM-
based signature matching is to find the probability that
this sequence has been produced by a Hidden Markov
Model M
POjM;
where M is the signature model computed during
enrollment.
The basic structure of an HMM using GMMs to
model observations is defined by the following elements:
Number of hidden states N. Number of Gaussian mixtures per state
M. Probability transition matrix A {aij}, which con-
tains the probabilities of jumping from one state to
another or staying on the same state.
In Fig. 2, an example of a possible HMM con-
figuration is shown. Hidden Markov Models are
usually trained in two steps using the enrollment
signatures. First, state transition probabilities and
observation statistical models are estimated using
a Maximum Likelihood algorithm. After this, a re-
estimation step is carried out using the Baum-Welch
algorithm. The likelihood between a trained HMM
and an input sequence (i.e., the matching score) is
computed by using the Viterbi algorithm. In [10],
the Viterbi path (that is, the most probable state tran-
sition sequence) is also used as a similarity measure.
A detailed description of Hidden Markov Models is
given in [13].
Within HMM-based dynamic signature verifica-
tion, the existing approaches can be divided in regional
and local. In regional approaches, the extracted time
Signature Matching. Figure 1 (a) Optimal warping path between
two sequences obtained with DTW. Point-to-point
distances are represented with different shades of gray, lighter
shades representing shorter distances and darker
shades representing longer distances. (b) Example of
point-to-point correspondences between two genuine
signatures obtained using DTW.
1194S Signature Matching
-
sequences are further segmented and converted into
a sequence of feature vectors or observations, each
one representing regional properties of the signature
signal [9, 11]. Some examples of segmentation bound-
aries are null vertical velocity points [9] or changes in
the quantized trajectory direction [11]. On the other
hand, local approaches directly use the time functions
as observation sequences for the signature modeling
[8, 10, 14].
Finding a reliable and robust model structure for
dynamic signature verification is not a trivial task.
While too simple HMMs may not allow to model
properly the user signatures, too complex models
may not be able to model future realizations due to
overfitting. On the other hand, as simple models have
less parameters to be estimated, their estimation may
be more robust than for complex models. Two main
parameters are commonly considered while selecting
an optimal model structure: the number of states and
the number of Gaussian mixtures per state [8]. Some
approaches consider a user-specific number of states
[10], proportional to the average signature duration or
a user-specific number of mixtures [14]. Most of the
proposed systems consider a left-ro-right configura-
tion without skips between states, also known as
Bakis topology (see Fig. 2).
Other Techniques
More examples of signature matching techniques in-
clude Neural Networks, in particular Bayesian,
multilayer, time-delay Neural Networks and radial-
basis functions among others have been applied for
signature matching. Other examples include Structural
approaches, which model signatures as a sequence, tree
or graph of symbols. Support Vector Machines have
also been applied for signature matching. The reader is
referred to [15] for an exhaustive list of references
related to these approaches.
Fusion of the feature- and function-based
approaches has been reported to provide better perfor-
mance than the individual systems [4].
Off-line Signature Matching
The proposed approaches for off-line signature match-
ing are notably heterogeneous compared to on-line
signature verification. These are mostly related to
image and shape recognition techniques and classical
statistical pattern recognition algorithms. They include
Neural Networks, Hidden Markov Models, Support
Vector Machines and distance-based classifiers among
others. A summary of off-line signature matching tech-
niques can be found in [15].
Related Entries
Off-line Signature Verification
On-line Signature Verification
Signature Features
Signature Recognition
Signature Matching. Figure 2 Graphical representation of a
left-to-right N-state HMM, with M-component GMMs
representing observations and no skips between states.
Signature Matching S 1195
S
-
References
1. Nelson, W., Turin, W., Hastie, T.: Statistical methods for
on-line
signature verification. Int. J. Pattern Recogn. Artif. Intell.
8(3),
749770 (1994)
2. Lee, L.L., Berger, T., Aviczer, E.: Reliable on-line human
signa-
ture verification systems. IEEE Trans. Pattern Anal. Mach.
Intell.
18(6), 643647 (1996)
3. Martinez-Diaz, M., Fierrez, J., Ortega-Garcia, J.: Universal
Back-
ground Models for dynamic signature verification. In:
Proceed-
ings IEEE Conference on Biometrics: Theory, Applications and
Systems, BTAS, pp. 16 (2007)
4. Fierrez-Aguilar, J., Nanni, L., Lopez-Penalba, J.,
Ortega-Garcia, J.,
Maltoni, D.: An on-line signature verification system based
on fusion of local and global information. In: Proceedings
of IAPR International Conference on Audio- and Video-Based
Biometric Person Authentication, AVBPA, Springer LNCS-3546,
pp. 523532 (2005)
5. Sato, Y., Kogure, K.: Online signature verification based
on
shape, motion and writing pressure. In: Proceedings of sixth
International Conference on Pattern Recognition, pp. 823826
(1982)
6. Martens, R., Claesen, L.: Dynamic programming
optimisation
for on-line signature verification. In: Proceedings fourth
Inter-
national Conference on Document Analysis and Recognition,
ICDAR, vol. 2, pp. 653656 (1997)
7. Kholmatov, A., Yanikoglu, B.: Identity authentication using
im-
proved online signature verification method. Pattern Recogn.
Lett. 26(15), 24002408 (2005)
8. Fierrez, J., Ramos-Castro, D., Ortega-Garcia, J.,
Gonzalez-
Rodriguez, J.: HMM-based on-line signature verification:
feature extraction and signature modeling. Pattern Recogn.
Lett. 28(16), 23252334 (2007)
9. Dolfing, J.G.A., Aarts, E.H.L., van Oosterhout, J.J.G.M.:
On-line
signature verification with Hidden Markov Models. In:
Proceed-
ings of the International Conference on Pattern Recognition,
ICPR, pp. 13091312. IEEE CS Press (1998)
10. Van, B.L., Garcia-Salicetti, S., Dorizzi, B.: On using the
Viterbi
path along with HMM likelihood information for online signa-
ture verification. IEEE Trans. Syst. Man Cybern. B 37(5),
12371247 (2007)
11. Yang, L., Widjaja, B.K., Prasad, R.: Application of
Hidden
Markov Models for signature verification. Pattern Recogn.
28(2), 161170 (1995)
12. Sakoe, H., Chiba, S.: Dynamic programming algorithm
optimi-
zation for spoken word recognition. IEEE Trans. Acoust. 26,
4349 (1978)
13. Rabiner, L.R.: A tutorial on Hidden Markov Models and
selected
applications in speech recognition. Proceedings of the IEEE
77(2), 257286 (1989)
14. Richiardi, J., Drygajlo, A.: Gaussian Mixture Models for
on-line signature verification. In: Proceedings of ACM SIGMM
Workshop on Biometric Methods and Applications, WBMA.
pp. 115122 (2003)
15. Impedovo, D., Pirlo, G.: Automatic signature verification:
The
state of the art. IEEE Trans. Syst. Man. Cybern. C Appl.
Rev.
38(5), 609635 (2008)
Signature Recognition
OLAF HENNIGER1, DAIGO MURAMATSU2,
TAKASHI MATSUMOTO3, ISAO YOSHIMURA4,
MITSU YOSHIMURA5
1Fraunhofer Institute for Secure Information
Technology, Darmstadt, Germany2Seikei University, Musashino-shi,
Tokyo, Japan3Waseda University, Shinjuku-ku, Tokyo, Japan4Tokyo
University of Science, Shinjuku-ku, Tokyo,
Japan5Ritsumeikan University, Sakyo-ku, Kyoto, Japan
Synonyms
Handwritten signature recognition; signature/sign
recognition
Definition
A signature is a handwritten representation of name of
a person. Writing a signature is the established method
for authentication and for expressing deliberate deci-
sions of the signer in many areas of life, such as banking
or the conclusion of legal contracts. A related concept is
a handwritten personal sign depicting something else
than a persons name. As compared to text-independent
writer recognition methods, signature/sign recognition
goes with shorter handwriting probes, but requires to
write the same name or personal sign every time. Hand-
written signatures and personal signs belong to the
behavioral biometric characteristics as the person must
become active for signing.
Regarding the automated recognition by means of
handwritten signatures, there is a distinction between
on-line and off-line signature recognition. On-line sig-
nature data are captured using digitizing pen tablets,
pen displays, touch screens, or special pens and include
information about the pen movement over time (at
least the coordinates of the pen tip and possibly also the
pen-tip pressure or pen orientation angles over time).
In this way, on-line signature data represent the way a
signature is written, which is also referred to as signa-
ture dynamics. By contrast, off-line (or static) signa-
tures are captured as grey-scale images using devices
such as image scanners and lack temporal information.
1196S Signature Recognition