EUROSTAT DIRECTORS OF METHODOLOGY/IT DIRECTORS JOINT STEERING GROUP Luxembourg June 28, 2016
EUROSTAT
DIRECTORS OF METHODOLOGY/IT DIRECTORS
JOINT STEERING GROUP
Luxembourg
June 28, 2016
From indicators tosynthesis.
Methodological issues in the construction of the construction of complex indicators
Filomena Maggino(University of Florence – Italy)
Synthesis of indicators
We need to start by
Premise
- sketching out the main methodological steps aimed
at developing indicators
- clarifying the different issues to face in synthesizing indicators
TOPIC
Premise
• considered a “niche field” from a scientific
point of view
• never missed in any conference, workshop,
seminar on measuring socio-economic
dimensions during the last decades
1.
Developing indicators
and managing the
complexitycomplexity
Developing indicators
(1) a normative exercise
Developing indicators
Why talking about “indicators”?
In order to start any measurement process, a
(1) a normative exercise
In order to start any measurement process, a
crucial guiding principle is identified …
Developing indicators
Three approaches to measurement:
• fundamental process ⇒ not derived from other
(1) a normative exercise
• fundamental process ⇒ not derived from other
measures (length, volume)
• deriving process ⇒ derived from other measures
(density, velocity)
• defining process ⇒ achieved as a consequence of a
definition (socio-economic status)
Developing indicators
In social sciences, the measuring process
requires:
(1) a normative exercise
• a robust conceptual definition
• a consistent collection of observations
• a consequent analysis of the relationship between
observations and defined concepts.
Developing indicators
Indicator
�
what relates
(1) a normative exercise
what relates
concepts to reality
through observation
but …… what actually is an “indicator”?
“index”
Developing indicators
(1) a normative exercise
�
“any thing that is
useful to indicate”
from Latin
“indicator” “index”
Developing indicators
(1) a normative exercise
�
“who or what
indicates”
�
“any thing that is
useful to indicate”
from Latin
Developing indicators
Indicator
�
not
(1) a normative exercise
not
a simply crude statistical information
but
a measure organically connected to a
conceptual model
Developing indicators
Indicator
�
purposeful statistics
(1) a normative exercise
purposeful statistics
(K. Land)
Developing indicators
index � indicator
when its definition and measurement occur in the
(1) a normative exercise
when its definition and measurement occur in the
ambit of a conceptual model and is connected to a
defined aim
Indicators should be developed and managed so that they …
Developing indicators
(1) a normative exercise
... represent different aspects of the reality,
... picture the reality in an interpretable way, and
... allow meaningful stories to be told
Developing indicators
(1) a normative exercise
Developing indicators
RISK
lack of any logical cohesion and consistency
(1) a normative exercise
lack of any logical cohesion and consistency
deforming reality through distorted results
(hidden – sometime - by using and applying
sophisticated procedures and methods)
(1) a normative exercise
Developing indicators
normative nature of the selection of indicators
cannot be denied cannot be denied
the process contains a “subjective” component
Developing indicators
(2) between objectivity and subjectivity
Developing indicators
Some clarification
on the use of the term “subjective”
(2) between objectivity and subjectivity
Developing indicators
Some clarification
on the use of the term “subjective”
(2) between objectivity and subjectivity
• are we talking about defining the phenomena?
• are we talking about components of the phenomenon?
• are we talking about defining the method of measurement and analysis?
Developing indicators
“subjective”
in defining phenomena
Process of describing the reality (conceptual framework) is
(2) between objectivity and subjectivity
Process of describing the reality (conceptual framework) is
always subjective
�related to the researchers’ view of the reality
The conceptual definition represents only a “small window” through which
only some facets of the reality can be seen (reductionism).
Developing indicators
“subjective”
as one of the components of the reality
We can distinguish between:
(2) between objectivity and subjectivity
• objective information, collected by observing reality
• subjective information, collected only from individuals and
their assertions
Developing indicators
“subjective”
in the measuring process
(2) between objectivity and subjectivity
Developing indicators
“subjective”
in the measuring process�
(2) between objectivity and subjectivity
(2) between objectivity and subjectivity
Developing indicators
“subjective”
in the measuring process�
Sometimes in this context the dichotomy "subjective-objective" is considered
equivalent to the dichotomy "qualitative-quantitative". However, the two
dichotomies should be kept distinct
Developing indicators
(3) the hierarchical design
Developing indicators
Process allowing indicators to be developed
(3) the hierarchical design
hierarchical design � requires the definition of the different
subsequent components
Developing indicators
Hierarchical
Components Questions Components’ definition
Conceptual model �
What is the
phenomenon to be
studied? � It defines the phenomenon, its domains and its general aspects.
�
Variables � What aspects define
the phenomenon? �
Each variable represents an aspect allowing the phenomenon to
be specified consistently with the conceptual model
(3) the hierarchical design
Hierarchical
design Variables �
the phenomenon? �
be specified consistently with the conceptual model
�
Dimensions �
What factors define the
aspect should be
observed? �
Each dimension represents each factor defining the
corresponding variable.
Basic indicators �
In which way each
dimension should to be
measured? �
Each indicator represents what is actually measured in order to
investigate each variable and its dimensions.
Developing indicators
Hierarchical
Components Questions Components’ definition
Conceptual model �
What is the
phenomenon to be
studied? � It defines the phenomenon, its domains and its general aspects.
�
Variables � What aspects define
the phenomenon? �
Each variable represents an aspect allowing the phenomenon to
be specified consistently with the conceptual model
(3) the hierarchical design
Hierarchical
design Variables �
the phenomenon? �
be specified consistently with the conceptual model
�
Dimensions �
What factors define the
aspect should be
observed? �
Each dimension represents each factor defining the
corresponding variable.
Basic indicators �
In which way each
dimension should to be
measured? �
Each indicator represents what is actually measured in order to
investigate each variable and its dimensions.
a) the model aimed at data construction,
b) the spatial and temporal ambit of observation,
c) the aggregation levels (among indicators and/or among observation units),
d) the models allowing interpretation and evaluation.
Developing indicators
Hierarchical
Components Questions Components’ definition
Conceptual model �
What is the
phenomenon to be
studied? � It defines the phenomenon, its domains and its general aspects.
�
Variables � What aspects define
the phenomenon? �
Each variable represents an aspect allowing the phenomenon to
be specified consistently with the conceptual model
(3) the hierarchical design
Hierarchical
design Variables �
the phenomenon? �
be specified consistently with the conceptual model
�
Dimensions �
What factors define the
aspect should be
observed? �
Each dimension represents each factor defining the
corresponding variable.
Basic indicators �
In which way each
dimension should to be
measured? �
Each indicator represents what is actually measured in order to
investigate each variable and its dimensions.
Developing indicators
Hierarchical
Components Questions Components’ definition
Conceptual model �
What is the
phenomenon to be
studied? � It defines the phenomenon, its domains and its general aspects.
�
Variables � What aspects define
the phenomenon? �
Each variable represents an aspect allowing the phenomenon to
be specified consistently with the conceptual model
(3) the hierarchical design
Hierarchical
design Variables �
the phenomenon? �
be specified consistently with the conceptual model
�
Dimensions �
What factors define the
aspect should be
observed? �
Each dimension represents each factor defining the
corresponding variable.
Basic indicators �
In which way each
dimension should to be
measured? �
Each indicator represents what is actually measured in order to
investigate each variable and its dimensions.
“Dimensionality” � theoretical
Two different situations can be observed:
• uni-dimensional � variable assumes a unique, fundamental underlying dimension
• multidimensional � variable assumes two or more underlying factors
Developing indicators
Hierarchical
Components Questions Components’ definition
Conceptual model �
What is the
phenomenon to be
studied? � It defines the phenomenon, its domains and its general aspects.
�
Variables � What aspects define
the phenomenon? �
Each variable represents an aspect allowing the phenomenon to
be specified consistently with the conceptual model
(3) the hierarchical design
Hierarchical
design Variables �
the phenomenon? �
be specified consistently with the conceptual model
�
Dimensions �
What factors define the
aspect should be
observed? �
Each dimension represents each factor defining the
corresponding variable.
Basic indicators �
In which way each
dimension should to be
measured? �
Each indicator represents what is actually measured in order to
investigate each variable and its dimensions.
“Dimensionality” � theoretical
The correspondence between the defined dimensionality and the selected indicators has to be
demonstrated empirically by testing the selected model of measurement.
Developing indicators
Hierarchical
Components Questions Components’ definition
Conceptual model �
What is the
phenomenon to be
studied? �
It defines the phenomenon, its domains and its general
aspects.
�
Variables � What aspects define
the phenomenon? �
Each variable represents an aspect allowing the phenomenon
to be specified consistently with the conceptual model
(3) the hierarchical design
Hierarchical
design Variables �
the phenomenon? �
to be specified consistently with the conceptual model
�
Dimensions �
What factors define the
aspect should be
observed? �
Each dimension represents each factor defining the
corresponding variable.
Basic indicators �
In which way each
dimension should to be
measured? �
Each indicator represents what is actually measured in order to
investigate each variable and its dimensions.
Each indicator (K. Land)
- represents a component in a model
- can be measured and analysed to compare different situations/groups … to observe
evolutions along time)
- can be related and integrated (aggregated) to specify the model
Developing indicators
How many indicators?
1° option
(3) the hierarchical design
Each variable measured by one indicator
single-indicator approach
Developing indicators
How many indicators?
1° option
(3) the hierarchical design
Each variable measured by one indicator
single-indicator approach
weak � low precision and low accuracy
Developing indicators
How many indicators?
2° option
(3) the hierarchical design
Each variable measured by more than one
indicator
multi-indicator approach
Developing indicators
How many indicators?
2° option
(3) the hierarchical design
Each variable measured by more than one
indicator
multi-indicator approach
Necessary with multidimensional variables
Developing indicators
Defining domains
(3) the hierarchical design
Developing indicators
Defining domains
- each variable
- each dimension
(3) the hierarchical design
- each dimension
refers to domains
Developing indicators
Defining domains
- each variable
- each dimension
(3) the hierarchical design
Other indicators are needed!
- each dimension
refers to domains
(3) the hierarchical design
Developing indicators
Defining domains
Segments of the reality in which the
relevant concepts and their relevant concepts and their
dimensions have to be observed and
assessed
Developing indicators
(4) the model of measurement
relationship between
variable and indicators
Developing indicators
(4) the model of measurement
variable and indicators
model of measurement
Developing indicators
Conceptualmodel
model of measurement
(4) the model of measurement
variable I
dimension1
B.I.
dimension2
B.I. a B.I. b B.I. c
variable II
dimension1
B.I. a B.I. b B.I. c
Two different models:
reflective formative
Developing indicators
(4) the model of measurement
reflective formative
Two different models:
reflective formative
Developing indicators
(4) the model of measurement
reflective formative
indicators � functions of latent variable
reflective
Developing indicators
(4) the model of measurement
explanatory
perspective � top-down
�changes in the latent variable are reflected in
changes in the observable indicators
• indicators are
- linearly related
Statistical assumptions and properties
Developing indicators
reflective
(4) the model of measurement
- linearly related
- interchangeable (the removal of an indicator does not change the essential
nature of the underlying construct)
• evidence for assessing the model � internal consistency:
- correlations between indicators can be interpreted only by the presence of
latent variables
- two uncorrelated indicators cannot measure the same construct
- each indicator has an error term
• total variance of each indicator can be expressed as a function of
i. latent variable (� uni/multi-dimensional � factors � communality)
Statistical assumptions and propertiesreflective
(4) the model of measurement
Developing indicators
i. latent variable (� uni/multi-dimensional � factors � communality)
ii. individual indicator’s characteristics (uniqueness)
• errors and disturbance factors are not interrelated and are not correlated with
latent variables
Total variance of each indicator
sum of threecomponents�
Statistical assumptions and propertiesreflective
(4) the model of measurement
Developing indicators
1. common variance
• explained by � latent variable (and its dimensionality)
• measured by � correlation between indicators
2. specific variance
• not correlated with the other indicators
3. error, portion of the total variance
• not correlated with the previous
Multidimensional Latent Variables
explorative confirmatory
Statistical assumptions and propertiesreflective
(4) the model of measurement
Developing indicators
explorative confirmatory
Two different models:
reflective formative
(4) the model of measurement
Developing indicators
reflective formative
indicators � causal in nature
formative
(4) the model of measurement
Developing indicators
explanatory
perspective � bottom-up
�changes in the indicators determine changes
in the definition / value of the latent variable
• indicators are not interchangeable (omitting an indicator is omitting part of
the construct)
Statistical assumptions and properties
(4) the model of measurement
Developing indicators
formative
• two uncorrelated indicators can serve as meaningful indicators of the same construct (internal consistency is not important)
• indicators have no error term
Proper and accurate application of the
hierarchical design
�
complex structure
From basic indicators to systems of indicators
complex structure
�
each indicator measures and represents a
distinct component of the phenomenon of
interest
Proper and accurate application of the
hierarchical design
�
complex structure
From basic indicators to systems of indicators
complex structure
�
System of indicators
From basic indicators to systems of indicators
DOMAINS
1 2 3 ..
variable i
dimension 1
sub-dimension 1
indicator A indicator A indicator A
indicator B indicator B indicator B
sub-dimension 2
indicator C indicator C indicator C
… … …
dimension 2
indicator … indicator … indicator …
indicator … indicator … indicator …
CONCEPTUAL
MODEL
dimension 2 indicator … indicator … indicator …
… … …
dimension … indicator … indicator … indicator …
variable ii dimension …
indicator … indicator … indicator …
… … …
variable iii dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
From basic indicators to systems of indicators
(1) functions
Descriptive and
explanatory
functions
� Monitoring
� Reporting
From basic indicators to systems of indicators
(1) functions
Evaluation
functions
� Forecasting
� Accounting
�Program management and performance
evaluation
� Assessment
They can be seen in cumulative terms (each one requires the
previous one)
• Monitoring � capacity of the system to monitor changes
over time and meet the need of improving knowledge
• Reporting
• Forecasting
From basic indicators to systems of indicators
(1) functions
• Forecasting
• Accounting
• Program/performance evaluation
• Assessment
• Monitoring
• Reporting � monitoring + analysis + interpretation
• Forecasting
From basic indicators to systems of indicators
(1) functions
• Forecasting
• Accounting
• Program/performance evaluation
• Assessment
• Monitoring
• Reporting
• Forecasting � observed trends can help in supposing
From basic indicators to systems of indicators
(1) functions
• Forecasting � observed trends can help in supposing
future trends and planning ex-ante analyses
• Accounting
• Program/performance evaluation
• Assessment
• Monitoring
• Reporting
• Forecasting
From basic indicators to systems of indicators
(1) functions
• Forecasting
• Accounting � supporting decision concerning the
allocation and the destination of resources
• Program/performance evaluation
• Assessment
• Monitoring
• Reporting
• Forecasting
From basic indicators to systems of indicators
(1) functions
• Forecasting
• Accounting
• Program / performance evaluation � problem
definition, policy choice and evaluation of alternatives and
program monitoring
• Assessment
• Monitoring
• Reporting
• Forecasting
From basic indicators to systems of indicators
(1) functions
• Forecasting
• Accounting
• Program/performance evaluation
• Assessment � certificate or judge subjects (individuals
or institutions) by discriminating their performances or infer
functioning of institutions, enterprises or systems
Indicators developed through the hierarchical process are
seen in relation to each other and show a meaningful and
From basic indicators to systems of indicators
(2) characteristics of indicators
seen in relation to each other and show a meaningful and
precise position in the system consistently with the
conceptual model
(i) the perspective through which the indicators are reporting the
phenomenon to be observed
(ii) the communication context in which the indicators are used
(iii) the interpretation attributed to the indicators in statistical
From basic indicators to systems of indicators
(2) characteristics of indicators
(iii) the interpretation attributed to the indicators in statistical
analyses
(iv) the criteria of their adoption
(v) their quality
(i) perspectives of observation
From basic indicators to systems of indicators
(2) characteristics of indicators
indicators can describe
- a status or a trend- a status or a trend
- an objective fact or a subjective expression
- a positive or a negative aspect
- a conglomerative or a deprivational perspective
- a benefit or a cost
- an input or an outcome
- an impact
(ii) communication context
From basic indicators to systems of indicators
(2) characteristics of indicators
indicators can be
- cold indicators ⇒ complex and difficult, for specialists- cold indicators ⇒ complex and difficult, for specialists
- hot indicators ⇒ simple and easy
- warm indicators ⇒ good balance between quality, comprehensibility and
resonance
(iii) interpretation
From basic indicators to systems of indicators
(2) characteristics of indicators
indicators can be
- descriptive (informative)- descriptive (informative)
- explicative (interpreting another indicator)
- predictive (delineating possible trends)
(iv) criteria
From basic indicators to systems of indicators
(2) characteristics of indicators
interpretation of the results according to a specific reference frame
including particular standard-values, defined a prioriincluding particular standard-values, defined a priori
(iv) criteria
From basic indicators to systems of indicators
(2) characteristics of indicators
• reference point (or critical value)
• signpost arrow (comparison with previous performances)• signpost arrow (comparison with previous performances)
• best practice (a model to be followed)
(iv) criteria
From basic indicators to systems of indicators
(2) characteristics of indicators
• reference point (or critical value)
• signpost arrow (comparison with previous performances)• signpost arrow (comparison with previous performances)
• best practice (a model to be followed)
Indicator: blood pressure
(iv) criteria
From basic indicators to systems of indicators
(2) characteristics of indicators
• reference point (or critical value)
• signpost arrow (comparison with previous performances)• signpost arrow (comparison with previous performances)
• best practice (a model to be followed)
Indicator: blood pressure
“the pressure is lower than normal”
(iv) criteria
From basic indicators to systems of indicators
(2) characteristics of indicators
• reference point (or critical value)
• signpost arrow (comparison with previous performances)• signpost arrow (comparison with previous performances)
• best practice (a model to be followed)
Indicator: blood pressure
“the pressure is getting lower”
(iv) criteria
From basic indicators to systems of indicators
(2) characteristics of indicators
• reference point (or critical value)
• signpost arrow (comparison with previous performances)• signpost arrow (comparison with previous performances)
• best practice (a model to be followed)
Indicator: blood pressure
(iv) criteria
From basic indicators to systems of indicators
(2) characteristics of indicators
requires
a wide consensus (not easy to be reached)a wide consensus (not easy to be reached)
and
involves
cultural paradigms, normative demands, expert
groups’ pressure, shared wishful ideas
(v) quality
From basic indicators to systems of indicators
(2) characteristics of indicators
clear, meaningful, consistent in describing the conceptual model s and in
relating to the defined aims and objectives
ACCURACY AND VALIDITY METHODOLOGICAL
SOUNDNESS appropriate, exhaustive, pertinent
in meeting requirements underlying its
construction (knowing, monitoring, evaluating,
accounting, …)
repeatable, robust PRECISION
AN
INDICATOR
SHOULD BE
repeatable, robust in measuring the underlying concept with a
degree of distortion as low as possible
PRECISION
reproducible, stable RELIABILITY
transparent, ethically correct in data collection and dissemination OBJECTIVITY INTEGRITY
relevant, credible in meeting users’ needs APPROPRIATENESS
SERVICEABILITY
practicable, up-to-datable, thrifty in observing through realistic efforts and costs in
terms of development and data collection
PARSIMONY
well-timed, timely, punctual In reporting the results with a short length of
time between observation and communication
AVAILABILITY periodic, regular In observing the phenomenon over time (for
example, short time between observation and
data availability)
discriminant, disagregable, in recording differences and disparities between
units, groups, geographical areas and so on
COMPARABILITY
accessible, interpretable,
comprehensible, simple,
manageable
in being findable, accessible, useable,
analyzable, and interpretable
USABILITY ACCESSIBILITY
Managing indicators: instructions for use
a riska challenge
Managing indicators: instructions for use
a need
Managing indicators: instructions for use
a challenge
��
complexity
Managing indicators: instructions for use
(1) A challenge: complexity
-Multidimensionalitydifferent aspects to be identified, not necessarily consistent
among them
-Nature-Nature- objective vs. subjective
- quantity vs. quality
-Levels of observation- micro vs. macro
-Dynamics- internal levels vs. external conditions
- trends, not necessarily linear
- relationships between phenomena
Managing indicators: instructions for use
a challenge
��
complexity
making relative�
a need
Managing indicators: instructions for use
(2) A need: making relative
From the conceptual point of view
in terms of
•consistency with the reference concept•consistency with the reference concept
•adequacy with reference to territory
e.g., nr. of beds in hospital
Managing indicators: instructions for use
(2) A need: making relative
Making relative has strong implications with reference
to comparability of indicators
Managing indicators: instructions for use
(2) A need: making relative
Making relative has strong implications with reference
to comparability of indicators
over time across
territories /
areas
between
groups
concepts
data
analysis
Managing indicators: instructions for use
(2) A need: making relative
Making relative has strong implications with reference
to statistical treatment of indicators
equivalenceinvariance
- Sampling design
- Questionnaire design
- Data collection method
- …
Managing indicators: instructions for use
(2) A need: making relative
Making relative has strong implications with reference
to statistical treatment of indicators
normalization
Which should consider:
• data properties
• original meaning of indicators
• values to be emphasized or penalized
• whether or not absolute value are used
• whether or not cases are compared to each other or to a reference unit
• whether or not units are evaluated across time
Managing indicators: instructions for use
a challenge
�a risk
��
complexity
making relative�
a need
�
reductionism
Managing indicators: instructions for use
(3) A risk: reductionism
reductionism� �� �
unavoidable dangerous
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
The complexity of the system of indicators may require approaches allowing more synthetic
views through more comprehensive measures ...
… by taking into account that
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
… by taking into account that
all elements included in the system are
organically integrated
�they are not chosen independently
Managing indicators: instructions for use
(3) A risk: reductionism
Reducing data complexity
System of indicators
to be considered as integral part of the process
leading to indicators development
Managing indicators: instructions for use
(3) A risk: reductionism
Reducing data complexity
System of indicators
�
Solutions
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
(a) reducing the number of indicators
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
(a) reducing the number of indicators
��Need of a solid conceptual framework
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
(a) reducing the number of indicators
�� Statistical rational � correlations
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
(a) reducing the number of indicators
�� Statistical rational � correlations
nr. of firemen vs. amount of damages of fires
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
(b) synthesizing indicators
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
(b) synthesizing indicators
�� Statistical rational � correlations
e.g. composite indicators
Managing indicators: instructions for use
(3) A risk: reductionism
System of indicators
(b) synthesizing indicators answer the call by 'policy makers' for condensed information
improve the chance to get into the media
allow to make multi-dimensional phenomena uni-dimensional
allow situations across time more to be easily compared
compare cases (e.g. countries) in a transitive way (ranking and benchmarking)
allow clear cut answers to defined questions (related to change across time, difference
between groups of population or comparison between cities, countries, and so on)
(Noll, 2009)
2.
Dealing with syntheses
in a system of indicators
A synthesis should be meaningful
Introduction
A synthesis should be meaningful
�
Introduction
Meaningful � telling stories
This is a meaningful
synthesis
Introduction
This is a meaningful
synthesis
Introduction
This is a meaningful
synthesis
Introduction
Is this is a meaningful
synthesis?
Introduction
What makes a synthesis meaningful
Introduction
What makes a synthesis meaningful
Conceptual framework
Introduction
Conceptual framework
Introduction
producing a
Introduction
System of indicators
Statistics offer many instruments
Introduction
Statistics offer many instruments
aimed at synthesizing …
Introduction
117
Instruments
Introduction
But
synthesizing indicators is not
Introduction
synthesizing indicators is not only a matter of having analytical instruments
Aspects of the system that can be synthesized
Introduction
synthesis of units (cases, subjects, etc.)synthesis of units (cases, subjects, etc.)
synthesis of basic indicators
aggregating the individuals' value of indicators
observed at micro level
aggregating the values referring to several
indicators for each unit (micro or macro)
Aspects of the system that can be synthesized
Introduction
each synthetic columns � indicators
each
case
synthetic
value
aggregation
goes through
with
reference
to
in
order
to
obtain
rows � units each
indicator macro-unit
Synthesizing units
Synthesizing units
MACRO-UNITS DEFINITION
Generally, pre-existent / pre-defined partitions, such as Generally, pre-existent / pre-defined partitions, such as
- identified groups (social, generation, etc.)
- areas (geographical, administrative, etc.)
- time periods (years, decades, etc.)
Synthesizing units
Looking for a value synthesizing the distribution
Generally
synthesis obtained by averaging individuals’ values synthesis obtained by averaging individuals’ values
at the level of interest (country, region, social group, and so on)
Synthesizing units
Looking for a value synthesizing the distribution
Generally
synthesis obtained by averaging individuals’ values synthesis obtained by averaging individuals’ values
at the level of interest (country, region, social group, and so on)
Which value?
Synthesizing units
Nature of data Value Index
different indexes according to nature of data
Looking for a value synthesizing the distribution
Nature of data Value Index
qualitative
disjointed � label mode
ordinal � natural /
conventional order median
quantitative
(additive)
discrete � natural number mean
continuous � real number mean
Synthesizing units
Looking for a value synthesizing the distribution
Synthesizing indicators
Different perspectives to be taken into
account:
Synthesizing indicators
conceptual design that guided in defining the indicators
(variables, dimensions, domains)
theoretical definition of the indicators (reflective or
formative indicators)
technical issues of synthesis (weighting, aggregation
techniques)
a. Conceptual perspective
b. Model-of-measurement perspective
c. Technical perspective
Synthesizing indicators
c. Technical perspective
a. Conceptual perspective
b. Model-of-measurement perspective
c. Technical perspective
Synthesizing indicators
c. Technical perspective
Consistent application of the hierarchical
design
a. Conceptual perspective
Synthesizing indicators
�
complex structurecomplex structure
- variables
- indicators
Synthesizing indicatorsa. Conceptual perspective
- indicators
- domains
- cases
- etc.
Synthesizing indicatorsa. Conceptual perspective
DOMAINS
1 2 3 ..
variable i
dimension 1
sub-dimension 1
indicator A indicator A indicator A
indicator B indicator B indicator B
sub-dimension 2
indicator C indicator C indicator C
… … …
indicator … indicator … indicator …
CONCEPTUAL
MODEL
dimension 2
indicator … indicator … indicator …
indicator … indicator … indicator …
… … …
dimension … indicator … indicator … indicator …
variable ii dimension …
indicator … indicator … indicator …
… … …
variable iii dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
The synthesis can be achieved through different perspectives:
Synthesizing indicatorsa. Conceptual perspective
for each uni-dimensional variable
- for a single domain (e.g., satisfaction at work)
- across domains (e.g., life satisfaction � all domains)
perspectives
Synthesizing indicatorsa. Conceptual perspective
for each multidimensional variable
- for a single domain (e.g., subjective wellbeing at work)
- across domains (e.g., subjective wellbeing � all domains)
across variables
- for a single domain (e.g., work � all concepts)
perspectives
for each uni-dimensional variable
- for a single domain (e.g., satisfaction at work)
- across domains (e.g., life satisfaction � all domains)
perspectives
Synthesizing indicatorsa. Conceptual perspective
for each multidimensional variable
- for a single domain (e.g., subjective wellbeing at work)
- across domains (e.g., subjective wellbeing � all domains)
across variables
- for a single domain (e.g., work � all concepts)
perspectives
DOMAINS
1 2 3 ..
variable i
dimension 1
sub-dimension 1
indicator A indicator A indicator A
indicator B indicator B indicator B
sub-dimension 2
indicator C indicator C indicator C
… … …
indicator … indicator … indicator …
Synthesizing indicatorsa. Conceptual perspective
CONCEPTUAL
MODEL
dimension 2
indicator … indicator … indicator …
indicator … indicator … indicator …
… … …
dimension … indicator … indicator … indicator …
variable ii dimension …
indicator … indicator … indicator …
… … …
variable iii dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
for each uni-dimensional variable
- for a single domain (e.g., satisfaction at work)
- across domains (e.g., life satisfaction � all domains)
perspectives
Synthesizing indicatorsa. Conceptual perspective
for each multidimensional variable
- for a single domain (e.g., subjective wellbeing at work)
- across domains (e.g., subjective wellbeing � all domains)
across variables
- for a single domain (e.g., work � all concepts)
perspectives
DOMAINS
1 2 3 ..
variable i
dimension 1
sub-dimension 1
indicator A indicator A indicator A
indicator B indicator B indicator B
sub-dimension 2
indicator C indicator C indicator C
… … …
indicator … indicator … indicator …
Synthesizing indicatorsa. Conceptual perspective
CONCEPTUAL
MODEL
dimension 2
indicator … indicator … indicator …
indicator … indicator … indicator …
… … …
dimension … indicator … indicator … indicator …
variable ii dimension …
indicator … indicator … indicator …
… … …
variable iii dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
for each uni-dimensional variable
- for a single domain (e.g., satisfaction at work)
- across domains (e.g., life satisfaction � all domains)
perspectives
Synthesizing indicatorsa. Conceptual perspective
for each multidimensional variable
- for a single domain (e.g., subjective wellbeing at work)
- across domains (e.g., subjective wellbeing � all domains)
across variables
- for a single domain (e.g., work � all concepts)
perspectives
DOMAINS
1 2 3 ..
variable i
dimension 1
sub-dimension 1
indicator A indicator A indicator A
indicator B indicator B indicator B
sub-dimension 2
indicator C indicator C indicator C
… … …
indicator … indicator … indicator …
Synthesizing indicatorsa. Conceptual perspective
CONCEPTUAL
MODEL
dimension 2
indicator … indicator … indicator …
indicator … indicator … indicator …
… … …
dimension … indicator … indicator … indicator …
variable ii dimension …
indicator … indicator … indicator …
… … …
variable iii dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
for each uni-dimensional variable
- for a single domain (e.g., satisfaction at work)
- across domains (e.g., life satisfaction � all domains)
perspectives
Synthesizing indicatorsa. Conceptual perspective
for each multidimensional variable
- for a single domain (e.g., subjective wellbeing at work)
- across domains (e.g., subjective wellbeing � all domains)
across variables
- for a single domain (e.g., work � all concepts)
perspectives
DOMAINS
1 2 3 ..
variable i
dimension 1
sub-dimension 1
indicator A indicator A indicator A
indicator B indicator B indicator B
sub-dimension 2
indicator C indicator C indicator C
… … …
indicator … indicator … indicator …
Synthesizing indicatorsa. Conceptual perspective
CONCEPTUAL
MODEL
dimension 2
indicator … indicator … indicator …
indicator … indicator … indicator …
… … …
dimension … indicator … indicator … indicator …
variable ii dimension …
indicator … indicator … indicator …
… … …
variable iii dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
for each uni-dimensional variable
- for a single domain (e.g., satisfaction at work)
- across domains (e.g., life satisfaction � all domains)
perspectives
Synthesizing indicatorsa. Conceptual perspective
for each multidimensional variable
- for a single domain (e.g., subjective wellbeing at work)
- across domains (e.g., subjective wellbeing � all domains)
across variables
- for a single domain (e.g., work � all concepts)
perspectives
DOMAINS
1 2 3 ..
variable i
dimension 1
sub-dimension 1
indicator A indicator A indicator A
indicator B indicator B indicator B
sub-dimension 2
indicator C indicator C indicator C
… … …
indicator … indicator … indicator …
Synthesizing indicatorsa. Conceptual perspective
CONCEPTUAL
MODEL
dimension 2
indicator … indicator … indicator …
indicator … indicator … indicator …
… … …
dimension … indicator … indicator … indicator …
variable ii dimension …
indicator … indicator … indicator …
… … …
variable iii dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
Sometimes,
somebody wishes
Synthesizing indicatorsa. Conceptual perspective
somebody wishes
to synthesize …
DOMAINS
1 2 3 ..
variable i
dimension 1
sub-dimension 1
indicator A indicator A indicator A
indicator B indicator B indicator B
sub-dimension 2
indicator C indicator C indicator C
… … …
indicator … indicator … indicator …
Synthesizing indicatorsa. Conceptual perspective
CONCEPTUAL
MODEL
dimension 2
indicator … indicator … indicator …
indicator … indicator … indicator …
… … …
dimension … indicator … indicator … indicator …
variable ii dimension …
indicator … indicator … indicator …
… … …
variable iii dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
variable … dimension … indicator … indicator … indicator …
Sometimes,
somebody wishes
Synthesizing indicatorsa. Conceptual perspective
somebody wishes
to synthesize …
… too much …
Sometimes,
somebody wishes
Synthesizing indicatorsa. Conceptual perspective
somebody wishes
to synthesize …
… too much …
a. Conceptual perspective
b. Model-of-measurement perspective
c. Technical perspective
Synthesizing indicators
c. Technical perspective
Synthesizing indicators
�
b. Model-of-measurement perspective
Synthesizing indicators
relationship between
variable and indicators
Synthesizing indicators
�
Synthesizing indicatorsb. Model-of-measurement perspective
relationship between
variable and indicators
model of measurement
Synthesizing indicatorsb. Model-of-measurement perspective
Conceptualmodel
model of measurement
variable I
dimension1
B.I.
dimension2
B.I. a B.I. b B.I. c
variable II
dimension1
B.I. a B.I. b B.I. c
indicators � functions of latent variable
reflective
Synthesizing indicatorsb. Model-of-measurement perspective
explanatory
perspective � top-down
�changes in the latent variable are reflected in
changes in the observable indicators
indicators � causal in nature
formative
Synthesizing indicatorsb. Model-of-measurement perspective
explanatory
perspective � bottom-up
�changes in the indicators determine changes
in the definition / value of the latent variable
Aggregation based upon latent variables
When basic indicators are
- Reflective � high correlations
- variables difficult to observe
Synthesizing indicatorsb. Model-of-measurement perspective
- variables difficult to observe
- synthesis easily interpretable
- Formative � no correlation
- variables difficult to interpret
- variable with normative meaning
Aggregation based upon latent variables
The hypothesis can be tested through structural modelswhich
Synthesizing indicatorsb. Model-of-measurement perspective
which
• can estimate correlations between latent variables
• can not produce individual scores (undetermined
scores)
a. Conceptual perspective
b. Model-of-measurement perspective
c. Technical perspective
Synthesizing indicators
c. Technical perspective
Two different general approaches
- aggregative-compensative techniques
based on correlations (reflective approach)
c. Technical perspective
Synthesizing indicators
based on correlations (reflective approach)
based on weights (formative approach)
- non-aggregative synthetic techniques
based on discrete mathematics
aggregative-compensative techniques
Synthesizing indicatorsc. Technical perspective
techniques
1. level of aggregation (micro/macro level)
2. dimensionality (dimensional analysis)
Synthesizing indicatorsc. Technical perspective
2. dimensionality (dimensional analysis)
3. importance of each indicator (weighting criteria)
4. aggregating approach (aggregation technique)
1. level of aggregation (micro/macro level)
2. dimensionality (dimensional analysis)
Synthesizing indicatorsc. Technical perspective
2. dimensionality (dimensional analysis)
3. importance of each indicator (weighting criteria)
4. aggregating approach (aggregation technique)
formative indicators
1. Level of aggregation
In many cases, indicators observed at micro (individual level)
can be synthesized at
Synthesizing indicatorsc. Technical perspective
can be synthesized at
micro level or macro level
1. Level of aggregation
Example
Variable � subjective wellbeing
Synthesizing indicatorsc. Technical perspective
Variable � subjective wellbeing
Dimension � affective
Indicators � positive affects and negative affects
Synthesis � affect balance
Computation
micro version � (positive affects) – (negative affects)
macro version � (positive affects) – (negative affects)
2. Dimensionality
Indicators are developed through a conceptual
Synthesizing indicatorsc. Technical perspective
concept � variable � dimension � indicator
Indicators are developed through a conceptual
process
2. Dimensionality
Testing dimensionality
� testing model of measurement
Synthesizing indicatorsc. Technical perspective
indicator � dimension � variable � concept
� testing model of measurement
� testing level of complexity
in order to evaluate the synthesis approach
2. Dimensionality
Highly correlated indicators
Reflective � indicators refer to the same
Synthesizing indicatorsc. Technical perspective
Reflective � indicators refer to the same
conceptual dimension
Consistency
They can be synthesized
2. Dimensionality
Highly correlated indicators
Formative � redundant indicators
Synthesizing indicatorsc. Technical perspective
Formative � redundant indicators
Redundancy
two indicators highly correlated are
considered redundant
Recommendation � select only one
2. Dimensionality
• Correlation Analysis
• Principal Component Analysis
Synthesizing indicatorsc. Technical perspective
• Principal Component Analysis
• Multidimensional Scaling
• Cluster Analysis
• Factor Analysis
• Item Response Theory
reflective approach
3. Importance
weighting system
Weight � indicator’s importance in measuring the
Synthesizing indicatorsc. Technical perspective
Weight � indicator’s importance in measuring the
conceptual dimension
3. Importance
weighting system
Weight � indicator’s importance in measuring the
Synthesizing indicatorsc. Technical perspective
Weight � indicator’s importance in measuring the
conceptual dimension
who / what defines the importance�
3. Importance
Decisions
proportional size of weights
Synthesizing indicatorsc. Technical perspective
proportional size of weights
�
equalequal or differentialdifferential weighting
3. Importance
Decisions
approach to obtain weights
Synthesizing indicatorsc. Technical perspective
approach to obtain weights
�
objectiveobjective or subjectivesubjective
3. Importance
Decisions
level for obtaining and applying weights
Synthesizing indicatorsc. Technical perspective
level for obtaining and applying weights
�
individualindividual or groupgroup
4. Aggregating approach
Choosing the aggregation technique
Synthesizing indicatorsc. Technical perspective
criteria � does the aggregating technique …
� admit compensability among the indicators?
� require comparability among indicators? (direction and distribution)
� require homogeneity in indicators’ levels of
measurement?
1. testing the robustness: capacity of the synthesis to
produce correct and stable measures
• uncertainty analysis
More over …
Synthesizing indicatorsc. Technical perspective
• uncertainty analysis
• sensitivity analysis
Testing robustness
�
evaluating role and consequences of choices made with
reference to:
Synthesizing indicatorsc. Technical perspective
reference to:
• procedure for data management (missing data imputation, data
standardization, ...)
• weights definition adopted
• aggregation technique applied
Testing robustness
�
evaluating role and consequences of made choices with
reference to:
Synthesizing indicatorsc. Technical perspective
reference to:
• procedure for data management (missing data imputation, data
standardization, ...)
• weights definition adopted
• aggregation technique used
This process can be included in the wider field of “what-if” analysis
Testing robustness
�Two stages
Synthesizing indicatorsc. Technical perspective
1. uncertainty analysis
to analyze to what extent the
synthetic indicator depends on
information composing it.
each individual score � scenario
scenario � combination of choices
producing a certain synthetic value
Testing robustness
�Two stages
Synthesizing indicatorsc. Technical perspective
1. uncertainty analysis
to analyze to what extent the
synthetic indicator depends on
information composing it.
each individual score � scenario
scenario � combination of choices
producing a certain synthetic value
Testing robustness
�Two stages
Synthesizing indicatorsc. Technical perspective
2. sensitivity analysis
to evaluate the contribution of each
identified source of uncertainty by
decomposing the total variance of the
synthetic score obtained.
Testing robustness
�Two stages
Synthesizing indicatorsc. Technical perspective
2. sensitivity analysis
to evaluate the contribution of each
identified source of uncertainty by
decomposing the total variance of the
synthetic score obtained.
composite indicator
formula (12%)
selection of sub-indicators
(20%)
data selection (25%)
data editing (8%)
data normalisation
(5%)
weighting scheme (15%)
weights' values (15%)
Testing robustness
�Two stages
Synthesizing indicatorsc. Technical perspective
2. sensitivity analysis
to evaluate the contribution of each
identified source of uncertainty by
decomposing the total variance of the
synthetic score obtained.
to tests how much the synthetic score
is sensitive to different choices (small
differences reveal low sensitivity)
Testing robustness
�Two stages
Synthesizing indicatorsc. Technical perspective
2. sensitivity analysis
to evaluate the contribution of each
identified source of uncertainty by
decomposing the total variance of the
synthetic score obtained.
to tests how much the synthetic score
is sensitive to different choices (small
differences reveal low sensitivity)
1. testing the robustness: capacity of the synthesis to
produce correct and stable measures
• uncertainty analysis
More over …
Synthesizing indicatorsc. Technical perspective
• uncertainty analysis
• sensitivity analysis
2. testing the discriminant capacity: capacity of the final
score to discriminate between cases (e.g., cut-points,
cut-offs, …)
Synthesizing indicatorsc. Technical perspective
Exploring the capacity of the synthetic score in
• discriminating between cases and/or groups (statistical tests)
• distributing all the cases without any polarization
• identifying particular values or reference scores
cut-point � continuous data
cut-off � discrete data
Conceptual, methodological and technical
criticisms
Synthesizing indicatorsd. Criticisms
Conceptual, methodological and technical
criticisms
�
Synthesizing indicatorsd. Criticisms
�Aggregative –compensative approach is not able
- to reflect the complexity of a phenomenon
- to capture the complexity of variables’
relationships
Who is in favor of aggregative-compensative
approach
�
Synthesizing indicatorsd. Criticisms
�- objectively built
- easy to manage
- easy to communicate
Methodology is far from being aseptic
Each stage introduces some degree of arbitrariness in
taking decisions concerning
a. Data metrics
Synthesizing indicatorsd. Criticisms
a. Data metrics
b. Indicators selection
c. Weights definition
d. Indicators aggregation
e. Indicators assessment
Methodology is far from being aseptic
Each stage introduces some degree of arbitrariness in
taking decisions concerning
a. Data metrics
Synthesizing indicatorsd. Criticisms
a. Data metrics
b. Indicators selection
c. Weights definition
d. Indicators aggregation
e. Indicators assessment
Ordinal data
treated as
cardinal
Methodology is far from being aseptic
Each stage introduces some degree of arbitrariness in
taking decisions concerning
a. Data metrics
Synthesizing indicatorsd. Criticisms
a. Data metrics
b. Indicators selection
c. Weights definition
d. Indicators aggregation
e. Indicators assessment
Use of
multidimensional
analysis
Methodology is far from being aseptic
Each stage introduces some degree of arbitrariness in
taking decisions concerning
a. Data metrics
Synthesizing indicatorsd. Criticisms
a. Data metrics
b. Indicators selection
c. Weights definition
d. Indicators aggregation
e. Indicators assessment
Objective
vs.
subjective
Methodology is far from being aseptic
Each stage introduces some degree of arbitrariness in
taking decisions concerning
a. Data metrics
Synthesizing indicatorsd. Criticisms
a. Data metrics
b. Indicators selection
c. Weights definition
d. Indicators aggregation
e. Indicators assessment
• No homogeneity
• Compensation
• Numerical weights
Methodology is far from being aseptic
Each stage introduces some degree of arbitrariness in
taking decisions concerning
a. Data metrics
Synthesizing indicatorsd. Criticisms
a. Data metrics
b. Indicators selection
c. Weights definition
d. Indicators aggregation
e. Indicators assessmentAre the compared
combination really
comparable?
4. Aggregating approach
Question
Synthesizing indicatorsc. Technical perspective
… and with ordinal data?
Non-aggregative synthetic techniques
Synthesizing indicatorsc. Technical perspective
techniques
Main goal
Assessing variables in a multidimensional ordinal setting by:
Synthesizing indicatorsc. Technical perspective
setting by:
1) respecting the ordinal nature of the data
2) avoiding any aggregation among indicators (i.e., no
composite is computed)
3) producing a synthetic index
Main tool
Partial Order Theory
1. The focus is not on dimensions, but on «profiles» �
Synthesizing indicatorsc. Technical perspective
1. The focus is not on dimensions, but on «profiles» �combination of ordinal scores, describing the
«status» of an individual.
2. Profiles � mathematically described and analyzed
through Partially Ordered Set (Poset) Theory,
instead of using classical linear algebra tools
(variances, correlations,…)
Main tool
Partial Order Theory
By considering all this, new challenges and perspectives can be identified in
Synthesizing indicatorsc. Technical perspective
By considering all this, new challenges and perspectives can be identified in order to improve the technical strategies allowing the social indicators to be constructed and managed, with reference to the challenging issues in managing the complexity in terms of both:• reducing data structure in order to aggregate and combining both
observational units (from micro-units to macro-units) and indicators• correctly and significantly communicating the “picture” obtained through the
indicators (correctly presenting results).
Partial Order Theory
Let’s consider
-three indicators d , d , d
Synthesizing indicatorsc. Technical perspective
-three indicators d1, d2, d3
-their ordinal degrees 1, 2, 3, 4
(d1, d2, d3)
� individual profile
Partial Order Theory
How many profiles with 3 four-degree
indicators?
Synthesizing indicatorsc. Technical perspective
indicators?
4 x 4 x 4 = 64
profile combinations
Partial Order Theory
We cannot
- sum up the degrees
Synthesizing indicatorsc. Technical perspective
- sum up the degrees
- manipulate them using classical statistical tools
- compute aggregated indicators
- …
Partial Order Theory
We can say that:
• some profiles are better than others, for example
Synthesizing indicatorsc. Technical perspective
• some profiles are better than others, for example
(4,3,3) is better than (2,3,2)
• some profiles are incomparable, for example (4,3,3)
and (3,3,4). In fact the first profile is better on the first component, but it is worse on
the third. So, which the best of the two? We cannot say!
Partial Order Theory
So, some profiles can be ordered, others cannot. What we get
is a
Synthesizing indicatorsc. Technical perspective
is a
PARTIAL ORDER
(«partial» since not any pair of profiles can be ordered)
We can represent the partial order of the well-being profiles very
effectively, by means of a graph, called «Hasse diagram» (from the name
of the German mathematician Helmut Hasse)
Partial Order Theory
Only profiles linked by a
descending path can be
ordered.
Synthesizing indicatorsc. Technical perspective
ordered.
Two profiles not linked by
a descending path are
incomparable, even if
they belong to different
«layers»
The meaning of the colors will be explained later
Partial Order Theory
Only profiles linked by a
descending path can be
ordered.
Incomparable profiles
Comparable profiles
Best profile
Synthesizing indicatorsc. Technical perspective
ordered.
Two profiles not linked by
a descending path are
incomparable, even if
they belong to different
«layers»
The meaning of the
colors will be explained
laterWorse profile
Partial Order Theory
In order to make incomparable profiles, comparable, a
threshold should be defined.
Synthesizing indicatorsc. Technical perspective
threshold should be defined.
e.g.:
(1,2,2) (2,1,2) (2,2,1)
Partial Order Theory
How to establish the threshold?
Synthesizing indicatorsc. Technical perspective
How to establish the threshold?
�normative problem
Partial Order Theory
Green: profile
above the
threshold.
Synthesizing indicatorsc. Technical perspective
threshold.
Black: ambiguous
profile.
Red: profiles on or
below the
threshold.Threshold
Threshold
Partial Order Theory
The idea is to quantify this «relational position» in the graph.
Synthesizing indicatorsc. Technical perspective
This idea can be formalized within partial order theory.
What is important to stress here is that this way we get a
synthetic measure of deprivation
WITHOUT AGGREGATING ORDINAL SCORES
Final remarksFinal remarks
all elements composing the process should be
organically integrated �
Final remarks
�they cannot be chosen independently
However, sometimes they appear unconnected
in particular
Final remarks
However, sometimes they appear unconnected
in particular
Final remarks
data and statistical approaches are not able to
rebuild the conceptual framework
However, sometimes they appear unconnected
in particular
Final remarks
data and statistical approaches are not able to
rebuild the conceptual framework
�This is obtained through subsequent
approximations of the whole process
� - - - - - - - - - - - - - - - - ?
Is this acceptable in a complex structure?
How can we build a correct process without
Final remarks
How can we build a correct process without
producing a chain of approximations?
In order to obtain a meaningful and
interpretable picture
Final remarks
interpretable picture
data should be managed in some way…
in other words …
Indicators obtained through the hierarchical
design are and remain a
Final remarks
design are and remain a
complex system
ManyMany thanksthanks forfor youryourattentionattention!!!!
attentionattention!!!!