Software Dynamics: New Method of Evaluating Real-Tim Performance of Distributed Systems Janusz Zalewski Computer Science Florida Gulf Coast University Ft. Myers, FL 33965-6565 http://www.fgcu.edu/zalewski/ FALSE2002, Nashville, Nov. 14- 15, 2002
Jan 02, 2016
Software Dynamics:A New Method of Evaluating Real-Time
Performance of Distributed Systems
Janusz Zalewski Computer Science
Florida Gulf Coast University
Ft. Myers, FL 33965-6565
http://www.fgcu.edu/zalewski/
FALSE2002, Nashville, Nov. 14-15, 2002
Talk Outline• RT Software Architecture
• Evaluating S/W Architectures
• Timeliness & S/W Dynamics
• Conclusion
FALSE2002, Nashville, Nov. 14-15, 2002
• Sensor/Actuator component
• User Interface component• Communication Link component• Database component• Processing component• Timing component.
Basic Components of Real-TimeBasic Components of Real-TimeSoftware ArchitectureSoftware Architecture
FALSE2002, Nashville, Nov. 14-15, 2002
The idea of grouping I/O information into different categories, which later determine the software architecture follows the fundamental software engineering principle of separation of concerns (Parnas, 1970s).
FALSE2002, Nashville, Nov. 14-15, 2002
We are missing good (any) measures to characterize Behavioral Properties of a software module (its dynamics).
FALSE2002, Nashville, Nov. 14-15, 2002
Interrupt Latency
The time interval between the occurrence of an external event and start of the first instruction of the interrupt service routine.
FALSE2002, Nashville, Nov. 14-15, 2002
• H/W logic processing
• Interrupt disable time
• Handling higher H/W priorities
• Switching to handler code.
Interrupt Latency Involves
FALSE2002, Nashville, Nov. 14-15, 2002
Dispatch Latency
The time interval between the end of the interrupt handler code and the first instruction of the process activated (made runnable) by this interrupt.
FALSE2002, Nashville, Nov. 14-15, 2002
Dispatch Latency Involves• OS decision time to reschedule (non-preemptive kernel state)
• context switch time
• return from OS call.
FALSE2002, Nashville, Nov. 14-15, 2002
Real-Time Properties
* Responsiveness
* Timeliness
* Schedulability
* Predictability
FALSE2002, Nashville, Nov. 14-15, 2002
How to measure these properties?* Responsiveness - just outlined* Timeliness - proposed below* Schedulability - rate monotonic and deadline monotonic analyses.
FALSE2002, Nashville, Nov. 14-15, 2002
Two measures of timeliness:* Overall time deadlines are missed (by a task)* Number of times deadlines are missed by X percent
FALSE2002, Nashville, Nov. 14-15, 2002
Overall time the deadlines are missed for 100 experiments (CORBA).
FALSE2002, Nashville, Nov. 14-15, 2002
Overall time (in milliseconds) deadlines are missed for 20 aircraft (in 100 experiments).
FALSE2002, Nashville, Nov. 14-15, 2002
Number of times deadlines are missed by more than 20% for 20 aircraft (in 100 experiments).
FALSE2002, Nashville, Nov. 14-15, 2002
FSI TLM Dish MicrodyneCombiner
3200
MicrodyneRx'er 1200
Internet Gateway
WorkstationWorkstation
DB Server
MUX/DEMUX
Workstation
SGI Origin 2000
Workstation Workstation
FutureExpansion
IEEE 1394 Splitter
Bit Sync
VME Bus
Analog I/O(AIO) Subsystem
Frontend(FEP) Subsystem
Realtime Processing(RTP) Subsystem
Satellite Ground Control Station
FALSE2002, Nashville, Nov. 14-15, 2002
One DB Client - 100 requests
0
100
200
300
400
500
600
700
800
0 20 40 60 80 100
Request Number
Tim
e t
o p
roc
es
s r
eq
ue
st
(ms
ec
)
Single DB Client Request Processing Time.
FALSE2002, Nashville, Nov. 14-15, 2002
% Deadlines Missed (one DB Client)
0
20
40
60
80
100
120
0 100 200 300 400 500 600 700 800 900
Deadline set (msec)
% D
ead
lin
es m
isse
d
Percent of deadlines missed for one DB Client.
FALSE2002, Nashville, Nov. 14-15, 2002
5 DB Clients - 100 requests
600
800
1000
1200
1400
1600
1800
2000
0 20 40 60 80 100
Request Number
Tim
e to
pro
cess
req
ues
t (m
sec)
Client #1
Client #2
Client #3
Client #4
Client #5
Five DB Clients Request Processing Time.
FALSE2002, Nashville, Nov. 14-15, 2002
% Deadlines Missed (5 DB Clients)
0
20
40
60
80
100
120
700 900 1100 1300 1500 1700 1900
Deadline set (msec)
% D
ead
lin
es M
isse
d
Client #1
Client #2
Client #3
Client #4
Client #5
Percent of deadlines missed for five DB Clients.
FALSE2002, Nashville, Nov. 14-15, 2002
Sensitivity:a measure of the magnitude of system’s response to changes.
FALSE2002, Nashville, Nov. 14-15, 2002
Sensitivity (Database)
0
20
40
60
80
100
120
0 100 200 300 400 500 600 700 800 900
Deadline set (msec)
% D
ead
lin
es m
isse
dSensitivity = 1.73
FALSE2002, Nashville, Nov. 14-15, 2002
Sensitivity (Telemetry)
0
20
40
60
80
100
120
0 50 100 150 200 250 300 350 400 450
Deadline Set (msec)
% D
ead
lin
es M
isse
dSensitivity = 1.00
FALSE2002, Nashville, Nov. 14-15, 2002
Sensitivity (GPS)
0
20
40
60
80
100
120
0 20 40 60 80 100
Deadline Set (msec)
% D
ead
lines
Mis
sed
Sensitivity = 1.64
FALSE2002, Nashville, Nov. 14-15, 2002
Time constant - :a measure of the speed of system’s response to changes.
FALSE2002, Nashville, Nov. 14-15, 2002
• Settling Time:
time when curve reaches 2% max
• Time Constant = 0.25 * Settling Time
FALSE2002, Nashville, Nov. 14-15, 2002
% Deadlines Missed (Database)
0
20
40
60
80
100
120
0 200 400 600 800 1000
Deadline Set (msec)
% D
ead
lin
es M
isse
d
Time Constant
Original
= 165 ms
FALSE2002, Nashville, Nov. 14-15, 2002
% Deadlines Missed (TDA)
0
20
40
60
80
100
120
0 100 200 300 400 500
Deadline Set (msec)
% D
ead
lin
es M
isse
d
Time Constant
Original
= 87.5 ms
FALSE2002, Nashville, Nov. 14-15, 2002
% Deadlines Missed (GPS)
0
20
40
60
80
100
120
0 20 40 60 80 100
Deadline set (msec)
% D
ea
dlin
es
Mis
se
d
Time Constant
Original
= 15 ms
FALSE2002, Nashville, Nov. 14-15, 2002
Applet Interface
IGVxWorks I
ITSVxWorks II
SMCSES/workbench
CGFObjecTime
Distributed Embedded Simulation Architecture
FALSE2002, Nashville, Nov. 14-15, 2002
Statistical measures of timeliness: * Round-trip time stability * Service time effect
FALSE2002, Nashville, Nov. 14-15, 2002
0100020003000400050006000700080009000
10000
0.01 0.1 1 10
Mean Service Time
Qu
eue
Len
gth
Queue Length for NonPeriodic Traffic with Inter-Arrival Time = 0.1
Queue Length for NonPeriodic Traffic with Inter-Arrival Time = 1.75
Queue Length for PeriodicTraffic Only
Service time effect for a specific architecture
FALSE2002, Nashville, Nov. 14-15, 2002
0
0.5
1
1.5
2
2.5
3
0 500 1000 1500 2000 2500 3000
Time Units to run simulation
Tim
e U
nit
s
Sending to SES fromVxWorks 1
Sending to SES fromVxWorks 2
Sending to ObjecTime fromSES
Round-trip message time for 5-task simulation
FALSE2002, Nashville, Nov. 14-15, 2002