Top Banner
Chapter Chapter 1 1 Introduction Introduction CONTENTS 1.1 DESCRIPTION OF THE QUEUEING PROBLEMS.................- 4 - 1.2 CHARACTERISTICS OF QUEUEING PROCESSES.................- 5 - 1.3 NOTATION......................................... - 9 - 1.4 MEASURING SYSTEM PERFORMANCE.......................- 10 - 1.5 SOME GENERAL RESULTS..............................- 11 - 1.5.1 LITTLES FORMULAS............................- 13 - 1.6 SIMPLE DATA BOOK KEEPING FOR QUEUES.................- 16 - 1.7 POISSON PROCESS AND THE EXPONENTIAL DISTRIBUTION......- 21 - 1.8 MARKOVIAN PROPERTY OF THE EXPONENTIAL DISTRIBUTION.....- 27 -
84
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Chapter 1

Chapter Chapter 11

IntroductionIntroduction

CONTENTS

1.1 DESCRIPTION OF THE QUEUEING PROBLEMS..............................................- 4 -

1.2 CHARACTERISTICS OF QUEUEING PROCESSES............................................- 5 -

1.3 NOTATION.......................................................................................................- 9 -

1.4 MEASURING SYSTEM PERFORMANCE........................................................- 10 -

1.5 SOME GENERAL RESULTS...........................................................................- 11 -

1.5.1 LITTLE’S FORMULAS........................................................................- 13 -

1.6 SIMPLE DATA BOOK KEEPING FOR QUEUES.............................................- 16 -

1.7 POISSON PROCESS AND THE EXPONENTIAL DISTRIBUTION.....................- 21 -

1.8 MARKOVIAN PROPERTY OF THE EXPONENTIAL DISTRIBUTION..............- 27 -

1.9 STOCHASTIC PROCESSES AND MARKOV CHAINS......................................- 29 -

1.9.1 MARKOV PROCESS............................................................................- 30 -

1.9.2 DISCRETE-PARAMETER MARKOV CHAINS......................................- 32 -

1.9.3 CONTINUOUS-PARAMETER MARKOV CHAINS.................................- 34 -

1.9.4 IMBEDDED MARKOV CHAINS...........................................................- 39 -

1.9.5 LONG-RUN BEHAVIOR OF MARKOV PROCESS: LIMITING

DISTRIBUTION, STATIONARY DISTRIBUTION, ERGODICITY............- 41 -

1.9.6 ERGODICITY......................................................................................- 45 -

1.10 STEADY-STATE BIRTH-DEATH PROCESSES.............................................- 50 -

Page 2: Chapter 1

2

Page 3: Chapter 1

Queueing System (排隊系統)

Wait in line at post office, supermarket, highway, …, etc.

Wait in line within a time-sharing computer system.

Wait in line before a server-farm system.

Wait in line before a switch/router system.

Wait in line before a protocol layer program module.

Wait in line in a statistical multiplexer.

Design Issues:

Design System Parameters so that the queueing system has

an optimal performance

Channel allocation in GSM/WCDMA/OFDMA systems.

Buffer design in switch/ router communication systems…

Bandwidth allocation for statistical multiplexer,

e.g. EPON system.

Scheduling in the server-farm or switch/router system.

3

Page 4: Chapter 1

Performance Measures

How long must a customer wait?

→ waiting time and system time

How many people will form in the line?

→ average queue length

How is the productivity of the server (counter)?

→ throughput, utilization

How much is the blocking probability? Or say how many

communication systems should be prepared?

“Queuing Theory” attempts to answer these questions through

detailed mathematical analysis.

4

Page 5: Chapter 1

1.1 DESCRIPTION OF THE QUEUEING PROBLEMS

Fig. - Conceptual Queuing Model

Arriving customer could be connection request asked for

connection setup and minimum data rate service.

The queueing model is commonly used in the traffic control,

scheduling, and system design.

5

Page 6: Chapter 1

1.2 CHARACTERISTICS OF QUEUEING PROCESSES

Arrival Pattern of Customers

The arrival pattern is measured in terms of probability

distribution or mean of arrival rate or interarrival time.

The arrival pattern may be one customer at a time or bulk.

In bulk-arrival situation, both the interarrival time and the

number of customers in the batch are probabilistic.

Customer reactions:

patient customers

balked (blocked) customer

reneged customer

jockey customer (switch from one line to another)

Stationary Process (time independent) vs. Non-stationary

Process (Not time independent, correlated)

Service Pattern

Service time distribution (random process)

service may be single or batch

state-independent service or state-dependent service

Impatient ustomers

Service timeService rate

6

Page 7: Chapter 1

Queue Discipline (Service Discipline)

First In, First Out (FIFO)

Last In, First Out (LIFO)

Selection for Service in random order (SIRO), independent

of time arrival to the queue, Most delay first serve (MDFS)

Priority

1. Preemptive

2. Non preemptive

The Round-Robin Service Discipline

(Operating System Theory, E.G. Coffman, Jr.)

7

Page 8: Chapter 1

The shortest-elapsed-time Discipline

System Capacity

In some queueing process there is a physical limitation to

the amount of waiting room (buffer size), i.e., system

capacity is finite

Fig. 1.2 - Multichannel queueing system

8

Page 9: Chapter 1

Number of Service Channels

Multiple channel queueing system

Routing and buffer sharing

Stages of Service

Fig. 1.3 - A multistage queueing system with feedback

Consider a communication network with ARQ recovery

control. It is a kind of multi-stage queueing system.

“Before performing any mathematical analysis, it is absolutely

necessary to describe adequately the process being modeled.

Knowledge of the aforementioned six characteristics is “Essential”

in the queueing analysis.”

9

Page 10: Chapter 1

1.3 NOTATION

Table 1.1 - Queueing Notation A/B/X/Y/Z

10

Page 11: Chapter 1

1.4 MEASURING SYSTEM PERFORMANCE

Blocking probability (Grade of service)

Waiting time or mean system time

(Quality-of-Service, requirement)

Mean queue length

(Average number of customers in the system)

System utilization (Throughput)

11

Page 12: Chapter 1

1.5 SOME GENERAL RESULTS

Consider the queuing model (G/G/1 or G/G/C)

Fig. - G/G/C Queueing Model

If , the queue will get bigger and bigger, the

queue never settles down, and there is no steady state.

If , unless the processes of arrivals and

service are deterministic and perfectly scheduled, no steady

state exists, since randomness will prevent the queue from ever

empty out and allowing the serves to catch up, time causing the

queue to grow without bound.

12

Page 13: Chapter 1

Notation Description

The number of customer in the system at time t

The number of customer in the queue at time t

The number of customer in the service at time t

System time

Waiting time

Service time

Consider C-Server queue

At steady state :

13

Page 14: Chapter 1

14

Page 15: Chapter 1

1.5.1 LITTLE’S FORMULAS

John D.C. Little related to ,

John D.C. Little related to ,

where , , and .

Little’s Formulas :

(1.1a)

(1.1b)

Fig. 1.4 - Busy-period sample path

15

Page 16: Chapter 1

16

Page 17: Chapter 1

A concept proof

(1.2a)

(1.2b)

(1.1a)

(1.1b)

Any queueing systems:

17

Page 18: Chapter 1

(1.3)

The expected number of customers in service in steady state.

For a single-server system :

For multiple-server system

For multiple server system,

Table 1.2 - Summary of General Result for G/G/C Queue

18

Page 19: Chapter 1

Little’s formula

Little’s formula

Expected-value argument

Busy probability of an arbitrary server

Expected number of customers in service ; Offered workload

Traffic intensity; workload to a server

19

Page 20: Chapter 1

1.6 SIMPLE DATA BOOK KEEPING FOR QUEUES

From fig. 1.5, we can see that

Fig. 1.5 – Sample path for queueing process

Thus using input data shown in Table 1.3

Table 1.3 – Input Data

20

Page 21: Chapter 1

We can have event-oriented book keeping shown on Table 1.4

Table 1.4 – Event-Oriented Bookkeeping

21

Page 22: Chapter 1

Column (2) : Arrival/Departure Customer i

Master Clock Time Arrival/DepartureCustomer i

0 1-A1 1-D2 2-A3 3-A5 2-D

Column (3) : Time of arrival i enters service

Column (4) : Time of arrival i leaves service22

Page 23: Chapter 1

Column (5) : Time in Queue:

Column (3) - Column (1)

Set

Column (6) : Time in System:

Column (4) - Column (1)

Column (7) : No. in Queue just after Master Clock Time:

A’s in Column (2) - D’s in Column (2) - 1

Column (8) : No. in System just after Master Clock Time:

A’s in Column (2) - D’s in Column (2)

23

Page 24: Chapter 1

Check Little’s Formula

Average time(by Column (6)) 70/12 = W

Average Queue Length (by Column (6))

70/31

Mean arrival rate

Average length

24

Page 25: Chapter 1

1.7 POISSON PROCESS AND THE EXPONENTIAL DISTRIBUTION

The most common stochastic queuing models assume that:

The arrival rate and service rate follow a Poisson distribution,

or equivalently, the interarrival times and service times obey the

exponential distribution.

Consider an arrival process , where denotes

the total number of arrivals up to t, with , and which

satisfies the following these assumptions:

1.

where is the arrival rate, and

2.

3. The number of arrivals in non overlapping intervals is

statistically independent.

tt t

t

25

Page 26: Chapter 1

Then, we have

(1.6)

(1.7)

We have

For the case , we have

26

Page 27: Chapter 1

where

27

Page 28: Chapter 1

Divide the above two equations by and as , we have

(1.11)

(1.12)

Then we have

We conjecture the general formula to be

(1.14)

28

Page 29: Chapter 1

That is a Poisson distribution.

This can be proven by mathematical induction.

29

Page 30: Chapter 1

We now show that if the arrival process follows the Poisson

distribution, the interarrival time follows the exponential

distribution.

Proof

Let T be the Random variable of “time between two arrivals”,

then

Let be the CDF of T,

Thus T has the exponential distribution with mean arrival time .

30

Page 31: Chapter 1

31

Page 32: Chapter 1

On the contrary, it can be shown that if the interarrival times

are independent and have the same

exponential distribution, then the arrival process follows the

Poisson distribution.

Proof

Let denote : CDF

Then

(1.15)

where is an Erlang distribution which is the sum

of n+1 independent and identical exponential random

variables.

Also

32

Page 33: Chapter 1

Let

Via manipulation, we have

That is a Poisson process.

※ The arrival time is uniformly distributed over the time axis.

33

Page 34: Chapter 1

1.8 MARKOVIAN PROPERTY OF THE EXPONENTIAL DISTRIBUTION

Proof

The exponential distribution is the only continuous

distribution which exhibits this memoryless property.

The proof of the above assertion rests on the fact that the

only continuous function solution of the equation

34

Page 35: Chapter 1

is the linear form

(1.18)

Proof of if , then

Proof

The memoryless property (1.17) can be rewritten as CCDF

(1.19)

Let

Take natural logarithm

Batch Poisson

35

Page 36: Chapter 1

Use the above-mentioned fact, we have

There are many possible and well-known general equations of

the Poisson/exponential process and will be taken up in great

detail in the text

36

Page 37: Chapter 1

1.9 STOCHASTIC PROCESSES AND MARKOV CHAINS

A stochastic process : , a family of random variables.

is defined over some index set or parameter space T.

T : Time range,

: state of the process at time t.

If T is a countable sequence, for example,

Then is said to be a discrete-parameter process

defined over the index set T.

If T is an interval, for example,

Then is called a continuous-parameter process

defined over the index set T.

Stationary:

37

Page 38: Chapter 1

Wide-sense stationary (W.S.S.):

independent of t

is a function of

38

Page 39: Chapter 1

1.9.1 MARKOV PROCESS

Continuous-parameter stochastic process: or

Discrete-parameter stochastic process:

is a Markov process

if for any set of n time points

in the index set or time range of the process, the conditional

distribution of , depends only on the immediately

preceding value . More precisely,

“Memoryless”: given the present, the future is independent of the

past.

Classification of Markov process:

According to:

(1) the nature of the parameter (index set, time range) space

of the process

(2) the nature of the state space of the process

39

Page 40: Chapter 1

Table 1.5 – Classification of Markov Processes

Parameter space (T)

State space Discrete Continuous

DiscreteDiscrete parameter

Markov chain

Continuous parameter

Markov chain

ContinuousDiscrete parameter

Markov process

Continuous parameter

Markov process

Semi-Markov Process (SMP) or Markov Renewal Process

(MRP)

If the time between two consecutive transitions T is an

arbitrary random variable, the process is called SMP.

If T is exponentially distributed for continuous parameter

cases, or geometrically distributed for discrete parameter

cases, then the SMP is reduced to Markov process.

40

Page 41: Chapter 1

1.9.2 DISCRETE-PARAMETER MARKOV CHAINS

The conditional probability

: Transition probability (single-step)

If these probabilities are independent of n, then the Markov chain

is called homogeneous chain.

: Transition Matrix

For homogenous chain, the m-step transition probability

are also independent of n.

From basic laws of probability,

In matrix notation

if ,

and then , where

Define the unconditional probability of state j at the mth trial by

: state probability at mth trial.

The initial distribution is given by .

Chapman-Kolmogorov (C-K) equations ustomers

41

Page 42: Chapter 1

, where

(1.24)

( →The sum of all rows equal to 0)

If limiting probability exists, at steady

state (Q→The sum of all rows equals to zero) or

42

Page 43: Chapter 1

43

Page 44: Chapter 1

1.9.3 CONTINUOUS-PARAMETER MARKOV CHAINS

, for and is countable for

.

From C-K equations, intuitively,

(1.25)

In matrix notation

Letting ,

44

Page 45: Chapter 1

: state probability of j at time t, regardless of starting state.

※ 讀課本 Poisson process的例子 p.30

45

Page 46: Chapter 1

Additional Theory:

If (1)

The prob. of change is linearly propositional to , with

propositionality constant which is a function of i and t.

(2)

Then we have Kolmogorov’s forward and backward equations;

respectively, by

(1.28)

46

Page 47: Chapter 1

Let and assume a homogeneous process as that

, for all t

Multiplying by

In matrix notation

, where and

At steady state, , where

Note that . ← It is intuitive

47

Page 48: Chapter 1

Since

we can have

and since

Q: Intensity matrix, where

For Poisson process (pure birth process)

48

Page 49: Chapter 1

與(1.12)相同

For birth-death process

The Poisson process is often called a pure birth process.

49

Page 50: Chapter 1

50

Page 51: Chapter 1

1.9.4 IMBEDDED MARKOV CHAINS

If we observe the system only at certain selected times, and the

process behaves like an ordinary Markov Chain, we say we have

an imbedded Markov Chain at those instants. (turn our attention

away from the truly continuous-parameter queuing process to an

imbedded discrete-parameter Markov Chain queuing process)

Consider the birth-death process at transition time

51

Page 52: Chapter 1

The transition probability ,

52

Page 53: Chapter 1

1.9.5 LONG-RUN BEHAVIOR OF MARKOV PROCESS: LIMITING DISTRIBUTION, STATIONARY DISTRIBUTION, ERGODICITY

If , for all i (independent of i),

We call the limiting probability of the Markov Chain.

(Steady-state probability)

Consider the unconditional state probability after m steps,

or equivalently , together with boundary condition

. These well-known equations are called stationary

equations, and their solution is called stationary distribution of the

53

Page 54: Chapter 1

Markov Chain.

If the limiting distribution exists, this implies that the resulting

stationary distribution exists and implies that the process

possesses steady-state. (The stationary distribution = the limiting

distribution)

But the converse is not true.

Example 1.1

The stationary distribution of Eq. (1.32), , is existed

(a solution to is existed), but there is no , and

does not exist except

(i) limiting 不存在(ii) stationary distribution不存在(iii) → strictly stationary

(iv) the process still does not possess steady-state

Strictly stationary: for all k and h

54

Page 55: Chapter 1

i.e. possesses time-independent distribution functions.

is independent of m.

The solution to does not imply strict stationary, except

But strict stationary does imply that is time-

independent, not independent of time.

Example 1.2

The process possesses (i) a steady-state since ,

(iii) but not in general stationary unless , and (ii) It

is strictly stationary at .

Example 1.3

55

Page 56: Chapter 1

and we have the stationary solution .

(i) limiting distribution exists

(ii) stationary distribution exists

(iii) the process is not completely stationary unless

only in the limit.

For continuous-parameter processes, the stationary solution can be

obtained from .

Thus, if the limiting distribution is known to exist, the solution

can be obtained from

56

Page 57: Chapter 1

57

Page 58: Chapter 1

1.9.6 ERGODICITY

is ergodic if time averages (statistics) equal ensemble

averages (statistics), where

Time-average:

Continuous-parameter Discrete-parameter

Ensemble-average:

58

Page 59: Chapter 1

59

Page 60: Chapter 1

=…

Ergodic → All moments are equal (The same statistics)

Example 1.1

The process is not ergodic if

The process is ergodic if (But stationary)

Example 1.2

The process is ergodic (not stationary), but two states, i and j,

are said to Communicate (i↔j)

If i is accessible from j (j→i) and j is accessible from i (i→j)

If all of its states communicate, the Chain is called

irreducible Markov Chain.

60

Page 61: Chapter 1

The state is aperiodic if

The Chain is said to be aperiodic if all states are aperiodic.

: The probability that the chain ever returns to j

If

when , define

Note: The stationary process is ergodic, but the ergodic process

need not be stationary.

Example in p.51

※ state 1 is recurrent,61

Page 62: Chapter 1

※ states 0,3,4 are recurrent,

※ states 2,5 are transient.

62

Page 63: Chapter 1

Theorem 1.1

(a) an irreducible, positive recurrent discrete-parameter

Markov Chain

(b) →The process becomes stationary→Ergodic

(c) If the Markov Chain is irreducible, positive recurrent, and

aperiodic, then the process is ergodic, and has a limiting

prob. distribution = stationary prob. distribution.

Theorem 1.2

An irreducible, aperiodic chain is positive recurrent, if there

exists a nonnegative solution of the system.

such that

63

Page 64: Chapter 1

Theorem 1.3

For continuous-parameter Markov Chain, the imbedded

Markov Chain need not be aperiodic as long as the holding

times in all states are bounded for Theorem 1.1 to be valid.

64

Page 65: Chapter 1

65

Page 66: Chapter 1

1.10 STEADY-STATE BIRTH-DEATH PROCESSES

At steady state,

From

we have

can be proven by induction.

66

Page 67: Chapter 1

Since ,

67