M/M/1 Queues

Introduction

This Section explains the simplest type of single queue - the M/M/1 queue.

The simplest queueing system consists of two components (the queue and the server) and two attributes (the inter-arrival time, i and the service time, t). The usual schematic representation of this systems is as follows:

QueueScheme

Customers arrive at the queue from the input randomly in time but have mean inter-arrival time i. The customers take a random amount of time to be served but have mean service time t. We assume in this case that these random times both have an exponential distribution.

This system is illustrated by the applet below. You may enter the service time and inter-arrival time in the top-right fields. (Note: for the queue to be stable (ie not get longer indefinitely as time goes on) the service time must be shorter than the inter-arrival time.)

The applet then uses Queueing Theory to calculate various performance measures for the queue. These are displayed immediately. You may simulate the queue by pressing 'Run'. This causes customers to be generated at the Source and move to the queue. Above the illustration is a representation of the Markov Chain associated with this queue. The states of the Markov chain represent the number of customers in the system.

As the simulation runs, statistics are collected from it and displayed below the calculated measures. You can compare these, to see how quickly they approach the predicted values.

SPNavUp

Queueing Theory

To solve the queueing system we rely on the underlying Markov chain theory. However we can abstract away from this and use the traffic intensity to derive measures for the queue. The traffic intensity, t, is the service time divided by the inter-arrival time. From this we calculate the measures used in the applet as follows:

Let pi represent the probability that their are i customers in the system. The probability that the system is idle, p0,  (ie the Markov chain is in state 0) is given by

 p0 = 1 - t.

The Utilisation, U, of the system is 1 - p0. ie. the proportion of the time that it is not idle.

U = 1 - p0 = t.

The probability that the queue is non-empty, B, is the probability of not being in state 0 or state 1 of the Markov chain ie.

 1-p0-p1 = 1 - (1-t) - ((1-t)t) = 1 -1 + t - t + t2 = t2.

The expectation of the number of customers in the service centre, N, is the sum over all states of the number of customers multiplied by the probability of being in that state.

This works out to be t/(1-t).

The expectation of the number of customers in the queue is calculated similarly but one multiplies by one less than the number of customers.

 This works out to be t2/(1-t).

SPNavUp

Markov chains

A Markov chain is a system which has a set S of states and changes randomly between these states in a sequence of discrete steps. The length of time spent in each state is the 'sojourn time' in that state, T. This is an exponentially distributed random variable and in state i has parameter qi. If the system is in state i and makes a transition then it has a fixed probability, pij, of  being in state j.

We can construct a Markov chain from a queueing system as follows; assign each possible configuration of the queue a state, define the probability of moving from one state to another by the probability of a customer arriving or departing. Thus state 0 corresponds to there being no customers in the system, state 1 to there being one customer and so on. If we are in state i, then the probability of moving to state i-1 is the probability of a customer departing the system, and the probability of moving to state i+1 is the probability of a customer arriving in the system (apart from the special case of state 0 when we cannot have a departure).

The fact that we can construct Markov chains from queueing systems means we can use standard techniques from Markov chain theory to find, for example, the probability of the queue having a particular number of customers (by finding the probability of the corresponding Markov chain being in the corresponding state).

Tom Slater, June 2000