You are on page 1of 10

Discrete event system simulation:

Discrete event simulation (DES) is the process of codifying the behavior of a complex system as
an ordered sequence of well-defined events. In this context, an event comprises a specific change
in the system's state at a specific point in time.
Common applications of DES include stress testing, evaluating potential financial investments,
and modeling procedures and processes in various industries, such as manufacturing and
healthcare.
As an example of a situation that lends itself to DES, consider the amount of electrical power
consumed by a corporation's office building as a function of time. Discrete events affecting this
function include power-up or power-down of any electrical device in the building. Instantaneous
changes of state in a device already powered-up are also discrete events; for example, a speed
change in a cooling fan or a brightness change in a desk lamp.
An effective DES process must include, at a minimum, the following characteristics:

Predetermined starting and ending points, which can be discrete events or instants in
time.

A method of keeping track of the time that has elapsed since the process began.

A list of discrete events that have occurred since the process began.

A list of discrete events pending or expected (if such events are known) until the process
is expected to end.

A graphical, statistical, or tabular record of the function for which DES is currently
engaged.

DES is commonly used to monitor and predict the behavior of investments; the stock market is a
classic example. DES can also help administrators predict how a network will behave under
extraordinary conditions, such as the Internet during a major disaster.

Areas of simulation application


Simulation in science and engineering research:

Earlier most experiments were carried out

physically in the laboratories, but today a majority of experiments are simulated on computers.
Computer Experiments' besides being much faster, cheaper, and easier, frequently better insight
into the system than laboratory experiments do.
Simulation in soft sciences: Simulation can be expected to play even a more vital role in biology,
sociology, economics, medicine, psychology etc. where experiments could be very expensive,
dangerous, or even impossible. Thus Simulation has become an indispensable tool for a modern
researcher in most social, biological and life sciences.
Simulation for business executives: There are many problems faced by management that cannot
be solved by standard operations research tools like linear and dynamic programming, inventory and
queuing theory. Therefore, instead of taking decisions solely on intuition and experience, now a
business executive can use computer simulation to make better and more powerful decisions.
Simulation has been used widely for inventory control, facility planning, production scheduling and
the like.

What is a good simulation application?

Systems where it is too expensive or risky to do live tests. Simulation provides an inexpensive,
risk-free way to test changes ranging from a "simple" revision to an existing production line to
emulation of a new control system or redesign of an entire supply chain.
Large or complex systems for which change is being considered. A "best guess" is usually a
poor substitute for an objective analysis. Simulation can accurately predict their behavior under
changed conditions and reduce the risk of making a poor decision.
Systems where predicting process variability is important. A spreadsheet analysis cannot
capture the dynamic aspects of a system, aspects which can have a major impact on system

performance. Simulation can help you understand how various components interact with each
other and how they affect overall system performance.
Systems where you have incomplete data. Simulation cannot invent data where it does not
exist, but simulation does well at determining sensitivity to unknowns. A high-level model can
help you explore alternatives. A more detailed model can help you identify the most important
missing data.
Systems where you need to communicate ideas. Development of a simulation helps
participants better understand the system. Modern 3D animation and other tools promote
communication and understanding across a wide audience

Steps of simulation studies:

1)Problem formulation:
Every study should begin with a statement of the problem. If the statement is provided by
the policy makers, or those that have the problem, the analyst must ensure that the problem
being described is clearly understood. If the problem is being developed by the analyst, it is
important that the policy makers understand and agree with the formulation.
2.Setting of objectives and overall project plan:
The objective indicates the questions to be answered by simulation. At this point a
determination should be made concerning whether simulation is the appropriate
methodology for the problem as formulated and objectives as state. Assuming it is decided
that simulation is appropriate; the overall project plan should include a statement of the
alternative systems to be considered, and a method for evaluating the effectiveness of
these alternatives.
3.Model conceptualization:
The construction of a model of the system is problem as much art as science. The art of
modelling is enhanced by an ability to abstract the essential features of a problem, to select
and modify basic assumptions that characterize the system, and then to enrich and
elaborate the model until a useful approximation results. Thus it is best to start with a simple
model and build toward greater complexity.
4.Data collection:
There is a constant interplay between the construction of the model and the collection of the
needed input data. As the complexity of the model changes, the required data elements

may also change. Also, since data collection takes such a large portion of the total time
required to perform a simulation, it necessary to begin it as early as possible, usually
together with early stages of the model building.

5.Model translation:
Since most real world systems result in models that require a great deal of information
storage and computation, the model must be entered into a computer-recognizable format.
We use the term "program".

6.Verified?

7.Validated?
Validation is the determination that the model is an accurate Ensures representation of the
actual system or problem. Validation is usually achieved through the calibration of the
model, an iterative process of comparing the model to actual system behaviour are using
the discrepancies between the two, and the insights gained, to improve the model.

8.Experimental Design:
The alternatives that are to be simulated must be determined. Often, the decision
concerning which alternatives to simulate may be a function of run that have been
completed and analyzed.
9.Production runs and analysis:
Production runs, and their subsequent analysis, are used to estimate measures of
performance for the system designs that are being simulating.
10.More runs?
Based on the analysis of the runs that have been completed, the analyst determines if
additional runs are needed and what design those additional experiments should follow.

11.Documentation and reporting:


There are two types of documentation:
Program
Process

Program documentation is necessary for numerous reasons. If the program is going to be


used again by the same or different analysts, it may be necessary to understand how the
program operates.
Progress reports give a chronology of work done and decisions made. This can prove to be
of great value in keeping the project on course.

Event-scheduling/Time-advance algorithm
The mechanism for advancing simulation time and guarantee in that all event securing correct
chronological order.
At any given time, the future event list (FEL) contains all previously scheduled future events and
The associated event times (t1, t2...):
FEL is ordered by event time, and the event time satisfy: t t1 t2 tn where t

is the value

of the Clock.

New system snapshot at time t1


Step 1 Remove the event notice for the imminent event (event 3, time t1) from
FEL.
Step 2 Advance CLOCK to imminent event (i.e., advance CLOCK from t to t1).
Step 3 Execute imminent event: update system state, change entity attributes,
and set membership as needed.
Step 4 Generate future events (if necessary) and place their event notices on FEL,
ranked by event time. (Example: Event 4 to occur at time t*, where t2 < t < t3.)
Step 5 Update cumulative statistics and counters.

Principles used in data collections:


The methodological principles of data collection that underpin good practice are the
following:
Focus on the collection of data needed to improve estimates of key categories which are the
largest, have the greatest potential to change, or have the greatest uncertainty.
Choose data collection procedures that iteratively improve the quality of the inventory in line with
the data quality objectives.
Put in place data collection activities (resource prioritisation, planning, implementation,
documentation etc.) that lead to continuous improvement of the data sets used in the inventory.
Collect data/information at a level of detail appropriate to the method used.
Review data collection activities and methodological needs on a regular basis, to guide
progressive, and efficient, inventory improvement.
Introduce agreements with data suppliers to support consistent and continuing information flows.
Developing a data collection strategy to meet data quality objectives regarding timeliness, and also
consistency, completeness, comparability, accuracy, and transparency using guidance provided in
Chapter 6, QA/QC and Verification, of this volume,
Data acquisition activities including generating new source data, dealing with restricted data and
confidentiality, and using expert judgement,
Turning the raw data into a form that is useful for the inventory.

Difference between stochastic and deterministic models


A deterministic model is a model where:
1 - the material properties are well known, i.e. deterministic. none of them is random
2 - The applied load are also deterministic

3- A Deterministic Model corresponds to a Design (Analytical Decision) in the Certainty State. The Stochastic Model
in a Random Analysis Corresponds to a Design in a Risk State which uses Probabilistic Definitions.

4- A Deterministic model is developed applying first principals equations, that is, mass balance, energy balances,
kinetic rates, calculating phisico-chemical parameters and so on. It is also called white box model.

A Stochastic model has on the other hand:


1 - random properties, e.g. the Young's modulus is a random variable with uniform distribution [E1,
E2]; or normal distribution (of a given mean or standard deviation)
2 - The applied load is random variable, e.g. Wind Load, earthquake (vibration of random amplitude
and displacement)
3-A stochastic model has the capacity to handle then uncertainty in the inputs built into it, for a deterministic model,
the uncertainties are extenal to the model. The uncertainties in the inputs to a deterministic model can be handled
through use of a Monte Carlo simulation (note that this does not make it a stochastic model).v

4-A Stochastic model is sometimes called black box modelling. You know the input and output values and a nondeterministic model is applied to correlate the variables.

The Hybrid model is a "mixture" of both Deterministic and Stochastic. Its treatment is quite similar to the Stochastic
model. The presence of a single random variable in the model necessitates the consideration of the stochastic

treatment. is a "mixture" of both Deterministic and Stochastic. Its treatment is quite similar to the Stochastic model.
The presence of a single random variable in the model necessitates the consideration of the stochastic treatment.

You might also like