Many of my peers in the past thought simulation to be a tool, an exercise, or an animation. But discrete-event simulation is actually much more than that. Discrete-event simulation is a process. Its execution should follow a procedure model. This is why, in this article, I will present a discrete-event simulation procedure model.
Choosing a procedure model is not as straight forward as you might think. For example, kanban-based procedure models and rapid prototyping are popular procedure models among software engineers. However, eventhough simulation modeling is closely related to software development (involves a lot of conceptual modeling and coding) I strongly advice against using these types of procedures for executing a simulation study. Instead, I propose a sequential procedure model that moves in phases.
In this article I will explain my procedure model. I will underline the motivation for applying a procedure model of this kind. My procedure model moves from phase to phase, from problem definition to modeling, execution and eventually interpretation and documentation.
Benefits of discrete-event simulation procedure model
Simulation studies are complex undertakings and they need structured guidance. This is where procedure models come in. Complexity in simulation studies originates from the underlying system itself, its processes and data, as well as from the task of having to model it virtually. This can involve a lot of logic modeling, and thus also a lot of coding.
Procedure models seek to reduce complexity by providing a structured strategy for how to proceed. Here I list some levers that I use to reduce complexity:
- I formulate the problem on a higher abstraction level. This helps with reducing complexity
- I draw system boundaries. This helps to exclude complexity from the model
- I define relevant KPIs. This helps focusing on what truely matters most
- I formulate a clear problem description
- I define a clear research question or objective
- I develop an experiment plan that provides a clear strategy for simulation execution
- Developing a conceptual model ahead of actual coding aids actual model implementation
My discrete-event simulation procedure model
I display my discrete-event simulation procedure model in the figure below. This procedure model is a framework and guideline. It does not represent a set of fixed rules. Nevertheless, using this framework as an orientation will deliver many of the advantages pointed out in my previous paragraph.
In the following sections I explain my discrete-event simulation procedure model in greater detail. I make use of inspiration that I collected from the following papers:
- Link: Improving the rigor of discrete-event simulation
- Link (German): Methode zur Durchführung betrieblicher Simulationen
Problem definition of current situation
During the initial steps of a simulation study one needs to understand the problem at hand. This is the most important step. It is a fundamental step that delivers a foundation for the remaining part of the simulation study. Without proper understanding of the problem and system at hand one cannot develop a good simulation model.
Good simulation models drive on purpose. With that I mean that a good model answers a specific research question. As a simulation engineer I am responsible for defining specific research questions. I must first derive these from a well-written problem definition. I am responsible for delivering this well-written problem definition, too.
Main steps executed during this phase are as follows:
- Understand processes and map them using flow charts
- Understand main entities involved
- Review relevant data and understand it
- Identify relevant stakeholders and include them
- Draw a relevant system boundary, defining what is and what is not part of the simulation study
- Formulate a problem definition
- Derive a research question or objective, and agree on system variables and constraints
- Agree on relevant key performance metrics for measuring the system throughout simulation runs
- Develop a simulation experiment plan
- Plan how to analyze simulation results
Model development and implementation
At this point I already have understood the problem. I have also already formulated a strategy for how to solve the problem. That is, I have developed a plan for how to develop a simulation model and how to execute the simulation study in a way that will allow me to answer the research question. I can thus proceed with the next phase of the discrete-event simulation procedure model. This phase is the model development phase.
I develop and implement the simulation model throughout this phase. It is very important to execute this phase in two separate steps. In a first step, I develop a conceptual model. In a second step, I implement the conceptual model into the chosen modeling framework. This indicates another step that could be relevant, namely the choice of a good modeling framework. The modeling framework has to be fit for purpose. The framework must be capable of housing appropriate simulation models. In other words, I choose the modeling framework based on the problem at hand. Moreover, the simulation model that I develop depends on the chosen modeling framework.
An experienced simulation engineer will choose a simulation tool first, then develop a conceptual model and eventually implement it.
Conceptual modeling can comprise activities such as process mapping and flow charting. State transition diagrams are another effective modeling technique for conceptual modeling. Thereupon following model implementation will mainly consist of setting up the database, developing the visualization and animation models, and constructing simulation model logic by means of e.g. object-oriented programming. Model implementation should also already consider statistics and implement them directly into the model.
Appropriate collection of simulation output data is important. In that way I am able to conduct data analysis of simulation results in later phases of the discrete-event simulation procedure model.
Model verification and validation
I have to test every fully implemented simulation model before initiating experiment execution. Relevant tests seek to verify and validate the model. In this phase I seek to ensure that the simulation is working properly. Here, I distinguish between two terms: Verification and validation.
Simulation model verification aims at verifying that the model works as intended. I.e. verification answers the question whether “the model works as defined in the conceptual model”.
Simulation model validation is performed once the simulation model has been fully verified. Validation aims at testing whether the simulation model represents the real-world processes with sufficient accuracy. For example, the lead time of products in a manufacturing plant could be used to validate whether the simulation model correctly reflects relevant real-world manufacturing processes.
Execution of simulation runs and experiments
Having a fully verified and validated simulation model available by now, the fourth phase of my discrete-event simulation procedure model mainly consists of execution. I.e. execution of the experiment plan defined during the first phase of the discrete-event simulation procedure model.
Execution can be time consuming. There are two reasons for this:
- Each simulation run might involve implementation of model adjustments, which might require manual interaction by the simulation engineer. These model adjustments can be costly. In order to avoid these I often try to develop a parametrized simulation model in the early stages of the procedure model.
- Computing each simulation run can take a long time. I.e. computational power can be a bottleneck. In projects that involve large simulation models and/or simulation runs I thus use a simulation tool that allows me to compile the simulation model into actual machine code. This aids speedy computation.
Analysis of simulation results
I analyze simulation results from previous simulation runs in this phase of the discrete-event simulation procedure model. This might result in additional simulation runs. I refer to these additional simulation runs as optimization runs. They usually seek to answer very specific questions or to test specific improvement potentials identified during the analysis of simulation results.
In addition, I frequently apply sensitivity tests at this point in the procedure model. I will already have specified this in the experiment plan during earlier stages of the discrete-event simulation procedure model. I use sensitivity tests to test the sensitivity of findings. For example: “How much does throughput change depending on the variance in processing times?”
Discrete-event simulation procedure model summary
Discrete-event simulation is a technique frequently applied to very complex problems. And, the resulting simulation models can become very complex, too. For this reason it is of paramount importance to use a structured discrete-event simulation procedure model.
In this article I presented my discrete-event simulation procedure model. It provides structured guidance for executing a complex simulation study by proceeding in phases. Each phase has a clear objective: The first phase aims at defining a clear strategy for how to approach the problem. During this phase all relevant data is already collected and analyzed.
The second phase aims constructing the simulation model. A good approach here is to decide on which simulation tool to use first. Secondly, a conceptual model should be developed. This could be done using e.g. flowcharts and state transition diagrams. Lastly, the conceptual model is implemented using the simulation tool, i.e. by writing code etc.
The third phase ensures that the simulation model works as it should, and that it represents reality sufficiently well.
The fourth phase executes the simulation experiment plan. This simulation experiment plan should have been developed in the first phase of the discrete-evet simulation procedure model.
The fifth and last phase of the discrete-event simulation procedure model comprises analysis of simulation results. Conclusions have to be drawn in this final phase, and this might require additional simulation runs. Additional simulation runs might be required for e.g. optimization purposes, or for the purpose of sensitivity analysis.
Data scientist focusing on simulation, optimization and modeling in R, SQL, VBA and Python