How many times have you worked painstakingly on something that you are not too familiar with, only to have it destroyed in a matter of seconds by one wrong move? A new car that ended up in a fender bender because you were fiddling with the indecipherable buttons on the dashboard? Or those numerous sheets of software coding which has helped you create the solution, but is one step away from being glitch-free? If you were able to simulate the effects of what you are working on, you would have a much better way of analyzing issues and finding a solution to them.
In the scientific and experimentative world, Simulation techniques help very much in situations where mathematical formulation of the problem is not feasible.
This technique was first advocated in the research that was conducted by Prof. John Von Neumann and Prof. Stanislaw Ulam. This was around half a century ago. The issues of nuclear shielding were either too complex for analysis or too expensive for real experimentation and these were taken care of. Since we are now in the era of high-speed digital computers, being able to wrap up several real-life business problems via computer simulation can be cost-effective.
Simulation can be understood as a quantitative technique that was created to study alternative courses of action using the replica of the actual system. A series of trial and error experiments are then conducted on this replica to predict the behavior of the system over a period of time. Studying the experiments conducted on the replica is very much like studying the real system in operation. To study how the real system would react to certain changes, the management team could change a number of variables and test the system accordingly. The data captured through the system reaction helps to make necessary modifications to the real system.
Examples: Simulating earthquake resistant structures, crash tests, pressure variation inside an aeroplane, missile trajectory tracking etc.