National Grid UK, the electricity transmission system owner for England and Wales and the system operator for all of Great Britain, uses @RISK’s probabilistic modeling techniques to measure network restoration performance in a variety of simulated failure scenarios. In other words, if the system, or part of it, somehow breaks down, the people at the utility want to know what, and how long, it will take to get it working again, and what the challenges will be along the way.
@RISK can be used in conjunction with MS Project and Excel to model the schedule and cost risks inherent in large, complex projects. This example demonstrates the use of @RISK to build a complete model of the construction of a new commercial venue. The model includes uncertainty in task times, a Risk Register for calculating contingencies, and a link to real-time cash flows in an NPV calculation model.
@RISK probability distributions have been assigned to the durations of several tasks in the schedule, some with a distribution and others using Risk Categories. The uncertain task times are assumed to be uncorrelated.
A Risk Register lists three possible risk events that could impact the project schedule and costs. By using the RiskProjectAddDelay function, these risks introduce schedule delays and associated costs. Specifically, this function allows the model to generate new tasks dynamically, depending on whether the risks occur or not. Changes are reflected at run time only, so it is necessary to run a simulation to see the impact and results of the Risk Register.
The example also contains a model of cash flows that leads to the NPV of the project. In particular, the project costs create a Timescaled Data report. This collects the total cumulative costs during a simulation. After a simulation, you can see the total cumulative costs for the project as they grow over time.
The other reports generated are the NPV and the Contingency for the Risk Register, at different confidence levels. Finally, the cash flow also includes a Revenue Adjustment calculation that takes the portion of the year in which Sales are initiated and applies a discount to the predicted annual revenue.
“[@RISK] is a key strategic tool for Thales, assisting us in our process of reaching informed business-critical decisions.”
Senior Consultant, Thales Management Consultancy
“With @RISK, students quickly turn static models into dynamic, probabilistic models that show all the possible outcomes.”
PROFESSOR ROY NERSESIAN
Columbia University’s School of International & Public Affairs
“When you are trying to communicate statistics to the medical community, people can get lost. But if you show them @RISK, they get it instantly.”
DR. JOHN FONTANESI
Director of the Center for Management Science in Health,
University of California San Diego School of Medicine
When you consider what an option is and how it functions in the equity markets, this speed is mind-bending––and, not incidentally, worrisome. An option is a contract to either buy or sell an asset at a given price. The option itself is purchased, and this creates a market for options as well as for their underlying equities. The seller offers an option called a put and the buyer’s option is called a call. Options have expiration dates after which they are worthless. Between the time when an option is purchased and when it expires, it’s value can fluctuate. This makes the trading of options pretty risky, and an obvious application for any kind of Monte Carlo in Excel software. Monte Carlo simulation is just one of a number of statistical analysis techniques that are used to value options, but it is the one used by the high-speed entrepreneurs.
The drawback to this mind-boggling speed is that it’s mind boggling. If they want to keep up, investors may not have even a moment to call or put.
A recent and very telling study by two actuarial experts makes clear the important perspective and depth that can be added to financial risk analysis by running Monte Carlo simulations with different probability functions for the same variable.
In the article, Pierre Bogacz of HFA Partners, a firm specializing in risk management for nonprofits, examines a typical decision hospitals face regularly, whether to renew an existing letter of credit (LOC) or turn to variable rate financing. He recommends that each financing vehicle be stress-tested using a financial risk simulation that takes into account the hospital’s entire balance sheet. Financial risk modeling–I assume using Monte Carlo software–is the way to calculate the risk-adjusted cost of debt, and the simulations that result can serve as a valid basis of comparison among various sources of credit.
Bogacz makes the important point that it is easy to become comfortable with a current lender, and this in itself is a risk. He believes that an expanded search for lenders is not only a sound risk avoidance practice, but it often yields information otherwise hard to come by–like how your current lender stands among its competitors. He is not saying "Don’t trust your banker." He’s saying "Do the math, run a financial risk analysis, and come up with what it really costs to borrow that money."
When it comes to predicting atmospheric events, the TV weather guys are pretty good at getting it right for the next few days. That’s because their forecasts are based on accurate models, and one of the factors in their accuracy is that they account for geographical space in small increments. For short-term forecasting, the grid spacing is a few tens of kilometers.
I had long believed that Monte Carlo simulation was developed by a team working at Los Alamos Scientific Laboratory during the 1940s. The blog mentioned Stanislaw Ulam playing solitaire. Both turned out to be true. Ulam was part of the team working on nuclear weapons at Los Alamos, and he prefaced his own account of his inspiration from solitaire by saying,"After spending a lot of time trying to estimate them by pure combinatorial calculations, I wondered whether a more practical method than "abstract thinking" might not be to lay it out say one hundred times and simply observe and count the number of successful plays. This was already possible to envisage with the beginning of the new era of fast computers. . . ." He and John von Neumann began to work on the calculations that eventually became essential to the Manhattan Project.
So far, so true. But how did Monte Carlo simulation enter the finance arena? The blog fast forwards thirty years to 1976 and Roger G. Ibbotson and Rex A. Sinquefield with their publication of "Stocks, Bonds, Bill, and Inflation: Simulations of the Future."
True––but not so fast. In the intervening years and especially during the 1950s, there was considerable development and dissemination of Monte Carlo simulation technique by the U.S. Air Force and the Rand Corporation. This brought the technique closer to the realm of finance, but we’re not there yet.
The earliest publication I can dig up on Monte Carlo and financial risk simulation is David B. Hertz’s "Risk Analysis in Capital Investment," published in the Harvard Business Review in 1964.
From Harvard Business Review, circa 1964
Okay, the 1960s. That still leaves unattended by history almost fifty years, the advent of desktop computing, the commercialization of Monte Carlo software, acceleration through parallel computing, and the wafting up on the horizon of cloud computing.
So, in the words of too many finance journals, "more research is necessary."