Author: Palisade Corporation

@RISK used by National Grid UK in Electricity Network Restoration Planning

@RISK Used by National Grid UK

National Grid UK, the electricity transmission system owner for England and Wales and the system operator for all of Great Britain, uses @RISK’s probabilistic modeling techniques to measure network restoration performance in a variety of simulated failure scenarios. In other words, if the system, or part of it, somehow breaks down, the people at the utility want to know what, and how long, it will take to get it working again, and what the challenges will be along the way.

» Read the case study

@RISK for Cost and Schedule Risk Using Risk Registers (with Example Model)

@RISK can be used in conjunction with MS Project and Excel to model the schedule and cost risks inherent in large, complex projects. This example demonstrates the use of @RISK to build a complete model of the construction of a new commercial venue. The model includes uncertainty in task times, a Risk Register for calculating contingencies, and a link to real-time cash flows in an NPV calculation model.

@RISK probability distributions have been assigned to the durations of several tasks in the schedule, some with a distribution and others using Risk Categories. The uncertain task times are assumed to be uncorrelated.

A Risk Register lists three possible risk events that could impact the project schedule and costs. By using the RiskProjectAddDelay function, these risks introduce schedule delays and associated costs. Specifically, this function allows the model to generate new tasks dynamically, depending on whether the risks occur or not. Changes are reflected at run time only, so it is necessary to run a simulation to see the impact and results of the Risk Register.

The example also contains a model of cash flows that leads to the NPV of the project. In particular, the project costs create a Timescaled Data report. This collects the total cumulative costs during a simulation. After a simulation, you can see the total cumulative costs for the project as they grow over time.

The other reports generated are the NPV and the Contingency for the Risk Register, at different confidence levels. Finally, the cash flow also includes a Revenue Adjustment calculation that takes the portion of the year in which Sales are initiated and applies a discount to the predicted annual revenue.

» Download the model (Requires @RISK 6.x Professional or higher, Microsoft Project must be installed.): XLSX fileMPP file
» Download a trial of @RISK Industrial

Call and Put as Fast as You Can

In the always high-pressure world of the equity markets, the pressure has been dialed up on one of the most intensely speculative forms of financial risk analysis, the pricing of equity options.  Earlier this year a software developer in Ireland teamed up with British hardware company to release a white paper describing what they are calling a new speed record in the pricing of options ––1 million options evaluated in 17 seconds.

When you consider what an option is and how it functions in the equity markets, this speed is mind-bending––and, not incidentally, worrisome.  An option is a contract to either buy or sell an asset at a given price.  The option itself is purchased, and this creates a market for options as well as for their underlying equities.  The seller offers an option called a put and the buyer’s option is called a call.  Options have expiration dates after which they are worthless.  Between the time when an option is purchased and when it expires, it’s value can fluctuate.  This makes the trading of options pretty risky, and an obvious application for any kind of Monte Carlo in Excel software.  Monte Carlo simulation is just one of a number of statistical analysis techniques that are used to value options, but it is the one used by the high-speed entrepreneurs.
 

Conveniently, for their blazing model  the British companies used what’s called a European option, which must be exercised on its expiration date. This simplified their calculations by eliminating variability in time and the conditions that change over time–-and as anyone who creates financial analysis models will tell you, simplicity and speed are directly related. Nevertheless, when the acceleration software and the hardware were cranked up to full speed, the white paper reports, they were processing 30 billion option pricing iterations per second.
 

The drawback to this mind-boggling speed is that it’s mind boggling.  If they want to keep up, investors may not have even a moment to call or put. 

Squeezing the Risk Out of Reinsurance

stop-loss opportunity in medical reinsuranceA recent and very telling study by two actuarial experts makes clear the important perspective and depth that can be added to financial risk analysis by running Monte Carlo simulations with different probability functions for the same variable.
 

Writing about a hypothetical case in the reinsurance industry, Lina Chan and Domingo Joaquin sought to predict how a stop-loss underwriting opportunity would affect a reinsurer’s bottom line. Chan, a managing partner in CP Risk Solutions, is a fellow of the Society of Actuaries, and Joaquin is an associate professor of finance at Illinois State University.
 
To create their predictions, they first established what level of loss in capital position would be unacceptable, and then, using Monte Carlo simulations in Excel, they analyzed three variations of the hypothetical underwriting arrangement.  For each version of the deal, they ran simulations using log-normal, inverse Gaussian, and log-logistic probability functions. 
 
I was surprised at sunshine-to-gloom differences in researchers’ simulation results.  The gloomiest was obtained with by the model using the log-logistic function, this prompted Chan and Joaquin to endorse the reinsurance deal involving the most sharing of risk––and, of course, of profit.  But what was most striking about their study were the possible courses of action that could have resulted from the analysts’ reliance on only one probability function.  By creating a multi-perspective set of risk analyses, they demonstrated how to effectively squeeze the riskiness of the hypothetical deal down to almost nothing.

» Full text of case study.  

Risk and the Cost of Debt

Recently a great deal of public attention has focused on economic efficiency in the health care industry, but one rarely mentioned element of overall efficiency is the financing of hospitals. Although many hospitals are nonprofit, even these need operating capital, and an infusion of capital usually is accompanied by financial risk.  A recent article in Becker’s Hospital Review highlighted the importance of financial risk analysis in the process of choosing sources of credit, and its observations should ring true for any business, for-profit or not.   

In the article, Pierre Bogacz of HFA Partners, a firm specializing in risk management for nonprofits, examines a typical decision hospitals face regularly, whether to renew an existing letter of credit (LOC) or turn to variable rate financing.  He recommends that each financing vehicle be stress-tested using a financial risk simulation that takes into account the hospital’s entire balance sheet. Financial risk modeling–I assume using Monte Carlo software–is the way to calculate the risk-adjusted cost of debt, and the simulations that result can serve as a valid basis of comparison among various sources of credit. 

Bogacz makes the important point that it is easy to become comfortable with a current lender, and this in itself is a risk.  He believes that an expanded search for lenders is not only a sound risk avoidance practice, but it often yields information otherwise hard to come by–like how your current lender stands among its competitors.  He is not saying "Don’t trust your banker."  He’s saying "Do the math, run a financial risk analysis, and come up with what it really costs to borrow that money."   

Computational Power for Truly Long-Term Forecasts

Monte Carlo simulation is often referred to as a computational space hog.  But how much space a simulation hogs, of course, is a matter of how much data, how many variables, and the complexity of the statistical analysis.  Climate prediction is a wonderful case in point. 

When it comes to predicting atmospheric events, the TV weather guys are pretty good at getting it right for the next few days.  That’s because their forecasts are based on accurate models, and one of the factors in their accuracy is that they account for geographical space in small increments.  For short-term forecasting, the grid spacing is a few tens of kilometers.

But climate change models attempt look a hundred years ahead–as well as a hundred years back–and these truly long-term environmental risk predictions are not nearly so accurate, even with the widespread use of Monte Carlo software.  The inaccuracies result from the fact that because of the limitations on computing power at individual institutions, the present climate change models must use expanded grid spacing.  Otherwise, computation for the statistical analysis involved would overwhelm the computers trying to run the models That’s the reason that Oxford University professor Tim Palmer has proposed a "global" facility to meet the computational needs of climate scientists.  "We do not," he says, have the computing power to solve the known partial differential equations of climate science with sufficient accuracy."  
Particle physicists also have the need for huge models, including those with mammoth Monte Carlo simulations, and they have met it with CERN, the European Organization for Nuclear Research. It is Palmer’s idea that a parallel organization in which national climate change centers could collaborate on an international climate prediction would support significant advances in our understanding of climate change.  At this facility dedicated computer power would allow scientists to squeeze down the grid scale, still be able to run Monte Carlo software to control for approximations at this level of detail, and reveal in much higher resolution how the earth’s climate is changing and how human activities affect this.  

Legends of the Monte Carlo Technique

A recent blog in Investment Week that mentioned the history of Monte Carlo simulation and its use in finance led me to take a harder look at what I thought I knew about how financial risk analysis was launched.

I had long believed that Monte Carlo simulation was developed by a team working at Los Alamos Scientific Laboratory during the 1940s.  The blog mentioned Stanislaw Ulam playing solitaire. Both turned out to be true.  Ulam was part of the team working on nuclear weapons at Los Alamos, and he prefaced his own account of his inspiration from solitaire by saying,"After spending a lot of time trying to estimate them by pure combinatorial calculations, I wondered whether a more practical method than "abstract thinking" might not be to lay it out say one hundred times and simply observe and count the number of successful plays. This was already possible to envisage with the beginning of the new era of fast computers. . . ."  He and John von Neumann began to work on the calculations that eventually became essential to the Manhattan Project. 

So far, so true.  But how did Monte Carlo simulation enter the finance arena?  The blog fast forwards thirty years to 1976 and Roger G. Ibbotson and Rex A. Sinquefield with their publication of "Stocks, Bonds, Bill, and Inflation: Simulations of the Future."

True––but not so fast.  In the intervening years and especially during the 1950s, there was considerable development and dissemination of Monte Carlo simulation technique by the U.S. Air Force and the Rand Corporation.  This brought the technique closer to the realm of finance, but we’re not there yet.  

The earliest publication I can dig up on Monte Carlo and financial risk simulation is David B. Hertz’s "Risk Analysis in Capital Investment," published in the Harvard Business Review in 1964.

Harvard Biz
From Harvard Business Review, circa 1964

Okay, the 1960s.  That still leaves unattended by history almost fifty years, the advent of desktop computing, the commercialization of Monte Carlo software, acceleration through parallel computing, and the wafting up on the horizon of cloud computing.

So, in the words of too many finance journals, "more research is necessary."