# On-Demand Webinar: “Choosing the Right Distribution”

» Watch now: “Choosing the Right Distribution”

Which distribution should you use to represent uncertain values in your spreadsheet model? This webinar provides an introduction to probabilistic analysis and Monte Carlo simulation right in your Excel spreadsheet using @RISK, demonstrating techniques that can be used for choosing a distribution to represent uncertain variables. We will look at a variety of examples to help you become more comfortable with choosing distributions to model uncertainty.

For more than 30 years, Palisade software and solutions have been used to make better decisions. Cost estimation, NPV analysis, operational risk registers, portfolio analysis, insurance loss modeling, reserves estimation, schedule risk analysis, budgeting, sales forecasting, and demand forecasting are just some of the ways in which the tools are applied. This webinar will demonstrate how easy – and necessary – it is to select distributions to use in quantitative risk analysis for any business.

» Watch “Choosing the Right Distribution” now

# New Functions in @RISK 7.5!

Functions are at the heart of risk analysis involving Monte Carlo simulation. In version 7.5 we have added a total of 22 new functions – 16 distribution functions and 6 new statistical functions. @RISK’s new distribution functions will appeal to a variety of industries and applications:

• RiskDagum – This distribution is mostly associated with modeling income distribution and is useful in many actuarial statistics.
• RiskFréchet – Used to quantify extreme events, Fréchet distributions is helpful in modeling rare, unexpected events such as radioactive emissions, seismic analysis, and peak single-day rainfall and flooding.
• RiskCauchy – This distribution is useful in scientific and engineering applications to model resonance behavior, measurement repeatability and light dispersion.
• RiskBurr12 – Burr distributions are used to model household income, insurance risk and reliability data.
• RiskFatigueLife – If you are looking to model material reliability, FatigueLife distribitions are effective in modeling reliability and are used to estimate failure of materials over time.

These new functions are important for accurate, insightful estimation of uncertainty and provide useful statistics on simulation results data.  Ranging from insurance risk to reliability engineering to modeling of household income, @RISK 7.5 has your risk analysis needs covered!

»Learn about all the new functions in @RISK 7.5

# @RISK and DecisionTools Suite 7.5 Now Available

The latest version of our popular risk analysis tools, @RISK 7.5 and DecisionTools Suite 7.5 are now available!  Version 7.5 offers a range of improvements for any decision maker, from general use enhancements to new, specialized analytical features.  New and enhanced graphing options, faster performance, and sophisticated analytics make DecisionTools Suite 7.5 the only decision analysis toolset you’ll ever need.

Register for a free webinar on What’s New in @RISK 7.5

### Key Features Include:

• New and Improved Tornado Graphs In @RISK
• Faster Optimization with RISKOptimizer
• Over 20 New @RISK Functions
• Graphing and Reporting Improvements in @RISK
• Optimized for Windows 10 and Excel 2016

### Other Important Features:

• Run Optimizations During Simulation Without Coding
• New StatTools Analyses

» Wednesday, July 20th, 1pm EDT (NYC) | 10am PDT (Los Angeles) – Register

» Thursday, July 21st, 10am EDT (NYC) | 3pm BST (London) | 7:30pm IST (Delhi) – Register

» Wednesday, July 27th, 10am AEST (Sydney) – Register

# Palisade Client Discusses Project Success in Economia

Previously,  Visual Reporting,  the Denmark-based business analytical reporting consultancy, used @RISK to help the Danish Government comply with Danish legislation requiring all Government related IT business cases that exceed €1.33m to be risk-assessed.

Laurits Søgaard Nielsen, CEO at Visual Reporting, believes Monte Carlo Simulation (MCS) used within @RISK was  integral to the solution that Visual Reporting  implemented for the Danish Government, and he discussed that success in the January 2015 issue of economia, which covers technical accountancy content,  business, economics, management and finance.  In the piece, Why do IT projects fail?Nielsen gives his insight:

“The main reason large IT projects fail is that they grow in complexity, and humans are very poor at handling complex tasks,” says Laurits Søgaard Nielsen, CEO of Virtual Reporting. “But often the creation of a business case does not follow best practice and can be severely affected by managers wanting to influence the process of project approval. ‘I want this project to happen’ often outweighs the poor business case.”

With Palisade’s risk and decision analysis tools, complex tasks and projects can be handled with confidence.

Read the original case study here.

# Shipping Industry Relies on PrecisionTree, @RISK, to Manage Risks to Crew, Environment, and Property

DNV GL SE, an international organization for ship classification, safety and quality control, uses PrecisionTree and @RISK software (part of the DecisionTools Suite, Palisade’s complete toolkit of risk and decision analysis tools) to determine the risk of an incident occurring to a ship or its systems and the consequences should one occur. This enables cost-benefit analysis so that informed decisions can be made about the best strategies to mitigate risk events.

DNV GL is the world’s largest ship and offshore classification organization, and in this context, DNV GL is a key player in major research and development projects for the shipping industry. A core element of their focus is enhancing safety and security in shipping by pro-active use of risk assessment and cost-benefit techniques.

Part of their risk assessment method involves Palisade’s decision tree software, PrecisionTree, to develop the risk models that apply to: (1) crew; (2) environment; and (3) property.

They then use @RISK to perform Monte Carlo simulation, which factors in the uncertainty of input parameters for the risk models, and illustrates the impact on the estimates provided for the assessment of a current level of risk, as well as the cost-efficiency of risk-mitigating measures.

“PrecisionTree makes it straightforward to quickly develop event trees. These are essential to our analysis when looking at how to reduce the both the number of shipping incidents and the consequences should they occur,” explains Dr. Rainer Hamann, Topic Manager Risk Assessment, Safety Research at DNV GL SE. “At the same time we have to be realistic about the accuracy of our data inputs and, by means of distributions and Monte Carlo simulation, @RISK enables us to be clear about the level of uncertainty contained in each model. Embedded in the Microsoft environment, it also allows us to incorporate standard Excel features, making it easy to use.”

Read the original case study here.

# Palisade VP Gives Tips on Planning for Climate Change in IT Business Edge Magazine

IT Business Edge, on online business publication recently published a piece by Palisade Vice President Randy Heffernan titled “Using Monte Carlo Simulations for Disaster Preparedness,” a slideshow featuring key tips on how to apply this statistical method to planning for extreme weather events.

As the article states, “the U.S. National Research Council recently suggested the necessity of a “national vision” that will take precautionary, rather than reactionary, approaches to flooding, particularly in the Atlantic and Gulf coasts, where water has reached flood levels an average of 20 days per year since 2001.”

As an expert in quantitative risk analysis, Heffernan had some tips for handling the key problems businesses face around climate change risk planning, including:

Fundamentally, all these tips are anchored by the application of Monte Carlo simulation, which, the article explains, “performs risk analysis by building models of possible results by substituting a range of values — a probability distribution — for any factor that has inherent uncertainty. It then calculates results over and over using a different set of random values from the probability functions.”

Want to make sure your business is better prepared for extreme weather? Check out the full article here, and check out @RISK, the Monte Carlo simulation software that makes planning for climate change events possible.

# @RISK Creates an 80% Cost-Savings for Nuclear Power Company

Building a new power plant in the U.S. is not for the risk-averse. The average cost of building one of these plants is historically 300% of the planned cost at the start of construction. Because of these prohibitive cost risks, as well as unfavorable public sentiment in reaction to the Three-Mile Island accident in 1979, U.S. nuclear power plant construction had ground to a halt for decades.

Almost 30 years later, the first contracts for new plants were signed by a major U.S. nuclear power company in 2008. The total expected project cost for the three projects was over \$30 billion. Hoping to avoid the abysmal track record of cost overruns in the industry, the nuclear power company wanted to use a new and improved risk-management approach to the planning process—so they hired Dr. Sola Talabi, Risk Management Professor at Carnegie Mellon University and Risk Management Consultant for Pittsburgh Technical, to serve as Nuclear Power Plant Risk Manager. “They knew what the history had been with these projects, and they didn’t want it to happen again,” says Dr. Talabi.

In previous power plant construction projects, risk management was only an objective of project managers. This time, risk management was both an objective and a function; the company created a dedicated risk organization whose only role was to manage risk. Furthermore, the company switched from deterministic risk planning methods, to probabilistic methods—using @RISK. Armed with the software, Dr. Talabi and his risk management team were able to characterize the uncertainty associated with cost items and perform quantitative risk analysis using Monte Carlo simulation. They were also able perform a sensitivity analysis to identify key cost drivers, such as the fact that supply chain issues was the greatest risk driver of all.

Thanks to this analytical, @RISK-driven approach, Dr. Talabi and the risk organization were able to dramatically impact these nuclear power plant projects, reducing cost overruns by roughly 80% relative to the historical trend. “While we still have cost overruns, it’s nothing like what we had before,” says Dr. Talabi. “I think it’s very important for decision-makers to realize that the rampant cost overruns of the past can be much better managed using tools like @RISK.”

# @RISK Helps Zero-In on U.S. Senate Race Outcomes

The midterm Senate race is fast approaching—and so are the speculations on its outcome. Previously, Lawrence W. Robinson, Professor of Operations Management at Cornell University’s Johnson Graduate School of Management used @RISK to statistically predict the senate races, using data from the stats-centered news site, FiveThirtyEight.   FiveThirtyEight was founded by statistician and political analyst Nate Silver, who, in his forecasts earlier in the year, initially summed up the probabilities of either Democrats or Republicans winning all their races.

Robinson took this analysis a step further by adding Monte Carlo simulation to the mix. While Silver warned in previous articles that to assume races are uncorrelated is “dubious,” and that Monte Carlo simulations requires variables to be uncorrelated, Robinson demonstrated that it is in fact very possible to include correlation in Monte Carlo analyses.

He started by creating a “lower bound” (zero correlation) and an “upper bound” (total correlation) in his model, and showed that Democrats’ chances of retaining control only fell somewhere between 41% and 50%.

The FiveThirtyEight Approach

Fast-forward a few months, and FiveThirtyEight’s models have gotten considerably more complex and data-rich, and their interactive forecasts are updated almost daily. As of this writing, the model predicts that Democrats have a 42% chance of retaining the Senate next year.

Unlike their earlier forecasts, “they’ve also included a correlation, of a type in their model,” says Robinson. “They do not explicitly use a correlation coefficient, as I did—instead, they change the distribution of the candidate’s lead.” Robinson explains that Silver and FiveThirtyEight introduce correlation through an additional random variable representing what they’ve labelled “national error,” which they generate and add to the mean margin of victory of every candidate.

This national error “could be a sex scandal, or some underlying and largely uncaptured sentiment in the nation,” Robinson explains. “For example, in the 2012 presidential race, it might have been Hurricane Sandy, and how presidential Obama looked in response.”

In the FiveThirtyEight forecast model, if the national error (whatever it represents) turns out to be +3 for the Republicans, they shift the mean margin of victory three points towards each and every Republican. “Unfortunately, nowhere in their post do they specify the probability distribution for his new ‘national error’ random variable,” Robinson says. “Thus it is not possible to know how correlated the individual races are with one another.”

@RISK Presents an Alternative Method

Because FiveThirtyEight’s methods are not entirely clear, Robinson wanted to devise a way to arrive at these forecasts using his own statistical methods, and to use a correlation that is explicitly defined. Instead of just using FiveThirtyEights’s “Leader’s chance of winning,” which was only given to the nearest percent, Robinson started with the mean and (estimated) standard deviation of the margin of victory, and calculated the probability of winning by assuming that the margin of victory on Election Night was normally distributed. “Although Silver says he assumes that the victory margin is leptokutic [has fat tails] for finding the probability of winning, he never specifies its probability distribution,” says Robinson. “I found that the standard assumption that the margin was normally distributed better matched his reported analysis.”

Robinson then built a Monte Carlo model in Excel using @RISK, treating the outcome of each race as a Bernoulli (0/1, win/lose) random variable. He then introduced a correlation matrix that captured the correlation between every pair of races, and ran 27 different simulations (each one simulating 400,000 elections) for correlations ranging between 0% and 100%. His results closely match that of FiveThirtyEight’s, showing the probability that Democrats will retain control of the Senate as a function of the correlation among races. “Now we can say that, as long as the correlation is between 20% and 97%, the probability that the Democrats will retain control will be between 40% and 42%,” says Robinson. “The advantage of this approach,” Robinson says, “is that we specify the correlation precisely, and that we conduct robustness analysis to see how the results change with the correlation.”

Interested in playing a political prognosticator? Check out our models here and run the @RISK simulation yourself.

# @RISK Takes Nate Silver’s Senate-Race Predictions a Step Further

Statistician super-star and  FiveThirtyEight editor Nate Silver has gained fame for his spot-on predictions around elections; he had accurate predictions for all 50 states during the 2012 presidential election. While it seems that Silver has the power of precognition, he actually relies on refined statistical methods—including weighting various political polls—to make predictive models.

Recently, Silver and FiveThirtyEight put out a forecast for the 2014 Senate race which examines the races on a probabilistic basis. Silver’s analysis, which simply sums the probabilities of each side winning all its races, projects that the Democrats are slightly more likely to lose control of the chamber than to retain it.

Lawrence W. Robinson, Professor of Operations Management at Cornell University’s Johnson Graduate School of Management took this research one step further by adding Monte Carlo simulation to the mix. Robinson set out to determine, in his words, “the probability that the Democrats hold at least 50 seats in the new Senate.” Only 50 seats are needed because Joe Biden will, in his role as president pro tempore of the Senate, break ties in the Democrats’ favor. “What we really want to know is, what chance will the Democrats have to retain control”?

After using @RISK to crunch the numbers, Robinson found that the Democrats have only a 41% chance of retaining control of the Senate.

While Silver warned in previous articles that to assume races are uncorrelated is “dubious,” and that Monte Carlo simulations requires variables to be uncorrelated, Robinson demonstrated that it’s in fact very possible to include correlation in Monte Carlo analyses.

By creating a “lower bound” (zero correlation) and an “upper bound” (total correlation) Robinson showed that Democrats’ chances of retaining control only hovers somewhere between the aforementioned 41% and 50%.

With the upper and lower bounds  in place, Robinson went on to create a model that allows the coefficient of correlation between every pair of elections to vary between 0% and 100%, and found the probability that the Democrats will hold the Senate for each different correlation coefficient value.

As Robinson says, “It would be very difficult to determine the correlations among all the different Senate races. However, if the coefficient of correlation is anywhere in the wide range between 20% and 85%, then the probability that the Democrats will retain control of the Senate (i.e., hold 50 or more seats) will be in somewhere in the very narrow band of 45% ± 0.1%.”

Stay tuned as the mid-term Senate race approaches this fall—Robinson plans to model more nuanced and data-rich predictions as the election nears.