Month: August 2010

@RISK Quick Tips: Automating @RISK risk analysis software with VBA

Presented at the Palisade Risk & Decision Analysis Conference New York City
Chris Albright, author of the book VBA for Modelers, presented a number of examples of how to automate @RISK, RISKOptimizer, and StatTools in Excel using Excel’s VBA and Palisade’s built-in object-oriented Excel Developer Kit. These examples include production applications, scheduling applications, World Series simulation, and more. All examples include macros written by Dr. Albright, so you’ll need to enable macros when you open them.

» Download the examples
» Order Dr. Albright’s book "VBA for Modelers"

Free Webcast This Thursday: “The Use of the DecisionTools Suite in Biotechnology Project and Portfolio Decision Making”

On Thursday, September 2, 2010, Svetlana A. Sigalova will present a free live webcast entitled. "The Use of the DecisionTools Suite in Biotechnology Project and Portfolio Decision Making "

Given the uncertainty of outcomes in the biotech industry, consideration of variability is an inherent part of the decision process. Often, the mean (average) is not a relevant decision criteria. This is especially true for smaller biotech companies like Vertex – the opportunity costs are extremely high because scarce capital resources would be invested elsewhere, with a higher probability of realistic return. For example, a company may reject a project which is profitable on average (positive Net Present Value) because some of the possible outcomes are unacceptable to the decision maker. Consideration of variability allows a decision maker to bring in their own risk tolerance into the decision. A similar argument applies when estimating a safety margin above a base case (e.g. in cost budgeting).

» Register now (FREE)
» View archived webcasts

Free Webcast This Thursday: The Use of the DecisionTools Suite in Biotechnology Project and Portfolio Decision Making

Vertex Pharmaceuticals, Inc. is a global biotechnology company based out of Cambridge, MA. The Company’s strategy is to commercialize its products both independently and in collaboration with major pharmaceutical companies. Vertex’s product pipeline is focused on viral diseases, cystic fibrosis, inflammation, autoimmune diseases, cancer, and pain.

Given the uncertainty of outcomes in the biotech industry, consideration of variability is an inherent part of the decision process. Often, the mean (average) is not a relevant decision criteria. This is especially true for smaller biotech companies like Vertex – the opportunity costs are extremely high because scarce capital resources would be invested elsewhere, with a higher probability of realistic return. For example, a company may reject a project which is profitable on average (positive Net Present Value) because some of the possible outcomes are unacceptable to the decision maker. Consideration of variability allows a decision maker to bring in their own risk tolerance into the decision. A similar argument applies when estimating a safety margin above a base case (e.g. in cost budgeting).

Vertex’s strategy and analytics group within the corporate finance division seeks to provide the senior management with dynamic revenue and profit forecasting methodology that helps to identify types of drugs that should be developed given a finite amount of cash and resources. A traditional financial view allows the user to identify scenarios and potential outcomes, but lacks the ability to show the range of potential values within each and every outcome. Vertex’s team uses the DecisonTools Suite to establish the average outcome, the variability of outcomes and to pressure-test risk and uncertainty of a particular scenario throughout the decision process.

Vertex’s team built a complex financial risk analysis model using @RISK to enhance its portfolio process. Monte Carlo simulation and optimization are used to analyze and optimize project and portfolio decisions, given short and long-term corporate strategy. @RISK is also frequently used throughout the business development process: simulating across multiple sales forecasts provides BD team with a range of potential outcomes, making it easy to pinpoint a particular scenario on a curve, along with its probability and value. TopRank turns the sensitivity analysis into a quick and seamless exercise, answering multiple what-if questions within minutes. Franchise and program leaders can now see a dollar effect of their program being delayed or advanced, adding supplementary indications to the development plan and even addressing the price uncertainties all at the same time. The simple interface of PrecisionTree along with tornado chart outputs makes it easy to explain the effect and importance of a particular assumption / decision to an audience with no finance background.

As the company continues to grow, adding more drugs and collaborations to its development pipeline, we will see in this free live webcast how the DecisionsTools Suite remains one of Vertex’s analytical tools of choice to enhance and guide the decision making process.

» Register now (FREE)
» View archived webcasts

The Better to Be Believed

In his blog yesterday for Smart Data Collective, Dean Abbott, makes a worthy, commonsense observation: no matter how accurate a predictive model is, it is of no use to the enterprise unless it is presented in such a way that all the decision makers understand what factors and techniques went into the analysis and why.
 
The reason that the ‘best understood’ model is more effective than the ‘best’ model is that when the people with authority over a particular decision are presented with a statistical analysis that is beyond their ken, they may or may not pretend to understand it.  But in any event, they are not likely to buy into the results if they can’t retell the story the model describes.  
 
Take for instance, a Monte Carlo simulation that focuses on credit risk analysis for a particular loan.   Everyone in the line of authority will be held responsible for real world outcome of what the Monte Carlo software describes in the Excel spreadsheet.   And if you are one of these decision makers, how can you take responsibility for something you may not quite understand?
 
The problem of acceptance of a predictive model presents the analyst with a tough question: Do I present the model that I know is true and statistically accurate?  Or do I present a ruder, cruder analysis that presents a story that can be immediately understood?
 

Abbott suggests a compromise: streamline your plot by masking (Abbott says "removing") fields that contribute to the robustness of the analysis but involve statistical twists and turns that are distracting to decision makers who may not be fascinated with technique and just want to see how the story turns out. This, he explains, allows you to work from a model both you and the decision makers can believe in.

Your thoughts? 

Are solar panels a sound investment? A risk analysis case study

The UK’s new coalition government has said that, as part of its ‘Green Deal’, it will encourage home energy efficiency improvements paid for by savings from energy bills. It seems likely that, in the year that energy regulator Ofgem warned of 20 percent electricity price hikes by 2020, this initiative will include solar panel technology

Currently the UK still lags behind many other countries in Europe and the rest of the world when it comes to harnessing solar power. Not only do we have less hours of sunshine than many regions, but there is a lack of clarity as to the ‘payback’ time when it comes to users seeing a return on investment.

This is where Palisade customer, the California-based Tioga Energy, makes an interesting case study. Whilst it may seem unfair to compare the UK with the west coast of America when talking about solar-related matters, the sunnier climate does not reduce the need to prove ROI for customers with solar energy agreements.

Tioga Energy provides project financing through its solar Power Purchase Agreements (PPAs), and maintains and operates solar systems on behalf of its customers. Tioga’s offering delivers predictably priced power and enables organisations to to both ‘green’ their operations and reduce energy costs. To illustrate the benefits of solar, estimating future electricity prices and making comparisons by showing the savings from a new solar system, Tioga enlisted the help of @RISK for risk analysis solutions.

To forecast possible price increases, Tioga Energy inputs California’s historical electricity rate data into a quantitative risk analysis model developed using @RISK. This generates a probability distribution for electricity rate rises over the 20-year PPA period, which shows that there is a 25 percent likelihood that price increases will be less than 4.8 percent, and a 25 percent chance that rate rises would be more than 8.7 percent.

The @RISK risk analysis model therefore helps Tioga Energy evaluate the likelihood that a customer will save money for a variety of PPA scenarios (i.e. the rate at which electricity would initially be charged and the amount by which it would then increase each year). It also calculates the magnitude of savings for the different combinations of first year costs and subsequent rises. Consumers are therefore able to better understand the pricing and make an informed decision about whether to sign up for a PPA.

Using historical data and @RISK’s risk modelling software capacity, Tioga offers consumers a robust view of the potential benefits of a solar PPA. This enables them to hedge against rising electricity rates, as well as feel confident that they are playing a part in tackling global warming.

» Read the Tioga Energy case study

Craig Ferri
EMEA Managing Director of Risk & Decision Analysis

Market decline versus speed to market – ‘A bird in the hand…’

I recently saw an interesting @RISK cashflow model from the portable phone industry. It modeled the uncertainty in the length and decline of overall market demand for a particular technology against five strategies for getting various application products to market as soon as possible. 

Using @RISK’s Simtable function, combined with Excel’s Index function, it was possible to run multiple simulations and see which strategy could take best advantage of the potential market, given the uncertainties in the development process, the possibility of competitors, the market take-up and the margins that might be achieved.

As is often the case in all aspects of life, the simulation revealed that ‘a bird in the hand is better than two in the bush’; it’s very comforting to know that @RISK risk analysis solutions can cut through loads of detail and come back with an answer that echoes received wisdom!

Ian Wallace, ACMA
Palisade Training Team

@RISK Quick Tips: Event and Operational Risk Analysis

@RISK risk analysis software using Monte Carlo simulation is used for a wide variety of applications. In this model, we have an example of a general usage to address Operational Risk.

In many circumstances one wishes to calculate the aggregate impact of many possible yes/no type events. For example, it is often important to answer questions such as "What is the loss amount that will not be exceeded in 95% of cases?" @RISK simulation can be used to answer such questions.

» Download the example: EventandOperationalRisks.xls

Rating the Polls

With the New York State primaries coming up September 14 and the general election on November 2, I predict that as soon as summer turns the corner into September, we’ll start hearing lots and lots about polls that predict election outcomes.  To find out if there was any early discussion of polls, polling, and outcomes, I returned to my favorite election forecast site from the 2008 presidential elections, FiveThirtyEight: Politics Done Right.

 
Sure enough, there it was, a comparative rating of pollsters. This will give people like me, who tend to believe any poll just because it’s covered in the news, a way to assess the poll reliability. FiveThirtyEight is the brainchild of Nate Silver, and 538 is the number of members of the Electoral College.  Silver’s primary business is Baseball Prospectus, which is also fueled by Monte Carlo simulation and other risk analysis techniques, but FiveThirtyEight has done well enough for the New York Times to want incorporate it in its online coverage during the coming elections.
 
Silver’s grasp of statistical analysis becomes immediately evident when you go to his page on the pollsters, and he’s more than happy to discuss the statistical methods he uses to rate the pollsters–regression analysis of raw data, Monte Carlo software in an Excel spreadsheet, weighting of poll performance data, and so forth. His take on these matters may be of practical interest to any of you who use these techniques in financial risk analysis.

Elections are all about decision making under uncertainty, especially voter decisions under uncertainty, and according to Nate Silver, only polls taken within 21 days of an election are reasonably reliable.  So when the national campaigns are ramping up in October, keep one eye on the polls and one on FiveThirtyEight.  

Graphing with your Mouse – Part III: Scatter Plots from Tornado Diagrams in @RISK

You can quickly generate a graph of any simulated output relative to a given input using the tornado graphs in @RISK risk analysis software. Just drag a bar from the tornado graph onto a blank space, and a scatter plot of that input relative to the simulated output appears. This is a great communication tool when you’re trying to assess the most important risk variables in your risk analysis model and want to “zoom in” on a particular input to see what’s going on. It can really help understand relationships in your modeling.

See this video for a quick demo:

» View "Getting Started in @RISK" video tutorials

Is @RISK a forecasting tool or a decision-making tool?

Most people understand that @RISK and Monte Carlo simulation are designed to be an improvement on single-point estimates.  In practice, however, I often see people using @RISK as a forecasting tool to get yet another single-point estimate, such as the 90th percentile, without putting it into the context of the potential range of outcomes.

This is probably the difference between a forecasting and a decision-making.  The former tends to focus on historical or observed trends and developing specific scenarios (e.g. best, most likely, worse) based on expert opinion, while the latter is concerned with confidence ranges and likelihood.

Indeed, it’s not until you add probability, as with @RISK, that you start to measure the quality of your forecasts (i.e. your confidence level) and calculate the margin of error – something that’s crucial in all walks of life!

In my opinion, therefore, @RISK is much more of a decision-making tool than a forecasting tool.  Both involve trying to predict the future but the addition of probability gives decision-makers vital insight to a problem. 

Don’t you just love semantics!

Ian Wallace, ACMA
Palisade Training Team