Month: March 2009

Real World Gamesmanship

A few years ago, the online magazine for computer gaming Gamesutra published an article by a game developer Alan Carpenter extolling the virtues of using risk analysis to balance role-playing games.  In this case balance meant designing a game that was neither too difficult nor too easy, and it was the costly, time-consuming goal of game development companies–costly meaning an average of $3 million a title!

Carpenter had observed that by using the same operational risk software that the oil and gas industry uses to make decisions under uncertainty, a game designer could take advantage of the thousands of possible scenarios spun out by a Monte Carlo simulation to add a new element of reality to games where context and emotion may be big draws for the gamer but can’t sustain entertaining play.

Carpenter advises game developers that many games could be designed using the same Monte Carlo Excel spreadsheet.  He offers a lengthy technical blueprint for games that are based on conflict–war, street fights, etc.–and in revisiting it what intrigues me is the idea that the same infrastructure of algorithms and probability functions that Carpenter lays out to capture events in an imaginary world could just as easily be applied to real-world political and military events.

If risk simulation is being used to help plan world events, it’s not widely talked about.  But I suspect this is going on, and I would love to hear from anyone out there who knows more about this than I do.  

The Finance Guy

Up here in New York–where everyone is eyeing the market anxiously and many are still muttering woulda-shoulda recriminations about risk assessment–a gust of fresh air just blew in. It came up from–of all places–Florida, where the real estate market has been reduced to rubble and there are thousands of empty hotel rooms.  But even in this setting, Dave Eller, The Finance Guy, is optimistic about future of risk analysis and its essential role in what he calls the "stewardship" of resources. When it comes to decision evaluation, Eller is high on the goodness factor in decision makers.

The Finance Guy has been building simulations since before the advent of dedicated Monte Carlo software and has had extensive experience in large corporations, particularly in oil and banks. Two years ago, writing for the hospitality industry he predicted today’s financial turmoil. "I’m not a doomsayer–not at all," he told me, "but I could see it coming. Now the typhoon has struck, and we’re sitting on the beach." There’s no need to blame this on risk simulation, he says, just bad simulation based on self-serving assumptions.

On the beach or not, no need to despair. "The country is going through a cleansing process. It’s almost biblical," he says about the many businesses working through the results of many bad decisions. "The heart of this cleansing process is making good decisions–stewardship of business resources." By his definition, good decisions involve people of principle who use detailed, objective models that tell them how to factor cost and risk into operations management. "Stewardship is very simple," he declares, repeating the mantra his clients know well, "Cut costs, reduce risk, make money." Cut costs, reduce risk, make money.  Cut costs, reduce risk, make money.

Further Successes with DecisionTools Suite training in Europe

Following the introduction of the DecisionTools Suite 5.0 in 2008, we modified the structure of the public training courses, which consist of a two-day @RISK course and a consecutive one day course on the other DTS products.  Although this has required a major commitment from customers in terms of the time away from their office, it appears that the decision to structure the course in this way has worked well in Europe. First, it does allow customers to explicitly choose whether they wish to learn only @RISK, or all the other products, or only the non-@RISK products. Second, there are certainly enough problem solving applications that can be addressed with the software to use the time productively in a value-added way. Third, we generally run the courses on a Tuesday through Thursday, so that people can be in their offices at the beginning and end of the week. By staying in touch with their offices during the course (e.g. using the e-mail facilities provided to customers for our courses based in the London office), most participants are able to take the time out without excessive disruption to their work. All the customers that I spoke to that have attended all three days have reinforced the feeling that this three-day approach is most appropriate, value-added, and allows customers to sign up for only those parts of the course that they feel most useful, whether it be pure @RISK, the non-@RISK DTS products, or the full Suite.

More details of the courses can be found on the Palisade web-site. As a reminder of the contents of the non-@RISK products (day 3), we cover TopRank as a tool to audit and conduct sensitivity analysis on general Excel models, PrecisionTree to build decision trees for making decisions under uncertainty. We then move on to optimization modelling using genetic algorithm optimization methods of Evolver and RISKOptimizer. Finally, we use StatTools to conduct a variety of statistical analysis and NeuralTools to perform predictive analysis based on neural network logic. As always, the course is built around practical examples and applications.

View the training schedules:

» Europe, Africa and the MidEast
» North America
» Brazil
» Latin America
» Asia-Pacific

» Live Web Training

Dr. Michael Rees
Director of Training and Consulting

March without the Madness

Two behavioral economists at the Wharton School have recently published a statistical analysis of win-loss patterns in NCAA basketball games.  Jonah Berger and Devin Pope collected data from more than 6500 games since 2005, and guided by a psychological/ economic model called Prospect Theory, they cranked their win-loss data through a specialized regression analysis (probably not a standard offering in Microsoft Excel statistics).  Then they replicated this modeling with data from a lab experiment using keyboard striking instead of basketball moves.  

Their primary finding from both datasets?  Losing can be a powerful motivator.  Both basketball teams and keyboard strikers were apt to move from a position of slight disadvantage to winning.  In the case of the basketball scores, the teams that were one point behind at halftime were more apt to win than teams that were one point ahead at halftime.

Because Prospect Theory addresses the problem of decision making under uncertainty–in fact, decision making in which all the alternatives involve uncertainty–I began to wonder if March Madness coaches could use Monte Carlo simulation to strengthen their strategy.  Maybe the coaches could use the same NCAA data and Berger and Pope’s results to run various point-up, point-down scenarios through their Monte Carlo software.  This would allow them to anticipate their strategies to respond to any number of halftime scenarios. 

This kind of risk analysis could certainly rationalize game strategy.  But then March would be March without the Madness, wouldn’t it?

Expert Advice is Needed in Tough Times

When the economy takes a turn for the worse, tough times call for cutbacks. Cutbacks might include extras you don’t really need, goods that haven’t yet outlived their usefulness, or services you can perform yourself. Just because you can – and perhaps should – do without some of the ‘luxuries’ does not mean everything should be cut from your budget. Expert advice is one of those areas – and it doesn’t have to break the bank.

Weighing important decisions demands advanced analytics and informed insights to uncover the value of one choice over another. With quantitative analysis, you can gain valuable insights into underlying risks and their implications. Taking that information forward, you can formulate effective responses to those risks. Statistical analysis in Microsoft Excel is useful, but without the ability to account for risk and uncertainty, a static model adds limited value.  George Box’s adage “All models are wrong. Some models are useful” is an apt perspective viewed through static analysis. An essential extension to the basic Excel model is quantitative risk analysis.

Palisade’s training and consulting services are cost effective ways to extract more from your risk analysis software. In short order you can gain better control over your simulation models through training, or you can work with the experts who developed these tools to aid you in efficient modeling of your given situation. Bespoke models provide you with information relevant to your organization. Add efficient application of @RISK (analysis with Monte Carlo simulation) and RISKOptimizer (Monte Carlo simulation with optimization for decision making under uncertainty) or PrecisionTree (decision analysis in Microsoft Excel using decision trees and influence diagrams) and you get much more. Then you’ll find out just how useful models can be.

Thompson Terry
Palisade Training Team

The Innovation Imperative in Manufacturing

In a recent report The Innovation Imperative in Manufacturing – How the United States Can restore its edge, produced jointly by The Boston Consulting Group, the National Association of Manufacturers and the Manufacturing Institute, The United States ranked #8 out of 110 countries in innovation leadership.     

The Top Ten List (overall)

  1. Singapore
  2. South Korea
  3. Switzerland
  4. Iceland
  5. Ireland
  6. Hong Kong
  7. Finland
  8. United States
  9. Japan
  10. Sweden

The first half of this report painted a pretty bleak picture for the US.  Among the depressing  statements:
“The United States is losing its distinction as an innovation leader and may be under-investing in the future.”

 “The United States is disadvantaged in several key areas, including workforce quality and economic, immigration and infrastructure policies.”
Some companies are even “moving R & D centers abroad to capitalize on leading-edge talent and lower cost scientists and engineers or to better meet local market needs . . .”

This information is difficult to swallow, but it clearly makes the point:

It’s time for change.

In the article, they actually used the quote “innovate or die!”

Even without the current economic crisis, we know we have issues and challenges at many levels, and if this report helps revitalize our companies and government into action, then all for it.

The second half the report does a nice job detailing what they considered to be the Four Key Factors for Success, which they felt are:

  • Idea generation
  • Structured Processes
  • Leadership
  • Skilled Workers

 
The research feels the US Government plays a vital part in encouraging innovation, and their role is to boost Company payback on innovation through consistent programs, like supporting innovation activities through government-funded laboratories and research labs. Tax credits are also a common way, but this deemed to be more of “thank you” then a motivator, and is often inconsistent from year to year.

Although they stated that some recommendations were beyond the scope of the report, they suggest the US make concrete improvement in six areas:

  1. Strengthen the workforce
  2. Lead by example
  3. Make innovation easier
  4. Maintain a strong Manufacturing base
  5. Improve the pay back
  6. Be Consistent

Although, I feel there is a lot of good information to be learned from the report, one should keep in mind that the bulk of the information was gathered through a NAM Survey of Corporate members with only ~1000 respondents and a series of 30  one hour interviews with “Senior Executives.” It’s truly hard to know how representative the sample was to the actual population.

To reiterate, we know that innovation, quality and jobs in the US have been on the decline for past years. It’s time to act, Adopt an innovation,  product development,  or quality program such as Design for Six Sigma (DFSS), Design for Lean Six Sigma (DFLSS), Critical Parameter Management, (CPM)or whatever you want to call it, deploy it and stick to it! As I have may have mentioned before, many companies I have worked with who indicated successful deployment really never got off the ground floor.
 

You can read the entire report at www.nam.org/~/media/AboutUs/ManufacturingInstitute/innovationreport.ashx

Some Best Practice Principles in Excel Modelling

This blog briefly posts some fairly standard (but not fully accepted, and more often simply not implemented!) “best practice principles” in Excel modelling. A later blog discusses a related topic as to whether risk modelling (when building Monte Carlo simulation models using @RISK in Microsoft Excel) requires the same (or a modified) set of principles.

The following principles are generally to be applied to Excel models (in fact, in practice, many of these may need to be varied or interpreted in a slightly different way than it might seem at first; my book Financial Modelling in Practice discusses some of these issues):

  • The model should be objectives driven, that is, it supports the decision-making situation; its structure and allowed sensitivities should be aligned to the decisions that will be taken with it
  • It should be kept “as simple as possible, but no simpler” (to paraphrase Einstein)
  • Error-checks should be built in. For example, the same quantity could be calculated in two different ways and the difference between the calculations (which should always be zero) could be set as an output. Excel DataTables (or Palisade’s TopRank) could be used to check that the error is always zero under a wide range of input scenarios (when using DataTables, conditional formatting of the cells can be used to highlight any non-zero values)
  • It should have a modular structure with related calculations kept as close together as possible
  • It should be compact, with no linked workbooks, and as few worksheets as possible; the total length of all audit trails in the model should be minimized.
  • There should be a clear logical flow (usually left-to-right, top-to-bottom), with no “mixed” formulae – every numerical quantity is either a number or a calculation
  • There should be short, transparent calculation steps (that can be understood within a short space of time)
  • Formatting should be used to highlight the structure and flow of the model (e.g. borders around the modules, bold text, colour-coding, shading, “significant figure’ rule for the number of decimals etc)
  • There should be no circular references (some very limited exceptions apply)
  • Named ranges should be used highly selectively but not excessively
  • Adequate documentation should be provided: key assumptions, limitations or key restrictions on the logic

Many of these issues apply in risk modelling with @RISK (risk analysis using Monte Carlo simulation software add-in for Microsoft Excel, for decision making under uncertainty), but some additional points may require consideration, as discussed in another posting.

Dr. Michael Rees
Director of Training and Consulting

Wetware vs. Software

I have just smushed up against a brand new piece of computer jargon: wetware.  Also referred to as meatware, it is the living analog of software.  It is the biological processes involved in cognition.

Where, you may reasonably wonder, did I encounter this attractive term?  In a recent article in Wired that more or less announced the victory of machine intelligence over the human mind in the ancient game of Go.  I say more or less because unlike the victory of Deep Blue over Garry Kasparov in chess, the computer foisted a handicap on the professional human player.  Under these conditions, artificial intelligence has beat the human kind in six games.

What it took for a computer to achieve these victories over wetware was Monte Carlo software.  The same statistical analysis software that is used in so many everyday situations that involve decision making under uncertainty: risk analysis for such financial transactions, operations management in engineering, exploration and production in natural resources, and product strategies.

While the article made much of the power of artificial intelligence, it did not mention that simulations produced by Monte Carlo software requires keen wetware to produce good decision evaluation.  Nevertheless it went on to confidently predict that complete domination of Go by computer is close at hand.
 
Taking the jargon for machine substitution for humans a little further, suppose software does conquer wetware in the ancient game. Then what? Go-bots? 

Advanced Analytics for Business Intelligence

Business Intelligence (BI) is all the rage. Businesses want business intelligence. Analytics and reports are at the heart of BI. Decision makers want decision intelligence. Analysis, especially quantitative risk analysis and Monte Carlo simulation, yields more thorough intelligence for effective decision making under uncertainty.

Some, Ralph Kimball among them, challenge that advanced analytical tools, “as powerful as they are, can be understood and used effectively only by a small percentage of the potential … business-user population.” What’s missing from the assertion is a recognition that data are about what has already happened. If you’re forecasting or planning strategically, you need predictions moving forward. It’s not just about data mining, it’s how you employ the data to make effective and well-informed decisions.
According to one BI developer, “use of advanced analytics has been limited to power analysts” leaving reporting capabilities to the bottom of the technology pyramid. But risk and decision analysis tools from Palisade Corporation are not just for power analysts — anyone can make ready use of these Monte Carlo software tools. Besides, several large corporations make @RISK accessible to business and production sides in the organization.

@RISK, and each of the DecisionTools Suite applications, has built-in reporting capabilities to communicate the analyses you’ve performed. In @RISK 5, you can even have your reports generated immediately after completing a simulation (see the Simulation Settings dialog), whether those reports are in standard form and layout, or as customized templates. Any statistic you can find from a data set can be reported in easy to use Excel spreadsheets. The quants can utilize the reports to communicate the risk analytics of a strategic decision as well as those on the front lines who need to generate a quick assessment of uncertainty in operational actions.

Thompson Terry
Palisade Training Team

Recalculating the Calculators

In an article this week in the online publication TheStreet, Taylor Smith takes aim at online calculators for retirement planning.  He says the problem with them is that most of them are based on Monte Carlo software, and then goes on to make some broad and inaccurate statements about the characteristics about Monte Carlo simulation and the ways it skews risk assessment.  

For starters, he quotes an investment "expert," who is, coincidentally, president of concern that produces risk analysis software not driven by Monte Carlo software, to the effect that good financial planning should take into account, more than market returns–including taxes, income, and expenses.  This implies the Monte Carlo simulation is not capable of factoring these elements into its predictions.  Nothing, as the a recent risk analysis model offered by both the Society of Actuaries and the Casualty Society aptly demonstrates, could be farther from the truth.  This model, developed by professors of finance and mathematics at Illinois State University and the University of Illinois Champaign-Urbana, helps pension and insurance planners to to forecast not only projected market returns but the effects such critical economic factors as interest rates, equity price levels, inflation and unemployment rates, and real estate prices.

My point is that a Monte Carlo simulation is what you make it.  It can be very simple and limited to one or two economic factors, or it can be a complex mix of many factors.  If the online retirement calculators are too simplistic to usefully account for reality, their builders have plenty of room to improve them.