Day: April 15, 2010

20 Questions in a New Orbit

An Ottawa toy developer is trying to make a jet-propelled leap from an online game to space travel. His vehicle? A neural network designed as the back end system for a game of 20 questions. Twelve years ago Robin Burgener wrote a neural net program to train on the sequences of player responses to questions–beginning with Animal? Vegetable? Mineral?–posed by the neural network,              
The game is does more than pose simple yes-or-no answers to lead you to a conclusion. The neural network algorithm is able to pose different questions in different orders, and it gets the right answer about 80 percent of the time.                                                         , 
Now, apparently, the sky’s the limit for Burgener’s neural network.  He was scheduled to make a presentation late last month at the Goddard Space Flight Centre explaining the potential uses for a neural networked 20 questions on board a space craft. These uses center broadly on troubleshooting technical and equipment problems and subsequently anticipating future problems.  
If, as he claims is true, his neural net guessing program can work around responses that are misleading or downright lies, what that would mean for space travelers, he concludes, is that  "if a sensor fails, you’re able to see past it."

I know what he means, I think, but I myself don’t tend to look past sensors.        

Robust Risk Analysis for the Time/Expertise Poor – Part 2

In my last blog I introduced the idea of a customised risk analysis solution to problems commonly faced in project risk management, especially cost estimation. Of course this idea is not uniquely applicable to project costs, but this paradigm is the simplest to explore, and that’s what I’m about to do.

Picture a risk register in a worksheet that has been created at a macro level to encapsulate most (all?) of the risks your projects may face. For any given project only a subset of these will be relevant – what is the best way to get these risks into a risk model on the next worksheet? By pressing a button of course! It is almost trivial to write code that picks up all selected risks and places them and the relevant data fields in the model worksheet. Sure beats manually copying and pasting individual line items and the transcribing errors that follow.

The next problem is utilising the workshopped parameters (likelihood of event, three-point estimates for severity etc.) in a logical way to be referenced by appropriate @RISK functions. Once a model structure has been agreed upon a macro button can place @RISK distributions where they ought to go, either logically due to the paradigm (using RiskBinomial, for example) or via a drop-down selection for dollar impact (RiskPert or RiskGamma, say). My clients have been especially thankful when I limit their choice of distribution and provide a simple flow-chart to follow to make this very decision. Reducing the propensity for arguments in risk workshops is worth its weight in gold; if we can assume that reducing this risk ‘weighs’ plenty!

Similarly one or two instances of the simulation settings are likely to satisfy all requirements, so these too can be activated by macro buttons. In this way a user can’t run a ‘poor’ simulation thus creating spurious results. The simulation output that is required can be placed into a report template attached to the model template and generated using yet another simply-labelled macro button. In this way there will be consistent reporting across the organisation allowing decision makers to become familiar and comfortable with simulation results they might otherwise ignore or be unaware of.

A risk model created by this process may not be the theoretically optimal one, but it will be valid and in context with its intended use. It will certainly be easy to use! The results will be consistent and should satisfy management’s desires as well as regulatory requirements.
The project cost estimation is but one example, and the above possibilities are far from the only ones imaginable. Additional complexity or alternate needs would be just as easily met simply with different code essentially without any practical limits. You don’t need to be an expert in Monte Carlo techniques and software to run robust, credible risk analyses. All you need is a risk analysis consultant who macro-controls the cumbersome and probabilistic elements, some appropriate simulation options and reporting procedures. Ask for me by name!

» Robust Risk Analysis for the Time/Expertise Poor – Part 1

Rishi Prabhakar