Month: October 2008

The Devil Is in the. . .Whah?

Many commentators on the current financial woes in the U.S. have blamed the credit crunch on the “CDO”–collateralized debt obligation.  CDOs are an unregulated type of credit product, asset-backed securities constructed from portfolios of fixed-income assets. They come in many shapes and sizes, and they are rated in a similar way as bonds are rated. These are not simple products, and therefore the risks associated with buying and selling CDOs are not easy to quantify.  It’s a tricky business, like option valuation.

As the current financial turmoil has made painfully clear, either the risk analysis done by credit rating agencies and by institutions buying and holding CDOs was not adequate or these risk assessments were optimistically ignored.  Furthermore, some pundits have suggested that because CDOs are not sold on the open market, they are not priced according to their risks–in other words, they were too easy to acquire.

How should we evaluate investment risk in a package of many debts, each of which would be assigned a different value-at-risk at any particular point in time?  We have a lot of slick statistical analysis techniques available, and perhaps these  alone should have been up to the task.  But as the current liquidity crisis demonstrates, the devil is not only in the details of risk analysis.  It is in the failure to take these probabilities to heart.

Project Risk Management in Six Sigma

The main limitation for any business activity when it comes to improving efficiency is time. A project typically has a definite start and end date, which means any improvement initiative has to be focused within that limited time period. This translates to working in parallel with the activities of a project as they actually happen.  The second limitation is budget. Besides completing the project on time, can we do it on budget?

As defined by the Project Management Institute, “Project management is the application of knowledge, skills, tools, and techniques to project activities to meet project requirements”.

The DMAIC and DFSS approaches tend to focus on controls for the improvements, process and product development, not the control of the project management process.  This division between Project management and project improvement/development is fine if the project has a Project Manager, preferably a PMP, and a Black Belt, DFSS or DMAIC. But the reality is we are positioning our Black Belts as project leaders, and most times both roles are performed by the Black Belt. This approach is acceptable only if the BB is apt at both.  In both DMAIC and DFSS training classes the focus is typically on the tools used to complete the project. Little (if any) time is spent on project duration or cost predication and management.  Critical decisions must be made based on the probability of completing on-time and on-budget. If these probabilities are too low, the project may need to be redefined or even scraped

One excellent tool the certified PMPs use is @RISK for Project, which uses Monte Carlo simulation to show you many possible outcomes in your project – and tells you how likely they are to occur. This means that you finally have, if not perfect information, the most complete picture possible. You can determine which tasks are most important, and then manage those risks appropriately. It can help you choose the best strategy based on the available information, which is why many PMPs and large companies with in-house training programs are standardizing on Palisade Corporation’s @RISK for Project for project management.

A good article to read to learn more about Integrating Project Management into a Six Sigma System is located in iSixSIgma’s Library and was authored by Daniel Zucker

Nest Eggs and Bear Markets

If your retirement is near or, in fact, here, you’re probably concerned about how to cope in today’s bear market.  And if you are one of the almost 40 percent of investors age 56 to 65 whose retirement planning led them to invest too heavily in equities–and who the Employee Benefit Research Institute says now hold about 80 percent of their wealth in stocks–you’re probably downright worried.

But it may be comforting to learn that yours is not the first group of retirees to cope with bad timing in addition to decision making under uncertainty.  It should be helpful to take a look at the strategies that helped other bear-market retirees deal successfully with tough conditions.  The investment firm T. Rowe Price recently did a risk analysis study of investors who retired into earlier bear markets.  Their decision evaluation pinpointed strategies that allowed these retirees to stretch their savings over a thirty-year period.  According to their findings, the worst decision a retiree can make is to sell stocks when the market has bottomed out, and the best decision is to hold on to stocks and adjust plans for withdrawing funds.

One obvious way to reduce withdrawals is to continue or resume working, but there are others, and a tactic that the investment firm’s Monte Carlo software found to give an investment fund an 89 percent probability of lasting thirty years is much less drastic: take withdrawals but do not increase your withdrawals to adjust for inflation until the market rebounds. This won’t be painless, but it will allow you to cash in later on the growth of the stocks you hold now.

For one columnist’s more detailed advice on negotiating a bear market after retirement, go to www.retirementrevised.com.

What is Risk Management, Anyway? Who Can I Ask?

The financial crisis had led the media, regulators, even accountants screaming for better “risk management.”  OK, the need seems pretty obvious in 20/20 hindsight.  But what does that really mean?

Lots of very smart people are attempting to answer that question.  A just-released report from Marsh Insurance Brokers outlines “how insurance and risk management strategies can help UK firms improve liquidity, free up cash, strengthen their financial resilience and continue operating profitably during deteriorating economic conditions.”  Everyday bloggers are posting reasonable steps for managing risk.   ZDNet offers three great steps for mitigating your IT risks. And ever-reliable Wikipedia offers its customarily thorough definition and approach.

With so many competing “theories” to risk management, which bullet list do you follow? Which is right for your industry? Wouldn’t you just like to talk to someone else facing problems and ask, “How are you handling this?” There needs to be more forums or meetings of people facing risks to answer these questions. Sure, there is RIMS. The Risk and Insurance Management Society has a huge annual meeting which offers great value to insurance and reinsurance professionals. And, the SRA (Society for Risk Analysis) does great work relating to human health and environmental risk. 

Paul Wilmott at Palisade Risk&Decision Analysis Conference
Wilmott magazine founder Paul Wilmott (right) discusses risk with a banking delegate at an earlier Palisade Risk&Decision Analysis conference.

But what if you’re not in those industries? Everyone is facing risk now. The financial crisis means that banks and the money sector are especially hard hit, but the ripple effect includes energy, healthcare, aerospace, manufacturing, and more. The Palisade Risk&Decision Analysis Conference (New York City, November 13-14) is one such forum gathering executives from many industries who face risk. Over 20 case studies provide a learning environment where professionals demonstrate their own unique approaches to risk and uncertainty. Software training using Monte Carlo simulation and other techniques is also available. There is evidence that this approach of learning from each other is particularly valuable for making change quickly.

DMUU Training Team

Monte Carlo Simulation in Tolerance Design and Stack Analysis

The appeal of Monte Carlo Simulation lies in its applicability under very general settings and the unlimited precision that can be achieved. In particular, Monte Carlo can be used in all situations where other techniques (Linear and non linear propagation, and Numerical Integration) can be used but can yield more precise estimates. For this reason, Monte Carlo is easily the most popular tool used in tolerance problems.


Until recently the caveat was in the amount of processing power and time it took to complete a simulation. Now @RISK from Palisade Corporation can utilize the full processing power and dual cores of  today’s personal computers, so this is no longer an issue. The other issue, what was deemed as “Pseudo random number generators,” has been overcome by @RISK offering 8 tested and proven random number generators such as Mersenne Twister, RAN3I, MRG32k3a, MWC and KISS . . . to name a few. These are just a few of the reasons why @RISK is becoming the standard Monte Carlo Simulation package in DFSS Training and Six Sigma classes around the world.


Reference: Design for Tolerance of Electro-Mechanical Assemblies: An Integrated Approach by Narahari, Sudarsan, Lyons, Duffy and Sriram

Tornado Graphs: Basic Interpretation

Microsoft Excel StatisticsWhen using @RISK (risk analysis software for conducting Monte Carlo simulations in Microsoft Excel), one of the output graphs is a tornado graph. Such graphs have their most direct interpretation for linear models with independent input distributions, such as in most typical cost budgeting models. In these cases, the regression coefficients provide a measure of how much the output would change if the input were changed by one standard deviation (the correlation coefficients provide a broadly similar measure, but are slightly different as is covered in another posting). In @RISK5, the “mapped values” feature shows the absolute figures i.e. the absolute change in the output as each input is changed in this way.

For models which have dependencies between the input distributions (e.g. correlation or parameter dependencies) and models where the output behaves in a non-linear way with some of the inputs, these statements hold either with some qualification or may not hold at all. In such cases, the interpretation of the coefficients will in general be specific to the nature of the model. For example, in linear models with correlated input distributions, the regression coefficient will still provide a measure of how much the output would change if the input were changed by one standard deviation, but only assuming that such a change could be implemented independently i.e. without affecting the other variables.

» Watch a video demonstrating Tornado Graphs in @RISK

Dr. Michael Rees
Director of Training and Consulting

Enterprise Risk Management: A New Look

The concept of Enterprise Risk Management, or the incorporation of risk assessment in all functional areas of an organization, is not especially new. In 2002, for example, Sarbanes-Oxley required internal controls on financial reports, usually including risk assessment. However, this tidbit from an article on “Smarter Risk Management” from Director of Finance Online caught our attention:

“Standard&Poor’s has recently indicated that they will begin incorporating consideration of the strength of enterprise risk management practices as a component of their credit ratings methodology. This is yet another incentive for ensuring that a company’s approach to risk management is robust, capable of being articulated and will stand up to scrutiny.”

If your S&P score depends, at least in part, on your risk management methods, there may be hope for us yet. There are others signs that enterprise risk management is making a comeback, or at least is remaining in the corporate consciousness. A September piece from Business World Online indicates:

“Risk management has recently come into prominence in the corporate suite. … Risk management exists because a company wants to take advantage of or minimize risks that affect it. These factors include political risks, foreign exchange risks, interest rate risks, liquidity risks, price risks, market risks, operational risks, credit risks, and employee risks.”

What many of these reports miss, however, is the value in learning about ERM directly from others facing risk – and not just in your own industry. The Palisade Risk&Decision Analysis Conference in New York City (Nov 13-14, 2008), is an example of a risk forum bringing together executives from a variety of industries for the purpose of exchanging ideas on risk. Over 20 case studies form a key learning model, along with software training.

DMUU Training Team

Next-to-Real-Time Polls

Number crunchers rejoice!  With the U.S. presidential election only two weeks away, polling is in full swing.  It’s continual, in fact, and spews out a steady stream of fresh data for the waiting analysts.  For their part, the analysts are now able to spin out probable outcomes of the election faster than in any earlier presidential races.  The data crunching takes place in close to real time.

For those who want the inside track on what it all means before the election results leave no doubt, there are a number of web sites that show current–up-to-the-last-minute, in fact–projections:  270towin, FiveThirtyEight, and RealClearPolitics.  Although the results you see when you visit these sites are hardly what you could call risk analysis or risk assessment, they are products of simulations by Monte Carlo software.

If you visit one of these sites more than once in quick succession, you may notice that the projections are slightly different numbers.  This is because the Monte Carlo method is based on statistical analysis of ranges of probabilities.  So, while the press may give you the impression that two weeks from now anything is possible, the polling websites churn through the flood of data to give you only those results that are most probable.

Debt and Hope

Political scientist Ted Lowi once said, “Where there’s debt, there’s hope.”  And certainly that’s been the optimistic philosophy with which the financial powers-that-be operate.  But fundamentally I think Lowi was trying to get at the way we think about money and the future: we have always understood it is a trade-off between risk and reward.  Without debt–or risk–there is no opportunity of reward.  Since the seventeenth century, mathematicians–with economists, financiers, and gamblers close on their heels–have been working to express the perfect balance point in between. 

Which brings me to the stock indexes we are watching with such fascination today.  After the stock market crash of 1929, economists began to focus heavily on risk assessment in investment, and the result was the first modern method for valuation of stocks: discounted cash flow analysis.   This was based on notions about the time value of money, and it gave investors a great tool for quantifying uncertainty.

Over the past 80 years, a number of specialized versions of this kind of risk analysis have been introduced into standard accounting practice, and now you can even go online to test some of these methods.

But, as the wild ride of the global markets in past weeks has demonstrated, we still need to improve our forecasting methods. But never fear, necessity is the mother of invention.  If the crash of 1929 brought us discounted cash flow analysis, what kind of quantitative forecasting will the current upheaval of the markets bring us?

@RISK: A Tool for the Six Sigma Practitioner’s Tool Box

Last month, I attended a free webinar, Minitab Methods to Deal with “Bad” Data, which was given by Master Black Belt Rick Haynes of Smarter Solutions, which I previously wrote about.  If you didn’t attend or have a chance to read the article, or watch the archived video, I strongly suggest you do.

Since then, I have been in contact with Mr. Haynes and was very happy to hear he is proponent of using @RISK, and recommends all of his students to consider adding it as a tool to their Six Sigma tool box. Below are his thoughts:

“Lean Six Sigma methods are all about solving problems.  When you start proposing improvements outside of the current wisdom, it is quite difficult to support a recommendation.   The creation of a simple Y = F(x) model of an actual system, using @RISK and Excel, provides a platform that you are able to provide very accurate estimates of future performance that are very compelling to the executive leadership with little or no investment at risk. No other tools in the Lean Six Sigma tool box provide this ability like process simulation.  All Black Belts and Master Black Belts can benefit from the use of simulation, particularly in transactional and service processes where experimentation and pilot testing is nearly impossible.” – Rick Haynes, Master Black Belt and Statistician, Smarter Solutions Inc.