I enjoy talking with Phil Rogers , who teaches statistical analysis and managerial decision making, because he enjoys talking about his students. His class at the University of Houston’s C. T. Bauer College of Business is filled with M.B.A. candidates who are already holding down managerial positions in their day jobs.
Phil worked for EXXON for many years, and he took on any number of operations research assignments for which he developed mathematical models. So he likes to invite his students to bring to class the real-world questions facing them in their jobs, and they work together to find the best mathematical approaches to decision evaluation.
Not long ago, Phil told me about some pretty sophisticated analyses of his students have produced of questions that I don’t usually associate with "managerial." Patient deposits for organ transplants. Allocation of turbines at a wind farm. The United States’ bid to become the venue for 2022 World Cup soccer.
He thinks the key to helping people learn modeling techniques is speed: a short learning curve and tools that students can become comfortable with fast. For instance, he likes Monte Carlo software that goes to work quickly for the students, without them having to understand how the software works.
This year he had a great opportunity to test this theory about speed. Sinopec and CNPC, two big Chinese oil companies (big meaning numbers 15 and 25 or some such on the Fortune World 500) each sent a small group of senior managers to Houston to brush up on quantitative analysis. Phil had three days to teach these execs how to build a mathematical model. Three days was all the time they could afford to be away from their home offices.
How did it go? His high-power students, he thought, did quite well. Which is what he expected because, he says, if you can get up to speed quickly in a familiar, universal interface, they way you do in say, Microsoft Excel statistics, "you don’t have to be able to program in Fortran to build a model."
@RISK (risk analysis using Monte Carlo, software for Excel) can be a simple yet effective tool to explore statistical concepts and properties of distributions. For example, one interesting question is whether the reciprocal of a positively skewed distribution is positively or negatively skewed.
One’s first thought may be that such a reciprocal is negatively skewed. Of course, when reflecting on such an issue for the lognormal distribution it becomes clear that this is not the case. Since the lognormal distribution is the result of multiplying many independent random processes, the reciprocal of such a distribution is the result of multiplying the reciprocals of these individual distributions. Therefore the reciprocal of a lognormal distribution is itself lognormal, and hence always positively skewed.
Turning to triangular distributions the situation is not so clear. @RISK can be used to sample a distribution and calculate its reciprocal. The @RISK Statistics functions can be used to compute the moments of the input distributions (mean, stddev, skewness using e.g. RiskTheoSkewness etc) and the statistics for the reciprocals are available after the simulation (using RiskSkewness etc).
The left graph shows a range of triangular distributions which are either symmetric or positively skewed (as model inputs), and the right hand graph shows the (simulated) reciprocals. It is interesting to note that the reciprocal of the symmetric distribution is positively skewed, whereas as the parameters of the distribution are adjusted, the reciprocal may either be positively or negatively skewed, but with a general prevalence for positive skew.
These properties could of course be explored further through mathematical manipulations, many of which are not trivial. However, @RISK provides an easy and intuitive way to explore such issues for the best possible decision making under uncertainty.
Dr. Michael Rees
Director of Training and Consulting
During these deteriorating economic times, it is more important than ever for organizations to be more vigilant about cutting costs and boosting the bottom line. An option is to instill a Lean and Lean Six Sigma culture when tackling projects to save money. The American Society for Quality (ASQ) is once again hosting the 2009 Lean Six Sigma Conference, March 2–3, 2009, in Phoenix, Ariz., to teach professionals needed skills to take back to their organizations.
Palisade will be on hand to demonstrate the use of Monte Carlo Simulation and to explain the benefits of utilizing @RISK to save time and money in your Lean Six Sigma and Design for Six Sigma projects. If you are planning to attend, please make time to come by to pick up a free trial CD of @RISK and to say hello.
Despite the current state of our economy it appears there will be an excellent turnout for this event.
Because Palisade Corporation will be exhibiting, our customers can save 50% on their second registration for the conference. Call ASQ Customer Care at 800-248-1946* and use priority code CEJDB69 to take advantage of this great savings opportunity and to start making a difference in your organization and career today! They will fill you in on the rest of the details when you call before Feb 25.
Hope to see you there!
Cheatgrass. Invasive mussels. A brown tree snake. They are "exotic," they are "weedy," and they are just a few of the invading culprits the USDA hopes to subdue with grants amounting to $4.4 million to fourteen U.S. universities.
Weeds do, in fact bring want. According to the USDA, more than 50,000 weedy species have been introduced to the U.S. where they rack up more than $100 billion in crop damage every year. So you may be surprised to learn–I was– that not all of the funds were awarded for biological controls. One of the larger grants went to Colorado State University for the development of risk assessment and decision evaluation tools to optimize strategies for managing Drooping Brome. Ranchers and farmers call this cheatgrass, and they hate the stuff. Not only is it a very successful weed, but with increasing levels of CO2, cheatgrass is running amok. So CSU will bring out their heavy computer artillery and take an operations management approach to holding off any further advances from this invader.
As mentioned in an earlier posting, correlation models do not necessarily capture the directionality between variables. We showed an example where asset prices whose changes are positively correlated may still trend in opposite directions.
Co-integrated time series, are becoming more and more important in financial econometrics, and attempt to capturing the directionality between variables. This is something that requires a different modelling approach, such as establishing the long-run equilibrium of the spread, ratio, or differences between the prices and then modelling these.
An example of co-integrated series is shown in the screenshot. This statistical analysis was completed with Monte Carlo software @RISK for Microsoft Excel. In the series, a relationship has been established between the price levels of two assets, with random and uncorrelated changes to each of the levels. From each sample of the series, the price changes can be calculated and the correlation coefficient between the series worked out. There is a positive (but varying) correlation coefficient as a result of the link between the price levels. The average correlation coefficient in this example (calculated by conducting 1000 iterations, to generate 1000 samples of the correlation coefficient is about 44%.
Dr. Michael Rees
Director of Training and Consulting
Lean Six Sigma projects are performed in many areas of business. There are a few that require an estimation of future performance when there is no chance to test or evaluate the new process. On February 26, Rick Haynes of SmarterSolutions will share his expertise in a free live webcast that documents a case where a reliability testing effort provided a reliability model that needed to be extrapolated in order to estimate the total impact on warrantee costs.
The reliability model was developed through a logistic design of experiments. The resulting model was coded into an Excel spreadsheet and then modeled using @RISK to answer questions of future failure percentages. The results were used as inputs to focus on the need for proactive actions by the supplier in order to maintain a good customer experience. In the end, no additional actions were taken by the supplier and business continued with a manageable liability rather than with an unknown future risk
Thank you for your feedback on Six Sigma’s Thirst for Information blog posting. Working with industry experts to develop the webinar series and armed with voice of the customer that you supplied me, we’ve been able to ensure topics are of interest you. So please keep your thoughts coming.
An item from the Department of More Things Change, the More They Stay the Same.
Last week, speaking at a conference on managing retirement income, an executive with a U.S. division of Deutsche Bank announced that with the "failure" of diversified investing strategies, Modern Portfolio Theory was dead. R.I.P. balanced portfolios. R.I.P. the Nobel Prize-winning work of Harry Markowitz. R.I.P. Monte Carlo simulation projections.
Instead, announced Phillip Hensler, "Advisors who offer predictability will prevail"– isn’t predictability the goal of all those portfolio managers who rely on statistical analysis techniques for risk assessment? And he foresees that we will enter a new era of "Outcome Driven Investing"–isn’t outcome what drives all investment activity?
In this new era financial planners will help their clients match their "health risks, market risks, and longevity risks with specific guaranteed and non-guaranteed" investment products. Two questions: What else have financial planners been doing for the past decade? And just how are they going to measure that risk?
Maybe in this new era, sound investment advice won’t be based on Modern Portfolio Theory and risk evaluation won’t be the work of Monte Carlo software. But just exactly what will be the era’s guiding principles and analytical techniques? Post-Modern Portfolio Theory and Las Vegas computational tools?
This is the second in a series of postings about correlation modelling. In the first posting we discussed the idea of correlation as representing a proxy model of dependency between random variables. In this posting, we discuss the idea the often overlooked concept that relationships of correlation do not necessarily imply any directionality.
In modelling time series of financial asset prices, it is common practice to correlate the changes (returns) in prices (within each period). One way to see that the variables will not necessarily trend in the same direction is by simple reference to the formula for correlation (see for example Excel’s Help and search for the CORREL function). The formula is based on the differences of each point from the average. If for example one variable has a positive average change and the other a negative average change, then they will drift in different directions, even if the changes are positively correlated. Correlation refers to the idea that each variable moves relative to its own average in a common way.
The screenshot shows an example of two times series with this property; one has a positive average drift and the other a negative average drift, and the series have a positive correlation coefficient of 34%.
Capturing the directionality between variables is something that requires a different modelling approach, such as establishing the long-run equilibrium of the spread, ratio, or differences between the prices and modelling these. This is the topic of co-integrated time series, which are becoming more and more important in financial econometrics, and will be the subject of a future posting.
Dr. Michael Rees
Director of Training and Consulting
Two hundred years ago yesterday Charles Darwin was born. It was the midst of the Industrial Revolution, and machines had just begun to replace human labor. Darwin had some formal medical education and a good bit of informal scientific education. He also had an imagination powerful enough to envision the links between geological time and biological variation.
Darwin hypothesized about biological vehicles for introducing variation in living organisms–he called these "gemmules"–but neither he nor anyone else in his generation had knowledge of genes. He would have found remarkable the mathematical processes that emulate biological processes. I am thinking, of course, of genetic algorithms and neural networks. And I think he would have found it fascinating that our use of these mathematical stand-ins has progressed to the point where robotics–the latest, most sophisticated example of machines taking over for human labor–is an everyday occurrence.
But I bet Darwin would have been blown away by by last week’s announcement of a robot that "evolves." Engineers at Robert Gordon University (UK) have combined neural network technology, like those used to analyze CRM data, with evolutionary algorithms, like the ones used for genetic algorithm optimization, to create a robot that has a "brain" that gradually "evolves" an optimal system to control the robots movements. The result is a ever more smoothly running robot. In other words, this machine is now taking over the human work of improving itself.
Okay, Charlie, where do we go from here?
Because risk assessment and risk analysis have taken such a beating in the press for their supposed role in the financial failures of the last few months, I want to call your attention to another good-news item about risk analysis.
February 3 I reported on on the Gates Foundation and its use of Monte Carlo simulation in evaluating the best combinations of medical tools to combat childhood malaria. Last week the World Health Organization published a cost-effectiveness study of preventive treatment for malaria in children. It used Monte Carlo simulation, along with other techniques in statistical analysis to predict that, in Mozambique and Tanzania at least, treating children preventively with a drug called sulfadoxine-pyrimethamine would be highly cost-effective.
The beauty of applying risk analysis to a question like this is that it used real-world data to address a life-and-death question and did this very efficiently. And in the case of malaria, as the Gates Foundation report points out, this kind of efficiency is in itself lifesaving because it speeds our progress against a disease that has a head start of many centuries of killing.