As a result of the world financial crisis, China is taking risk management seriously. The Chinese financial regulatory system has instituted testing for all workers in the financial sector, including risk managers, certified financial analysts and information security engineers. The workers are required to pass exams within one year, or lose their jobs.
“There are new sources of volatility that threaten our sound and stable growth,”
the China Banking Regulatory Commission said in a statement in October. “It is
important to recognize these new problems and make careful decisions to cope
with them.” (NYTimes.com, December 25, 2008) The new exams seem to be an effort to ensure that all financial sector workers are familiar with good practices in decision making under uncertainty.
It appears that China has not been affected as drastically as the West during the recent financial turmoil. While China owns a lot of real estate in foreign markets that have taken a hit, the total holdings are only a fraction of China’s wealth. Because Chinese banks are more cautious and more heavily regulated, they have managed to avoid the worst downturns. The financial sector is aiming to stay diversified, and it appears they may even be planning to enter new markets such as financial derivatives. Risk analysis is central to these efforts.
Perhaps the success in avoiding the worst of the market pitfalls has reinforced the historic tendency toward regulation in China. While many in Western markets are pointing to the U.S. Securities and Exchange Commission for its role in neglecting regulation, China is doubling up on its own regulation and adding risk assessment as a central part of the equation. In order to be most effective, Monte Carlo software such as @RISK should be a part of the new initiatives.
DMUU Training Team
It has skin. It responds to sensation. It has visual memory and reacts to the expression on your face. It is lifelike in the truest sense. But it is not alive. It serves the same function as a companion animal. But it is not an animal.
It is the MIT Media Lab’s latest robot, “Huggable.” In its current manifestation Huggable is a Teddy Bear loaded with physical sensors, and the brain that links all these sensors is an elaborate neural network. The software that makes Huggable tick is based on the same kind of computational mathematics that powers many operations management tools, medical diagnosis classification systems, calculations to resolve tricky resource allocation problems, and predictive models for many other kinds of decision making under uncertainty.
But Huggable is not about business or sensible decision making. Huggable is about feeling. This sensitive machine, which can distinguish among nine different types of sensation, is designed to produce human sentiment. According to the robot’s inventors at MIT, the perfect use for Huggable is as a go-between for grandparent and grandchild. Through a Web link, a grandparent can hear and see a child through the neural network perceptions of a robot. Art imitates life–or life imitates art?
First of all, what is Monte Carlo simulation?
Monte Carlo simulation is a computerized mathematical technique that allows people to account for variability in their process to enhance quantitative analysis and decision making. The technique is used by professionals in such widely disparate fields as finance, project management, energy, manufacturing, engineering, research and development, insurance, oil&gas, transportation, and the environment. (read more)
Where did it come from?
The term Monte Carlo was coined in the 1940s by physicists working on nuclear weapon projects in the Los Alamos National Laboratory.
How Monte Carlo simulation works:
Monte Carlo simulation performs variation analysis by building models of possible results by substituting a range of values—a probability distribution—for any factor that has inherent uncertainty. It then calculates results over and over, each time using a different set of random values from the probability functions. Depending on the number of uncertainties and the ranges specified for them, a Monte Carlo simulation could involve thousands or tens of thousands of recalculations before it is complete. Monte Carlo simulation produces distributions of possible outcome values.
Monte Carlo simulation provides a number of advantages over deterministic, or “single-point estimate” analysis:
- Probabilistic Results. Results show not only what could happen, but how likely each outcome is.
- Graphical Results. Because of the data a Monte Carlo simulation generates, it’s easy to create graphs of different outcomes and their chances of occurrence. This is important for communicating findings to other stakeholders.
- Sensitivity Analysis. With just a few cases, deterministic analysis makes it difficult to see which variables impact the outcome the most. In Monte Carlo simulation, it’s easy to see which inputs had the biggest effect on bottom-line results.
- Scenario Analysis. In deterministic models, it’s very difficult to model different combinations of values for different inputs to see the effects of truly different scenarios. Using Monte Carlo simulation, analysts can see exactly which inputs had which values together when certain outcomes occurred. This is invaluable for pursuing further analysis.
- Correlation of Inputs. In Monte Carlo simulation, it’s possible to model interdependent relationships between input variables. It’s important for accuracy to represent how, in reality, when some factors goes up, others go up or down accordingly.
A few examples of Monte Carlo Simulation for Six Sigma and Design for Six Sigma for you to explore.
A broad-reaching reevaluation has been forced on both financial planners and retirees by the drastic down-trend in the financial markets. In my last commentary on retirement planning, I cited the work of Wharton professor Jeremy Siegel, who has used risk analysis and other forms of statistical analysis to estimate that it takes about twenty years for a “balanced” stock portfolio–retirement or otherwise–to produce optimum returns.
Okay, 20 years, two decades. I assume this includes the ten years that the financial media keep referring to as the “lost decade”–the ten years leading from 1998 to the present. Apparently, the reason it is dubbed lost is that after inflation is accounted for, the S&P 500 has gained only 1.3% in the past decade, and investors in equities saw their funds stand still while the economic engines idled. Ten years is a big portion of a person’s life and puts a big dent in a retired person’s income security. The large-scale effect of this is magnified by the fact that increasing numbers of people have turned to equities in their retirement planning.
Most of the comment I’ve read on how investors can avoid another lost decade of flat returns and outright losses identifies portfolio balance as the best protection. Although balance is defined by any number of criteria, the key element is diversification–both among types of investments and among stocks. And interestingly enough, Monte Carlo simulation is consistently cited as the tool to calculate the returns and timing of returns of various balancing schemes.
The continuous uniform distribution represents a situation where all outcomes in a range between a minimum and maximum value are equally likely.
From a theoretical perspective, this distribution is a key one in risk analysis; many Monte Carlo software algorithms use a sample from this distribution (between zero and one) to generate random samples from other distributions (by inversion of the cumulative form of the respective distribution).
On the other hand, there are only a few real-life processes that have this form of uncertainty. These could include for example: the position of a particular air molecule in a room, the point on a car tyre where the next puncture will occur, the number of seconds past the minute that the current time is, or the length of time that one may have to wait for a train. In oil exploration, the position of the oil-water contact in a potential prospect is also often considered to be uniformly continuously distributed.
For the distribution to apply to each situation, implied assumptions need to hold, and it is the validity of these assumptions that can be questioned. In the example concerning the waiting time for a train, one would need to assume that trains arrive in regular intervals but that we have no knowledge of the current time, not of other indicators (sound, wind) that a train is in the process of arriving. For this reason, the distribution is sometimes called the “no knowledge” distribution. One of the reasons that such a distribution is not of frequent occurrence in the natural world is that in many cases it is readily possible to establish more knowledge of a situation, and that in particular there is usually is a base case or most likely value that can be estimated.
Dr. Michael Rees
Director of Training and Consulting
In the midst of the holiday season, I want to bring up the subject of applying Six Sigma to food preparation, mainly baking. I am not implying that you try to apply Six Sigma variation reducing techniques to anyone’s holiday baking as it could cause negative unintended consequences, like boxed macaroni and cheese dinners for the New Year.
As a child, I recall sitting around the dinner table after consuming a huge holiday meal, listening to the discussions about my grandmother’s homemade cheesecake and lemon meringue pie. Statements such as “the cheesecake was the best ever”, “this year’s lemon meringue pie wasn’t a tart as last year’s”, “the crust came out perfect” etc . . . To be honest as a 10 year old, I was not able to discriminate such subtleties. Now that I am an adult, I question whether they really could either, particularly after such an eating event, not mention comparing samples 12 months apart. With that said, the deserts were always phenomenal.
Now, onto present day . . . why not apply Lean Six Sigma to baking? Well, some do! A few years ago a regional supermarket chain in the mid Atlantic region hired a Lean Six Sigma consultant to optimize their chocolate cake for ultimate customer satisfaction, taste, pricing and of course profitability. Using taste tests, QFD, Kano models and a little DOE, they were able to identify the characteristics that were most important, then worked on reproducing those characteristics every time with little variation.
The project was a success for both the customers and company producing ultimate chocolate cake experience. Going back to my statement of unintended negative consequences, the Black Belt may have gained a few extra pounds during that assignment.
What’s next? If we can apply Lean Six Sigma to baking cakes to maximize profits and customer satisfaction, doesn’t it make sense to apply it to all food industries?
An aspect often missing from risk analysis discussion is the stochastic nature of risk aversion. In calibrating a financial model, the resulting implied values are the ones that would prevail, in that they are risk-adjusted. The expected cash flows are discounted at a rate that takes risk aversion into account. Should the risk aversion vary stochastically over time, the knowledge that some unknown (and possibly unknowable) future degree of risk aversion will prevail tomorrow, such that the future prices will be accurately determined, is of little comfort. It is the unknowing that carries a hidden cost.
Typically, financial models tend to price future risk either as a constant, or as a deterministic function of time. Unfortunately, there is evidence that risk aversion changes. And it changes in an unpredictable way over time. The risk analyst cannot be confident that today’s calibration will be valid in the future. We need only to examine commodities markets over the last few years to see how volatility risk has affected prices and behaviors in a variety of markets. Estimates do appear, if we hold out a long enough historical time horizon, to offer at least a provisional distribution for the future market prices. Once we have an estimate, we can create a time series Monte Carlo simulation of the distribution to represent that history and obtain, through the simulated scenarios, a future joint probability distribution of the variable and of the traditional observable risk factors. The Monte Carlo simulation generates the data we need to perform the statistical analysis, assisting in our estimates needed for effective valuation and decision analysis.
Palisade Training Team
Okay, so as the news this week makes infuriatingly clear, in my ongoing round-up finance types, I have forgotten one of the rarer but most risk-loving creatures of the finance world. This critter is the con, the crook. If the published allegations are true, the case of Bernard Madoff is remarkable, but not only for the dollar amount of the swindle. The number of investors involved, the sophistication of his Ponzi scheme, and the longevity of the scam are astounding.
Madoff had to have a very detailed understanding of the workings of the equities and bond markets and real mastery of the operations of a legitimate hedge fund. If the everyday Wall Street hedger uses statistical analysis to analyze historical performance and Monte Carlo software to sort through decisions on value-at-risk and option valuation, what kind of complex formulas and high-powered tools did Madoff have at his finger tips in order to maintain a double set of books whose secrets were kept even from the family members who were his employees?
Many commentators on the scandal have pointed out that even the slickest Ponzi scheme is doomed to collapse. While I am very curious about the technology and analytical techniques Madoff might have used, I am truly intrigued by the mind that perpetuated the scam. If he had the smarts and education to make such an elaborate swindle work, why not put these talents to work legally? Why take a risk with such adverse probabilities? As the sophistication of his scheme implies, he must have had a finely honed estimate of the risk of being discovered. Is it possible that he enjoyed walking down this razor’s edge every day?
This morning I received a couple of interesting emails. The first being from Vijay Bajaj, WCBF’s Founder&CEO. The second being from Michael Cyger, Founder of iSixSigma. For those of you who may not be familiar with either organization, iSixSigma is one of the premier commercial organization that provides information and networking to the Six Sigma and Design for Six Sigma Communities through their iSixSigma Magazine, networking events and through isixsigma.com. The WCBF- Six Sigma Solutions (Worldwide Conferences and Business Forums) focuses exclusively on Six Sigma and related quality Conferences & Events.
The WCBF is giving a limited number of complimentary conferences passes, which can be used at any of their Lean and Six Sigma Conferences and Summits in 2009. In order to “win” one, you must complete a survey. The survey asks for the latest burning, need-to-know issues that should be addressed at events, and asks you to submit your recommendations for cutting-edge, provocative and perspective-shaking speakers. These survey results are certainly going to be used to decide future conference agendas. A bit complicated, but they seem to be gathering the VOC to ensure their future conferences meet the needs of their market, additionally they seem to be positioning themselves to bolster event attendance if needed, knowing travel budgets have been slashed in many organizations.
iSixSigma is offering huge discounts for their upcoming iSixSigma Live! Summit & Awards in January, and throwing in two (2) free 3-hour Master Class Workshops. Additionally, they also give suggestions “pinch pennies” while traveling.
These actions are extraordinary, knowing both organizations are commercial firms and have called the shots over the past years. The current economic crises seems to be taking a toll. I applaud them for their efforts and hope they continue to gather and evaluate the VOC even after these challenging times to are behind us.
Recently, I compiled a list of Lean Six Sigma and related events for 2009. Currently, there are 47 national tradeshows and events scheduled. As I indicated in a past post, I wish each of these organizations would hold one to two per year instead of a dozen or more. This would allow better idea and best practices sharing and potentially reduce costs for the event organizers.
For those of you who, like I, assumed speculators, hedgers, and arbitrageurs were all the same critter with different species names, I’ve now considered two of three creatures of finance who make their living by risk assessment and the balancing of probabilities, speculators and hedgers. The difference between they work is this: speculators accept a certain level of risk in return for a certain probability of payoff; hedgers accept a certain level of risk in one market but try to balance that risk by making in investment with probabilities of an opposite payoff in another market. So far, so good. I’ve left the hardest for last: the arbitrageur.
The arbitrageur–who is often not a person but a bank or a brokerage–makes matching deals in an asset–usually a financial instrument–that trades in two different markets at two different prices. So the arbitrageur buys an asset (or borrows money) in one market and sells it in another market. In a perfect world the value of the asset would be the same in both markets. But the arbitrageur is banking on a less than perfect world, because there is profit to be made in the difference between the two different prices.
Ideally, to avoid the change of either price over time, the buying and selling in the two markets take place simultaneously. But the arbitrageur knows it is not a perfect world and therefore has to accept some level of price risk. Successful arbitrage relies on risk analysis, an understanding of option valuation and VAR, value-at-risk, and, especially, on nearly simultaneous electronic trading. Balancing values and risks clearly calls for Monte Carlo software and genetic algorithm optimization. But even with the most sophisticated software and the most favorable odds, the timing of these complicated deals is everything. Profit depends on a cool hand a the computer.