This week saw my book “Financial Modelling in Practice: A Concise Guide for Intermediate and Advanced Level” (John Wiley&Sons) being available to a US audience on amazon.com (the original plan was for the launch date to be end December 2008).
I have high hopes for the book in the US market, being one of the most sophisticated markets for financial modelling and its applications. My belief is that many modellers have a reasonable knowledge of core Excel functionality, but desire to increase and consolidate their knowledge in a way that is prioritised, practical, and application-driven. In addition, I felt that there were few if any really good texts out there which help modellers to design, to structure and to build models which are relevant, accurate, and readily understandable. Many texts and training courses in the modelling area put their emphasis either on Excel functionality, or on financial theory, or on mathematical models, but hardly address the modelling process. Finally, most modelling texts either do not adequately treat the topic of risk analysis, or otherwise treat it from a mathematical perspective that is both inaccessible to many modellers and lacking in practical tools.
The book starts with a review of Excel functions that are generally most relevant for building intermediate and advanced level models, including functions relevant to statistical analysis. It then discusses the principles involved in designing, structuring and building relevant, accurate and readily understandable models. Topics covered include the use of sensitivity analysis, best practice modelling principles and related issues, and model auditing tools. A Chapter is devoted to the modelling of financial statements and of cash flow valuation using discounted cash flow analysis. It then moves on to discuss risk assessment and uncertainty modelling. Many practical applications and example models are presented in an intuitive and accessible way and the @RISK Monte Carlo software from Palisade Corporation is used to implement most models. The topic of options and real options modelling is then covered, treating these as a natural extension of risk modelling. Classical option valuation methods are discussed, as well as practical methods of modelling real options, including the implementation of decision trees. Chapter 6 covers VBA for financial modelling applications. The topics selected for inclusion were established by consideration of the core types of financial models that frequently require the use of VBA and provides beginners in this area with a solid base on which to discover the richer possibilities available to modellers by using VBA.
» Buy now at Amazon.com
» Buy now at Amazon.co.uk
Dr. Michael Rees
Director of Training and Consulting
IBM has just announced that it will head up a collaboration among five universities and its own researchers to develop a computer that will mimic the function of a mammalian brain. But isn’t that what neural network technology does? Not quite.
In the past couple of years neural networks have come into wide–almost everyday–use in science and medicine, industry and business. They have been hailed for their ability to solve complex problems requiring prediction and classification, and they have been used for decision evaluation, production forecasting, and other questions that arise in operations management; in medicine, where their capacity for classification has become valuable, they now play a big role in predictive diagnostics.
The way neural networks manage to accomplish all this cool stuff is by mimicking the activity of neurons in the biological brain. But the lead IBM researcher points out that the way a brain actually works is through a network of synapses, the chemical junctions through which neurons communicate with other neurons and with cells.
The difference between the two types of networks may sound too subtle to be important, but for a computer, it’s a tall order. The project is also a cat-and-mouse game that will involve scientists and scholars from fields as diverse as psychology and materials science in addition to computer scientists. Some of these experts have participated in a recent attempt to simulate the synaptic networks in a mouse brain, and the new project’s ultimate goal is to create electronic circuits that simulate the workings of a cat’s brain. They’ve got the mouse, so as those famous mice Tom and Jerry love to say, bring on the cat!
Even before we move into the last month of 2008, the financial year has already proved itself an “outlier”–a term from statistical analysis that specifies the unusual, the anomalous. It has brought the kind of stock market lows that investors have been advised to avoid, or at least outlast, by using “long-term” stock investment strategies. The idea behind long-term investing is that the investor relies on time, in addition to portfolio balancing, to smooth out the dips and peaks in returns on investment. Okay, so how how much time does long-term balancing require?
Recent research by Wharton professor Jeremy Siegel, proposes that at least 20 years is the optimum “term” for investments. Using 100 years’ worth of data, Siegel calculated that in any 20-year period, stock returns will better bonds’ conservative returns. What does this mean for investors looking to retire?
Timing is everything. And a number of mutual fund firms offer investors target-date portfolios that provide some assurance that once the investor retires, the cash will last as long as the investor. The guiding force behind these target-date portfolios is Monte Carlo software, which uses historical data to apply risk analysis to project year-by-year returns for the investor. These risk assessments do take into account outlier low years like 2008, but in these simulations the better returns in surrounding years offset the sinking spells.
In other words, never mind the pain of the present, time heals all wounds. But–worst case scenario–what if your target date coincides with an outlier low like our current nadir? Hang in there for a while. It may not take all 19 years.
I am a consumer just like everyone else, and I’m seriously concerned about the retail service industry in the United States. To our chagrin, many manufacturing and telephone customer service positions have been outsourced to places like Mexico, China, and India. Ever since this migration started we have struggled to maintain these types of industries in the US because outsourcing can produce the goods and perform the services at a much lower cost then we can.
What I want to address today is the need for our retail establishments serving the general public to invest in sound Six Sigma and Lean process improvement principles. The processes used in many of our service industries, from the Automotive to Healthcare industries, are extremely flawed and waste much time and money between waiting, errors, reprocessing, and rework. Besides the shear inconvenience and high prices that are resultant in these ineffective procedures, they put our health at risk. Directly in the case of a medical misdiagnosis or flawed work on our automobiles, indirectly they put us at risk through the stress and aggravation one must endure to get things done properly on the first, second or even third attempt! Never mind the stress of the high prices.
With this said, the application of the ideas and concepts of Six Sigma for reducing the variation in process, (never mind advance concepts of Design of Experiments or Monte Carlo simulation etc . . .) and applying Lean principles for reducing waste – such as applying Poke yoke to tasks performed by relatively unskilled staff when performed incorrectly – have the ability to negatively affect our health.
In all fairness, the American workers are the best in the world. It is not their fault; they are given faulty and sometimes broken processes and systems to work with no support or time to fix them. I am sure they are as frustrated, or more so, than the general public, because they are caught in the middle and have to deal with us when things go wrong. How do we fix this? Please tell me your thoughts on this!
Last week, Palisade Corporation held its North American User Conference; it was a very successful event that brought together @RISK Users from around the world. Presentations and discussions touched on topics such as the subprime mortgage crisis, financial risk management, modeling flu, project risk management and of course, the ways to Monte Carlo simulation in Six Sigma.
It was great to see such a high level of interest in the Six Sigma related presentations and buzz they created in both the social networking opportunities as well as the feedback forms that were submitted after the conference. This shows despite the economic difficulties and the natural tendency to eliminate all unessential spending, Six Sigma and Design for Six Sigma is rightfully viewed as part of the solution.
SigmaFlow’s president Jay Holstine, presented Process Mapping for Knowledge Transfer: Doing More with Less. A very pertinent topic in today’s economic times, which will be presented live as an ISSSP Focused Session on November 25 at 2pm EST. Please join us.
Ed Biernat from Consulting with Impact led a presentation on the use of Six Sigma in Process Industries. If you are interested in viewing his presentation, Lean Six Sigma Applicatin of @RISK Part I, it can be viewed online. Part II will be live on December 12, 2008 at 1pm EST where he will dive deeply into the use of @RISK in this case study. Please join us.
A recent article, Executives Switch to Survival Mode, in the Wall Street Journal indicates that two of the top issues in crisis management can be managed with a strong Lean Six Sigma program, these were:
- Excellence in Execution – Whether on the shop floor or in administrative processes, there is no longer room for inaccuracies or waste.
- Speed, flexibility and adaptability to change is another area where a strong Six Sigma program mitigates the effects of crisis.
The interest at our User Conference in exploring the use of @RISK to reduce project cycle times and costs indicates to me that smart business leaders are looking to reduce risks and strengthen their companies during this time of crisis.
There’s been a real bloom in the number of decision evaluation tools being offered online to help farmers analyze the risks and opportunities in their planting and cultivation plans. Farming has always been a risky business because it’s dependent on weather, crop yields, commodity prices, and–at least since the New Deal–on government subsidy programs.
What’s always been complicated is now downright complex, and the agriculture press churns out regular advice on operations management advice for the “agriculture industry.” One currently hot topic in these columns is the new federal program known as ACRE–Average Crop Revenue Election–which requires farmers to make planting and crop rotation decisions that will be carried out over multiple years.
Decision making under this kind of uncertainty means that in order to do reliable production forecasting, farmers have become increasingly familiar with statistical analysis techniques. They have to spend as much time in front of a computer as in a tractor seat.
Gone are the days of the farmer as a rube. Welcome the agricultural manager.
An area gaining traction in the risk analysis world is the application of Monte Carlo simulation to environmental risk. There are numerous uncertainties in the natural world and they affect plans in any number of ways. Think of a typical construction project. What is the impact of weather, specifically inclement weather, on the progress of that project? Are the delays significant enough to trigger penalties? Could a risk assessment help determine that likelihood? Are there mitigating steps to take to better ensure the progress of the job? Could decision trees aid with the consequences and determining the best course of action? Statistical analysis would certainly provide useful data to support a well informed decision.
Specific to the area of renewable energy generation, climate plays a huge role. Variability around forecasting availability of wind or water translates to uncertainty when planning for power generation and delivery. Think of a hydro installation. What capacity should be planned? What is the minimum generation that can be guaranteed? What will long term changes yield in terms of climate shift, and the resulting impact on power generation?
Tools such as @RISK and PrecisionTree add the relevant analytical techniques to spreadsheet models which would allow you to explore these kinds of questions, develop plans around these issues and set policy for future decisions.
Palisade Training Team
This week saw Palisade sponsoring the first SMI-organized conference on Financial Modelling in the Oil and Gas Industry, in London (UK). Palisade’s Michael Rees spoke on the use of @RISK, PrecisionTree, Evolver and RISKOptimizer within the industry. The talk included examples of the use of the software to conduct reserves estimation, model exponential decline and production forecasting, model prices, costs and investments and to generate an integrated risk-based decision evaluation process. Other examples included using the software to help make decisions concerning exploration and production, to implement real options valuation, and to optimize production schedules using Evolver’s and RISKOptimizer’s genetic algorithm optimization capabilities.
Palisade recently also offered the first European regional training seminar dedicated solely to Oil and Gas applications. This was held in Oslo, Norway in late October. Due to the success of this event, others are likely to be held in 2009 – watch this space!
DMUU Training Team
Over the past year or so I’ve become aware of how terms from quantitative analysis are filtering into everyday conversation. This month a juicy tidbit from television made it clear that tech terms are becoming household words. It’s a scene from “The Office”: Angela is caught up in an off-beat love triangle with Dwight and Andy. Does she take a traditional pull-the-petals-from-a-daisy approach to the messy business of love and commitment?
Nope. Ignoring a deadline from Dwight and under pressure from Andy, Angela clears up this irrational scenario by taking a rational approach. We find her making a decision tree.
She is weighing the pros and cons of both men, a kind of romantic risk assessment of the heart. I found her pad and pencil a little too dated to put much faith in her analysis. But she could easily drop that pencil and click open her Microsoft Excel statistics worksheet and begin a risk analysis model, complete with option valuation and value-at-risk, before moving on to genetic algorithm optimization. Clearly the most rational method of choosing a man. Wouldn’t you love to see the probability functions she assigned?
The finance techie terms I just cited haven’t made it into everyday lingo yet, but stay tuned. . . .for the next episode of romance analysis.
One of the typical complaints I have heard over the years in the Six Sigma Community is that transferring data between Minitab and other programs is a hassle. Sure you can cut past from any program to another, but that becomes repetitive and a source for potential error.
Palisade Corporation has been working with SixSigmaIn Team over the past few months; they are a very competent and respected consulting and training company in Italy. Besides their competency in training and consulting, they have created a very useful software application called MTBridge which can automate the information transfer between @RISK, MS Excel and Minitab. This allows you to focus on your project, not the mechanics of getting the data from Minitab to @RISK where you need it.
Let’s play together for winning is a creative, entertaining and informative video that demonstrates MTBridge’s capability. The creative part of the presentation utilizes Microsoft’s Text to Speech technology in conjunction with MTBridge to provide the voices for Minitab, Excel and @RISK. Enjoy!