Schedules Announced for Risk Conferences in Copenhagen, Amsterdam, and Dubai

Presenters include Deloitte Risk Services,
Royal HaskoningDHV, Lego Group, DSM,
Saudi Aramco, Kuwait Petroleum

In their ninth year, the Palisade Risk Conferences have become “must attend” events for quantitative risk and decision analysis professionals. The schedules are set for Copenhagen, Amsterdam, and Dubai. Each day promises a full slate of intensive software training by Palisade trainers, in-depth looks at risk modelling from industry experts, networking opportunties, and the chance to review your own model with Palisade consultants.

Copenhagen, 31st March
Amsterdam, 3rd April
Dubai, 8th April



Other 2014 dates include Bogotá, Madrid, Perth, Brisbane, London, Frankfurt, Johannesburg, and Melbourne, with more dates to be announced soon.

0 Comments »

Free Webcast this Thursday: "Using @RISK in Evaluating Full (late stage) Compound Development in the Pharmaceutical Industry"

Register now for a free webcast to be presented by Venkat Raman, A.C.A. MBA, Managing Principal of VR Advisors LLC.

"Using @RISK in Evaluating Full (late stage) Compound Development in the Pharmaceutical Industry" will discuss the financial evaluation of late stage development of compound in the pharmaceutical industry, but the rational, methodology and analysis discussed here has universal applicability to any multi-stage product development activities across industries. We will not discuss the real options analysis here, as this is not the intended objective of this presentation. But the webcast will:

  • Present the case - a miniature model of a full-blown real world case
  • Discuss the two financial models – the deterministic and the probabilistic models
  • Frame the case
  • Model the case
  • Discuss insights and results

JOIN US THIS THURSDAY - January 30, 2014 - 11:00am ESTRegister Now
"Using @RISK in Evaluating Full (late stage)
Compound Development in the Pharmaceutical Industry
"


Venkat Raman is a management consultant with over 25 years of extensive global experience in strategy and corporate finance across large and small enterprises. He began his career with the Big 6 and over the years has held leadership positions in the management consulting, insurance, technology services, and entrepreneurial ventures. Venkat brings his collective experience, wisdom, and judgment to every engagement. Venkat is an MBA from Indiana University, a qualified CPA, and a Chartered Accountant. Extended bio here.

0 Comments »

Minimising Financial Risk in Infrastructure PPP Projects

Recent practice in Private Public Partnership (PPP) transport projects has seen the participating governments and public agencies gradually moving from demand-based contracts to availability-based ones. These latter agreements see the public partner bear the financial implications of actual demand being either over or under that forecast, while risks associated with construction and service availability are transferred to the private partner fulfilling the contract.

Civil engineering consultancy Solvĕre has developed a methodology to enable the partners involved in PPP infrastructure projects to minimise their financial risks by accounting for each element of the project that can affect its financial status and therefore profitability. To do this Solvĕre uses @RISK to estimate the performance and to forecast the potential deductions in the payment mechanism for each project. (How much and at what intervals the government pays the contractor is determined by the ‘payment mechanism’. This relates to the quality of the service provided by the private partner whose revenue is therefore dependent on its performance score and the incentive or penalty rules of the contract).

The key objective is to quantify, for various levels of probability, the economic impact of the performance criteria not being met. Solvĕre’s @RISK model takes into account the contract specifications and the resources committed by the operator to complete the project and undertake maintenance of the infrastructure, combining them in order to evaluate the expected level of performance on this base scenario.



Solvĕre believes that the highly-complex nature of PPP contracts, coupled with payment mechanisms being subject to a significant degree of uncertainty, requires in-depth analysis in terms of probability and risk. Using @RISK, it has developed a way to do this, thereby enabling informed decisions regarding the feasibility of the project and a proper risk allocation between the partners.

» Read more about Solvĕre's use of @RISK

 

0 Comments »

Finding the Weakest Link: Joseph Yacura Discusses Supply Chain Risk Management

Businesses around the world rely on supply chains to create, manage and distribute goods and services. Most of these supply chains reach beyond local borders and operate in a globalized environment which prioritizes cost-competitive production. To achieve these cost targets, companies have taken out all unnecessary capacity and inventory within the whole supply chain to maximum revenue. As a result, if anything goes wrong in just one link of these chains, entire businesses can stall. For most companies, these supply chains are at considerable risk of getting bogged down due to threats such as natural disasters, criminal activities, and cyber-attacks.
 
Joseph Yacura, a supply chain expert and consultant at ISG-ONE recently discussed how to manage potential risks facing supply chain operations in a video report released last month (see video below). According to Yacura, supply chains and supply chain risk management are becoming “a critical function of almost every company in the world.” 
 
Yacura explained that in certain regulated industries such as financial services, insurance, healthcare and others, new regulatory requirements have been implemented. These regulations require that suppliers be monitored and managed. Failure to do some place the regulated business in serious exposure to significant financail penalties. This is an added risk for affected companies to manage.
 
According to the video, supply chain disruptions have become much more frequent, and their severity has increased “dramatically”.
 
To keep these increasingly complex and sophisticated supply chains secure moving forward, Yacura stated that companies must prioritize supply chain risk management, and recommended companies have a supply chain risk officer in place to deal with these issues.
 
Currently the best most companies can do is to monitor and react to supply chain disruptions that have already occurred. In the future, leading companies will use predictive modeling to anticipate risk and disruptions within their supply chain. Risk associations, that are shared between suppliers, will also be identified to mitigate future supply chain disruptions.
 
 
 
 
0 Comments »

Guarding against the risk of ‘unbankable’ community projects in South Africa

The legacy of apartheid in South Africa has left much of the country without basic services such as housing, water and electricity. The government initiatives in place to tackle the issue do not have enough resource to meet the scale of the need so commercial finance programmes will play a key role in delivering these services.  But poor planning without enough information makes it difficult to recoup the costs of a project, thereby making it unattractive to potential commercial financers – in other words, ‘unbankable’.  

For example, an engineer might design a high-specification water system.  However, the focus on design may make it over-complex and therefore expensive – with the end result that it does not meet the needs of the community, the government or the financing organisation.

Bigen Africa, a consulting company that describes itself as a ‘development activist’, tackles this issue with risk analysis.  It uses @RISK as a tool to enable it to identify, manage and mitigate the risk associated with each project and ensure it attracts funding and is successful.

@RISK risk modeling software helps Bigen Africa to understand and demonstrate that it is the number and type of houses that drives the demand for services, where this demand is, what it will be in the future and who will use the services.  It forms the basis of engineering / planning, the financial risk analysis model, the revenue model and strategy, affordability analysis and the integration between the services (housing, roads, solid waste, water, sanitation, electricity, etc).

@RISK provides the level of detail that banks require before making a decision on whether to finance a project.  At the same time, the methodology benefits from simplicity and is easily understood by the wider audience involved in the project but not necessarily familiar with the specific concept of demand risk.

» Case Study: Bigen Africa uses @RISK to encourage funding for basic community services in South Africa

0 Comments »

Free Webcast this Thursday: “Modeling Behavior Using @RISK and PrecisionTree: Why probability estimates aren’t always what they seem”

Join us this Thursday, October 6, 2011, for a free live webcast entitled, "Modeling Behavior Using @RISK and PrecisionTree: Why probability estimates aren’t always what they seem," to be presented by Christopher Brand.

Over the past three decades, behavioral economists and psychologists have gathered a significant amount of evidence suggesting that most people find it surprisingly difficult to make accurate judgments about probabilities. This is a cause for concern in real-world decisions, which normally involve at least some degree of risk and uncertainty. Even more troubling are the consequences for long-term decisions, in which it is necessary to account for multiple possible sequences of events; if each event includes an erroneous prior probability judgment, this will have the effect of multiplying the overall error in the decision.

This free live webcast will discuss some examples of probabilistic decision making where intuitions and judgments are regularly incorrect -- such as the Monty Hall problem, the base rate fallacy, and the conjunction fallacy -- and demonstrate these cases via implementation within @RISK risk analysis software, and PrecisionTree decision trees modeling software. Furthermore, the presentation will also explain how models of behavior during decision making can be developed using Palisade software.

Chris Brand is an Associate at Captum Capital Limited, where he provides consulting and training services to early stage life science companies in the behavioral aspects of business development. He is also a PhD student at Birkbeck, London University where he is active in the psychology of decision making. Chris holds an MSc in Cognitive and Decision Sciences from University College, London, an MA in Philosophy from the University of York and a BSc in Philosophy and Psychology from the University of Keele.

» Register now (FREE)
» View archived webcasts

0 Comments »

Unknown Unknowns – Probabilistic Modeling in Projects

Yuri Raydugin, P.Eng, MBA, PhD, Principal Consultant at Risk Services & Solutions, Inc. points out that project management practitioners recognize the need to consider “unknown unknowns” in project risk management.  However, clear and consistent recommendations on incorporating these uncertainties into risk models have yet to be proposed.

In a recent white paper, Dr. Raydugin outlines a useful thinking process to address this problem and comes up with practical recipes on handling unknown unknowns using Monte Carlo simulation.  Four dimensions of unknown unknowns are discussed: the novelty of a project, the phase of project development, the type of industry, and bias. A discussion on unknown unknowns vs. corporate risks is provided. Practical recommendations on including unknown unknowns into probabilistic cost and schedule risk models are put forward.

» Read the full paper
0 Comments »

Customised Solutions Using @RISK and VBA for Excel

If you missed Palisade trainer Rishi Prabhakar's webcast "Customised Solutions Using @RISK and VBA for Excel," you can still view it in our archive.

The hour-long presentation explores the use of VBA for Microsoft Excel to control @RISK functionality, to simplify the process of risk analysis for resource-strapped businesses. Rishi explains the advantages (and limitations) of macro control for modelling and running simulations.

Simple examples are worked through to show the XDK (@RISK’s automation library) in action, from generic examples to a cost estimation model. This addresses elements of model construction, various simulation settings and finally reporting. The emphasis is on exposing the viewer to the various possibilities the XDK lends to the user rather than an in-depth VBA for Excel coding session.

Rishi Prabhakar holds a BSc in Mathematics from the University of Technology, Sydney Australia. Rishi has experience in the resources, infrastructure and primary industries, telecommunications, scientific research, banking and finance with an emphasis on operational risk.

With technical skills in the areas of modelling, simulation, statistical analysis, cost estimation, time series forecasting, customised solutions utilising VBA for Excel, and extreme value theory, Rishi has provided training and consulting services in risk and decision analysis for Palisade’s Asia Pacific office since 2005.


» Customised Solutions Using @RISK and VBA for Excel
» Webcast archive
0 Comments »

Oops! Didn’t see that coming!

We are pleased to introduce you to consultant and trainer David Roy, our first guest blogger in my blog. Dave comes to us from SSPI, Six Sigma Professionals, Inc., and taught Jack Welch and his entire staff their Six Sigma Green Belt training. David’s blog will be the first in a series, and this initial entry also has a quick survey at the end for your input on structuring DFSS training.

--Steve Hunt

 
 

Oops! Didn’t see that coming!

 

How often do we hear these words after we have made a change to product, service or process?

 

We frequently solve one problem only to discover a new problem; or the solution we selected didn’t really resolve the problem.

 

There are many reasons for these surprises. Problem Solving sometimes addresses the symptoms and not the root cause. Useful solutions often have compromising harmful effects that we did not consider.

 

You may now be thinking; “Wow, if everything we do is going to turn out bad let’s not change anything.”   The reality is that change is inevitable. Whether driven by rising customer expectations, innovative new technologies or even variation in inputs over time; change will occur.

 

Managing the design and implementation of these changes requires a more formal methodology than the prominent “Launch and Learn” method.

 

The sophistication of the methodology will vary depending on the magnitude of the risks associated with the change. If we are problem solving for variation in a standard process and trying to regain control simple tools such as Cause and Effect diagram and Failure Mode Effects Analysis and Standard Work may be all that is required.

 

When we start to explore reducing variation or introducing new technologies or process then we need to bring on a Design For Six Sigma (DFSS) methodology which incorporates elements such as Change Management, Robust Design, Reliability, Modeling & Simulation and Piloting & Prototyping.

 

Over the next 4 blogs we will cover the four phases of a DFSS project under the framework of I-dentify, C-onceptualize, O-ptimize, and V-erify or ICOV for short.

We will give a high level look at the steps within these phases and the tools used to reduce the risk of the change and un-intended consequences.

 

On another note, if you are able, we’d like to ask for your guidance by completing a short marketing survey to help SSPI structure our training in a way that is most useful to our community. This 8 question survey should take less than 5 minutes, and is anonymous. Your opinions are greatly appreciated.

http://www.surveymonkey.com/s.aspx?sm=2aQk8QF1eLB5MFQJC1pUXA_3d_3d

 

BIO:

 

David Roy is an integral part of the Six Sigma community. He taught GE’s Jack Welch and entire staff Six Sigma, as well as served as Senior Vice President of Textron Six Sigma. He is a Certified GE Master Black Belt, was instrumental in developing GE’s DMADV (DFSS) methodology, and has taught 3 waves of DFSS Black Belts. Dave’s experience includes Product and Transactional so his examples are of interest to all. David holds an BS in Mechanical Engineering from The University of New Hampshire. He is also the co-author “Services Design for Six Sigma – A Roadmap for Excellence”

 

0 Comments »

Profitability Projections in a Manufacturing Environment of High Uncertainty

The other night, I had the opportunity to watch a free webcast titled “Use of @RISK for Probabilistic Decision Analysis of a Manufacturing Forecast in an Environment of High Uncertainty”. This presentation was extremely timely, since many companies are struggling to survive in these challenging economic times. Dr. Jose Briones did an excellent job discussing and illustrating how profitability projections in a manufacturing environment are directly tied to how the sales forecast fits with the capability of the operation, and how different manufacturing capacities and productions rates impact the output of the plant and the allocation of the fixed cost of production.

In the example he presents, a company is trying to decide how best to balance the sales of certain families of products to maximize revenue, maintain a diverse product line, and properly price each individual product based on the impact to the manufacturing schedule and fixed cost allocation.

He spends an appropriate amount of time discussing different input distributions such as the Triangular, Normal, Pert and Gamma distributions as well as sharing his recommendations on when to use them. He also shares his expertise on fixed cost allocation by product and the dangers in using the common method of dividing the fixed cost by the total production, and recommends doing so by allocating the fixed costs based on the projected run time of each product family. Lastly, he spends some time discussing the interpretation of the results, which I feel does a great job wrapping up the information presented in the webcast.
 

Dr. Jose A. Briones is currently the Director of Operations for SpyroTek Performance Solutions, a diversified supplier of specialty materials, BPM software and innovation consulting services. Dr. Briones has a PhD in Chemical Engineering from Clemson University and is a graduate of the Business Administration Program of Wharton Business School. If you have any questions about the webcast, you can contact Jose at Brioneja@SpyroTek.com or through Jameson Romeo-Hall at Palisade Corporation.
 

 

1 Comments »

Goldilocks Had It Easy

Ed Biernat, Consulting with Impact, has been in touch to respond to my recent question about analysis paralysis: How do you know when you've done enough decision analysis, no more, no less than will benefit you?
 
Here's Ed's take on the issue:  "Goldilocks had it easy.  She eventually got it right the third time. This issue is one that we wrestle with in Lean Six Sigma overall, because it is easy to become enamored with the analysis of data.  Analysis paralysis kills the speed of an implementation and must be vanquished at all costs.  Inertia is the biggest foe that we face in implementing Lean Six Sigma.  It was one of the big problems with the old model with statisticians in businesses (and why it is hard to find a pure statistician around now in anything but actuarial endeavors.) What the issue really comes down to the basic question, What Problem Are You Solving?
 
Golf makes a quick analogy.  Let’s take the greatest 7-iron player in the world.  This person can play the 7-iron like nobody’s business.  In fact, they use the club more than any other club in their bag, and crowd really appreciates this virtuoso of the 7-iron.  But what is the purpose of the game?  To use the 7-iron or to get the lowest score on the course?  For risk-analysis geniuses, we can substitute the risk analysis tool for the 7-iron.  It is a great tool, a powerful tool. But only if it helps us solve the problem we are facing.  And that problem is probably not to build the world’s best model.
 
If you have addressed the question that you started with when you built the model, then you have done enough analysis.  In our consultancy, our bias is to get close and move forward unless we are dealing with a mission-critical decision. We fully admit that we are not modeling experts, and we are OK with that. That is not why our clients engage our services.  We solve problems and help them to change their culture.   Modeling helps with that by getting the team familiar with issues and sensitivities before we do a full deployment.  Once they can see the impact of this variation and their assumptions, and once they have a framework for going forward, we put the model away because it's done its job."

Thanks, Ed, for giving this some thought!
 
 
0 Comments »

Data Issues Part 1

In a recent public training workshop (for @RISK for Excel) I was reminded of an unusual fact regarding data.

Commonly @RISK for Excel is used to fit distributions to historical data for use in risk modelling, and it sure beats wildly guessing obscure parameters. However there are (naturally) a litany of woe-inducing problems with all historical data sets: non-stationary data series, extreme values/outliers, data recording errors, seasonality and heteroskedasticity to name a few. Excessive ‘cleansing’ of the data set is commonly prescribed, but the statistician in me cringes to even type those words! Quality control and transforming the data will help to eliminate most of those problems, but what about outliers?

In the early Naughties I was working for a large Australian bank, forecasting their daily call centre volumes for the purpose of planning staff levels and predicting service levels. A particular call centre averaged 30,000 calls per weekday. Yet on September 12th, 2001, calls dropped to less than 10,000. Along with the rest of the world, Australians were watching the terrorist attacks on television and the internet rather than calling to fix spelling mistakes in their contact details or transfer small sums of money between accounts. But what to do with that data point? Presuming the forecasting model is not intended to include such extreme events as terrorist attacks then the point could simply be filtered out of the data set and not thought of again.

But now consider a process that should include rarer events, such as flood damage or operational risk, as one of the risks in a model. If you have 10 years of good data (say), but the set includes an event that should only occur every 100 years. This level of impact is thus drastically overrepresented in the data and any fitted distribution will be biased toward such extremes. Yet the data point can not be completely ignored as such values can occur and the simulation models must have the capacity to sample such values (though with a reasonable likelihood). In this case the artistry that is fitting distributions to data comes to the fore. The data point could be removed from the set but not from our decision making process.

From the range of distributions that can be selected, the optimal choice should not only represent the remaining data well but also have a tail that samples events in the vicinity of those that have been excluded from the analysis with reasonable probability. No, that’s not always easy to do. But as with many elements of probabilistic modelling it simply must be done in order to provide useful information to decision makers.

Thus the context of the modelling can go a long way to determine the most appropriate steps to take with your data set. If that sounds like a subjective guideline then you read it correctly. Not enough people realise just how important experience and intuition can be in the seemingly prescriptive fields of mathematics and statistics. Fitting distributions to data is no different.

And yet that isn’t the unusual fact I was reminded of in the workshop! But I’ll leave that for Part 2 of my Data Issues blog.

Rishi Prabhakar
Trainer/Consultant
0 Comments »

Adopting a healthy approach to risk

Having talked in previous posts as to why it’s important, and today how accessible it is for any size of organisation to adopt a healthy approach to risk, I’ll now take you through my top ten tips on how you can maximize your risk management programme:

1. Get buy-in
Risk management is not an optional extra. It is a business critical tool that is an asset and an integral part of the project. The company culture must be developed to embrace QRM (quantitative risk management) and DMU (decision making under uncertainty) in order that everyone understands their benefits and therefore accepts the need for them.

2. Get budget
Business tools cost money, but managing risk is an investment - not an overhead – and must be regarded as such. Allocating resource and making it a formal business process should be seen as an insurance policy.  Not only will it help organisations make better decisions that will save them money in the long term but, by identifying potential risks and adverse events, it can protect them against unexpected costs in the future.

3. Get words
As with any organisational change, it is essential that everyone is clear on the new processes. Therefore a common risk language – or 'glossary' – needs to be developed to avoid misunderstanding and to ensure a consistent approach to QRM and DMU.

4. Get numbers
Qualitative assessment is essential, but numbers are more powerful – for example the percentage chance of meeting a deadline or budget. Monte Carlo simulation random sampling provides the margin of error for a venture and is a good way to illustrate the consequences of different courses of action. Risk management experts must ensure everyone understands these figures, and accepts them.

5. Get structure
Managing risk in order to make better-informed decisions requires an appropriate organisational structure. Individuals and groups need clearly defined roles, and must then each take responsibility for their own area of expertise.

6. Get lateral
Every organisation has risks that it deals with on a daily basis and which must therefore be factored in to the decision-making model. However, no enterprise operates in isolation, so other external variables must be included. For example, even a small rise in fuel costs could have a major effect on revenues if raw materials need transporting long distances.

7. Get perspective
Political, cultural and social risk factors can be explored by involving all stakeholders.  Investing time and money in consultation and research ensures that businesses have a clear idea of the complete environment in which they operate, and therefore minimise the chances of products and services failing.

8. Get reporting
Risks, and the management of them, must be reviewed regularly – and the programme amended if necessary. This requires a regular reporting process, in which risks are clearly identified and prioritised.

9. Get with it
Being risk aware does not mean being risk averse. Businesses should guard against rigidly adhering to 'the way we've always done it' approach, instead keeping up-to-date, learning new tricks and not being afraid to be bold.  Although risky on the surface, these tactics prevent being left behind – much of the potentially uncertainty can also be removed with QRM and DMU.

And finally…

10. Get it documented
Back up the commitment to a thorough QRM and DMU programme with documentation. This validates the budget and buy-in requested at the start. And it’s good for business – organisations this thorough are guaranteed a competitive edge.

Craig Ferri
EMEA Managing Director of Risk & Decision Analysis
0 Comments »

Targeted Analyses and Compelling Communication: A Formula for Successful Value Creation in Management Science

Michael A. Kubica is Founder and President of Applied Quantitative Sciences, Inc. He has over 18 years' experience within the healthcare industry, and has been providing quantitative sciences consultancy since 1999. Michael has extensive experience in providing quantitative decision support solutions for leading pharmaceutical, medical device/diagnostics, and biotechnology companies, addressing a wide range of business issues. Prior to establishing AQS, Michael held the position of Vice President, Operations for Magellan Health Services. During his career Michael has also held positions of Director of Quality Management, Regional Director of Business Operations & Finance, and Hospital Administrator. Throughout his career, Michael employed sophisticated quantitative methods to forecast performance, streamline operations, and improve quality. Michael has an MBA and Master’s of Science in psychology. He serves as Adjunct Professor of Research Design and Statistical Analysis at St. Thomas University in Miami, FL. Applied Quantitative Sciences, Inc. (AQS) is a consultancy specializing in assisting medical device, pharmaceutical and biotechnology companies make decisions under conditions of complexity and uncertainty. They are a market leader in providing simulation and optimization models which are used by industry leaders for the purposes of forecasting, new technology valuation, business and strategic planning, supply chain management, and resource planning.

Mr. Kubica will present a case study later this week at the 2009 Palisade Conference: Risk Analysis, Applications, & Training, 21 - 22 October at the Hyatt Regency in Jersey City (10 minutes by PATH from Manhattan's Financial District).

See the abstract for his case study below, and see the full schedule for the Conference here.

Targeted Analyses and Compelling Communication: A Formula for Successful Value Creation in Management Science

The value of quantitative science projects too often remains unrealized for would-be consumers. Despite flawless analyses, sophisticated reports and dazzling presentations, the message goes unheeded by those who could most benefit: If only they understood how to operationalize the results. The clarity with which quantitative scientists view the practical application of results is often paralleled only by their inability to generate that same clarity in their customers. The result is that good management science is at best ignored and worst, misunderstood (and misapplied). This workshop describes steps we as quantitative scientists can take to foster understanding, generate novel insights and stimulate actionable results with our clients. 

This Week: October 21-22 in NYC

Building on the success of last year’s record-breaking event, the conference will offer a wide range of software training, model building, and real-world case study sessions. Last year, the event drew over 150 practitioners and decision-makers from a broad spectrum of industries. The @RISK and DecisionTools software tracks were more popular than ever. This year, we’re expanding software training with sessions that let you walk through examples and try the tools directly. This will enable you to take some new tips back to the office. Please join us in October for a great opportunity to learn and connect with colleagues.
0 Comments »

Simulating the U.S. Economy: Where will we be in 100 years?

William Strauss is the President and founder of FutureMetrics. He brings more than thirty years of strategic planning, project management, data analysis, and modeling experience into the company’s stock of knowledge capital. Bill’s professional history includes executive positions as director, president, and senior vice president, as well as positions as senior analyst and field coordinator. He has an MBA (specializing in Finance) and a PhD (Economics).

Dr. Strauss will present a case study at the 2009 the 2009 Palisade Conference: Risk Analysis, Applications, & Training. The conference is set to take place on 21 - 22 October at the Hyatt Regency in Jersey City, 10 minutes by PATH from Manhattan's Financial District.

See the abstract for his case study below, and see the full schedule for the Conference here.

Simulating the U.S. Economy:
Where will we be in 100 years?


There is an assumption that drives all of our expectations for how our economy will be in the future. That assumption is one of endless economic growth. Clearly endless exponential growth is impossible. Yet that is what we base all of our expectations upon. We all agree that zero or negative economic growth is bad (just look around now at the effects of the Great Recession). But we also know logically that 2% or 4% annual growth every year leads to an exponential growth outcome that is unsustainable. 

To see where this growth imperative will take us we first have to see how we go to where we are today. This work first models the 20th century. The model is both complex and simple. The basic schematic of the model’s relationships is easy to understand. Furthermore, the core of the model is a simple production function that combines capital, labor, and the useful work derived from energy to generate the output of the economy. Complexity is contained in the solutions to the internal workings of the model. What is unique is that there are no exogenous economic variables. Once the equations’ parameters are calibrated, setting the key outputs to "one" in 1900 results in their time paths very closely predicting the U.S. GDP and its key components from 1900 to 2006. 

The experiment in this work is about the future. If the model can very closely replicate the last 100 years, what does it have to say about the next 100 years? From 1900 to 2006 there are periods in which there was parameter switching. (The optimal parameters and the years for the switching were found using a constrained optimization technique.) That suggests that in the future there will also be changes. The experiment uses @RISK’s features to generate new combinations of parameters for each of tens of thousands of runs of the simulation. Changes in the parameters represent potential exogenous policy choices.

The "doing what you did gets you what you got" scenario leads to a surprising and unsettling outcome. The experiments using @RISK do find a path that works. Obviously if it is not "business-as-usual" that leads to a stable outcome, it is some other way. The policy choices that lead to a stable outcome suggest that the future of capitalism is not going to be what we expect it to be.

Please join us in October in New York for software training in best practicies in quantitiative risk analysis and decision making under uncertainty, real world case studies from risk services consultants and experts, and networking with practicioners from many different fields including oil and gas, pharmaceuticals, academics, finances, Six Sigma, and more.

0 Comments »

Palisade’s Custom Development Services

Palisade Corporation now offers custom development services. Our consulting team can help you to automate your risk and decision analysis models so they can be easily used by everyone in your company, or even outside of it. 

We offer different options that include Excel add-ins, Windows, and Web based applications. Our consultants can help you to design, program and deploy these applications. A typical application might connect an Excel spreadsheet to your company’s database, extract data, then adjust it to probability distributions so they can be used in dynamic risk or optimization models. The structure of reports can be also customized and published as PDFs, or to the Web.

Palisade Custom Development can incorporate Monte Carlo simulation, probability distributions, distribution fitting, graphs, reports, and many other features of @RISK into any Windows-based application. In addition, we can integrate genetic algorithm optimization from RISKOptimizer or Evolver. This allows you to apply powerful, proven analytics to applications outside Excel. Applications can be run in a desktop, network, or Web environment.

You may wish to customize your @RISK or DecisionTools Suite spreadsheet models, restricting access to model components for some users or automating reports and other aspects of your analysis. Using the DecisionTools built-in Excel Developer Kit (XDK) and custom Excel VBA programming language, Palisade can help you build powerful, easy-to-use risk models for one user or for an entire work group.

We are currently working on a new website where you will find more information and project samples.  Upcoming posts will discuss examples of custom Excel VBA programming.

» More about Palisade Custom Development

Dr. Javier Ordóñez
Director of Custom Development
0 Comments »

Expert Advice is Needed in Tough Times

When the economy takes a turn for the worse, tough times call for cutbacks. Cutbacks might include extras you don’t really need, goods that haven’t yet outlived their usefulness, or services you can perform yourself. Just because you can – and perhaps should – do without some of the ‘luxuries’ does not mean everything should be cut from your budget. Expert advice is one of those areas – and it doesn’t have to break the bank.

Weighing important decisions demands advanced analytics and informed insights to uncover the value of one choice over another. With quantitative analysis, you can gain valuable insights into underlying risks and their implications. Taking that information forward, you can formulate effective responses to those risks. Statistical analysis in Microsoft Excel is useful, but without the ability to account for risk and uncertainty, a static model adds limited value.  George Box’s adage “All models are wrong. Some models are useful” is an apt perspective viewed through static analysis. An essential extension to the basic Excel model is quantitative risk analysis.

Palisade’s training and consulting services are cost effective ways to extract more from your risk analysis software. In short order you can gain better control over your simulation models through training, or you can work with the experts who developed these tools to aid you in efficient modeling of your given situation. Bespoke models provide you with information relevant to your organization. Add efficient application of @RISK (analysis with Monte Carlo simulation) and RISKOptimizer (Monte Carlo simulation with optimization for decision making under uncertainty) or PrecisionTree (decision analysis in Microsoft Excel using decision trees and influence diagrams) and you get much more. Then you’ll find out just how useful models can be.

Thompson Terry
Palisade Training Team
0 Comments »

Transformation Partners Company Now Featuring @RISK in Black Belt Certification Courses



I earned my Lean Six Sigma Black Belt Certification from Transformation Partners Company
  (or TPC) a full service Lean Six Sigma consultancy firm based in Fairport, NY. They have chosen Palisade's @RISK to be used exclusively for their Monte Carlo Simulation modules taught during their Black Belt, Master Black Belt, Design for Six Sigma (DFSS), and Product Development programs.

TPC will be starting a new wave of certification courses this month featuring @RISK in 4 different locations across New York and Pennsylvania. Check out TPC's website for more information on their programs and to the find links to sign up for upcoming classes.

0 Comments »