As every risk manager knows, however good a risk-based model, the validity of the forecasts it makes is dependent upon the quality of the input data. One approach to obtaining assurance that forecasts from Monte Carlo analysis are realistic is to measure the capability of the risk management process that has been used to produce risk models.
The Project Risk Maturity Model (RMM) performs this measurement and has been demonstrated to help produce risk models that result in realistic forecasts (including models using Palisade’s risk analysis software, @RISK).
For example, the UK’s Ministry of Defence (MoD) equipment projects include the development and manufacture of new military equipment for the UK’s Army, Royal Navy and Royal Air Force. Risk is often increased on these large and complex projects by objectives that tend to push the limits of technical feasibility. As a result, the MoD is highly committed to its project risk management process.
However, in 2001, the MoD recognised that too many of its projects ran late and over budget. It traced this to over-optimistic risk analysis forecasts in the early stages, leading to projects passing approval points without adequate scrutiny. This realisation prompted the MoD to invest in the Project RMM. As a result, risk models used for project approvals became more reliable and realistic.
This is useful insight for @RISK users, because it enables them to check that the process used to develop their input data has been good enough to support their model. As a result they can be confident of their predictions – regardless of the industry in which they operate.
For those who want to find out more, further details about the Project Risk Maturity Model and its lead developer, Martin Hopkinson, are in the case study on our website.