White Papers

Technical descriptions of the use of risk and decision analysis software and solutions. More at http://www.palisade.com/articles/whitepapers.asp

MRAG AP Quantifies Illegal Fishing in the Pacific Islands with Palisade’s @RISK

MRAG LogoMRAG Asia Pacific (MRAG AP) is an independent fisheries and aquatic resource consulting company based in Brisbane, Australia. An international leader in the field of aquatic resource consulting, the company is dedicated to promoting sustainable use of natural resources through sound integrated management policies and practices. MRAG AP used Palisade’s @RISK software to estimate the volume, species composition and value of illegal, unreported and unregulated (IUU) fishing in Pacific Islands tuna fisheries. The outcome of their study provided recommendations on ways to strengthen monitoring, control and surveillance arrangements, to help minimize the future financial impact of IUU on Pacific Island economies.

Illegal, unreported and unregulated (IUU) fishing is a global problem that results in significant financial losses every year. Previous attempts to estimate the damages put the value between US$707 million and more than US$1.5 billion per year in the Western Pacific Ocean area alone. However, IUU fishing by its very nature is secretive, making it extremely difficult to accurately quantify the nature and extent of potential damages, as well as plan for effective Monitoring, Control and Surveillance (MCS) activities.

As part of a European Union-funded project, MRAG AP was commissioned to estimate the volume and value of IUU fishing in the tuna fisheries of the Pacific Islands region. The company took a ‘bottom up’ approach to the study, analysing detailed information at a local scale in an effort to build a more accurate picture of IUU fishing activity, particularly the variation in the nature and scale of IUU activity associated with each IUU risk in each main fishery. Estimates obtained in this way were then added together to develop an overall estimate of IUU catch and value.

Bigger@RISKDistribution

According to Duncan Souter, CEO of MRAG AP, “The challenge with this approach is that it is time-consuming and information is often very patchy and hard to collect. There are therefore many gaps to fill that require analytical methodologies of varying degrees and complexity.” For this study, the company broke down the ‘IUU problem’ into discrete, quantifiable units – volume, species composition and value – before aggregating them up to produce a regional scale estimate.

“@RISK is very user friendly, considering the complex analytical techniques we were undertaking, and provided useful outputs that allowed clear presentation of our results,” said Souter.

» Read the full case study
» MRAG AP’s report: Towards the Quantification of Illegal, Unreported and Unregulated (IUU) Fishing in the Pacific Islands Region

Free Minicourse in Renewable Energy Modeling using @RISK, with example models

How do we insure a reliable energy supply when using renewable energy sources?

Renewable Energy Modeling

Solar power is inherently unreliable, fluctuating with time of day and degree of cloudiness. Wind power is a victim of air flow patterns. To prevent blackouts, renewable energy sources need to be backed up with conventional power sources. In effect, they require virtually 100% backup with fossil, nuclear, hydro sources of power. Think of the repercussions of a solar eclipse and calm winds on renewable energy output, which occurred in Europe in 2015.

In this free on-demand minicourse, Professor Roy Nersesian models the complexities of this problem – and demonstrates solutions – using @RISK for Excel.

Integrating Renewables with Electricity Storage, using @RISK

Materials include:

  • 1-hour webinar delivered by Roy Nersesian
  • Energy example models
  • 50-page whitepaper
  • Presentation slidedeck

» Go to free minicourse now

» Case study: @RISK Helps Integrate Renewable Energy Sources

Will Aviation Biofuels Fly? @RISK Helps Assess Two Different Government Policies

Developing biofuels for aircraft is a risky endeavor, as transforming plant material into liquid fuel is still very expensive compared to the price of fossil fuels. Dr. Wallace Tyner and his colleagues at Purdue University used @RISK to conduct a cost-benefit analysis of building an aviation biofuel plant, and to determine the potential impacts of two different government policies to jump-start this technology: reverse auction and capital subsidy. In a reverse auction, the government would put out a request to supply aviation biofuels, and different private investors place bids on the price per gallon of fuel, with the lowest bidder winning the contract. The government must pay the contracted price per gallon of the biofuel, regardless of the current price of oil. A capital subsidy involves the government paying for a portion of the capital costs in developing the biofuel.

The Purdue researchers used a discounted cash flow model to find the net present value (NPV) of a theoretical aviation biofuel plant. They incorporated four variables that have a large impact on the non-risk adjusted breakeven fuel price: capital cost, feedstock cost, final fuel yield, and hydrogen cost (the price of hydrogen input used in producing biofuels). They created empirical distributions on all these variables from literature and/or experts, and then used @RISK to incorporate uncertainty into these variables using a PERT distribution. The researchers created projections to forecast what the fuel prices would be in the future, and what the breakeven biofuel price would have to be.

They found that both policies reduced risk in investment of aviation biofuels, however a reverse-auction policy reduced the risk of this investment more. Their research is detailed in their article “Field to flight: A techno-economic analysis of the corn stover to aviation biofuels supply chain,” published in the March/April 2015 issue of Biofuels Bioproducts & Biorefining.

Dr. Tyner uses @RISK for research, as well as for teaching his course in benefit cost analysis. “I use it to teach all my students to introduce uncertainty into project evaluation,” he says. “It’s been a tool in my portfolio for a long, long time.”  He also notes how it has made his work drastically more efficient. “You can do something much less expensive and much easier with @RISK today compared to what I painstakingly did years ago,” he says.

The full article is available here.

Assessing the Sun: A Scenario Analysis Weaves Solar Power with Hydroelectric in South America

As many experts have discussed, the road to sustainable energy must be paved with multiple different types of renewable resources. Elio Cuneo,an electrical engineer and Chair of Energy Management and Administration at the Universidad Santa Maria in Venezuela, tackled this conundrum in his white paper, titled: "Water and sun, certainty and volatility: Ideal Pairing for Electricity Supply in SIC."

Sistema Interconectado Central (SIC), which stands for Central Interconnected System in Spanish, is the main alternating current power grid in Chile. In July 2008 a solar measurement station with a tracking system was installed in the north of Chile. The objective of these measurements was to research the potential of global and direct radiation for a possible use of the solar power in the north of Chile for energy efficient production in SIC, which largely relies on hydroelectric power.

Cuneo ran stochastic models using Palisade's @RISK software, and found that including solar power generation represents the equivalent of a “rain” insurance for the SIC system, especially for hydroelectric plants that operate with reservoirs; in fact, having a high certainty of solar power generation is particularly relevant during drought scenarios.

In non-drought conditions, solar energy would also be beneficial; water is stored, leading to lower operation costs and lower spot market prices. Cuneo goes on to explain, "as the benefit is received by end-users as well as by hydroelectric plants with reservoir capacity, the cost of this “insurance” must be prorated between all of them", in order to make solar technology development financially attractive.

Read the white paper here.

See also: As Hydropower Moves to Small Dams, Big Dams use Risk Analysis Solutions to Meet Energy Conservation Goals

Modeling Today and Tomorrow’s Risk: One Insurance Company’s Strategy

In a recent white paper from Government Entities Mutual, (GEM) Inc., which writes Liability, Workers’ Compensation, and Property reinsurance coverage, underwriting manager Joel Kress posed the question, “how risky are we?” 
 
To answer this question, Kress and his team decided to simply model the most detrimental and most quantifiable risks: Underwriting Risk and Reserve Development Risk. For Underwriting Risk, they sought to quantify their annual risk transfer contracts. For Reserve Development Risk, they outlined and measured the risk associated with all past contracts they had written. Since GEM is almost 10 years old, they knew there would be years (and decades) of further Incurred But Not Reported (IBNR) development on GEM’s balance sheet. This type of risk accumulates geometrically as the years move on.
 
Since GEM’s loss experience alone was limited and thus statistically non-credible, Kress and his team supplemented this data with loss experience from other industry reinsurance data. With this combination, they were able to create a single loss distribution, which statistically estimates the company’s predictability of loss.
 
Using @RISK's Monte Carlo simulation, GEM then created a profile for each contract written in the  most recent policy year (2011), and distilled all the information from each contract into exposure to loss, which is simply frequency x severity, that GEM held as the risk bearing captive. Kress and GEM actuaries then estimated the risk for the historical policy periods by using the selected loss distribution to measure the variability around the expected loss reserves. This variability or, of greater concern, the variability of losses costing more than expected, was the third piece to GEM’s risk metric. GEM’s selected loss distribution looked like many other (re)insurance loss distributions–skewed towards the  right, indicating a chance, albeit slim, of a large, calamitous loss. 
 
The majority of this risk came from contracts currently being written, since the insurable events have not yet occurred. Turning to @RISK again, Kress and his team used the  input variables to estimate GEM’s  losses for the current policy year’s contracts, and then ran the algorithm for 10,000 hypothetical policy years. From this tome of data, they were able to determine key statistical metrics. 
 
Once all the simulations were finished, it was time to measure the results. GEM used Surplus as a measuring stick since it is easily understood, readily calculable, and of concern to most interested parties. GEM found that at a 60% Confidence Level, their Surplus would need to make up a $965,000 shortfall in losses. Thus with this risk they modeled, the amount of extra money from GEM’s current and historical contracts will cost beyond what is expected.
 
The last step in this process was to use five statistical benchmarks of ruin to measure themselves against. These benchmarks included the total Captive’s Contributed Capital, Company Action Level, Regulatory Action Level, Authorized Control Level, and Mandatory Control Level. GEM was able to assign chance percentages to all these potential risks, ranging from 17.2% and 0.4%.
 
Thus, using @RISK, Joe Kress and GEM were able to assess their risk for current and future books of business. According to Kress, “None of this minutia would be possible without the power of computers. It is one thing to program an algorithm to do a set of tasks, as outlined above. It is another thing entirely to make the computer work for you.”
 
 
 

Money-saving Meds: Researchers Use Pharmaceutical Risk Assessment to Determine New Drug’s Cost-Cutting Benefits

In the April issue of the International Journal of Nephrology and Renovascular Disease, researchers from  the DaVita Clinical Research in Minneapolis, MN used Palisade's @RISK software to develop a cost-offset model that cuts costs for end-stage renal disease treatment using an innovative pharmacological treatment.
 
Currently, about 400,000 end-stage renal disease (ESRD) patients in the US undergo dialysis three or more times per week. The costs for these treatments are staggering—with 85% of patients relying on Medicare as the primary payer, the estimated amount spent on their care is $29 billion.
 
One of the major portions of this cost come from metabolic maintenance medications–the kidney is responsible for both regulating phosphorus levels and red blood cell production. But dialysis is unable to mimic these particular behaviors of a healthy kidney—thus, ESRD patients can suffer from significant anemia, as well as bone and mineral deregulation, resulting in calcium being deposited in arteries instead of bone, with associated increases in clinical events such as fractures and cardiovascular and cerebrovascular events. Thus, patients must use oral phosphate binders to decrease their serum phosphorus levels, and receive regular injections of epoetin alfa (an erythropoiesis-stimulating agent [ESA]) to stimulate red blood cell production, as well as intravenous (IV) iron. All told, oral and injectable medications account for more than half of outpatient dialysis expenses.
 
An experimental oral phosphate binder, ferric citrate, has been found to both manage serum phosphate levels and increase measures of iron and iron storage in the blood, indicating that this single drug may have multiple benefits: treatment as an oral phosphate binder medication and iron source in ESRD patients with anemia. Thus the authors of the study developed a budget impact model estimating the monthly cost associated with the use of ferric citrate in the treatment of hyperphosphatemia with the added benefit of treating iron deficiency associated with ESRD anemia, versus the cost of other currently available phosphate binders. The model was constructed from the perspective of a US managed care plan.
 
 
Monte Carlo simulations were used to address the high uncertainty of the cost-offset model parameters using @RISK. The simulation showed that for each patient with ESRD, a managed care organization, such as Medicare, will likely save between US$104 and US$184 (90% confidence interval) per month with ferric citrate use. These savings translated into a monthly savings of between US$52,164 and US$92,186 (90% confidence interval) per 500 ESRD patients when ferric citrate was compared to other conventional phosphate binders (Figure 2). The monthly model input variables were projected out to determine annual cost estimates. An additional Monte Carlo simulation demonstrated (at 90% probability) that a provider serving 500 dialysis patients could save between US$626,000 and US$1,106,000 annually with the use of ferric citrate.
 
With the help of @RISK, the researchers were able to prove that this promising new drug could help reduce expenses for a health care system that's desperate for cost reductions.
 
Read the original study here.
 

@RISK Utilized in Spelman College Grant Research Project on Health Care Costs of Former Inmates

@RISK utilized in Spelman College grant research project on health care costs of former inmates According to Dr. Marionette Holmes, assistant professor at Spelman College, the homeless and previously-incarcerated population represents one of the most vulnerable demographics in the United States in terms of health care, and the cost for caring for these individuals can be staggering. For example, the monthly cost to care for those who are triply diagnosed with a mental illness, HIV positive, and substance abuse problems can range from approximately $4,000 to $40,000. In a grant research project entitled, "Examining the Health and Economic Impact of a Policy Driven Supportive Housing Program for Formerly Incarcerated Homeless Individuals in New York City," Holmes is utilizing @RISK in a cost-benefit analysis that considers the economic and health impacts of prisoners moving from incarceration to homelessness and moving from incarceration to supportive housing.

The supportive housing program Holmes examined is located in New York City, and specifically targets individuals who have cycled through both the penal and health care system more than four times over the past five years. One of the implementations of @RISK in this research is through a model that converts risky sexual behavior into HIV transmission risk and uses findings to project future HIV transmission probabilities.

Findings of the study are expected to be released in December.

Dr. Holmes has also assisted her research students with the use of @RISK. Ms. Andrea Brown, a recent graduate of the Spelman College Mathematics Department is writing a research paper entitled, “The Impact of Condom Attitudes of African American Female College Students on HIV/STD Transmission Risk,” which utilized @RISK in a cost-benefit analysis. Though the paper is still being written, the research conducted took first place in a recent research competition.

Unsafe Seafood? Monte Carlo Analysis Finds Increased Cancer Risk Due to Arsenic-heavy Seafood

While finding risk analysis solutions is important for companies and businesses, it is paramount when it comes environmental and human health. Simulations that help evaluate risk are crucial in helping scientists understand, and try to mitigate some of the threats living organisms face. Take, for example, a new study to be published in August 2013 that measures the  probabilistic risk of arsenic consumption via seafood by people living in the southwestern region of Taiwan.
 
Arsenic is a known poison and carcinogen that occurs in both organic and inorganic compounds. Organic arsenic compounds tend to be less toxic, while inorganic is more so.  Inorganic arsenic compounds tend to come from industrial and agricultural sources, which can make its way into the enviornment and lead to its eventual uptake into the foodchain.
 
In this study, the researchers had to tackle both the variability  of arsenic levels in the seafood, as well as individual consumption habits. Using Monte Carlo simulation, they were able to conduct an assessment of exposure to arsenic, finding that arsenic consumption from five types of fish and shellfish for the 95th percentiles falls below the threshold set by the Food and Agriculture Organization and the World Health Organization, however,  it exceeds the target cancer risk. According to the authors of the paper, “this study demonstrates the importance of the individual variability of seafood consumption when evaluating a high exposure sub-group of the population who eat higher amounts of fish and shellfish than the average Taiwanese.”  
 
Palisade's @RISK Monte Carlo simulation software can be used to conduct the types of analyses used in this study, yielding key probabilities for risk of disease and environmental concerns.
 
 

Research at Curtain University of Technology Explores Tools for Teaching Probability and Risk

Textbooks that touch upon technology-related subject matter face the challenge of topics becoming outdated by the time the books go to press. Darren O’Connell found this type of content stagnation to be readily evident in how risk analysis is taught in institutes of higher learning. In fact, he found risk analysis lessons were not only behind the times, but not nearly as accurate as more current methods of calculating risk.

For his doctorate research at Curtin University of Technology (Australia), O’Connell compared the traditional method of teaching risk through the use of normal distributions, to a probabilistic method, featuring @RISK, StatTools and RISKOptimizer. Having utilized Palisade solutions professionally for financial risk analysis, O’Connell was convinced that new technologies offered more efficient and accurate means of teaching risk. To illustrate his point, O’Connell presented methods of modeling the stochastic price process of two illiquid securities under uncertainty by application of probabilistic techniques, in order to manage price risk within a Value-at-Risk (VaR) framework. To do this, he utilized multiple Palisade solutions. His research discovered that the newer methods were not only more accurate, but they were more user-friendly and cost-efficient.

“The benefits gained from using Palisade software were the ability to select probability distributions from a large universe, which opened up greater statistical modeling possibilities that develop more realistic solutions to problems encountered by industry practitioners,” reports O’Connell. “The seamless integration of Palisade products into the Excel development environment is a huge advantage, and allows practitioners to learn to use DecisionTools as if it’s a natural extension of learning Excel. This in turn reduces training and system development costs, because risk departments are not investing in expensive / extensive proprietary system solutions.”

» Read more about O’Connell’s research
» View a detailed slideshow

Troubled Waters: Report Calls for New Risk Analysis Services when Estimating Flood Insurance

As our weather patterns change and become more severe, it can’t be denied–climate change is upon us, and with it are some serious changes to life as we know it.  Take, for example, a recent report  commissioned by FEMA and  written by the National Research Council which outlines the true risks and costs of flooding in the Missouri and Mississippi river flood plains, and attempts to accurately price out flood insurance in a way that recovers actual costs.

The report highlighted the fact that flooding is bound to increase in severity–and thus, it's crucial to have a modern, statistically sound approach using risk analysis when analyzing and managing flood risk in areas protected by levees. Palisade’s @RISK software is used around the world for these type of analyses, and has in fact been used to analyze the risks and rewards of flood mitigation in the UK .  The National Research Council stated a need for state-of-the art estimates on how well levees will perform, in order to paint an accurate picture of the likely risks communities might face during flood conditions. Scenario analysis is key–multiple variables must be analyzed, such as how high a river might rise, and how many times will it crest it’s specified flood level.

According to Gerald Galloway, an engineering professor at the University of Maryland  who chaired the panel that produced this report, flood losses are continuing to rise. In fact, the National Weather Service predicts roughly $8 billion per year in flood losses–a number that is bound to grow as climate change continues.

The report states that, as an administrator of the Nation's  flood insurance program, FEMA must adopt a more up-to-date approach to analyzing as well as managing the risks of flooding behind levees. “Property owners would be more favorably inclined to buy flood insurance if individual risk is well-known, understood, and insurance rates are priced to match the probability of flooding and financial impact of flooding events,” the report says. 

If FEMA chooses to tackle the problem using @RISK, they'll be sure to have accurate answers to these complex questions.

 

See also: Using Risk Analysis to measure the impact of climate change