Author: Lauren Roberts

Is it Best to Hedge Your Lettuce? @RISK and StatTools Help Answer the Question

Agriculture is traditionally one of the highest risk economic activities. In California, many produce farm operations use a rule-of-thumb to manage their seasonal finances–often aiming to contract 80% of their crop in advance to buyers at set prices, and leaving the remaining 20% to be sold at spot prices in the open market. The rationale for this is based on an assumption that costs, and a reasonable margin, can be covered with 80% of production hedged by forward contracts. The hope is the remaining 20% of production will attract high prices in favorable spot markets, leading to substantial profits on sales. Of course, spot prices might not be favorable, in which case any losses could be absorbed by the forward sales.

Steven Slezak, a Lecturer in the Agribusiness Department at Cal Poly, San Luis Obispo, and Dr. Jay Noel, the Agribusiness Department Chair, used @RISK to conduct a case study on an iceberg lettuce producer that uses the rule-of-thumb approach to manage production and financial risks. “We wanted to know if the 80% hedge actually covers costs over the long-term and if there are really profits in the spot market sales. We wanted to know if the return on the speculation was worth the risk. We found the answer is ‘No’.”

Slezak and his colleagues created an @RISK revenue distribution model with inputs such as past revenue, harvest costs, and crop yields. They used StatTools to create the distribution parameters. Next, @RISK was used to simulate combinations of all costs and revenue inputs using different hedge ratios between 100% hedging and zero hedging. By comparing the results of these simulation in terms of their effect on margins, it was possible to determine the effectiveness of the 80% hedging rule of thumb and the value added by holding back 20% of production for spot market sales.

“While growers have to give up some of the upside, it turns out the downside is much larger, and there is much more of a chance they’ll be able to stay in business,” says Slezak. In other words, the cost-benefit analysis does not support the use of the 80% hedged rule-of-thumb. It’s not a bad rule, but it’s not an optimal hedge ratio.

Slezak is a long-time user of @RISK, and has relied on the software to perform economic and financial analysis on a wide range of problems in industries as diverse as agribusiness, energy, investment management, banking, interest rate forecasting, education, and in health care.

Read the complete case study here.

@RISK Weighs Water’s True Cost

water

Many of us assume that water is cheap—it flows freely from the tap whenever we need it. However, the true cost of water is far from free. For many companies pursuing corporate water stewardship strategies, a common barrier to action is this perceived low cost of water. As water costs are often as low as $2.00 per thousand gallons, justifying large water reclamation projects can be difficult if not impossible to justify.

A major beverage manufacturer was moving forward with implementation of a water recycling system, but faced a decision barrier around the perceived low cost of water. They hired Antea Group, an international environmental, health, safety, and sustainability (EHS&S) consulting firm, to conduct a quantitative analysis of the proposed project. John Estes, consultant with Antea Group, used @RISK to examine the beverage manufacturer’s operations and look at the hidden costs of the water they used.

“There’s the rinsing, cleaning of equipment, the steam needed to sterilize,” he says. “Essentially, a significant amount of water is used that doesn’t go into the product and instead is discharged to the sanitary sewer system.” The cost of sending this wastewater to the sewer is over $7.50 per thousand gallons— over 3.5x higher than the cost of getting water from the tap. Estes adds that there are additional hidden costs to the water from the heating and cooling processes, system operations and maintenance, and effluent treatment.

Using @RISK, Estes and his colleagues found that the most significant cost driver was the discharge to the sanitary sewer. They were also able to determine a more accurate cost of water as a baseline. With this true cost number in hand, they then ran probabilistic models with @RISK to determine future potential scenarios, discovering that once the recycling facility comes on-line, the projected cumulative savings will range between $10.5-million to $14.5-million over 10-years.

@RISK has proven itself to be an invaluable tool for Antea Group, Estes says. “What’s so nice about @RISK is its transparency,” he explains. “We can show all the formulas, and we can watch the simulations as they run. Other risk analysis software have more of a ‘black box’ design. With @RISK, you can set your variables, change them, and rerun your simulation and see exactly how it changes—it’s very helpful.”

@RISK Helps Keep Pupfish from the Brink of Extinction

@RISK Helps Keep Pupfish from the Brink of ExtinctionThe Devils Hole pupfish (Cyprinodon diabolis) is one of the world’s most endangered animals. There is only one wild population living in a single aquifer-fed thermal pool in Nye County, Nevada, and has been perched on the brink of extinction at 35–68 fish in 2013. A major strategy for conserving the pupfish has been to create additional captive populations, but scientists needed to know how to best extract wild pupfish for breeding purposes without unduly accelerating the extinction risk for the population in Devil’s Hole. Dr. Steven Beissinger, Professor of Conservation Biology at the University of California, Berkeley, constructed a population variability analysis (PVA) using @RISK to better inform this dilemma.

Dr. Beissinger first created models for extinction risk in the wild for the pupfish, which showed that most simulated populations of the pupfish became extinct within 50 years. Median and mean time to extinction were 26 and 27 years, and 17 and 22 years, respectively. Next, to evaluate the effects of different strategies for removing individuals to initiate a captive breeding program on the wild population, Dr. Beissinger modeled the effects of removing different numbers of individuals (0–14) at the start of each simulated year.

Dr. Beissinger then wanted to answer the question of which life-stage should pupfish be harvested. The model showed that removing pupfish eggs had the least effect on the wild population. Indeed, Dr. Beissinger calculated that removing 25 eggs for captive breeding is equivalent to removing a single adult in terms of its influence on population dynamics.

This modeling work has helped to inform decisions made by the U.S. Fish and Wildlife Service to conserve the Devil’s Hole pupfish, which has now begun to remove pupfish eggs from Devil’s Hole to start a captive population in its state-of-the-art breeding facility. “The modeling helped everyone to see what some of the trade-offs would be and made the various outcomes more explicit,” says Dr. Beissinger.

Dr. Beissinger uses @RISK for both research and teaching, and finds it to be a valuable tool: “@RISK makes Monte Carlo processes easy for professionals and students to understand,” he says. “The nice thing about it, from my perspective, is that it functions within Excel, which makes visualizing the information a lot easier and more intuitive for people. It makes what would otherwise be a lot of tedious steps a lot easier to do.”

Read the complete case study.

 

Counting Cars: @RISK Models Traffic Congestion

In their article titled “Estimation of Mixed Traffic Densities in Congested Roads Using Monte Carlo Analysis” in Air & Waste Management Association’s April 2015 EM Magazine, authors Brian Freeman, Bahram Gharabaghi, and Jesse Thé use @RISK to create a novel method to estimate the number and type of vehicles on a 1-km stretch congested roadway.

When researchers need to model traffic patterns, they go out and count cars, multiple times a day at multiple locations, explains researcher Brian Freeman. Thus, he wanted to devise a more efficient method for evaluating traffic congestion.

EMMagazine_April2015_EstimationOfTrafficDensity

Freeman and his colleagues first assumed that each vehicle occupies road space based on its length (L) and inter-vehicle gap (IVG) during congested traffic. Both IVG and L are independent variables subject to a wide range of values. A vehicle’s length may average from 1.8 m for a sedan, and up to 9.7 m for a large bus. IVGs are independent of the vehicle due to driver behavior and changes in speed due to the vehicle traveling ahead. The authors accounted for four classes of vehicle types, including sedans, SUVs, midsized buses, and large buses, and considered speeds from 5 to 40 KPH.

Each vehicle length was assigned its own PERT distribution from vehicle manufacturer data. Using @RISK, the authors ran their stochastic model using 5,000 iterations on each variable at 5, 10, 15, 20, and 40 KPH at the same time, for 1-KM stretch of road. During each iteration, a vehicle class was randomly selected from the four classes for each space.

The resulting graphs of the @RISK model yielded a power curve that approximates the expected number of cars at each speed, thus giving researchers a fast, convenient tool for better understanding and estimating vehicle numbers in traffic. For Freeman, the benefit of using @RISK for this research was its efficiency. “You can quickly create a model in Excel that would otherwise require a very complicated statistical tool package to set up,” he says. “You can easily create your model without having to become an expert in statistics.”

Read the EM Magazine article here, and see the Palisade case study here.

 

Review our Software on Capterra

Review our Software on CapterraPalisade software users–have our tech tools been valuable to you and your work?  If so, we’d love you to take a few moments to review our products on Capterra.

Capterra provides software directories, blogs, infographics and ebooks to help customers find the exact type of software tool they’re looking for.  This tech  concierge service has been used by major industry leaders–such as Warner Brothers, Coca-Cola, Walmart, Whole Foods, and The Home Depot–to find the right tech solutions.

Palisade software is listed on Capterra, and we’d like to help people find it.  If you can, please take a moment to review our software tools on Capterra and help spread the word:

Visit the Capterra @RISK page
Visit the Capterra DecisionTools Suite page
Visit the Capterra BigPicture page

 

With Great Power Comes Great Responsibility: @RISK plays a part in finance analysis for SpiderMan 2 and other movie favorites

With Great Power Comes Great Responsibility: @RISK plays a part in finance analysis for SpiderMan 2 and other movie favoritesThe next time you settle in at the Cineplex to watch the latest Hollywood blockbuster, you may have @RISK to thank in part for its production.

Movies such as SpiderMan 2, The Interview, and 22 Jump Street have all benefited from @RISK’s quantitative risk analysis—thanks to Benjamin Waisbren. Waisbren, a partner at the law firm Winston & Strawn, is also the President and a shareholder of LSC Film Corporation, which funds major film productions–such as V for Vendetta, Blood Diamond, and 300. In his role as a film financier, Waisbren uses @RISK in all his negotiations.

Last year, Waisbren relied on @RISK to navigate a $200 million co-financing deal with Sony. The deal gave Waisbren’s team, LStar Capital, a stake in nearly all of Sony’s movies. Waisbren says the success was in part thanks to Palisade’s risk analysis software: “I am happy to say that @RISK played a very significant part in my work on that deal,” he says. “I did the modeling behind it, and because of @RISK, we did not hire any outside advisor or consultant.” Additionally, because of the risk modeling tool, “our closing costs were about 20 million dollars lower than other deals.” After closing the agreement, Waisbren was then asked to be manager of the new film fund for LStar Capital.

Since then, Waisbren has continued to use @RISK to do statistical analysis on risk quantification volatility of motion picture businesses. For example, the software has helped him to make uncannily accurate predictions around how much a film will make on its opening day or weekend—or, it can help him predict how much money a film production will burn through in a certain number of months. In short, it’s proven itself invaluable: “I’m really grateful for this product—it has provided a substantial competitive advantage in this business,” he says. “I’m a huge believer in this tool.”

Read the full case study here.

Oil and Gas Companies Rely on @RISK to Evaluate Insurance Structure

Oil and Gas Companies Rely on @RISK to Evaluate Insurance StructureIn a volatile industry sector, many large oil and gas companies establish their own insurance subsidiary. These entities, called captive insurance companies, help manage the overall group risk. Due to the way in which they manage reserves, the captive can typically afford to retain higher risks than individual operating companies or business units. Alesco Risk Management Services Limited (Alesco), an independent energy insurance broker and risk management consultant,works with oil and gas companies to determine how much risk they should retain in their business unit, how much should reside with their captive(s), and at what point they should transfer the excess or catastrophe risk to local and international reinsurance markets.

To help its clients develop an informed risk management strategy, Alesco uses @RISK to design models that forecast future insurance losses. This enables alternative insurance structures to be tested to see how different balances of business unit retention (usually a simple deductible or excess on the policy that will be applied before any insurance claim is made), captive retention, and, beyond that, commercial insurance, affect premium levels and capital requirements. The objective is to find the optimal structure that balances an acceptable premium cost with the client’s financial ability to retain risk, and its appetite to do so.

“Palisade’s @RISK makes it quick and simple to run Monte Carlo simulations directly in Excel, thereby avoiding the need to build complex models with thousands of rows of data and code,” explains Derek Thrumble, Partner at Alesco. ” As a result we can undertake complex forecasting for our clients within a realistic time-frame to influence decisions that meets their corporate, financial and legal requirements and determine the insurance strategy that is the best fit for them at that time.”

Read the full case study here.

Play Ball! Palisade Software Used in Award-Winning Sports Analytics Research

Play Ball!  Palisade Software Used in Award-Winning Sports Analytics Research

Clayton Graham being interviewed by Jody Avirgan for FiveThirtyEight, the statistics-minded ESPN website founded by statistician Nate Silver.

Baseball fans, take note: next time you’re looking to place a bet on a game, you may want to use @RISK. That’s what DePaul University professor Clayton Graham did to create his baseball wagering model, presented at the prestigious  ninth annual 2015 MIT Sloan Sports Analytics Conference. Dr. Graham’s research submission, “Diamonds on the Line: Profits through Investment Gaming,” topped the “Business of Sports” track and earned third place overall.

In the paper, Dr. Graham discusses how he uses Palisade’s DecisionTools Suite to create a baseball investment model to calculate the probability of winning individual games and the economic consequences of each wager based against each game’s betting line. Additionally, the research sought to determine the optimal bet size, based upon the risk tolerances of the investor.

Creating the model required the following five steps:

  1. Building a predictive function that determined the probability of winning each game
  2. Defining the betting line of each game, which determines the payoff or loss
  3. Establishing an economic relationship between the production function and betting line
  4. Creating a risk-return based investment function compatible with the production model
  5. Quantifying the results model

“Palisade’s DecisionTools Suite was invaluable to the success of this lengthy project, as it quickly and easily computed the myriad statistical scenarios,” said Dr. Graham. “Baseball has a seemingly infinite set of possibilities with each at-bat, and the intricacies of determining what may happen would be impossible to determine manually, with any degree of expediency. DecisionTools Suite is also very easy to use and intuitive because it operates in Microsoft Excel. I can say, without hesitation, that this project would not have been possible without DecisionTools Suite and the technical support Palisade offers.”

Once the model was complete, an initial bankroll of $1,000 was used to place wagers (about two per day) on Major League Baseball games beginning on June 16, 2014 and through the conclusion of the World Series on October 29, 2014. The amount wagered on each game was determined by the analysis, and 75 percent of the time, games didn’t offer enough statistical edge to even warrant a wager. Key results included:

  • 68 percent of wagers resulted in wins,
  • 35 percent return on daily at risk capital,
  • Initial $1,000 investment grew more than 1,400 percent during the season.

“While this paper is not a typical utilization of DecisionTools Suite, the research clearly shows the power and accuracy of the software, and we are thrilled that our solution played a part in achieving such a prestigious honor,” said Randy Heffernan, Vice President, Palisade. “It is especially gratifying to see Professor Graham’s tireless work being recognized at this level, as he has been a Palisade customer and champion for more than three decades.”

Check out Dr. Graham’s presentation at the Sloan Sports Analytics Conference:

Read the full research paper here.

Read the press release here.

Rave Review for BigPicture from Mind Mapping Software Blog

BigPictureRave Review for BigPicture from Mind Mapping Software Blog has garnered an in-depth and enthusiastic review from the Mind Mapping Software Blog, a leading source for news, trends and resources related to visual mapping. Chuck Frey, widely regarded as one of the leading experts on visual mapping and visual thinking, explored the software’s numerous features and says, “You’ll be very impressed.”

Frey investigates BigPicture’s basic diagram-building capabilities, noting that it has “a unique capability that leverages the inherent strengths of Excel” and  that BigPicture enables “a fast way to build your diagram from scratch or to leverage existing data. Very cool! This is a capability that you won’t find in any dedicated mind mapping or diagramming program.”

Frey then delves into data mapping–‘where the real power is’–exploring the software’s ability to take reams of data and convert them into diagrams. “If your eyes glaze over when you look at a large number of columns and rows of data in Excel like I do, you’ll immediately appreciate this aspect of BigPicture,” Frey writes.

He is also impressed with BigPicture’s automated and elegant organizational chart functions: “This promises to be a big time-saver for human resources managers!” and its convenient slide-show mode: “quickly build presentations that have excellent continuity from one slide to the next.”

Frey concludes: “If you’re looking for an affordable, easy-to-use diagramming and mind mapping tool and you already use Excel, then you owe it to yourself to check out BigPicture. I think you’ll be very impressed with its excellent toolset and ease of use. This is backed up by a help file and videos that actually do an excellent job of explaining how to perform common tasks. That’s all too rare today!”

To read the full Mind Mapping Software Blog review, click here.

Download your complimentary version of BigPicture.

How Safe is That Shrimp? @RISK Tackles Health Hazards After Deepwater Horizon Spill

Southeast Louisiana is home to a large population of Vietnamese Americans who rely heavily on the shrimp caught in the Gulf for their livelihood and as a food source. When the Deepwater How Safe is That Shrimp? @RISK Weighs Health Risks of Seafood after Deepwater Horizon SpillHorizon oil spill happened in in April 2010, Vietnamese shrimpers were particularly concerned. While the U.S. Food and Drug Administration (FDA) conducted risk assessments on seafood contaminant levels and health risks, the Vietnamese community worried that the risk assessment conducted by the FDA did not accurately take into account their much higher levels of shrimp consumption and lower-than-national-average body weight. They were also concerned that the FDA did not source specimens from the key areas where they commonly fished for shrimp.

At the request of a prominent Vietnamese community organization, Dr. Jeffrey Wickliffe, Associate Professor of Global Environmental Health Sciences at the Tulane University School of Public Health and Tropical Medicine, conducted a targeted health risk analysis on the Vietnamese shrimping community and their potential for heightened risk from the Deepwater Horizon spill. He and his colleagues collected key data, including concentrations of PAHs in shrimp; daily shrimp intake rates of surveyed Vietnamese community members; assumed durations of exposure that people had to PAHs; individual’s self-reported body weights; and averaging times (the time used to average out the dose of PAHs).  Using @RISK, the model for these inputs was simulated 10,000 times, and a sensitivity analysis was conducted to determine the most influential parameters. The analysis revealed that the concentration of chemicals and the daily shrimp intake rate were the most influential in determining risk levels, however, “The study showed that the shrimp were really low in PAHs overall,” says Dr. Wickliffe. “In fact, the testing did not actually detect any of the known carcinogens.”

Nonetheless, to be extremely conservative in their analysis, Dr. Wickliffe and his team modeled health risks using even more cautious assumptions about the presence and carcinogenicity of the PAHs. It was only under the very most conservative approach that excessive health risks (> 1 in 10,000 at the 99th percentile) were seen, and even then, they appeared only in the extreme upper tail of the modeled risk distribution. While these results are reassuring in terms of the overall health risk posed to the Vietnamese American shrimping community, Wickliffe’s team cautions that this approach is not currently tenable for policy-based chemical risk assessment because of the dearth of knowledge regarding the toxicology of these modeled compounds.

Dr. Wickliffe uses @RISK in all the courses he teaches. “Probabilistic analysis is where the regulatory agencies are going with risk assessments,” says Dr. Wickliffe. “So for students who are getting a degree in public health and environmental health sciences, this is the kind of training they need—they need to know how to conduct this kind of risk assessment.”

Read the full case study here.

Learn more about @RISK.