Month: October 2010

Calm Those Tweets

Sooner or later, it had to happen. . . .  Tweets have been linked to stock market behavior.  This was not a case of inside information.  Researchers from Indiana University have demonstrated that public mood, as expressed in millions of Tweets, can predict stock market behavior with fair reliability .   
 
Analyzing a collection of 9.8 million Tweets from 2.7 million users in 2008, the team used a "subjectivity analysis" tool called OpinionFinder and a Profile of Mood States (a psychological measure) to create a time series that tracked daily variation in public mood as exhibited by the language of the Tweets.  It then compared the fluctuations in mood with those of the closing values of the Dow Jones index.
 
To make these comparisons, the team trained a neural network on the data.  Of course, this was not just any neural network.  It was a Self-Organizing Fuzzy Neural Network, one in which organizes its own"neurons" during the training process.  
 

The patterns that this neural network identified revealed that Tweeting terms that convey a sense of calmness anticipated upward movement in the stock market.  These predictions were 87.6 percent accurate. Although I have been unable to to track down the statistical analysis methods behind the mood measures, these odds would seem to be impressive.

Does the relation between Tweeting and the stock market work only one way?  Or does this result imply that if we want to avoid another Black Swan dive in the financial markets, we should just think calm thoughts and Twitter slowly?

Decision trees and the rescue of the Chilean miners

As the world celebrates with Chile following the rescue of the 33 miners trapped underground for 69 days, it is interesting to review the various technologies that have been instrumental in supporting the engineering feat that lifted the men to ground level.

For example, during the crisis, mining expert Manuel Viera, the CEO and managing partner of engineering consultancy and Palisade partner, Metaproject, proposed a model to determine the rescue method that would subject the miners to the least risk.  He used Palisade’s decision tree analysis tool, PrecisionTree, to evaluate the various alternatives from a technical and economical perspective.

The decision evaluation challenge was how to rescue the miners as quickly as possible, as well as ensure that their mental and physical health was maintained while the rescue mission was planned and implemented.  The rescue operation was very risky, not least because it was possible that another landslide could occur, with causal factors including geological faults, lack of accurate information from the plans of the inside of the mine, and insufficient knowledge about the structural geology of the mine. The additional drilling required to rescue the miners could have caused walls to collapse further as a result of micro fractures and faults in the rock.

A key decision in the risk analysis was whether to raise the miners to 300 meters below the surface, or to keep them in their current location near the refuge at 700 meters.  There were also several drilling options for reaching the trapped miners.
 
PrecisionTree presented a matrix of statistical analysis results for each branch tree (i.e. rescue option).  This made it is possible to ascertain, for example, that for some of the drilling options it was feasible to move the miners in two stages, but for others it was not, due to logistical problems. Information such as this is invaluable when decision making under uncertainty is a matter of life and death. 

The actual rescue operation went straight to 700 meters and used three drills at the same time: Drill A, the Strata 950 raise bore machine; Drill B, the Schramm T-130 machine; and Drill C, the RIG 442 machine.  As predicted by the Metaproject PrecisionTree analysis, Drill B was the first to reach the miners.

» Read more about the PrecisionTree model

The 2010 Palisade Risk Conference series continues – This week in Sydney!

The 2010 Palisade Risk Conference is being held this Wednesday and Thursday, October 20-21, at The Radisson Plaza Hotel Sydney. This event is an excellent opportunity to network with other professionals and find out how they’re using Palisade solutions to make better decisions. Our keynote speaker, Dr Frank Ashe of Q Group Australia and Macquarie University Applied Finance Centre, will deliver his presentation of "Risk Management and Organisational Culture."  On Thursday, Andrew Kight of Aon Global Risk Consulting will deliver the plenary session, "Applications of @RISK in Risk Financing Strategy."

Other case studies will demonstrate how Palisade customers use the world’s leading risk and decision analysis software. Kimball Fink-Jensen of Kaizen Institute will present “How @RISK Adds Value to Lean Business Models and Six Sigma”; Evan Hughes of Sunbury Electrification Project presents “Construction Risk Management in LOR”; and David Thompson presents “Decision Making in the Face of Uncertainty”. In addition, we have case studies from Marsh Risk Consulting, University of New South Wales, TBH Capital Advisers, Value Adviser Associates, and WEL Networks Ltd.

Sam McLafferty, CEO of Palisade Corporation, will provide a bit of background on Palisade’s history and describe what sets Palisade apart in the market. He will give an overview of @RISK and the DecisionTools Suite and then describe the latest enhancements and additions to these products before providing a glimpse into what’s coming next from the company.

Join us to find out why Palisade stands at the forefront of risk and decision software analytics!

» 2010 Palisade Risk Conference in Sydney

The Unsupervised Neural Net

In an article in last week’s Oil & Gas Journal, Tom Smith focused on the use of neural networks in oil and gas exploration.  Because of the technology’s usefulness in classifying data and identifying patterns, it has become widely used to reduce the risk and time in the siting of oil and gas wells. All well and good, but not good enough apparently to satisfy the growing intensity of exploration.
 
Oil and gas apparently aren’t the only things spurting out of the oil fields. These areas are gushing data, so much data that conventional neural networks can’t process all of the information.  Author Smith believes that the next step in reducing risk and wasted time in exploration will be the "unsupervised" neural network.  It pushes the Known off the computer screen and replaces it with an Automated Unknown.  
 
While the "supervised" neural network processes classified data, that is, known information, the "unsupervised" neural net can classify unclassified data and then process the patterns that result.  This makes it invaluable for seismic interpretation, that is, for detecting and analyzing subtle geological variations that may be related the potential to extract usable oil or gas.
 
Smith predicts that unsupervised neural networks will be a "disruptive" technology in seismic interpretation.  A disruptive technology is an unexpected innovation that changes the direction of progress in an industry, like digital downloads in the music industry.  If he’s right, it just got a whole lot easier to strike oil. 
 

Free Webcast This Thursday: “Using Simulation Techniques in Litigation”

On Thursday, October 7, 2010, Mike Pellegrino will present a free live webcast in which he will explore the use of @RISK risk analysis functionality for intellectual property valuations in litigation. We will discuss the basic applications of Monte Carlo simulation techniques to create defensible intellectual property valuations. We will then discuss advantages of using simulation techniques in a litigation form, common attack points from opposing counsel, and how to defend against such attacks.

Attendees should have comfort with basic @RISK functionality, basic asset valuation theory, and basic understanding of intellectual properties such as patents, copyrights, trademarks, and trade secrets.

Mike Pellegrino is the founder and President of Pellegrino & Associates. He is a former practicing software engineer, chief finance officer, and accountant. This diverse background gives Mike the requisite experience to credibly and defensively value intellectual properties that include best-selling books, top-selling branded products, cutting-edge software, patents that drive many of today’s high-tech products, and even John Dillinger’s publicity rights.

» Register now (FREE)
» View archived webcasts

Monte Carlo’s Place in Bioscience

The increasing number of mentions of Monte Carlo simulation in the popular press usually refer to its use in the realm of finance–for such applications as determining value-at-risk, reserve estimation, and credit risk management–because this is where quantitative analysis hits us directly in the pocketbook and where the technique is relatively easy to explain.  But there is a parallel upturn of coverage in the realm of medicine, particularly in pharmaceutical risk management, that is mostly taking place out of the public eye. 
 
This coverage appears in specialized periodicals–such as Genetic Engineering — their online counterparts, and in the offerings of online aggregators targeting in audiences in medicine, public health, and the pharmaceutical industry.  These articles deal with statistical analyses that are not so easy to explain– pharmaceutical risk assessment in drug trials, diagnostic probabilities in new treatment regimes, risk analysis of public health hazards–and only a limited number of readers can understand them.

I mention this parallel stream of publishing because of the sheer number of medical, pharmaceutical and biotechnology studies that rely on Monte Carlo simulation. The steady rise in the number of Google alerts I receive is pretty clear evidence that the technique has escaped corporate headquarters and is deeply entrenched in the biosciences, going to work on life-and-death issues.