The Flash Crash

Last May 6, the Dow Jones Industrial Average made a rapid series of inexplicable drops, and, in fact, in one five-minute period fell more than 500 points.  Then, just as inexplicably, the market recovered.  The causes of the so-called Flash Crash remained mysterious until September, when the SEC issued a report on the rapid fluctuation of the market.  It found that a single "large fundamental trader" had used an algorithm to aggressively hedge its market position quickly.
 
Since then the role of neural networks and algorithms in automated transactions has received a good deal of attention from the media.  The online edition of this month's Wired offers a fascinating perspective on algorithms as investors.  It reveals how neural networks and other automated types of statistical analysis  can chew through news of the financial markets--essential a big pile of data-- to instantaneously produce a financial risk analysis, make a larger determination of the results of a prospective trade portfolio risk management terms, and make the trade.  The speed with which a computer can function as an investor is part of the problem. It produces a kind of feedback loop in which each instantaneous trade produces instantaneous responses from other computers trolling the markets. 
 
The trend toward computer control of financial markets, however, does not continue unfettered. The month after the Flash Crash, the SEC instituted some "circuit breakers," rules to stop trading when the feedback loops begun too intense and the markets fluctuate too rapidly.

All of this presents an interesting and larger question: How much control can we delegate to computers--not just in the financial realm but in our social and creative lives--before we have to scramble to catch up with them and regain control?   
0 Comments »

The Summit Looms!

IQPC
There's an important event coming up in just a few short weeks--"Profit through Process," the IQPC Lean Six Sigma summit.  The summit is an almost week-long conference in Orlando, January 17 through January 20. IQPC has rounded up about 800 experts to deliver their knowledge, insights, and often inspiration to the folks who attend.   These presenters represent some high-profile companies like BP and PayPal, as well as leading consulting firms.

The sessions have been organized to satisfy the hankerings of everybody in the Six Sigma-process optimization crowd, from the chief quality officer to the Black Belt. And one thing I think will be really productive for the folks who get down to Orlando is the conference's emphasis on integrating a Lean Six Sigma program so that it becomes a living part of the organization.

Eight hundred is a lot of experts, who should have a lot of useful ideas about driving profit through process improvement.  And will be a lot of useful tools to check out.  Which brings me to my role at the summit--how can I sign off on thes blog without putting in a plug for Palisade?  I'll be there with our Monte Carlo software--booth #7--extolling the virtues of Monte Carlo simulation for Lean Six Sigma.  Monte Carlo simulation is a relatively recent addition to the Six Sigma toolkit, and while you may not think risk analysis is central to process optimization, it is really about getting a grip on uncertainty--and you can certainly relate to that.

Please know the IQPC is offering a "Free Hall Pass" for those interested in networking or exploring the latest Lean, Six Sigma & BPM solutions.

Remember, it's January, and you can come to Florida!  Here's the website--www.leansixsigmasummit.com/Event.aspx?id=341814.

I hope to see you there!

0 Comments »

2011 Palisade Risk Conference in Amsterdam Announced

Amsterdam User Conference 29-30 March 2011

Following on from the 2010 Palisade Risk Conferences in London, Sydney, Las Vegas, and Peru, we now have the pleasure of announcing the first of our 2011 series. The dates for your diary are the 29th–30th March and the location is the city of Amsterdam. The venue for this event will be the West Indische Huis, the historic former headquarters of the West India Trading Company, famous for purchasing Manhattan for the sum of 60 guilders, and sending Henry Hudson to sail west.

The 2011 Palisade Amsterdam Risk Conference will follow the format of previous events. Over the course of two days, industry experts will present a selection of real-world case studies about innovative and interesting approaches to risk and decision analysis. The event will also include the ever-popular workshops and training given by Palisade consultants. In addition, you’ll see sneak previews of new software in the pipeline from Palisade, including the very exciting @RISK and DecisionTools Suite 6.0.

The conference will feature plenty of opportunities to network with other professionals, including a pre-dinner cruise followed by a meal at another location steeped in Dutch history, the award winning D’Vijff Vlighen (Five Flies Restaurant).

» Register

0 Comments »

Calm Those Tweets

Sooner or later, it had to happen. . . .  Tweets have been linked to stock market behavior.  This was not a case of inside information.  Researchers from Indiana University have demonstrated that public mood, as expressed in millions of Tweets, can predict stock market behavior with fair reliability .   
 
Analyzing a collection of 9.8 million Tweets from 2.7 million users in 2008, the team used a "subjectivity analysis" tool called OpinionFinder and a Profile of Mood States (a psychological measure) to create a time series that tracked daily variation in public mood as exhibited by the language of the Tweets.  It then compared the fluctuations in mood with those of the closing values of the Dow Jones index.
 
To make these comparisons, the team trained a neural network on the data.  Of course, this was not just any neural network.  It was a Self-Organizing Fuzzy Neural Network, one in which organizes its own"neurons" during the training process.  
 
The patterns that this neural network identified revealed that Tweeting terms that convey a sense of calmness anticipated upward movement in the stock market.  These predictions were 87.6 percent accurate. Although I have been unable to to track down the statistical analysis methods behind the mood measures, these odds would seem to be impressive.

Does the relation between Tweeting and the stock market work only one way?  Or does this result imply that if we want to avoid another Black Swan dive in the financial markets, we should just think calm thoughts and Twitter slowly?
0 Comments »

The 2010 Palisade Risk Conference series continues – <br> This week in Sydney!



The 2010 Palisade Risk Conference is being held this Wednesday and Thursday, October 20-21, at The Radisson Plaza Hotel Sydney. This event is an excellent opportunity to network with other professionals and find out how they’re using Palisade solutions to make better decisions. Our keynote speaker, Dr Frank Ashe of Q Group Australia and Macquarie University Applied Finance Centre, will deliver his presentation of "Risk Management and Organisational Culture."  On Thursday, Andrew Kight of Aon Global Risk Consulting will deliver the plenary session, "Applications of @RISK in Risk Financing Strategy."

Other case studies will demonstrate how Palisade customers use the world’s leading risk and decision analysis software. Kimball Fink-Jensen of Kaizen Institute will present “How @RISK Adds Value to Lean Business Models and Six Sigma”; Evan Hughes of Sunbury Electrification Project presents “Construction Risk Management in LOR”; and David Thompson presents “Decision Making in the Face of Uncertainty”. In addition, we have case studies from Marsh Risk Consulting, University of New South Wales, TBH Capital Advisers, Value Adviser Associates, and WEL Networks Ltd.

Sam McLafferty, CEO of Palisade Corporation, will provide a bit of background on Palisade’s history and describe what sets Palisade apart in the market. He will give an overview of @RISK and the DecisionTools Suite and then describe the latest enhancements and additions to these products before providing a glimpse into what’s coming next from the company.

Join us to find out why Palisade stands at the forefront of risk and decision software analytics!

» 2010 Palisade Risk Conference in Sydney


0 Comments »

The Unsupervised Neural Net

In an article in last week's Oil & Gas Journal, Tom Smith focused on the use of neural networks in oil and gas exploration.  Because of the technology's usefulness in classifying data and identifying patterns, it has become widely used to reduce the risk and time in the siting of oil and gas wells. All well and good, but not good enough apparently to satisfy the growing intensity of exploration.
 
Oil and gas apparently aren't the only things spurting out of the oil fields. These areas are gushing data, so much data that conventional neural networks can't process all of the information.  Author Smith believes that the next step in reducing risk and wasted time in exploration will be the "unsupervised" neural network.  It pushes the Known off the computer screen and replaces it with an Automated Unknown.  
 
While the "supervised" neural network processes classified data, that is, known information, the "unsupervised" neural net can classify unclassified data and then process the patterns that result.  This makes it invaluable for seismic interpretation, that is, for detecting and analyzing subtle geological variations that may be related the potential to extract usable oil or gas.
 
Smith predicts that unsupervised neural networks will be a "disruptive" technology in seismic interpretation.  A disruptive technology is an unexpected innovation that changes the direction of progress in an industry, like digital downloads in the music industry.  If he's right, it just got a whole lot easier to strike oil. 
 
0 Comments »

Taking the guesswork out of water pipe replacement, using risk modeling software

Bournemouth & West Hampshire Water (BWHW) provides clean drinking water to around half a million people in an area in the south-west of the UK spanning over 1000 square kilometres. This amounts to an average of about 150 million litres of drinking water every day, through nearly 3000 kilometres of water mains.

This is no mean feat.  Not only is it a sizeable area, but BWHW’s large infrastructure is complex and aging: pipes laid in the Victorian era are still in service and a wide variety of materials — including cast iron, ductile iron, cement, PVC or plastic — is used for buried pipes, which have a diameter ranging from 50mm up to 900mm. At the same time, BWHW must manage the system in line with the stringent, risk-based, requirements set out by OFWAT (the regulator for the water and sewerage industry in England and Wales) to provide good quality service at a fair price for consumers.

A key challenge is pipe replacement, particularly in terms of identifying which sections should be a priority for renewal. Halcrow was commissioned to develop a risk-based model to consider and improve the capital efficiency of BWHW’s water distribution pipe network replacement programme. It did this using risk analysis software @RISK, in conjunction with its own ‘cluster’ analysis tool to combine the probability of pipe failures across the region with the consequential benefits of pipes not failing in the future. 

The result is a risk analysis model that allows BWHW to make more informed decisions about targeting investment.  From there the water company is able to replace pipes that, should they burst, would have the most significant impact on the level of service it offers its customers as well as the direct costs incurred by the organisation.

Taking a risk-based approach to its pipe replacement programme gives BWHW the assurance that it is achieving the best possible return on its investment.  It is also ensures it can maintain its good record for providing good value and service to customers.

Craig Ferri
EMEA Managing Director of Risk & Decision Analysis
0 Comments »

Big Data

The August issue of McKinsey Quarterly is devoted to the year's ten most significant "tech-enabled" business trends.  Of the ten, the one that caught my eye was Experimentation and Big Data.  Like many of us in IT-based businesses, I had been wide awake to Big Data--which are really nano-data in vast quantity--but only peripherally aware of entrepreneurial experimentation using it.  
 
Big Data, of course, are the billions of bytes of information that come to a business through their use of Internet--clicks, click-throughs, patterns of browser use.  The more successful your e-commerce operation, the heavier the flow of Big Data.  But even the smallest e-enterprise can slurp an astonishing amount of it.  What's big about Big Data are the patterns in which they arrive, and these patterns are where the Experimentation comes in.  They tempt the enterprise to employ new stratagems to improve the effectiveness of the e-operations, and because of the fluidity of the Internet medium, a great deal of tinkering results.  If one ploy doesn't prove beneficial, it's pretty easy to try another one.
 
In effect, Big Data, provide a focus group, or a series of focus groups.  This, it seems to me, is where risk analysis, especially financial risk analysis, has a Big Role to play.  With all the historical data in your lap, any standard operations risk simulation can tell you a lot about the potential results of Ploy A versus the results of Ploy B.  Train a neural network on the data.  Then bring up the Excel, the Monte Carlo software, and let it chomp through the known patterns.  This should yield fair predictions of potential results. The ideal implementation, of course, is one in which the analytics play in real time as Big Data roll in.

While the McKinsey authors don't trouble themselves about specific statistical analysis techniques, they make an important point: the fluidity and speed of both data acquisition and quantitative analysis require a new managerial mindset toward experimentation and decision evaluation.  
 
0 Comments »

Health Care Management: Decision Making at Two Levels

Reading recent reviews of two books on healthcare caused me to realize that in spite of the rapidly increasing number of clinical studies that use risk analysis and neural networks to sort out the best treatment choices, there has been very little published on how to use quantitative tools like decision trees and Monte Carlo software to manage health care better. Given the recent national debates on health care reform, this is actually quite surprising. 
 
There's health care management, and then there's health care management.  On the macro level, decision evaluation focuses on the organization. Marian C. Jennings's Health Care Strategy for Uncertain Times (2000) prescribes ways for corporate health care managers to reshape the ways their organizations deal with uncertainty by adopting the same quantitative techniques used in the commercial realm by enterprises like investment firms and utility companies.  On the micro level, health care management focuses on you, your body. Thomas Goetz's The Decision Tree (2010) prescribes how to apply a number of these same decision analysis techniques to your own health. 
 
Essentially, what both books are saying is, "Look, the only certainty is uncertainty.  But you have some numbers.  Here are the tools to turn those numbers into plans you can reasonably rely on." These tools shouldn't be news to you as a reader of this blog, but apparently, if the popularity of Goetz's book and renewed attention to Jennings's are any indication at all, the health care management arena is plenty ripe for quantitative decision support tools.
0 Comments »

Another take on the BP Oil Spill

We are pleased to introduce you to consultant and trainer Sandi Claudell, today’s featured guest blogger. Sandi is CEO of MindSpring Coaching, and has been a valued Palisade Six Sigma Partner for quite some time. She is a Six Sigma Master Black Belt (Motorola), and is a Lean Master (Toyota Motors - Japan) among other notable achievements.

--Steve Hunt


Part 1: The Platform Disaster

Much has been said about the disastrous BP oil spill in New Orleans. If we use the theory of probability and reliability then have too many different companies responsible for a very complex construction and operation added to the chance of failure.

 

There is probably a cultural issue at work where each entity wanted to give the other what they wanted to hear rather than the truth. (For historic and recent examples: NASA Challenger and recent Toyota Prius problems). When we lose sight of quality and reliability of parts, construction, maintenance, testing under ALL conditions rather than the obvious few, etc. then we run high risks of failure. When you build 100+ wells and avoided disasters  . . . perhaps people fool themselves into thinking there never WILL be a disaster. They don’t look at a model that demonstrates the longer you go without such an event (given the input factors of how each element can and will fail) the closer you come to the event we all want to avoid.

 

They may or may not have used an integrated Systems Design  . . . not simply an engineering system but the system on how individuals work together, communicate with each other, act as a conforming unit or a more self-directed autonomous unit looking for and generating solutions outside the box. A team that is innovative and willing to look at all the possibilities and create a breakthrough design that was / is more mistake proof.

 

If they had used DFSS (Design for Six Sigma) then their designs would be more robust taking into consideration all the necessary safety precautions for human life as well as immediate response to a potential failure. As part of DFSS we use a statistical tool call Design of Experiments (Strategy of Formulations, Central Composites, etc.) where we can try very complex interactions (factors) with minimal effort / cost and maximum statistical accuracy. DoE creates prediction equations that allow us to model and ask questions of what would happen under different conditions. More importantly we can look at many different quality metrics (responses, outcomes, etc.) with the same experimental trial. If we replicate the test then we can even forecast what elements cause variation (very hard to detect in highly complex systems without the use of statistics).

 

If they had used an FMEA (Failure Mode Effect Analysis  . . . a tool used in Six Sigma) then they could have anticipated failures and put error proofing devices in place to detect and/or respond to potential faults BEFORE it is irreversible. If we add a Monte Carlo simulation to potential working conditions then the model forecasts probability plots and identifies key factors that will be critical to success or failure.

 

Perhaps they did indeed use a Monte Carlo using Crystal Ball. It is a good product but if they used Palisade’s @RISK and added some of the other tools provided by Palisade such as RISK Optimizer, Neural Tools, etc. then they could have analyzed the system in other dimensions besides a simple Monte Carlo, thus uncovering weaknesses BEFORE designing and/or building the platform and well.

 

Part 2: Capping the well head

 

In Lean there is a whole discipline called “Error Proofing Devices”. As part of the design effort we need to create first and foremost safety and other devices that prevent the error from occurring in the first place. If that line of defense fails then there should be devices built into the process designed to cap the well if your error proofing fails. If that line of defense fails then there should be a disaster response plan created and practiced and tested to ensure that the spill is repaired immediately.

 

Part 3: Treating the resulting spill

 

Again, Design of Experiments could test different materials, chemicals and methods to find the right combination to contain or otherwise manage the resulting oil spill. Trying one chemical only may be the age old definition of madness . . . trying the same thing over and over again expecting different results. Again, a robust design of experiments could aid in the process of finding a solution that is most effective and with multiple tests on the same samples ensure that is it the most safe for the environment and the population most directly in the path of the oil spill. These tests are ideally run years before such a spill however, doing something now is better than simply standing by and watching it happen.

 

Last but not least:

 

Management (Executives down to line managers) should have coaches. Coaches who can speak to the culture, the systems design, the tools and methods used in Lean Six Sigma and who can verify data analysis and help with the accurate interpretation of the data. These coaches should be independent . . . not a full time employee of the corporation as they are more likely to speak the truth and highlight risks as well as opportunities.

 

Now BP and all the other entities may have done some of what I mentioned above. But I would assume they must have left out one or more of the listed items or we wouldn’t be looking at the oil traveling into the wetlands around New Orleans right now. Hindsight is always brilliant but we can learn from our mistakes. We can create better cultures, systems, error proofing devices, Experimental Designs etc.

 

 

BIO:  

 

Sandi Claudell is CEO of MindSpring Coaching. She is a Master Black Belt in Six Sigma, a Lean Master and has worked as a consultant for many companies to initiate worldwide improvements. For more information or to contact Sandi please visit http://www.mindspringcoaching.com/.

0 Comments »

Neural Nets Writ Small

Of all the statistical analysis techniques I receive news alerts for, the neural network flashes up on my screen most often.  While I, like many of you, really enjoy the big-screen futuristic applications of neural nets--prediction of sun storms is a splendid recent example--there is a quieter trend ramping up at a more down-to-earth level. The nano level,that is the itsy-bitsy, teeny-weeny, the molecular level.  
 
For at least the past five years, the nanotechnology industry has been predicting and prototyping ways to incorporate neural networks into nano-machines.  This innovation has proved to be very handy for sensing devices.  The nano-sensor combines receptor particles with electronics controlled by a neural network algorithm.  The neural net sorts through the sensor responses to uncover patterns that trigger alerts.
 
This year there was a flurry of media attention focused on one of these sensing technologies, the nano-nose, which uses an array of nano-receptors coordinated by a neural network.  These sensors are being promoted to sniff out everything from explosives to disease.  
 
One indication of the expected adoption of applications that combine nano with neural is the advertising for neural network algorithms that can downsize to nano. But more than one of the nano-machine innovators has commented on the need to develop more robust statistical analysis techniques to improve the accuracy of the sensors.  Which means that there will be more neural network to shrink, which means that the algorithms advertised today may already be outdated.

Whatever the commercial considerations and no matter how blasé we become about technological possibility, there is still a big wow factor in packing a high-powered computing technique into such infinitesimal space, and you can be certain the nano people will be harnessing neural networks to many new kinds of more-mini-than-micro machines.
0 Comments »

20 Questions in a New Orbit

An Ottawa toy developer is trying to make a jet-propelled leap from an online game to space travel. His vehicle? A neural network designed as the back end system for a game of 20 questions. Twelve years ago Robin Burgener wrote a neural net program to train on the sequences of player responses to questions--beginning with Animal? Vegetable? Mineral?--posed by the neural network,              
 
 
The game is does more than pose simple yes-or-no answers to lead you to a conclusion. The neural network algorithm is able to pose different questions in different orders, and it gets the right answer about 80 percent of the time.                                                         , 
 
Now, apparently, the sky's the limit for Burgener's neural network.  He was scheduled to make a presentation late last month at the Goddard Space Flight Centre explaining the potential uses for a neural networked 20 questions on board a space craft. These uses center broadly on troubleshooting technical and equipment problems and subsequently anticipating future problems.  
 
If, as he claims is true, his neural net guessing program can work around responses that are misleading or downright lies, what that would mean for space travelers, he concludes, is that  "if a sensor fails, you're able to see past it."
 
I know what he means, I think, but I myself don't tend to look past sensors.        
0 Comments »

Neural Nets vs. the Ripple Effect

About a week ago the Financial Times ran an article about a "new" investment analysis technique that could cut through turbulence in the financial markets: neural network analysis.  I thought okay, this isn't new but maybe the application is innovative.  Besides, I liked the metaphor the reporter used, a metal ball dropped in a vat of oil and the ensuing ripples that disturb the oil.
 
The article is about software developed by a Danish investment firm that turned its back on "linear" models to adopt a neural network approach that continually reclassifies investments in a portfolio and then makes suggestions about which equities to buy and which to sell. The proprietary software chews through a heap of data--prices, price-earnings ratio, and interest rates, for starters, and its performance bench mark is the Russell 1000 index. 
 
The test portfolio used to proof the method was acquired in 2007, just before the ball dropped into the oil.  For a time it seemed to hold up but then got caught in the turbulence and its undertow. It has now recovered nicely, ahead of the Russell 1000 in fact, and the asset managers are looking  for more investors. This is a sweet success story, especially given the demon turbulence looming over the project and the fact that the assets are apparently owned by the Danish state pension plan.

I understood the use of neural network software to counter nonlinear events like market turbulence, and I understood the continual classification and reclassification.  But I was intrigued that nowhere in the article was there a mention of risk, risk analysis, or even risk assessment.  Maybe it was there all the time, incorporated in the proprietary software, and maybe it just wasn't mentioned.  Certainly the asset managers who developed the program were aware they were at risk--they were chewing their nails as their fund slid down right beside all the other funds that were dropping in value.  But assessing risk doesn't seem to have been a factor in the firm's new defense against mayhem in the markets.  
 
So.  Is it time to shut down your Monte Carlo software?  I don't think so. . . .   
0 Comments »

New Approaches to Risk and Decision Analysis



Risk analysis and decision-making tools are relevant to most organisations, in most industries around the world.  This is demonstrated by the speaker line-up at this year's European User Conference, an event at which we believe it is important to bring together customers from a wide range of market sectors.

We are holding 'New Approaches to Risk and Decision Analysis' at the Institute of Directors in central London on 14th and 15th April 2010.  As with previous years, the programme aims to provide everyone attending with practical advice to enhance the decision-making capabilities of their organisation.  Customer presentations, which offer insight into a wide variety of  business applications of risk and decision analysis, include:
  • CapGemini: Faldo's folly or Monty's Carlo – The Ryder Cup and Monte Carlo simulation
  • DTU Transport: New approaches to transport project assessment; reference scenario forecasting and quantitative risk analysis
  • Georg-August University Research: Benefits from weather derivatives in agriculture: a portfolio optimisation using RISKOptimizer
  • Graz University of Technology: Calculation of construction costs for building projects – application of the Monte Carlo method
  • Halcrow: Risk-based water distribution rehabilitation planning – impact modelling and estimation
  • Pricewaterhouse Coopers: PricewaterhouseCoopers and Palisade: an overview
  • Noven: Use of Monte Carlo simulations for risk management in pharmaceuticals
  • SLR Consulting: Risk sharing in waste management projects - @RISK and sensitivity analysis
  • Statoil: Put more science into cost risk analysis
  • Unilever: Succeeding in DecisionTools Suite 5 rollout – Unilever's story
We will also look at the recently-launched language versions of @RISK and DecisionTools Suite, which are now available in French, German, Spanish, Portuguese and Japanese.  Software training sessions will provide delegates with practical knowledge to ensure they can optimise their use of the tools and implement business best practise and methodologies.

With over 100 delegates from around the world attending, the event is also a good opportunity to network and knowledge-share with risk professionals from around the world.

» Complete programme schedule, more information on each presentation,
   and registration details



0 Comments »

Palisade is proud to announce our first Health Risk Analysis Forum in San Diego on March 31st 2010




Why attend?

This one-day forum is a great way to find out how others in the Healthcare Industry are using our software, as well as to learn new approaches to the problems Healthcare professionals face every day. We will have six software training sessions, and six real-world case studies presented by industry experts covering risk and decision analysis from all angles specific to the Healthcare sector.

You will also see how new versions of @RISK, PrecisionTree, RISKOptimizer, TopRank, NeuralTools, StatTools, and other Palisade software tools work together to give you the most complete picture possible in your situation.

Who should attend?


Professionals in risk and financial analysis in: Care Equipment & Services, Pharmaceuticals, Biotechnology & Life Sciences, Hospital Care & Management, or related services

How much?


For a limited time, the cost for attending the Health Risk Analysis Forum is has been discounted $100.

$295 covers all sessions, continental breakfast, lunch and a cocktail networking reception. Attendees will also receive a welcome package that includes a 15% discount on their next software purchase.

Please contact Jameson Romeo-Hall at jromeo-hall@palisade.com if you are interested in attending.

Location
The Westin Gaslamp Quarter
910 Broadway Circle
San Diego, CA 92101
(619) 239-2200

Book your room at a discounted rate (subject to availability.)


0 Comments »

The Rise of the NOMFET

By now we've become accustomed to the marvels of neural network technology and, in fact, inured to the advances it brought in statistical analysis with its computational simulations of nerve cells.  Its many everyday applications--especially in online retailing--seem kind of ho-hum, and we'd be put out if for some reason they weren't in use. Wasn't it only four or five short years ago that neural nets themselves were big news?  
 
Last week there was more big news about neural networks: a French research team's announcement of an "organic" transistor that mimics a brain's synapse. Neural network computing is based on computational stand-ins for biological neurons, and linking these neurons with electronic synapses currently requires at least seven transistors.  One new "organic" transistor can take the place of those. 
 
The key here is nano. Tiny.  Tinier than tiny.  The new transistors are made of nanoparticles of gold and pentacene on a plastic substrate. The resulting connector is called a nanoparticle organic memory field-effect transistor: a NOMFET. 
 
Not only will the NOMFETs accelerate the performance of neural network circuits, but because the human brain uses 10 to the fourth times as many synapses as neurons, the space saving NOMFETs  will help make possible a generation of computers inspired by the human brain.
 
The rise of the NOMFET may also make possible another kind of advance, one that I find a little scary to contemplate.  Because its built on plastic, the NOMFET could potentially be used to link a computer with living tissue.  Get back, Frankenstein.
0 Comments »

A Downturn for the Better

Honoring a time-honored tradition for the turn of the year, I've been looking back over the year just past to do a little retrospective trend-spotting.  Here's one that took me by surprise: in spite of the downturn in the economy, there was also a downturn in online fraud. It's counterintuitive--historically, hard times are correlated with rising crime--but apparently true.
 
Late last year, DigitalTransactions, an online publication catering to businesses engaged in the "electronic exchange of value," reported that the results of a survey of principals in these businesses showed an overall decline in fraud of about 1 percent.
 
The survey, sponsored and carried out annually by a California risk management company, is the first in its eleven-year history to show a fraud rate this low.  In 2009 North American merchants were expected to lose (a mere) $3.3 billion, in contrast to their loss of $4.0 billion in 2008.  
 
What's behind this good-news downturn?  Probably not increased honesty.  There was no data on attempted fraud, and the assumption is that the increased use of automated fraud detection tools cut the merchant's losses. The level of sophistication of these tools has ratcheted up to the level where neural network classification, risk analysis, and statistical analysis of correlated data can take place in real time during the processing of a transaction.  Furthermore, the combination of operational risk software with device identification of the purchaser's computer now make it difficult for a single computer to mob an online merchant with multiple bogus orders.

So the good news is not about improvements in human nature.  It's about improving the defenses of this booming sector of the economy.  
0 Comments »

Digital Eyes on Alien Life

University of Chicago geoscientist Patrick McGuire has big plans for Mars.  Previously he worked on an imager for a Mars orbiter that could identify different types of soil and rock by detecting infrared and other wavelengths, and now he is drawing on that experience to develop a space suit with digital "eyes" and a neural network that rides on the hips of the spacesuit and can sort out living biological material from other matter.
 
The digital eyes will detect and plot colors, and the neural net, which is known as a Hopfield neural network, will compare these color patterns to a database of information previously gathered from that area of planet in order to make an animal-vegetable-mineral determination.  
 
This complex AI system has already been tested at the Mars Desert Research Station in Utah, and McGuire and his colleagues were satisfied that the Hopfield algorithm could learn colors from just a few images and could recognize units that had been observed earlier.
 
McGuire's concept is that a human wearing this neural network could simply walk around the red planet and record every nearby object, rapidly gathering information.  

Obviously, such a clothing item awaits a manned Mars mission.  But in the meantime, why not have the next Rover suit up?  
0 Comments »

The Cat is Out of the Bag

At November's supercomputing conference in Portland, Oregon, IBM announced that its researchers working with a team from Stanford University had succeeded in developing an accurate simulation of human brain function. The simulation will be capable of emulating sensation, perception, action, interaction and cognition.
 
This algorithm simulating a living neural network, called BlueMatter (spelled as one word like everything else in computerese these days) is an important milestone in IBM's mission to build a cognitive computing chip because it begins to advance large-scale simulation of a cortical neural network and it synthesizes neurological data.  BlueMatter is built with Blue Gene (two words for this pun in the singular) architecture, which, in combination with specialized MRI images, allowed the team to create a wiring diagram of the human brain.  This map of the brain is, according to IBM's press release "crucial to untangling its vast communication network and understanding how it represents and processes information."

To be more accurate, what BlueMatter has thus far demonstrated is the potential to achieve neural network technology that operates on the scale of complexity of the human brain.  The algorithm's current simulation approximates the cortical system of a cat.  Hence, the title of the paper announcing IBM's accomplishments: "The Cat Is Out of the Bag."  Even so, this is an operations research accomplishment that dwarfs such mundane analytical tasks as option valuation, value-at-risk, or reserve estimation.
 
One of the goals of the company's cognitive computing program is to create a chip that operates with the energy efficiency of the human brain (20 watts).  But in order to emulate the brain activity of a cat, the research team had to bring out one of the largest supercomputers in the world, the IBM Dawn Blue Gene/P--which comprises about 150 thousand processors and contains 144 terrabytes of main memory. 
 
This cat came out of a pretty big bag.  
0 Comments »

New Approaches to Risk & Decision Analysis at the 2010 Conference in London



Following on from the resounding success of the last Palisade Risk Conference in London, which attracted over 110 attendees from industry and academia, the 2010 Palisade Risk Conference will be taking place on April 14th-15th. The location for this event will again be the Institute of Directors on Pall Mall, London, and already there are a number of exciting presentations confirmed from the likes of Unilever, Pricewaterhouse Coopers and Halcrow.

The 2010 Palisade Risk Conference will be a two-day forum which will cover a wide variety of innovative approaches to risk and decision analysis. Featuring real-world case studies from industry experts, best practices in risk and decision analysis, risk analysis software training, and sneak previews of new software in the pipeline, the event is also an excellent opportunity to network with other professionals and find out how they’re using Palisade risk analysis solutions to make better decisions.

Call for Papers

If you have an unusual or interesting application of Palisade software which you would like to present, please send a short abstract to cferri@palisade.com. The closing date for abstracts to be submitted is Friday, 11th December, 2009.
0 Comments »