I understood the use of neural network software to counter nonlinear events like market turbulence, and I understood the continual classification and reclassification. But I was intrigued that nowhere in the article was there a mention of risk, risk analysis, or even risk assessment. Maybe it was there all the time, incorporated in the proprietary software, and maybe it just wasn't mentioned. Certainly the asset managers who developed the program were aware they were at risk--they were chewing their nails as their fund slid down right beside all the other funds that were dropping in value. But assessing risk doesn't seem to have been a factor in the firm's new defense against mayhem in the markets.
Risk analysis and decision-making tools are relevant to most organisations, in most industries around the world. This is demonstrated by the speaker line-up at this year's European User Conference, an event at which we believe it is important to bring together customers from a wide range of market sectors.
We are holding 'New Approaches to Risk and Decision Analysis' at the Institute of Directors in central London on 14th and 15th April 2010. As with previous years, the programme aims to provide everyone attending with practical advice to enhance the decision-making capabilities of their organisation. Customer presentations, which offer insight into a wide variety of business applications of risk and decision analysis, include:
- CapGemini: Faldo's folly or Monty's Carlo – The Ryder Cup and Monte Carlo simulation
- DTU Transport: New approaches to transport project assessment; reference scenario forecasting and quantitative risk analysis
- Georg-August University Research: Benefits from weather derivatives in agriculture: a portfolio optimisation using RISKOptimizer
- Graz University of Technology: Calculation of construction costs for building projects – application of the Monte Carlo method
- Halcrow: Risk-based water distribution rehabilitation planning – impact modelling and estimation
- Pricewaterhouse Coopers: PricewaterhouseCoopers and Palisade: an overview
- Noven: Use of Monte Carlo simulations for risk management in pharmaceuticals
- SLR Consulting: Risk sharing in waste management projects - @RISK and sensitivity analysis
- Statoil: Put more science into cost risk analysis
- Unilever: Succeeding in DecisionTools Suite 5 rollout – Unilever's story
With over 100 delegates from around the world attending, the event is also a good opportunity to network and knowledge-share with risk professionals from around the world.
» Complete programme schedule, more information on each presentation,
and registration details
This one-day forum is a great way to find out how others in the Healthcare Industry are using our software, as well as to learn new approaches to the problems Healthcare professionals face every day. We will have six software training sessions, and six real-world case studies presented by industry experts covering risk and decision analysis from all angles specific to the Healthcare sector.
You will also see how new versions of @RISK, PrecisionTree, RISKOptimizer, TopRank, NeuralTools, StatTools, and other Palisade software tools work together to give you the most complete picture possible in your situation.
Who should attend?
Professionals in risk and financial analysis in: Care Equipment & Services, Pharmaceuticals, Biotechnology & Life Sciences, Hospital Care & Management, or related services
For a limited time, the cost for attending the Health Risk Analysis Forum is has been discounted $100.
$295 covers all sessions, continental breakfast, lunch and a cocktail networking reception. Attendees will also receive a welcome package that includes a 15% discount on their next software purchase.
Please contact Jameson Romeo-Hall at firstname.lastname@example.org if you are interested in attending.
The Westin Gaslamp Quarter
910 Broadway Circle
San Diego, CA 92101
Book your room at a discounted rate (subject to availability.)
So the good news is not about improvements in human nature. It's about improving the defenses of this booming sector of the economy.
Obviously, such a clothing item awaits a manned Mars mission. But in the meantime, why not have the next Rover suit up?
To be more accurate, what BlueMatter has thus far demonstrated is the potential to achieve neural network technology that operates on the scale of complexity of the human brain. The algorithm's current simulation approximates the cortical system of a cat. Hence, the title of the paper announcing IBM's accomplishments: "The Cat Is Out of the Bag." Even so, this is an operations research accomplishment that dwarfs such mundane analytical tasks as option valuation, value-at-risk, or reserve estimation.
Following on from the resounding success of the last Palisade Risk Conference in London, which attracted over 110 attendees from industry and academia, the 2010 Palisade Risk Conference will be taking place on April 14th-15th. The location for this event will again be the Institute of Directors on Pall Mall, London, and already there are a number of exciting presentations confirmed from the likes of Unilever, Pricewaterhouse Coopers and Halcrow.
The 2010 Palisade Risk Conference will be a two-day forum which will cover a wide variety of innovative approaches to risk and decision analysis. Featuring real-world case studies from industry experts, best practices in risk and decision analysis, risk analysis software training, and sneak previews of new software in the pipeline, the event is also an excellent opportunity to network with other professionals and find out how they’re using Palisade risk analysis solutions to make better decisions.
Call for Papers
If you have an unusual or interesting application of Palisade software which you would like to present, please send a short abstract to email@example.com. The closing date for abstracts to be submitted is Friday, 11th December, 2009.
Take this convenient and inexpensive opportunity to learn from Palisade’s trainers and software developers. Learn how to use the elements of the new DecisionTools Suite 5.5 as a comprehensive risk analysis, optimization, and statistical analysis toolkit. See how each of the products in the Suite — @RISK, RISKOptimizer, Evolver, PrecisionTree, TopRank, StatTools, and NeuralTools — can be used to solve practical problems in the real-world.
The conference also features case studies demonstrating how to use @RISK and DecisionTools Suite, from risk management experts in the fields of finance, healthcare and pharmaceuticals, energy, oil and gas, DFSS and Six Sigma, project management, operations management, manufacturing, and more.
See the full schedule for the Conference here.
Next Week: October 21-22 in NYC
Building on the success of last year’s record-breaking event, the conference will offer a wide range of software training, model building, and real-world case study sessions. Last year, the event drew over 150 practitioners and decision-makers from a broad spectrum of industries. The @RISK and DecisionTools software tracks were more popular than ever. This year, we’re expanding software training with sessions that let you walk through examples and try the tools directly. This will enable you to take some new tips back to the office. Please join us in October for a great opportunity to learn and connect with colleagues.
About fifty years ago, logician and child psychologist Jean Piaget designed what has become a classic experiment to test the memory and learning of babies. It is a game of hiding and finding, and through it we have discovered that when infants up to 10 months old are repeatedly shown an toy being hidden in a certain place, they continue to look for the toy there, even when they have also seen it hidden in some other place. By the age of one year, however, they get it. They figure out they can look in more than one place.
This experiment has been used for decades by scientists interested in human development, and most recently has led to a finding by a Hungarian team that the ability of young infants to read social cues actually misleads them and causes them to perform worse in the hide-and-seek game. Older babies, however, were still able to read through the deception.
How, you are wondering, did the UI team verify an internal cognitive process with neural network? Their neural net trained on the responses of the infants in many different versions of the hiding game, and the team then programmed an interruption in the flow of computation so that the computer stopped "paying attention" to the hiding event. Then it too flunked the memory test. Now scientists at the University of Iowa have used a neural network model to prove that the problem is not the infants' mistaken reading of a social cue but simply distraction claiming the attention of the young infants and thereby disrupting their memory of the actual hiding of the toy.
Now scientists at the University of Iowa have used a neural network model to prove that the problem is not the infants' mistaken reading of a social cue but simply distraction claiming the attention of the young infants and thereby disrupting their memory of the actual hiding of the toy.
What is the take-home message from this? That you can fool all of the babies only part of their lives? Or that an neural network is never too old to be immature?
Dr. Strauss will present a case study at the 2009 the 2009 Palisade Conference: Risk Analysis, Applications, & Training. The conference is set to take place on 21 - 22 October at the Hyatt Regency in Jersey City, 10 minutes by PATH from Manhattan's Financial District.
See the abstract for his case study below, and see the full schedule for the Conference here.
Simulating the U.S. Economy:
Where will we be in 100 years?
There is an assumption that drives all of our expectations for how our economy will be in the future. That assumption is one of endless economic growth. Clearly endless exponential growth is impossible. Yet that is what we base all of our expectations upon. We all agree that zero or negative economic growth is bad (just look around now at the effects of the Great Recession). But we also know logically that 2% or 4% annual growth every year leads to an exponential growth outcome that is unsustainable.
To see where this growth imperative will take us we first have to see how we go to where we are today. This work first models the 20th century. The model is both complex and simple. The basic schematic of the model’s relationships is easy to understand. Furthermore, the core of the model is a simple production function that combines capital, labor, and the useful work derived from energy to generate the output of the economy. Complexity is contained in the solutions to the internal workings of the model. What is unique is that there are no exogenous economic variables. Once the equations’ parameters are calibrated, setting the key outputs to "one" in 1900 results in their time paths very closely predicting the U.S. GDP and its key components from 1900 to 2006.
The experiment in this work is about the future. If the model can very closely replicate the last 100 years, what does it have to say about the next 100 years? From 1900 to 2006 there are periods in which there was parameter switching. (The optimal parameters and the years for the switching were found using a constrained optimization technique.) That suggests that in the future there will also be changes. The experiment uses @RISK’s features to generate new combinations of parameters for each of tens of thousands of runs of the simulation. Changes in the parameters represent potential exogenous policy choices.
The "doing what you did gets you what you got" scenario leads to a surprising and unsettling outcome. The experiments using @RISK do find a path that works. Obviously if it is not "business-as-usual" that leads to a stable outcome, it is some other way. The policy choices that lead to a stable outcome suggest that the future of capitalism is not going to be what we expect it to be.
Please join us in October in New York for software training in best practicies in quantitiative risk analysis and decision making under uncertainty, real world case studies from risk services consultants and experts, and networking with practicioners from many different fields including oil and gas, pharmaceuticals, academics, finances, Six Sigma, and more.
We offer different options that include Excel add-ins, Windows, and Web based applications. Our consultants can help you to design, program and deploy these applications. A typical application might connect an Excel spreadsheet to your company’s database, extract data, then adjust it to probability distributions so they can be used in dynamic risk or optimization models. The structure of reports can be also customized and published as PDFs, or to the Web.
Palisade Custom Development can incorporate Monte Carlo simulation, probability distributions, distribution fitting, graphs, reports, and many other features of @RISK into any Windows-based application. In addition, we can integrate genetic algorithm optimization from RISKOptimizer or Evolver. This allows you to apply powerful, proven analytics to applications outside Excel. Applications can be run in a desktop, network, or Web environment.
You may wish to customize your @RISK or DecisionTools Suite spreadsheet models, restricting access to model components for some users or automating reports and other aspects of your analysis. Using the DecisionTools built-in Excel Developer Kit (XDK) and custom Excel VBA programming language, Palisade can help you build powerful, easy-to-use risk models for one user or for an entire work group.
We are currently working on a new website where you will find more information and project samples. Upcoming posts will discuss examples of custom Excel VBA programming.
» More about Palisade Custom Development
Dr. Javier Ordóñez
Director of Custom Development
Sure is, and a word to the wise: Protect your identity. Hunt and peck no more.