Last month I noticed a slight uptick in press coverage of computer-vs.-brain stories and thought I had neatly disposed of that topic with my blog on wetware vs. software. Not so.
The uptick has become an upswing–though not yet quite an obsession. On one side of campus, biologists are patiently explaining to the lay person why the AI version of a neural network can never approach the complexity and capacity of the biological model for that set of algorithms, and several buildings away computer scientists are optimistically predicting a "brain on a chip."
For the moment, the brain-on-a-chip crowd, exemplified by the consortium of European scientists attempting to map and then simulate the workings of biological neurons and synapses, has practical goals in mind. Smaller, higher capacity formats for computation will add speed and layers of complexity to now commonplace analytical processes in business and industry such as statistical analysis, genetic algorithm optimization, and simulations for risk assessment. In the long view, they are tantalized by the prospect that computers could take over the job of thinking from humans.
For the beauty-of-the-biological-brain crowd, represented this week by neuroscientists Sam Wang and Sandra Aamodt in the New York Times, the best use of computer science is to give science a scale on which to measure the information processing capacity of the human brain. They tell us, for instance, that the human brain is so compact it can store about a third of all the archived information on the entire Internet. Accordingly, it would take a whopping amount of parallel processing for computers to begin to stand in for a single synapse.
I’m not sure why the idea of the brain as a computer or the computer as a brain is so compelling, but it is clear that it will be a long time–think geological time–before there is a meeting of the minds.