Biomedical Computation Review published by Simbios (funded by NIH) carried a cover story by Roberta Friedman on Reverse Engineering the Brain. You can see it here. It is thoroghly researched, and covers work of Gerald Edelman, Kwabena Boahen, Tomaso Poggio, Thomas Serre, Eric Knudsen, myself, amongst others.
Archives for April 2009
"Well, you see, Norm, it’s like this. A herd of buffalo can only move as fast as the slowest buffalo. And when the herd is hunted, it’s the slowest and weakest ones at the back that are killed first. This natural selection is good for the herd as a whole, because the general speed and health of the whole group keeps improving by the regular killing of the weakest members.
In much the same way, the human brain can only operate as fast as the slowest brain cells. Now, as we know, excessive intake of alcohol kills brain cells. But naturally, it attacks the slowest and weakest brain cells first. In this way, regular consumption of beer eliminates the weaker brain cells, making the brain a faster and more efficient machine.
And that, Norm, is why you always feel smarter after a few beers."
Vitaly Feldman is a theoretical computer scientist at IBM Almaden. Today, we had a great intellectual pleasure of listening to a wonderful and stimulating white-board talk from Vitaly on his upcoming paper in Neural Computation.
Over a lifetime cortex performs a vast number of di®erent cognitive actions, mostly dependent on past experience. Previously it has not been known how such capabilities can be reconciled, even in principle, with the known resource constraints on cortex, such as low connectivity and low average synaptic strength. Here we describe neural circuits and associated algorithms that respect the brain’s most basic resource constraints and support the execution of high numbers of cognitive actions when presented with natural inputs. Our circuits simultaneously support a suite of four basic kinds of task that each require some circuit modi¯cation: hierarchical memory formation, pairwise association, supervised memorization, and inductive learning of threshold functions. The capacity of our circuits is established via experiments in which sequences of several thousands of such actions are simulated by computer and the circuits created tested for subsequent e±cacy. Our underlying theory is apparently the only biologically plausible systems-level theory of learning and memory in cortex for which such a demonstration has been performed, and we argue that no general theory of information processing in the brain can be considered viable without such a demonstration.
This week’s Economist carried a story on the wiring diagram of the brain. It is worth reading.
"The result of all this effort, it is hoped, will be precise circuit-diagrams of brains. The first brains to be mapped will probably have belonged to mice. Besides being cheap and disposable, a mouse brain weighs half a gram and packs a mere 16m neurons. Human brains (1.4kg and 100 billion neurons) will come later, when all the wrinkles have been ironed out in rodents, and proper methods devised to analyse the results. But come they will. And when they do, the most complicated object in the known universe will begin to give up the secrets of how it really works."