Clark Goble, at his Mormon Metaphysics blog (which I highly recommend), recently posted on the question, "Brains like Computers?" In his post, he references an article written some time ago by Chris Chatham, at Developing Intelligence, on "10 Important Differences Between Brains and Computers". In this post, I'll share my thoughts in response to the interesting differences pointed out by Chris.
Difference #1: Brains are analogue; computers are digital
Brains are analog to the extent of our ability to observe, but they (along with all things) may exist in a universally digital substrate. Such speculation aside, even if brains are absolutely analog, digital systems can simulate analog systems. It may prove possible for digital simulations of the brain, as they become more complex, to simulate well beyond whatever degree of minute detail is pertinent to proper functioning of an analog brain. Even if an infinite regress of analog detail is pertinent to proper functioning, a brain simulator itself would have to be built within the context of such a regress, would therefore share analog properties with the analog brain, and need only simulate details above their shared substrate.
Difference #2: The brain uses content-addressable memory
The brain uses content-addressable memory when we observe it systematically at a particular magnitude. Likewise, as the author points out, computers use content-addressable memory when we observe them systematically at the magnitude of the Internet. We've begun to see advances toward a semantic web, in which content is labeled or otherwise identified in ways that increasingly enable computers to recall data like our brains do.
Difference #3: The brain is a massively parallel machine; computers are modular and serial
Parallel computing is advancing exponentially. This trend has long manifest itself in the growth of networks, and is now manifesting itself in processor architecture. In 2006, Intel had dual-core processors on the consumer market. More recently, Intel put quad-core and dual quad-core processors on the consumer market. They've also announced plans for 80-core processors by 2011, as would be predicted by an exponential trend. Assuming this trend continues, it will not be long (only decades) before computers are far more massively parallel, both in networking and processor architecture, than human brains.
Difference #4: Processing speed is not fixed in the brain; there is no system clock
Different components of a computer have different clock speeds, and so, in the aggregate, a computer does not have a fixed processing speed. In recent years, we've begun adding more special-purpose processors to the composition of computers, which makes overall processing speed increasingly complex. Moreover, at a higher magnitude of computing complexity, such as a network or the Internet, it may prove more useful to model processing speed in analog terms than in digital terms.
Difference #5: Short-term memory is not like RAM
If beneficial, the architecture of RAM could be modified to reflect that which permits short-term memory in the human brain. However, RAM architecture could prove superior in efficiency while yet enabling all the functions associated with short-term memory in the human brain. Decreasing costs and advances in computer memory may, at least to the extent desired, increasingly dissolve distinctions between short- and long-term memory.
Difference #6: No hardware/software distinction can be made with respect to the brain or mind
Software does not exist independent of hardware. Software is a pattern across a hardware substrate. The software pattern is material and observable, of course, which allows the hardware to interact with it (or, in other words, with itself). Likewise, the brain maintains material and observable patterns, which do not exist independent of the brain. A difference, at least for now, is that we cannot transfer with high fidelity the patterns in one brain to another (presumably, education does this with low fidelity), or to a non-biological equivalent. That may change in the future, as we improve our ability to scan and simulate brains at increasing magnitudes of detail.
Difference #7: Synapses are far more complex than electrical logic gates
Computers can use many electrical logic gates to emulate the function of a single synapse. As suggested above, it may prove possible for digital simulations of a synapse, as they become more complex, to simulate well beyond whatever degree of minute detail is pertinent to proper functioning of an analog synapse.
Difference #8: Unlike computers, processing and memory are performed by the same components in the brain
Computers can simulate computer processors in memory. They can, likewise, enable the simulation to modify itself. Indeed, some computer components have already been designed in this manner, modifying themselves toward improved congruence with environmental factors based on a genetic algorithm. Such flexibility, when coupled with the processing speed of computers, can enable rapid technological evolution.
Difference #9: The brain is a self-organizing system
Computers are becoming increasingly self-organizing systems. Genetic algorithms combined with environmental inputs result in complex behavior that is in many areas quickly approaching levels commonly associated with intelligence.
Difference #10: Brains have bodies
Computers also have bodies, and they always have since the time of their mechanical ancestors. Commonly today, their observations are attained from their keyboard and mouse body parts, and their actions are expressed in their monitor body parts. Yet they are becoming increasingly complex. They have eyes in the form of cameras and ears in the form of speakers. They've been connected to wheels and other forms of locomotion, to the point even of proving capable of walking up and down stairs on two legs. Perhaps most promising at this time, they've been given virtual bodies in virtual worlds, where they can intereact with diverse inputs, including human avatars, and modify their behavior accordingly.
Bonus Difference: The brain is much, much bigger than any [current] computer
The "[current]" should be emphasized, as should the fact that we've observed exponential advance in computing power for a long time. Assuming this trend continues, and there are good reasons to suppose that it may even accelerate, a single $1000 computer should have the computing capacity of a human brain around the year 2033, if not sooner. Around 2050, following the trend further, a single $1000 computer should have the computing capacity of all human brains combined. You can see, from this, that even if neurologists' liberal estimates of the computing power of the human brain are short of reality by an order or two of magnitude, a continuing exponential advance of computing power would attain that level within a short period of time.