QR6.3.11 The Silicon Chip Speculation

The classic brain information theory argument is the silicon chip thought experiment, that replacing every brain nerve with an equivalent silicon chip wouldn’t alter consciousness:

… imagine that one of your neurons is replaced by a silicon chip prosthesis that has the exact same input/output profile as the neuron it replaces. At the core of this thought experiment is the presumption that such a replacement would be unnoticeable to you or to anyone observing your behavior. Presumably, you would continue to experience pain even though the physical realization of those mental events includes a silicon chip where an organic neuron used to be. Now imagine that, one by one, the rest of your neurons are swapped for silicon prostheses. Presumably there would be no change in your mental life even though your brain, which was once made of lipid and protein neurons, is now entirely composed of silicon neuronoids.(Mandik, 2004).

No evidence supports this speculation except the belief that brains are biological computers. Yet that the brain equates to a set of wired chips isn’t supported by neuroscience, as transistors are insulated from electromagnetic fields while nerves broadcast them, as brainwaves show.

 Even if a chip could replace a nerve and its synapses, it can’t transmit as nerves do. Replacing cellphone transmitters with computers that can’t transmit would diminish the network, so by the same logic, replacing every brain nerve with a chip would reduce consciousness. The silicon experiment would give a supercomputer with no more consciousness than a mass of metal. The silicon chip speculation, like the singularity prediction (Kurzweil, 1999), is science fiction posing as science fact,.

A similar thought experiment is that if one day we can copy physical things atom-for-atom, would replicating the brain copy consciousness? Physical realism says it would, as the result is physically identical to the original and the physical world is all there is. But if one “me” tends the garden while another cooks the dinner, how can “I” experience both events? If a copy of me went to work while “I” lay in the sun, I experience the sun not a day at work, so the physical copy didn’t replace me at all.  For Chalmers, the original is conscious and the copy is a zombie while for Dennett, both are zombies imagining they are conscious. In quantum realism, physically identical brains would generate two conscious beings that, despite their similarity, independently choose, just as identical twins raised the same make their own choices and live different lives. 

Cutting the nerves joining the hemispheres to “split” the brain doesn’t stop data exchange, it stops them forming a single consciousness. When the hemispheres dissociate, to each have its own consciousness, they can come into conflict and having two “I”s isn’t beneficial. If in the future I made a perfect copy of myself that was also conscious, who is to say it wouldn’t decide to kill me? The brain evolved to unify consciousness, not to divide it, for a reason.

Next