Reality Checking AI (1/5) – Brains and Computers

Before I start, I should point out that I am an expert in neither AI nor computer engineering. My thoughts and opinions are based on my limited understanding as a (somewhat) informed layperson.

 

The Brain / Computer Analogy

Image result for brains and computers

Is the brain a computer? Well, at a crude level brains can be thought of as information processing systems; that is to say, systems which accept inputs, perform some kind of operation on them, and then produce outputs. Since computers can also be described as information processors; in some sense, the brain is a computer. However, relying too heavily on this analogy conceals at least as much as it reveals, because a brain is simply not like any computer we know of nor is it even like any futuristic variant that anyone has any realistic idea at all about how to build.

Despite having this broad definition in common, a human brain is so much more than a computer in all the ways that matter. Yes, it is an information processing system, but to leave it at that almost laughably simplistic definition is to deliberately obscure what a brain actually does. If the brain is an information processing system, it is one that produces outputs no physical information processing system can, or should (according to materialistic naturalism) be able to, produce; things like thoughts, emotions, qualia, consciousness, and even just common sense understanding or know-how. This doesn’t mean there must be a ‘ghost’ in the machine – all it means is that no computer on earth can do what a human brain does and no one has the faintest idea what it would take to get us from the former to the latter.

 

But wouldn’t you say a calculator is a computer even though no calculator can do what a computer can? Yes, I would.

So, why can’t I claim that a computer is a brain even though no computer can do what a brain does (yet)? The problem here is that there is still a vast difference between brains and computers and it isn’t clear that this difference is one of degree, as if computers were on the same ladder as brains, just a few rungs lower. A calculator is clearly recognisable as a computing machine. It accepts user input, performs some operation on it, and produces an output. This is exactly what the fastest supercomputer does. The only difference is in complexity.

On the other hand, it just isn’t obvious that a brain does exactly what a computer does, just with more complex input/processing/output. In fact, if we’re being honest and look at the evidence (as opposed to listening to the hype), no computer on the planet is even remotely capable of doing what a human brain does and, I’ll say it again, no one has the faintest idea what it would take to get us from the former to the latter. The difference between brains and computers is one of kind, not degree.

 

So, brains aren’t really computers, but there is another reason I am opposed to people throwing around this analogy. In making the crude comparison between brains and computers without keeping in mind all the ways brains are so different, it becomes easier and easier to slide into a simplistic and ultimately erroneous conception of brains (and therefore human experience) as something mechanical and meaningfully reducible to neuronal impulses and chemical interchanges.

What’s wrong with this? While the brain is fundamentally mechanical and reducible to its physical parts, it isn’t essentially those things. We know the mind emerges from the brain (no brain, no mind) and we know the brain is a physical organ comprised of physical parts, none of which are conscious, capable of subjective experiences, able to think, etc. This is what the brain fundamentally (on the level of its smallest parts) is. And yet, the output of the brain is something quite remarkably unlike anything mechanical or even physical. This is what the brain essentially (on the level at which we live) is.

If you say, and hear other people say, the brain is a computer enough times, you will eventually start to think humans are computers; that is so say, nothing more than lumps of matter with some fancy moving parts, and if nothing sounds wrong with that statement, I would suggest you haven’t spent enough time with either computers or humans.

 

The Brain as Hardware / The Mind as Software

Image result for brains hardware

As with the brain/computer analogy, this one is also partially accurate. In computers, the hardware is the tangible, physical components; the software is the intangible programs operating through the hardware. Similarly, in a human, the hardware is the physical brain while the software is the intangible mind. However, the analogy seems pretty seriously flawed to me. The physical brain as hardware is okay but the mind as software doesn’t seem to track at all.

All software is a set of instructions which allow the user to interact with the hardware. It tells the hardware what to do. One needs software in order for the hardware to do anything; i.e. software –> hardware, where the arrow means something like ‘activates’ or ‘controls’. One consequence of this is that the two (software and hardware) are completely different things.

The mind’s relationship with the brain, however, is absolutely nothing like this. The mind emerges naturally from the way neurons are arranged in the brain and it has nothing to do with providing instructions to an otherwise innate lump of matter. First, the mind and brain aren’t, and can’t be, separate. Treating the mind as software, ironically, implies a ghost in the machine (the one thing most of us can agree doesn’t exist), separate from the brain, acting on it. Secondly, one needs the brain in order to have a mind; in other words, brain –> mind, where the arrow means something like ‘produces’. This is the polar opposite of the software/hardware relation.

 

Hold on. Software is just information. The mind is also information; specifically, information about how the parts of the brain are to be arranged.

So, what does ‘information’ mean here? Software is information about how the hardware is to operate (ultimately perhaps, about whether an electric current is allowed to pass or not). The mind is information about how neurons are arranged (ultimately perhaps, about whether, and how strong, the connections between neurons are to be). Now, this does seem fairly similar, both sets of information contain details about the structure of the ‘parts’ of their respective hardware, but, as with the brain/computer analogy, this obscures as much as it reveals.

The attempt is to reduce software and the mind to information but the word ‘information’ is being used in two different ways. The kind of ‘information’ in software won’t tell you how to build a computer (the hardware); it only tells you what it will instruct the hardware to do. On the other hand, the kind of ‘information’ the mind is claimed to be is nothing more than a description of how the parts of the brain (the ‘hardware’) are arranged. On such a loose application of the word ‘information’, a table is also information (i.e. a description of how the atoms in a table are arranged), but surely we wouldn’t want to claim that a table is a computer much less a brain!

 

NAND Gates and Substrate Independence

Image result for nand gates

Max Tegmark in Life 3.0 gives a good description of NAND gates. Essentially a NAND gate is any system that takes two bits as input and outputs one bit, which will be either a 1 or a 0, depending on the inputs. Such a simple set up is what modern computers are based on and, if one digs deep enough, (possibly) how brains work (a neuron ‘fires’ or ‘doesn’t fire’ depending on the input). Indeed, a neuron is (allegedly) a NAND gate, just an organic one. [1]

So, Tegmark concludes, all that is required to create a lump of matter that remembers, computes, and learns (like a human brain) is a system of sufficiently complexly interconnected NAND gates. Hence human-like intelligence is substrate independent, which means that there is nothing special about organic matter as far as intelligence is concerned; intelligence can arise on any suitably arranged collection of NAND gates.

I’m agnostic about this. It sounds good in theory, but in practice (isn’t this what science is supposed to be based on?), no collection of NAND gates that we have constructed has demonstrated anything remotely like the capabilities of even a human child. It could be that we don’t have the technology that allows us to connect our artificial NAND gates to each other in a sufficiently complex fashion yet, but it could equally be true that there is something special about organic matter arranged in an organic body able to ‘organically’ interact with the world to realise goals. Maybe this level of involvement with the world is precisely what is required for the development of true human-like intelligence.

Remember that although we say a computer ‘remembers’ and ‘learns’, it doesn’t do these things anything at all like the way human brains do. (If it did, how different would our interactions with computers be?) Is this just because we don’t have enough NAND gates connected together yet? Maybe. Can anyone tell us how they need to be connected to get true human-like learning? No. In light of this, maybe we ought to keep our expectations a little more tightly tethered to reality.

 


 

[1] The words ‘possibly’ and ‘allegedly’ are very important here. Although many assume neurons operate just like NAND gates and are digital in nature, the evidence seems to indicate that neurons communicate in digital and analogue modes simultaneously.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s