drupal statistics module

Machines Like Us

Feral neurons

Sunday, 03 February 2013
by Peter Hankins

Credit: Wikimedia commons

Dan Dennett confesses to a serious mistake here, about homuncular functionalism.

An homunculus is literally a “little man." Some explanations of how the mind works include modules which are just assumed to be capable of carrying out the kind of functions which normally require the abilities of a complete human being. This is traditionally regarded as a fatal flaw equivalent to saying that something is done by “a little man in your head," which is no use because it leaves us the job of explaining how the little man does it.

Dennett, however, has defended homuncular explanations in certain circumstances. We can, he suggests, use a series of homunculi so long as they get gradually simpler with each step, and we end up with homunculi who are so simple we can see that they are only doing things a single neuron, or some other simple structure, might do.

That seems fair enough to me, except that I wouldn’t call those little entities homunculi; they could better be called black boxes, perhaps. I think it is built into the concept of an homunculus that it has the full complement of human capacities. But that’s sort of a quibble, and it could be that Dennett’s defence of the little men has helped prevent people being scared away from legitimate “homuncular” hypotheses.

Anyway, he now says that he thinks he underestimated the neuron. He had been expecting that his chain or hierarchy of homunculi would end up with the kind of simple switch that a neuron was then widely taken to be; but he (or ‘we’, as he puts it) radically underestimated the complexity of neurons and their behaviour. He now thinks that they should be considered agents in their own right, competing for control and resources in a kind of pandemonium. This, of course, is not a radical departure for Dennett, harmonising nicely with his view of consciousness as a matter of ‘multiple drafts’.

It has never been really clear to me how, in Dennett’s theory, the struggle between multiple drafts ends up producing well-structured utterances, let alone a coherent personality, and the same problem is bound to arise with competing neurons. Dennett goes further and suggests, in what he presents as only the wildest of speculations, that human neurons might have some genetic switch turned on which re-enables some of the feral, selfish behaviour of their free-swimming cellular ancestors.

A resounding no to that, I think, for at least three reasons. First, it confuses their behaviour as cells, happily metabolising and growing, with their function as neurons, firing and transmitting across synapses. If neurons went feral it is the former that would go out of control, and as Dennett recognises, that’s cancer rather than consciousness. Second, neurons are just too dependent to strike out on their own; they are surrounded, supported, and nurtured by a complex of glial cells which is often overlooked but which may well exert quite a detailed influence on neuronal firing. Neurons have neither the incentive nor the capacity to strike out on their own. Third, although the evolution of neurons is rather obscure, it seems probable that they are an opportunistic adaptation of cells originally specialised for detecting elusive chemicals in the environment, so they may well be domesticated twice over, and not at all likely to retain any feral leanings. As I say, Dennett doesn’t offer the idea very seriously, so I may be using a sledgehammer on butterflies.

Unfortunately Dennett repeats here a different error which I think he would do well to correct; the idea that the brain does massively parallel processing. This is only true, as I’ve said before, if by ‘parallel processing’ you mean something completely different to what it normally means in computing. Parallel processing in computers involves careful management of processes which are kept discrete, whereas the brain provides processes with complex and promiscuous linkages. The distinction between parallel and serial processing, moreover, just isn’t that interesting at a deep theoretical level; parallel processing just a handy technique for getting the same processes done a bit sooner; it’s not something that could tell us anything about the nature of consciousness.

Always good to hear from Dennett, though. He says his next big project is about culture, probably involving memes. I’m not a big meme fan, but I look forward to it anyway.

Peter Hankins is author of the Conscious Entities weblog.