If qualia can exist in the absence of any subjective self-awareness, that suggests they’re not as tightly tied to people as we might have thought. That would surely give comfort to the panpsychists, who might be happy to think of qualia blossoming everywhere, in inanimate matter as well as in brains. I don’t find this an especially congenial perspective myself, but if it were true, we’d still want to look at personal thisness and how it makes the qualia of human beings different from the qualia of stones.
I think making a decision and becoming aware of having made that decision are two different things, and I have no deep problem with the idea that they may occur at different times. The delay between decision and awareness does not mean the decision wasn’t ours, any more than the short delay before we hear our own voice means we didn’t intend what we said.
I have noticed as an empirical matter that when someone vigorously criticises philosophy, they are generally about to offer us some.
There’s some important element of the way the brain works that just doesn’t seem to be computational. But why the hell not?
One of John Searle’s popular debating points is that you don’t get wet from a computer simulation of rain. Actually, I’m not sure how far that’s true; if the computer simulation of a tropical shower is controlling the sprinkler system in a greenhouse at Kew Gardens, you might need your umbrella after all.
You could argue that human mental states operate according to a program which is simply implicit in the structure of the brain, rather than being kept separately in some neural register somewhere; but even if we accept that there is a difference, why is it a difference that makes a difference? I can’t say for sure, but I suspect it has to do with intentionality, meaningfulness. Meaning is one of those things computers can’t really handle, which is why computer translations remain rather poor: to translate properly you have to understand what the text means, not just apply a look-up table of vocabulary. It could be that in order to mean something, your mental states have to be part of an appropriate pattern of causality, which operating according to a script or program will automatically mess up. I would guess further that it has to do with a primitive form of indexicality or pointing which lies at the foundation of intentionality: if your actions aren’t in a direct causal line with your behaviour, you don’t really intend them, and if your perceptions aren’t in a direct causal line with sensory experience, you don’t really feel them. If that general line of thought is correct, of course, it would be the case that we cannot ever program or build a conscious entity -- but we should be able to create circumstances in which consciousness arises or evolves.