Consciousness is a riot of experiences
by Jeremy W Bowman
The word 'qualia' seems to be entering everyday usage. (It's a plural — the singular is 'quale'.) A quale is a distinctive sort of conscious experience, such as the subjective experience of blue (i.e. what we consciously experience when we are actually looking at a clear cloudless sky, or dreaming about swimming in the Aegean, etc.). How might qualia be explained from the perspective of evolutionary theory?
The really mysterious thing about qualia is this. The nerve endings send "signals" to the brain via the sensory neurons, like messages along telephone wires, and the brain reacts appropriately by sending "signals" back along the motor neurons to the muscles. Although there is an obvious need for the nerves to work like telephone wires, there doesn't seem to be any obvious need for conscious experience to enter the picture at all. And yet, the life of a conscious creature is a riot of subjective experiences — distinctive colors, various subjective feelings such as hunger and pain, and so on. Why?
Here's a very quick answer:
All living creatures are programmed to seek goals such as food, reproduction, safety, etc.
Having an internal "map" of the outside world helps animals to achieve their goals. This internal map is a belief system. It works like the onboard computer map in a cruise missile, which looks at the terrain below and guides it towards its target. Of course, a cruise missile has just one goal and a very limited sort of map, but the basic idea is the same.
Programming a cruise missile is no doubt complicated, but maintaining a belief system is even more complicated: it calls for a lot of self-regulation. A belief system needs to perceive situations in the outside world, naturally, but it must also make choices, delay the achievment of some goals in favor of others, discard some beliefs when other beliefs are more likely to be true, and so on.
All of that entails having a higher-level "map". This is more than just a "map" of the outside world like a cruise missile — it's a "map" for perceiving one's own internal states, and one's overall position in the world. For example, discarding one belief in favor of another belief involves having second-level beliefs about which first-level beliefs are more likely to be true than others.
We are now in a position to ask: what is consciousness? Answer: consciousness is constantly-updated knowledge of our own states — and it mostly consists of higher-level states like the ones just mentioned.
For example, consider reaction to injury. A creature that does not have any such higher-level states (and is therefore not conscious) might have a simple defense mechanism that makes it recoil defensively when injured. But a creature with a higher-level "map" of its own states would be able to make a decision between "carrying on regardless" if the injury is not too serious, or stopping and nursing the wound if the injury is sufficiently serious. The seriousness of the injury depends on the circumstances. If the creature is running away from a predator, it should keep running at all costs. If the creature has to suffer no worse a fate than going without a meal, it should stop and rest. Unless it is in danger of starving to death, in which case it shouldn't.
The decision-making capacity of these second-level states is a bit like the decision-making capacity of a political assembly. Each of the members wants what's best for his own constituents, but the decisions of the whole are taken in the interests of the whole. This is achieved when the representations made by each member has its own distinctive character and a degree of insistence.
For example, having a distinctive sort of pain is normally the same thing as having an injury in concert with an internal state that indicates the severity and location of the injury, so that the "assembly" of second-level states (i.e. consciousness) can make an informed decision about whether or not to override it.
As a first pass, above, I said that consciousness is constantly-updated knowledge of our own states. Now I will fine-tune that by saying that consciousness consists of second-level representations of first-level representations of states of the world outside our heads.
For example, suppose I burn my finger. That is a state of the world (my body) outside my head. The injured part of my finger sends a signal to my brain, which then forms a state that normally co-occurs with that sort of injury, and so works as an "indicator" of its presence. This is a first-level representation of such an injury. So far, consciousness has nothing to do with the process. But now my brain has to take account of my overall state, and make decisions based on the various indications that are available to it. Doing that involves perceiving various internal states — such as the first-level representation of the injury to the finger — and weighing them up in terms of their urgency, type, and so on. That involves forming a second-level representation of first-level representations. In a sense, each of the first-level representations has to "make a case" for itself by having distinctive qualities that demand more or less attention, this or that type of attention, and so on.
In order to be represented appropriately at the second level, a first-level representation has to be distinctive. That is why it "feels like something or other". What these states feel like is a product of how they are physically realized, whether they are welcome or unwelcome, what sorts of decisions have to be made given their occurrence, and so on.
For example, the first-level states that occur with injury are realized in different ways depending on which part of the body is injured. Almost all of them are unpleasant, because almost all injury is unwelcome. Most of them are "insistent" because most of them require some sort of action, taken sooner rather than later.
Or again, consider the first-level states that typically occur with the presence of an (objectively) blue object. Blue objects are unusual in nature, so the first-level states that accompany them are very distinctive, arouse curiosity, etc. Mostly these states are pleasant because most blue objects are safe, and some are valuable in some way. The first-level state of perceiving a blue object (i.e. the "experience of blue") is not an especially "insistent" sort of state because action is rarely needed in response.
I hope it's reasonably clear that having "qualia" is a "functional" business that can add significantly to reproductive success.