Draft copy. Please do not quote.
Second Sense: A Theory of Sensory Consciousness
Consciousness, like love, is something so intimate and vital to our sense of ourselves as humans, that explanation, even definition, risks robbing humanity of its soul. The proposal that consciousness could be fully explained in terms of physical stuff like neurons challenges the idea that we are special, mysterious beings, deserving of considerations such as inalienable human rights. Yet we need not think of mystery and explanation as incompatible. Consider life. Despite knowing the basic facts about human reproduction – DNA, cell division and the like – giving birth was the most mysterious process I have ever experienced. The sense of mystery here seems to come from sources other than inexplicability: complexity, cultural significance and symbolic value. Similarly with consciousness, I do not expect the mystery of consciousness to be removed when it has been successfully explained. At minimum a successful explanation must provide criteria for determining when consciousness is present and when it is not.
The first step, of course, is to determine exactly what sort of ‘consciousness’ is to be explained, a project I begin in Section I. Section II elaborates the description of consciousness at issue, which I call sensory consciousness, and proposes an explanation. One distinctive element of the explanation is its representational structure. The explanation is first-order, or flat, rather than higher-order. Section III argues for another distinctive element, the central role of a mechanism for integrating conscious sensory states, the second sense. I conclude by considering four possible objections to the theory in Section IV.
I. WHAT ‘CONSCIOUSNESS’?
It is well established that ‘consciousness’ is multiply ambiguous and so is useless for theoretical explanation without a careful description of the phenomenon in question. My target will be the sort of consciousness one has when one has sensations and thoughts, and moreover, one feels one’s sensations and thoughts in characteristic ways. So, for example, if one has an all-day headache, it is reasonable to say that the sensations of pain remain all day, yet the characteristic hurtfulness of the pain comes and goes. The phenomenon of interest here is the pain when it hurts. To narrow the field further still, I will deal exclusively with conscious sensations. Let us call the form of consciousness at issue sensory consciousness and call the individual sensations exhibiting this form of consciousness conscious sensory states. An explanation of sensory consciousness, then, will allow us to distinguish conscious sensory states from unconscious ones. In the example of the all-day headache, an adequate explanation should tell us what factors determine when we feel our pain and when we do not.
Before going any further, it will be useful to distinguish between questions about the contents of conscious sensory states and the more general question of what makes those contents conscious. Conscious sensory states have all sorts of contents – the smell of morning coffee, the sound of a Mozart concerto from the stereo, the feeling of exhaustion, you name it. The phenomenon of sensory consciousness I wish to consider is that which is common to all conscious sensory states, regardless of their content. Much of the argument about consciousness (of any variety) since the revival of the problem two decades ago has revolved around the contents of conscious sensory states, in particular the phenomenal content known as ‘qualia’. Since ‘qualia’ is used as ambiguously as ‘consciousness’ and ‘sensation’, for present purposes I will take qualia to be that which determines the characteristic way something feels. When I look at grass, why does it look green to me rather than red? Might qualia be inverted, so things that look green to me look red to you? How does the way round things feel differ from the way round things look? These are questions about qualia and are some of the most puzzling problems in philosophy of mind. But these questions, as with any questions about a particular sort of content, can be separated from the question of sensory consciousness per se. Even presuming we could solve problems of qualia and other sorts of mental state content, there is still the question of what makes those mental states conscious as opposed to unconscious.
In his review of current consciousness theory, Joseph Levine puts the point this way:
There are two questions a Materialist theory has to answer: (1) what distinguishes conscious experiences from mental states (or any states, for that matter) that aren’t experiences? and (2) what distinguishes conscious experiences from each other; or, what makes reddish different from greenish, what determines qualitative content? (Levine 1997b, 388)
The first of these questions is the question I propose to answer, involving what I have called sensory consciousness. The second question can be answered separately from the first because sensory states can be distinguished by what features of the world they represent.  In other words, sensory states can be individuated in terms of intentional content. We can call a sensory state ‘red’ (or ‘reddish’) as opposed to ‘green’ (or ‘greenish’) just in case it represents red features as opposed to green features. Now, how we go about determining what counts as a ‘red’ feature is another problem. My point here is that this problem is separate from the problem raised by the first question because it applies to unconscious as well as conscious sensory states. Discrimination tests suggest that both unconscious and conscious sensory states can represent sensory features like color. Blindsight patients, for example, are able to guess with better than chance accuracy as to whether the stimulus presented in the blind area of their visual field is red or green. (Farah 1997; Stoering 1998; Stoering and Cowey 1996) This suggests that whatever distinguishes red sensory states from green sensory states is present even when the state is not conscious.
My question, then, is what distinguishes unconscious sensory states from conscious ones? What distinguishes the sensory state of a blindsight patient representing a red stimulus from the sensory state of a normally sighted person representing a red stimulus? As I have said, my discussion is limited to sensory states, such as when one visually represents a red or green stimulus. So seeing the words on the page or feeling the paper turn in your hand are the sort of mental states I will be considering.
In order to explain sensory consciousness, we will need to get clearer about what conscious sensory states are. This will not be an easy task. Because so many different terms are used to describe the various phenomena involved, it can be difficult to determine when the dispute is purely verbal and when substantive disagreement lies behind terminological differences. To offset these terminological problems somewhat, I begin with a rough taxonomy into three general kinds of sensory state: unconscious, conscious and self-conscious. As I describe these kinds of sensory state I will consider how they interrelate and how other mental state taxonomies fit into this schema. Though I restrict my explanandum to conscious sensory states, not everyone does the same. So I will use the terminology and descriptions of the authors where appropriate, drawing connections to my target, conscious sensory states, where applicable.
Unconscious sensory state. A creature has an unconscious sensory state when the state is a sensory representation but there is ‘nothing it is like’ (Nagel 1974/1991) to have that sensory state.
Though some may be inclined to think that all mental states are conscious, David Rosenthal (1997) has argued persuasively for the possibility of unconscious mental states. There are many cases where a person’s behavior or mental activity is best explained by the existence of unconscious mental states. The all-day headache, for example, is reasonably considered to be a single sensation that is intermittently conscious. One does not feel the pain when distracted, so during that time of distraction the sensory state is unconscious. Yet it is reasonable to say the same headache endured throughout. As Rosenthal observes, “it would be odd to say that one had had a sequence of brief, distinct, but qualitatively identical pains or aches.” (Rosenthal 1997, 731) Rosenthal also points to the way we attribute thoughts, desires or emotions to a person even when those mental states are not at all conscious. (Rosenthal 1997, 731) I may be angry about something but not be aware of my anger until someone else remarks on my scowls and quick temper. Freudian repressed states explain otherwise random neurotic behaviors, and unconscious reasoning processes explain the ‘light bulb phenomenon’, as when the answer to a problem suddenly occurs to you even though you had been consciously thinking about something else.
Current psychological literature provides additional examples of unconscious sensory states. As noted earlier, in blindsight studies patients are asked to make guesses about objects presented in the blind area of their visual field. Because the guesses of blindsight patients are accurate at a rate better than chance, psychologists conclude that visual stimuli from the blind area are processed unconsciously. (Stoering 1998) Blindsight patients represent visual features of objects, such as color or shape, as is shown by their accurate guesswork, but these visual representations are unconscious. There is ‘nothing it is like’ for the blindsight patient to have these mental states. More familiarly, conscious processes such as speech are usually subserved by unconscious sub-processes (parsing, word choice, etc.) that are routine and operate in parallel. (Dennett 1991a)
Second, Rosenthal argues that if all mental states are conscious, no explanation of consciousness is possible. We cannot explain consciousness in terms of the mental if consciousness is built into the nature of mentality. The only alternative form of explanation, one in purely physical terms, is a daunting project, if not an impossible one. As Rosenthal states (perhaps a bit too strongly), “nothing that is not mental can help to explain consciousness. So, if consciousness is essential to mentality, no informative, nontrivial explanation of consciousness is possible at all.” (Rosenthal 1991c, 463) We would have no other choice but to accept consciousness as primitive.
Conscious sensory state. A creature has a conscious sensory state when the state is a sensory representation of something in the world, and there is something it is like to have that state.
As a first stab, this description of sensory consciousness tells us two things: a conscious sensory state is a representational state whose object is external, and there is ‘something it is like’ to have a conscious sensory state. Neither feature is particularly helpful, however; the first is too broad and the second is too vague. In the next section I will provide a fuller description of conscious sensory states within the context of a theoretical explanation. At the moment my principal aim is to contrast conscious states with unconscious and self-conscious states in order to provide a schematic framework for identifying sensory consciousness.
With this negative project in mind, the example below highlights some of the typical differences in mental states that have been taken as markers of consciousness. I will argue that none of these markers is a necessary condition for sensory consciousness.
Riding my bicycle down a steep hill, I unexpectedly hit a deep groove in the road and am thrown from the bike. I get up, put on my glasses and walk over to pick up my bike. At least, I am later told by witnesses that I have done this. I remember nothing from the time I was riding down the hill until much later being questioned by medics.
Oftentimes, an ability to respond to a sensory stimulus is described as a form of consciousness. My movements to get up, pick up my glasses and bicycle show that I have seen the glasses, bicycle, etc. in some sense, because I have reacted appropriately. But the sense of ‘seen’ involved in sensory responsiveness is neither a necessary nor a sufficient condition for sensory consciousness. It is unnecessary because motor functions may be disconnected from other mental processes. A person may have fully conscious auditory sensations but be unable to respond to them due to paralysis or other motor disorder. Therefore, lack of responsiveness does not imply lack of sensory consciousness. Nor is sensory responsiveness sufficient for sensory consciousness. The sort of chair shifting behavior noted in the second example is a case of sensory responsiveness without sensory consciousness. Though the pain from my shoulders and legs was not conscious – I did not feel the hurtfulness of the pain – I nonetheless responded to it by adjusting my position several times. Similarly, a person may be in a state of dreamless sleep and move in response to a muscle cramp or a poke from a bed partner. So the sensory representations that facilitate motor responses need not be conscious sensory states.
While there may be some question about whether or not I had conscious sensory states when picking up my glasses and bicycle, it seems more likely that I had conscious sensory states later while I was talking to the medic. Because we tend to remember what we experience consciously, memory has also been taken as an indicator of sensory consciousness. If I remember an event, such as talking to the medics, this memory indicates that I was having conscious sensory states at the time of the conversation. Memories are notoriously fallible, however, and so can only serve as an indicator that the remembered sensory state was conscious. One may be tempted to make a strong claim and say that in the case of episodic memory the remembered sensory state must have been conscious. In episodic memory a person remembers an event as if she were there at the time as a participant or observer rather than having learned of the event is some other way. I remember looking into the face of the medic and can recall sensory details, such as the mole on his left cheek, by producing a mental image of the event. But even episodic memories can be distorted if the memory is inconsistent with other memories or beliefs. (Dennett 1991a) Perhaps the mole was on the left cheek of the store clerk I had seen moments before the accident and I somehow confabulated the image of the medic. So, even though episodic memory often indicates conscious sensory states at the time of the remembered event, it does not do so reliably.
Episodic memories are not a sufficient criterion for identifying past conscious sensory states, nor are memories in general a sufficient criterion for identifying present conscious sensory states. In other words, memories of all kinds can be unconscious. Right now, presumably, we all possess multitudes of memories that are not conscious. If prompted I can remember what I had for breakfast or who phoned me yesterday. If the event made a sufficient impression, I may even recall some episodic memories, full of sensory detail. But without such a prompt, these memories would likely remain unconscious states. The existence of unconscious memories is evident by their quick and (relatively) consistent recall as well as the way they can influence behavior. After a bad meal in a restaurant I may avoid returning there, even if I don’t consciously remember the event as my reason. So it is possible to have memories that are not themselves conscious and do not imply the existence of past conscious sensory states.
It is also possible to have conscious sensory states without any episodic memories whatsoever. In cases of anterograde amnesia, patients are unable to form new memories. Though alert and attentive, patients with brain disorders such as Korsakoff’s Syndrome may lack normal memory function. Like the case of motor dysfunction described above, these amnesia patients likely have fully conscious sensory states but lack the ability to remember them. (Kolb and Whishaw 1990, 552) Though sensory consciousness without episodic memory is possible, some form of memory may be necessary for sensory consciousness. Skill-based memories such as recognition and conditioned response are retained in even the most severe amnesia cases. However, because skill-based memory is so basic to mental function – arguably necessary for perception and the acquisition of simple concepts – its relation to sensory consciousness is not likely to be very informative. The role such memories play in sensory consciousness is probably duplicated for many other mental functions. Only episodic memory seems to be bound in an interesting way to sensory consciousness, but as I have argued, even this form of memory fails to be necessary or sufficient for having conscious sensory states.
Putting aside memory, we are left with the vague idea that there is ‘something it is like’ to have a conscious sensory state. This condition does not get us terribly far in itself, because there is no way to further analyze ‘what it’s like’. The phrase is used more like a demonstrative than a description, to point to something that cannot be otherwise identified. While it is not always clear exactly what is being pointed out, looking at how different authors use the phrase may help determine some identifying markers for sensory consciousness.
One use for the ‘what it’s like’ locution is in comparing Rosenthal’s ‘state consciousness’ with my ‘sensory consciousness’. Rosenthal describes ‘state consciousness’ as a property of mental states that it is like something to be in, while there is nothing it is like to be in an unconscious state. (Rosenthal 1993b, 357) I fully agree with Rosenthal when he says that the question at hand is “what it is for a mental state to be conscious. Assuming that not all mental states are conscious, we want to know how the conscious ones differ from those which are not.” (Rosenthal 1997, 729) I restrict my claims to conscious sensory states rather than attempting to explain all forms of conscious state, but we both agree that there is ‘something it is like’ to be in a conscious state. Additionally, conscious states represent something in the world on both our accounts. Though we disagree on what constitutes sensory consciousness, we agree that conscious states represent external objects. Given these two features, ‘sensory consciousness’ seems to be a subset of Rosenthal’s ‘state consciousness’. Thus, I take ‘state consciousness’ to be the general category that includes sensory consciousness as well as other forms of conscious state.
Self-conscious sensory states, which comprise my third category, differ from conscious sensory states in their representational object. While conscious sensory states are mental states representing external objects, self-conscious sensory states are mental states representing other sensory states. Yet both are ‘conscious’ in the sense that there is something it is like to have that state. Thus,
Self-conscious sensory state. A creature has a self-conscious sensory state when the state is a mental state about one’s own sensory representations, and there is something it is like to have the self-conscious sensory state.
This description of a self-conscious sensory state is similar to Armstrong’s definition of ‘introspective consciousness’ as “a mental event having as its (intentional) object other mental happenings that form part of the same mind.” (Armstrong and Malcolm, 108) A reconstrual in more fashionable representational terms might be: one’s self-conscious sensory states represent one’s own sensory representations. I use the term ‘self-conscious’ rather than ‘introspective’ to emphasize the continuity between conscious sensory states and self-conscious sensory states. In my view, self-conscious sensory states are very much like conscious sensory states, except with respect to their representational object. While garden-variety conscious sensory states represent the world, self-conscious sensory states represent one’s own sensory states. Since my target is exclusively conscious states, I will not elaborate on or argue for this description of self-conscious sensory states except to mark a distinction between conscious states and self-conscious states. To help keep this distinction clear, I will restrict my use of the term ‘conscious state’ to refer to mental states that represent the world and will use ‘self-conscious state’ to refer to mental states that represent one’s mental states. The question of sensory consciousness, as I construe it, is the question of what determines whether a sensory state is a conscious state or whether it is an unconscious state.
II. A THEORY OF SENSORY CONSCIOUSNESS
Now that we have an idea about what sensory consciousness is not, we need a clearer description of what it is. So here is another example of the difference between unconscious and conscious sensory states:
For the last 20 minutes I have been shifting around in my chair, crossing one leg and then the other, sitting forward, then back. Only now when I have turned my attention away from my task do I become aware of the bodily condition that has been causing my movement. Now I notice the ache in my shoulders and the cramped feeling in my legs.
One minute my leg and shoulder cramps instigate their effects without benefit of sensory consciousness, and the next minute they are conscious sensory states. What is the difference? I suggest that when unconsciously representing the tension and inflammation in my legs and shoulders, I am responding to the pain signals without incorporating the information about my bodily condition into my overall representation of what is happening at the moment. Though my sensory representations of bodily damage are sufficiently coordinated with motor systems to produce my periodic fidgeting in the chair, they are not coordinated into this broader representation of the present moment. My writing task encompasses my world at that time, and anything else is represented as a peripheral element in that world or not represented as part of that world at all. When the task is finished and my attention is free to turn to other concerns, my representation of leg and shoulder cramps is then incorporated into my coordinated sensory representation of what is happening at the moment.
My proposal is that this overall representation of what is happening at the moment that constitutes the content of conscious sensory states. Sensory representations coordinated into conscious states are the best approximation of what the world is like at the present moment. In the next few pages I will try to unpack some of the central elements in this formulation: 1. that they are the best approximation of the world, 2. that they are sensory representations of the world, and 3. that they are representations of the world at the present moment.
First, conscious sensory states are the best approximation of what the world is like at the present moment. The coordination processes involved in sensory consciousness are, on my view, aimed toward response. The sensory representations coordinated into conscious states are selected on the basis of their usefulness in generating appropriate responses. When only general responses are required, with no particular end in view, it may be best to incorporate whatever sensory representation happens to be floating about. Often when we are in between tasks our conscious sensory states are filled with such random representations. Body pains, a glimpse of the mail carrier on the porch, and the taste from a swig of hot coffee are among the sensory representations that flood the mind while a decision about the next action is deliberated. When we turn to more specific tasks sensory representations are more carefully selected, combined or even confabulated in order to form a coherent representation that will facilitate the necessary responses.
The so-called ‘cutaneous rabbit’ experiments described by Daniel Dennett and Marcel Kinsbourne show how subjects often misrepresent stimuli in order to form coherent representations. Subjects in these studies receive sets of three taps administered first on the wrist then on the elbow and finally on the upper arm. Strangely, subjects do not report feeling these stimulus sets but instead report feeling a series of taps evenly spaced along the arm – as if a rabbit is hopping from wrist to shoulder. However, if subjects receive only the first three wrist taps without the subsequent elbow and upper arm taps, they then accurately report feeling all the taps at the wrist – the ‘rabbit’ hops in place. (Dennett and Kinsbourne 1992, 186) One way to account for this phenomenon is to postulate a time delay required for the wrist-taps to be represented in a conscious sensory state, allowing later taps to influence the representation of earlier taps. The influence of later taps thereby causes a revision in the conscious representation of earlier taps. Dennett and Kinsbourne call this Stalinesque editing because it involves manufacturing a representation of evenly spaced taps to appear in conscious sensory states. But another possible explanation of the cutaneous rabbit reports is that the editing occurs after the taps are incorporated into sensory consciousness but before the final report. (Dennett and Kinsbourne 1992, 190f) Dennett and Kinsbourne call this second alternative Orwellian editing because the subject’s memory of a veridical conscious representation is changed. The subject has fleeting conscious sensory states that accurately represent three taps at the wrist, elbow and upper arm, but those accurate representations are wiped out and replaced by a false report of evenly spaced taps along the arm. According to Dennett and Kinsbourne, there is no functional difference between Stalinesque and Orwellian forms of editing because there is no way to determine such a thing as a “charmed circle of consciousness” where all and only conscious sensory representations are located. (Dennett and Kinsbourne 1992, 193)
No doubt significant methodological problems stand in the way of arbitrating between Stalinesque and Orwellian forms of explanation. Sometimes it seems as if the apparently hopeless task of determining the physical substrate of conscious sensory states is the reason Dennett and Kinsbourne reject the difference between these two interpretations. The difference, if one can be determined, is a matter of exactly which representations are part of a conscious sensory state, whether the editing from three 3-tap intervals to evenly spaced taps occurs before or after the taps are consciously represented. Subjective reports cannot arbitrate between Stalinesque and Orwellian explanations because both explanations predict the same sorts of report. Therefore the only way to determine a difference is to determine a physical difference between conscious sensory representations and unconscious ones. At this point we are still struggling to agree on a good operational definition of sensory consciousness and a long, long way from the sort of neurological identification of conscious sensory states that might provide independent verification of their contents. Still, there seems to be no principled reason why future research could not make some such identification and thereby distinguish Stalinesque from Orwellian forms of editing.
Nonetheless Dennett and Kinsbourne strongly maintain that Stalinesque and Orwellian editing are metaphysically indistinguishable.
Both the Orwellian and the Stalinesque version...can deftly account for all the data – not just the data we already have, but the data we can imagine getting in the future. (Dennett and Kinsbourne 1992, 193)
...if one wants to settle on some moment of processing in the brain as the moment of consciousness, this has to be arbitrary. One can always “draw a line” in the stream of processing in the brain, but there are no functional differences that could motivate declaring all prior stages and revisions unconscious or preconscious adjustments. (Dennett and Kinsbourne 1992, 194)
There can be no functional property, that is necessary and sufficient for a sensory state to be conscious, Dennett and Kinsbourne claim, because “there is no further functional or neurophysiological property K over and above the properties that account for the various ‘bindings’ and effects on memory, speech, and other behavior.” (Dennett and Kinsbourne 1992, 236) I agree with this statement, but do not see how it implies that there is no functional property K which is necessary and sufficient for a sensory state to be conscious. Indeed, I propose just such a form of ‘binding’ as necessary and sufficient for a sensory state to be conscious. A sensory state is conscious, on my account, when it is bound into a coordinated representation of the world at the present moment. There is a lot of work to be done in order to support this account and even more to determine a physical substrate for such coordinated representations. The question for Dennett and Kinsbourne, however, is why this sort of binding could not in principle account for sensory consciousness.
To move on then, what is included in such a coordinated representation of the world? I have said that the coordination of representations into conscious sensory states is geared toward action in the world, and the sort of sensory representations coordinated varies according to the sort of responses required. The coordination of some sensory representations rather than others into conscious states is based on what combination will most likely help the creature respond to current environmental conditions. To get an idea about what factors might determine such a combination, consider a comparison of the sorts of stimuli that tend to be processed consciously with those that tend to be processed unconsciously. Neuropsychologist Bernard Baars notes that unconscious sensory processing occurs when stimuli are of short duration or low intensity, when the range of contents is limited, and when stimuli and response are habituated and routine. Applied to the example above, my feeling of body cramps as I work remains unconscious because it is not terribly intense or complex, and my responses are routine. Conscious sensory processing, on the other hand, is required for long-lasting or high-intensity stimuli, wide ranges of content, or novel stimulus-response patterns. So when I cease working, my feeling of cramps becomes conscious due to a change in working conditions. I am no longer concentrating on a specific task, so other general-purpose stimuli can be incorporated into my current conscious state. The cramp stimuli become part of the complex range of stimuli that flood the mind following the release of focused attention. Additionally the need to choose a new task might generate a search for new stimuli, a search which soon hits upon the signals from leg and shoulder. Alternatively, a very intense cramp might have interrupted my computer task, calling for immediate response.
A common element among stimuli requiring conscious sensory processing is that they present an immediate challenge. They alert the creature to a potential obstacle or item of interest. Long-lasting or high-intensity stimuli suggest a dramatic change in the environment that might call for reassessment of current action plans. Novel stimuli might also present motivation for altering the present course of action. The need for conscious evaluation is less clear regarding wide ranges of content. Often wide ranges of contents can be integrated without becoming conscious, as when integrating letter-discrimination and motor processes while typing. So there must be some other factor that determines whether integration involves conscious or unconscious processing. If the function of sensory consciousness is to represent what the world is like in order to respond effectively, as I have suggested, perhaps the determining factor is the need to represent integrated elements as occurring at the present moment. I represent some things as occurring ‘now’ in order to assess the world at that time. Representations are incorporated insofar as they are useful in making my assessment. The coordinated representations form a roughly unified representation of what the world is like at a moment. A series of such representations can then help a creature keep track of myriad environmental changes over time. In short, my proposal is this: conscious sensory states are representations of what the world is like at each moment. As time marches on and things change externally, sensory consciousness provides essential updates of current conditions. Sensory consciousness represents what is happening right now, at the present moment.
What moment is represented by conscious sensory states as the present moment? As Dennett and Kinsbourne have argued, the time represented is not just the time at which sensory representations first occur. (Dennett and Kinsbourne 1992, 188f) Otherwise the wrist taps would be represented as grouped into sets, since the sensory representations occur in sets. So some other principle for representing time must be at work here. A good analogy for the way sensory consciousness represents time is a stockmarket trade board. The board is continuously updated as stockbrokers watch various of their clients’ holdings jump up or down with market fluctuations. By watching the board, brokers can make quick comparisons among stocks and shout ‘buy’ or ‘sell’ as appropriate. The trade board is a relatively simple coordinating device, collecting together information about stock sales activity into a current market price. No choices need to be made about what sorts of information will appear on the board or when it will appear since all such mechanisms are determined in advance by the program designers. Yet the trade board illustrates how the time represented as current differs from the time that the representation occurs. At any moment the trade board represents the current price of stocks. If you log a sale while that price is on the board, that is the price you get. But other people are simultaneously buying and selling around the world which means by the time the sale is logged, another price is the ‘real’ price at that moment. The ‘real’ price is the price that includes all the selling and buying as it occurs. But the ‘real’ price is impossible to represent because it takes time to collect all the necessary sales information. The price on the board represents the coordination of all the available information about stock activity when the price was listed on the board. The ‘current’ market price represents this compilation. Since time moves on and trading activity continues, the ‘real’ price has changed by the time the current market price is listed. Given the time required to produce a price for the trade board, the representation of stock prices that appears on the board now (and so by stipulation of trading practice represents them as current) actually refers to the price of stocks from a moment ago. The time of stock price representing (now) differs from the time represented by the stock price (a moment ago).
Because the stock price represented by the board as current actually represents the price of a moment ago, the shrewd broker must anticipate the market trends in order to take advantage of prices listed as current. The experience of the broker allows her to predict whether a price will keep going up or whether the stock has reached a peak. This experience is why brokers get paid for their services. Likewise, action on the part of conscious creatures involves predicting how the world will be at the next second, as well as representing the world at the present moment. As with the trade board, the moment represented as present by a conscious sensory state has already passed. It takes time for stimuli from whatever object or event is represented to reach the sensory organs and more time for sensory processing prior to the coordination of representations into a conscious sensory state. Therefore, effective action requires predicting how the world will be at the next moment in addition to representing some moment as present. This pull toward the future may be one of the reasons it is so difficult to determine the difference between Stalinesque and Orwellian forms of representational editing. In the cutaneous rabbit experiment, we can ask whether the tap-representations are edited before conscious sensory states are produced in order to provide the best approximation of what the world is like at the moment the taps occurred (Stalinesque), or whether they are edited after conscious sensory states are produced in order to better anticipate the bunny’s next move, At this point in our understanding of sensory consciousness, the answer could go either way. Pace Dennett and Kinsbourne, however, I maintain that there is a principled difference between the two answers. There is a fact of the matter about which of these explanations is true: either evenly spaced taps are incorporated into a coordinated sensory representation of the present moment or they are not.
It is important to note that neither Stalinesque nor Orwellian forms of editing require that the representation of time involves timed representations. For example in the cutaneous rabbit experiments, it is not necessary that subjects produce an evenly spaced sequence of tap representations in order to have a representation of evenly spaced taps. Even if the Stalinesque form of representational editing turns out to be true, there are at least two ways such a representation might be produced, both compatible with my proposal. One possibility is that very short intervals of time are compressed into a single conscious representation of ‘now’. In this case, the evenly spaced taps would be represented as a unit of movement occurring at that moment. It makes good sense that the brain would interpret common patterns of movement holistically rather than represent each microsecond separately. Temporal clumping, like spatial clumping, would be an efficient short-cut in figuring out what sort of event/object is out there as well as in anticipating what sorts of changes are likely to follow. According to Richard Warren in his commentary on the Dennett and Kinsbourne article, the most evolutionarily plausible hypothesis of the cutaneous rabbit stimuli is that a single agent produced all the wrist taps, and this hypothesis would result in the representation of an evenly spaced sequence of taps. “This agent cannot readily jump abruptly from one of the three stimulated positions to the next within a single inter-tap interval (which ranged from 50 to 200 ms).” (Warren 1992, 231) Warren argues that a series of brief events such as the wrist tap series is likely represented as a ‘temporal compound.’
A second Stalinesque possibility is that an evenly spaced sequence of conscious tap representations is indeed produced. Though a sequence is not necessarily represented by a sequence, sometimes it is best represented as such. Dennett and Kinsbourne admit as much when they say: “If someone thinks the thought, ‘One, two, three, four, five,’ his thinking ‘one’ occurs before his thinking ‘two’ and so forth. The example does illustrate a thesis that is true in general and does indeed seem unexceptioned, so long as we restrict our attention to psychological phenomena of ‘ordinary,’ macroscopic duration.” (Dennett and Kinsbourne 1992, 200) Because of the mistakes we make in representing temporal sequences of very short durations it is unclear exactly how those sequences are represented. If the Stalinesque interpretation of the cutaneous rabbit experiment is true, then a conscious representation of evenly spaced taps could be represented by either a single representation of sequence or a sequence of representations. The point is that neither Stalinesque nor Orwellian interpretations of the cutanteous rabbit experiment are committed to any form of vehicle/content confusion. We do not yet know how the rabbit stimuli are consciously represented, but we should not assume that the absence of knowledge entails the absence of a matter of fact.
Though we do not know how conscious sensory states represent the wrist taps, and at the moment have no clue how we might find out, my claim is that the answer can be determined by determining what sensory representations are conscious. What may be unsettling about this claim is the prospect that we could be mistaken about the content of our own conscious sensory states. If, for example, the Orwellian description of the cutaneous rabbit phenomenon is true, then we have a veridical conscious representation of separate sets of taps at wrist, elbow and shoulder. Then, milliseconds later we report our conscious representation as having been a sequence of evenly spaced taps. How could we possibly be so wrong about our own conscious sensory states so soon after they have occurred? Unsettling as this prospect may be, we should not be surprised that even this aspect of the much-maligned notion of introspective infallibility should prove problematic. Because there is no necessary connection between the content of conscious sensory states and reports about that content, reports about one’s own conscious sensory states, like any other reports, are fallible. Therefore, we cannot rule out the possibility that the Orwellian interpretation of our conscious sensory representations might be true, as disturbing as this possibility might seem.
In happier news, the proposal that conscious sensory states are coordinated representations of the present moment provides support for one of our other long-held ideas about sensory consciousness: the notion of a stream of consciousness. From the subjective point of view there seems to be a stream of conscious sensory states flowing from one moment to the next. This stream may be relatively coherent, as when focusing on a task, or it may contain a random collection of junk, as when one has no aim in particular. Whatever the content, each moment of sensory consciousness seems to form a unified collection, conjoined with moments preceding and following. The description of conscious sensory states as coordinated representations of the present moment dissolves the apparent tension between distributed, parallel brain processing and the unified representation of the world by conscious sensory states. Conscious sensory states represent the world as unified, even though many brain processes, perhaps even including the processes that compose conscious sensory states, are distributed and parallel.
Let me expand a bit here, as the proposal of unified sensory representations may lead one to think that these representations must come together at some single place in the brain. The coordination of representations into conscious sensory states is determined by what things are represented as occurring now, and there is no reason to require that this coordination occurs at a single neurological locus. Quite the contrary, in fact. On the present hypothesis, conscious sensory states are composed of unconscious sensory representations that are coordinated by some means so as to represent the present moment. If such a coordination were a matter of physically reproducing each sensory representation into one large, master conscious representation at a single spot in the brain, then one would expect there to be a single spot that housed all and only conscious sensory representations. No such spot exists. Various sorts of brain damage result in various sorts of deficits in conscious representation, but there is no form of brain damage that wipes out all conscious sensory representation while preserving other mental functions. Alternatively, there may be one master representation for each conscious sensory state, but the master representations occur at different points in the brain. One conscious state might be located in the anterior cingulate gyrus, for example, and another in the superior temporal sulcus. Such a proposal seems biologically arbitrary, however, with conscious sensory states popping up all over the brain like popcorn.
Pragmatic reasons, moreover, militate against any form of the single spot hypothesis. Reproducing sensory representations into one, master conscious sensory representation is redundant. If there are already representations lying around in the brain, why not just use them? In addition to the waste of redundancy, the extra step of reproduction takes time. Simply using existing representations without reproducing them into a master representation is already a time-consuming operation. To go to the additional trouble of reproducing representations would be startlingly inefficient in the split-second world of conscious sensory representation. It would be more effective for the brain to simply make use of existing representations wherever they are located, which by current accounts, seems to be widely distributed across the cortex. So we can set aside another of Dennett’s worries – that coordination of representation will necessitate a single place where the representations ‘all come together’. (Dennett 1991a, 107) Representations do come together in the sense that their content is coordinated. But this coordination need not occur at a single place.
Now that we have addressed some of the concerns attendant to the idea that representations can ‘come together’, we need to ask why representations would come together into a coordinated sensory representation of the present moment. The suggestion that conscious sensory states integrate information from various separate representational systems is not uncommon. Bernard Baars has proposed that conscious states coordinate the specialized work done by multiple, independently functioning sub-processors so as to initiate coherent action. (Baars 1993, 1997a; Baars and Fehling 1992, 204) Robert Van Gulick also opts for an integrationist view, suggesting that the informational structure of conscious sensory states likely requires “the simultaneous interaction of many brain regions and representational systems”. (Van Gulick 1992, 229) My own integrationist spin is that conscious sensory states coordinate representations so as to inform the creature’s overall plan of action. Sensory states need to be coordinated into conscious sensory states when a creature acquires the capacity for decision-making. As Bruce Bridgeman notes, the ability to decide between two alternative actions is the point at which a creature needs an internal system for making and executing a plan of action.
An organism that merely reacts to sensory information has no need for consciousness – it simply does what the environment demands and its psychology is one giant transfer function. As soon as more than one plan drives behavior, however, there must be an internal rather than external trigger for action. Along with this must come a planning mechanism that makes plans, stores them in memory and keeps track of where the execution of each one stands. (Bridgeman 1992, 207-208)
The function of conscious sensory states, in my view, is as part of the planning mechanism Bridgeman describes. Conscious sensory states are a representation of the present moment, formed in order to keep track of the relation between the organism’s actions and environmental conditions. Conscious sensory states represent current conditions in order to make sure there is no new danger or opportunity on the horizon. If there is, then a change in plans may be in order to instigate new actions like avoiding that puddle up ahead or buying that delicious donut in the window. If a sensory state does not carry information required for this sort of plan assessment, then it need not be incorporated into the representation of the present moment. Only those sensory states most likely to figure in the ongoing decision-making procedure of the creature are selected by a second sense.
Thus, innate and habituated sensory-response patterns usually remain unconscious, despite the fact that they are instrumental in overall response effectiveness. If there is a reason to direct attention to these otherwise unconscious sensory states, then they can become conscious. But so long as response follows automatically from sensation, there is no need to incorporate the sensation into one’s representation of the world at the present moment. Sensory states all occur in the present moment, but they may or may not be incorporated into a representation of the present moment. The immediacy of representation is one of the ways sensation differs from cognition: sensation does not represent in absence of the stimulus. But there is a difference between now representing x and representing x as now. Though sensory states represent features currently present, those features need not be included in the conscious sensory state that forms a representation of the present. Habitual stimulus-response patterns, for example, need not be included in a representation of the present because they are not required for planning future action. When the stimulus occurs, it generates an appropriate response automatically without the need for conscious processing. One reason very strong stimuli are processed consciously, in addition to an automatic response, is their likely importance toward future plans. Though touching something very hot results in immediate withdrawal, the pain is nonetheless conscious. Because pains of this sort are quite important to planning one’s next action (getting a hot pad, moving further away from the stove), the sensation is coordinated into a representation of the present moment.
Because conscious processing takes time, its value probably lies in long-term direction, rather than immediate action or even the inhibition of action. As anyone knows who has ever tried to stop herself in the middle of making a social gaffe, once an action is begun, it is nearly impossible to restrain. When lucky, there is just time enough to plan an appropriate apology. We keep track of the present moment largely in order to plan for the ones that follow. For immediate action we tend to rely on our previous plans, those informed by previous conscious sensory states and those hard-wired by evolution. Let’s not forget that we are speaking on the time-scale of seconds here. Although my conscious sensory representation of ‘now’ may not direct my actions at the same moment represented by ‘now’, the representation of ‘now’ influences my actions in the next second. Such an influence can be considerable, and it usually is.
III. SO WHAT GOOD IS A SECOND SENSE?
If you found Section II as persuasive as I hope, you may be tempted to stop now. We have a good operational definition of conscious sensory states: coordinated sensory representations of what the world is like at the present moment. The difference between conscious sensory states and unconscious sensory states is that conscious sensory states represent what the world is like at the present moment, while unconscious sensory states are not included in these representations. So what does a second sense add to the theory?
One critical question remains in explaining the difference between conscious and unconscious sensory states: how is it that some sensory states are conscious and others are unconscious? We now have an idea about what constitutes the difference between conscious and unconscious sensory states and about why there exists such a difference. What remains is to show how some sensory states rather than others come to be conscious, for if we fail to address this additional problem, then we have failed to explain sensory consciousness and have merely described it. An adequate description would be no small achievement in itself, given the disagreements and difficulty in investigating the topic. Yet an explanation of how conscious sensory states come about is necessary to dispel the sense that they are inherently mysterious, somehow beyond our full comprehension. My efforts up to this point have been to identify a phenomenon to be explained, sensory consciousness, and to offer a theoretical identification of that phenomenon. The next step is to explain how conscious sensory states come about.
I suggest that a second sense causes some sensory states to be conscious. Like the higher-order inner sense theory, I maintain that some kind of sensory mechanism makes sensory states conscious. Decidedly unlike the higher-order inner sense theory, the second sense produces conscious sensory states that represent the world. An inner sense, according to the higher-order theory, produces representations about sensory states. By virtue of these higher-order representations, sensory states become conscious. At issue here is exactly what we are conscious of when we have conscious states. On the higher-order view, we are conscious of our sensory states. On my view, which is a first-order or flat theory, we are conscious of the world.
For both types of theory, conscious states represent the world. The difference lies in how states become conscious. Higher-order theories take conscious states to be those mental states that are objects of some special kind of higher-order representation. Flat theories, on the other hand, take conscious states to be a special kind of representation of the world. One of the goals of Section II was to isolate the features of conscious sensory states that make them special: they are coordinated representations of the world at the present moment. In this Section we will see how such representations might come about.
While the following offers a causal explanation, in this case describing the cause of conscious sensory states is still part of the project of describing in what conscious sensory states consist. I have said that conscious sensory states are coordinated representations of the present moment. Now I will argue that what it means to be a ‘coordinated representation’ is to be selected by a second sense. In other words, to be a conscious sensory state requires that the state represent the present moment and that it consist of sensory representations coordinated by a second sense.
To be a ‘coordinated’ representation implies some sort of organization. There is good reason to believe that the sort of coordination involved in sensory consciousness is task-related. Representations are organized so as to most effectively complete whatever task is at hand. To accomplish this organization, then, some sort of mechanism is required to select and combine the representations needed for the task. I take the second sense to be this mechanism. Sensory states are conscious when selected by the second sense and coordinated into a representation of the present moment. Sensory states not selected remain unconscious.
The importance of organization to sensory consciousness is often overlooked in our rapture over its phenomenal aspect. Sensory consciousness is closely related to attention in the way that we can, to some extent, control the content of conscious sensory states. By a shift of attention some things rather than others become the content of our conscious sensory states. Though not normally state conscious of my body position, I can become state conscious of how I am sitting, the tilt of my head, the seat pressing against my legs, etc. by focusing on these various body positions. Similarly, I can shift my attention from the computer screen to the hum coming from the next room, thereby becoming state conscious of the refrigerator sound. Thus, previously unconscious sensory representations – of body position, of background sounds – become conscious through purposive shifts of attention. The ability to direct attention so as to be state conscious of some things rather than others is the function of a second sense.
If conscious sensory states are coordinated sensory representations that facilitate effective action in the world, there must be some way to control which representations are selected. In many situations, some sorts of sensory representations are clearly irrelevant or downright distracting. Without some way to select some sensory representations and eliminate others, sensory consciousness would be overloaded with a random mix of sensory content. While such a psychedelic panoply of sensation could very well be conscious in some sense, it is certainly pathological.
Think again of the computer processing example. While working away at my computer, I had many sorts of sensory representations that remained unconscious. Some examples are my representations of body position and the subtle motor adjustments required to wiggle about without falling off my chair. These representations were peripheral to my current task and so remained unconscious. Which sensory representations are conscious and which remain unconscious is task-relative to a large extent, so there must be some way to select which sensory representations need to be coordinated into a conscious sensory state in order to accomplish the task at hand. In order to be successful in my work, I need a mechanism that selects and coordinates appropriate sensory representations. Without this sort of control over the content of conscious sensory states, the value of sensory consciousness in facilitating effective response would be lost. For conscious sensory states to be effective, the second sense must be able to select appropriate sensory representations to coordinate into conscious sensory states.
Of course, the content of conscious sensory states is not always under our control – a nagging tune or disturbing image are familiar examples of uninvited conscious contents. Such cases raise the question of who controls a second sense mechanism. While it is possible for us as conscious creatures to exercise a great deal of control over the contents of our conscious sensory states, it is useful to keep in mind that sensory consciousness is a relatively basic phenomenon evolutionarily. As Bridgeman suggests, it probably arrives with the ability for deciding among possible actions. (Bridgeman 1992, 207-208) On the proposed hypothesis, conscious sensory states are coordinated sensory representations of the present moment so as to facilitate effective action. Often ‘effective action’ will simply be whatever fulfills the creature’s current goals and desires, and so the content of conscious sensory states will reflect those purposes. At other times, however, the urgency of the stimulus may override other creature wishes, such as when feeling severe pain or hearing a sudden, loud sound. The evolutionary advantage of immediate conscious appreciation of these sorts of sensory stimuli is fairly clear. The value of a nagging tune or disturbing image is harder to explain. But such intrusions may exploit the usefulness of other sorts of representations that have more obvious survival value. Nagging tunes tend to nag because they have been repeated so often, and repetition is a useful way to sort between transitory things and things that recur. Recurring things are good to remember because they can serve as markers, both for positive and negative features of the environment. Likewise, really shocking things are good to remember – the shock is usually quite pleasant or unpleasant – so it is handy to easily recognize them so as to pursue or avoid. Whatever the final explanation, we should consider the control function of a second sense in larger terms than the satisfaction of a particular creature’s current goals and desires. The ‘effective action’ which forms the function of conscious sensory states is primarily a matter of evolutionary effectiveness, and is only secondarily directed toward individual goals.
It is also worth pointing out that a second sense is in this way similar to the external senses. I can send my eyes or limbs on a specific mission to acquire information about one or another aspect of the current environment, yet I cannot fail to see what is directly in front of my eyes or feel what my limbs are touching. I may fail to see that it is an apple in front of me or feel that the substance in the bowl is macaroni, but I cannot fail to sense the stimuli that impact my sensory organs. In this way, the senses are passive. Yet they are also active and can be to some extent directed, with sight and touch as most amenable to direction, smell and taste least directable, and hearing somewhere in between. I can hear some things better by turning my head, or cupping an ear, but by and large I will hear whatever sounds are available. A second sense would likely fall in line with sight and touch on this continuum. When deeply focussed on a task, I am able to eliminate almost all distractions. Yet when first waking in the morning, I am bombarded by sensations with little ability for selection or control.
Given the analogy here between external senses and second sense, this is a good point to consider the reasons to call this mechanism a ‘sense.’ In what ways is a second sense similar to the external senses, and in what ways is it different? I see three features as essential, if not necessary, to a ‘sense’: 1. it is non-cognitive; 2. it serves a relay function; and 3. it has particular forms of inputs and outputs. All three features apply to a second sense.
First, the mechanism in question is non-cognitive. That is, sensation is non-conceptual, involves tracking, and is detailed, whereas cognition is conceptual, represents in absence and abstracts from detailed presentations. Foremost, sensation requires no concepts. The minimum requirement for having a concept of a thing is the ability to individuate it, to separate it from its surround. Since it is possible to sense an object without being able to isolate clear borders between that object and its neighbors, sensation does not meet this requirement. The most I may be able to say about the content of my sensations is some vague gesture, like ‘stuff over there.’ Nor does sensation require the ability to identify an object or to re-identify it upon repeated presentation. Further, sensory representation relies on tracking, keeping an object in view (or hearing, or touch). Once the object is out of sight, sensory representation ceases. On the other hand, one critical marker of conceptual representation is the ability to represent an object in absence. Thinking about a cup does not require that there be a cup in front of me, but seeing a cup does. In line with this difference, sensory representation is more detailed than conceptual representation. Concepts abstract across differences in particular presentations of an object so as to isolate the key features that will secure re-identification. Because sensations maintain the object in view, abstraction of detail is not required. This brief examination gives us three markers (not to be confused with necessary and/or sufficient conditions) for the distinction between sensation and cognition: sensation is non-conceptual, involves tracking, and is detailed, whereas cognition is conceptual, represents in absence and abstracts from detailed presentations.
The second sense exhibits all three of these markers for sensation but differs from external sensation in one important respect. For external senses, the causal source of sensory input is the same as the object represented in sensation. If I have a sensory representation of an apple, provided the representation is veridical, the causal source of that representation is an apple. I track the apple with my eyes, and the detail represented is about the apple. For the second sense, however, causal source and object represented come apart. Sensory states are the causal source of conscious sensory representations, but they are not the object represented, on my account. So if I have a conscious sensory representation of an apple, the causal source of that representation is a sensory representation of the apple. While it is the sensory representations that I track with my second sense – these are what get selected and combined into conscious states – the detail represented by conscious sensory states is about the apple. This separation between causal source and object represented forms the primary difference between external senses and second sense and will figure in the remaining two points as well.
The second reason to call the second sense a ‘sense’ is that it is a relay mechanism. External senses take various forms of physical stimuli as input and relay this information in the form of sensory representations to cognitive structures, motor systems, and, per hypothesis, to the second sense. Similarly, the second sense takes the information supplied by the external senses and relays it in the form of conscious sensory representations to cognitive structures, motor systems and in all likelihood, back to the external senses. Being a relay mechanism alone is clearly not sufficient to qualify anything as a ‘sense’ since cognitive structures relay information and are non-sensory (as are other relay mechanisms like telephone wires, fiber-optic cables, etc.) It is the sort of information relayed – non-cognitive mental representations – that distinguishes a sense.
Third, the flip side of the previous point, a second sense has specific forms of inputs and outputs. The input of a second sense is limited to sensory representation; this is the form of input to which a second sense is sensitive. External senses take physical stimuli as input and produce sensory representations as output; the second sense takes sensory representations as input and produces conscious sensory representations as output. The output of second sensing is also specific: conscious sensory states. Each external sense produces its own variety of representation, which represents the object in a specific way. Eyes produce visual representations, ears produce auditory representations, etc. The second sense produces conscious sensory representations, coordinated sensory representations of the world at the present moment. Conscious sensory representations combine features from several sensory modalities and so are not unique in the kind of feature they represent. Conscious sensory representations are unique in the way they represent those features, as coordinated sensory representations of the world at the present moment. No other representations represent in just this way.
There are no doubt other differences between external senses and the second sense, but the similarities are sufficient at least to say that the operations involved are more like sensation than cognition. The ability to control the content of conscious sensory states suggests that there is a mechanism of some sort dedicated to producing them. This mechanism does not require concepts, relays a specific sort of information (sensory representation), and produces a specific sort of representation (conscious sensory representation). For these reasons, it seems justifiable to call the second sense a ‘sense’.
IV. SOME TOUGH QUESTIONS
With the main elements of the theory in view, I see four serious questions about the second sense theory that should be answered immediately, as they address central claims about the necessary conditions for conscious sensory states. The first question deals with the number of sensory states necessary for coordination, specifically whether more than one sensory state is required. The second concerns the degree of coordination necessary for sensory consciousness. The third question raises the complicated issue of how a second sense coordinates sensory states, and the possibility that the mechanism cannot be flat and fulfill the coordination function required. The fourth question is whether the second sense is some kind of homunculus, or ‘little person’ in the head, and merely moves the problems of sensory consciousness inside a level.
First, we can ask how many sensory states must be coordinated to constitute sensory consciousness. I suggest that at least two sensory states are required. Semantically, it makes no sense to speak of ‘coordinating’ a single state. So to admit conscious sensory states composed of a single state would mean abandoning the idea that conscious sensory states are coordinated representations. And if we abandon the idea that conscious sensory states are coordinated representations, then we can no longer claim that the function of sensory consciousness is to coordinate sensory representations so as to facilitate effective action. Creatures that are unable to make decisions probably do not have conscious sensory states; they simply react to sensory information according to set patterns. It is when several different actions become possible that conscious sensory states are required to keep track of current environmental conditions. Conscious sensory states inform the creature about how the world is ‘now’, what effects previous actions have had and whether any new prospects or dangers are afoot. To admit singular sensory states can be conscious is to give up this explanation of the function of sensory consciousness.
Furthermore, conscious sensory states seem phenomenologically to represent multiple features rather than simple features such as the color white or the sound of middle C. While there is nothing logically impossible about the idea of a conscious sensory state constituted by the sensory representation of a simple feature, such an idea stretches the imagination. Normally our conscious sensory states are cluttered with so many varieties of sensory representations, it is difficult to think what sensory consciousness would be like if it represented one feature exclusively. If that feature were the color white, then the display would have to be absolutely homogeneous, with no wrinkles or shadings. For if these occurred, then arguably two features – two shades of white – are represented, not one. All of a person’s proprioceptive receptors would also need to be stilled, since in the absence of other forms of stimulation, it is likely that these normally unconscious sensory states would become the focus of second sensory operations. At this point, without the normal array of sensory representations and without any proprioceptive feedback, the form of consciousness exhibited by a single conscious sensory state would be significantly different than the sort of sensory consciousness we enjoy.
Less hypothetically, it may be the case that mystics can achieve states constituted by a simple sensory representation. The practice of deep meditation may allow the mystic to focus so narrowly as to exclude all sensory states but one. Again, though, this form of consciousness is significantly different than the normal sort of sensory consciousness, and I suspect the mystic would agree. If such a form of consciousness exists, its function is likely quite different than the function of sensory consciousness. For the mystic, a lifetime of training and practice is necessary to achieve the rarified mental states of mystical consciousness. Mystical abilities are not naturally occurring, evolutionarily endowed talents. Consequently, such cases should not be included among the standard examples to be accounted for by a theory of sensory consciousness. Once we have agreement about common cases like long-distance driving and Spot-sight we can tackle more unusual phenomena such as mystical states and the possibility of singleton conscious sensory states.
Of course, the representation of complexity does not imply complex representations. A single sensory state could represent many features without being a coordination of many representations which each represent one feature. To assume otherwise would be to infer from content to vehicle without argument. So here is an argument. At minimum, sensory representation is individuated by sensory organ: the eyes produce visual representations, the ears produce auditory representations, and so on. Yet conscious sensory states represent a variety of sensory features as occurring together at a moment in time. So a conscious sensory state that represents two kinds of feature, say color and sound, must be composed of two sensory representations. On the other hand, as has been noted, sensory representations can be coordinated and yet remain unconscious. So the fact that conscious sensory states represent complex features does not require they themselves have a composite representational structure. The necessary coordination processes could take place prior to the production of a conscious state, and that single, already-coordinated mental state could then become conscious. However, this suggestion also leaves us without a function for sensory consciousness. Why do we have conscious sensory states if all the coordination work has been previously accomplished? What benefit does sensory consciousness bring to the creature already endowed with coordinated representations?
As for the second question, I have said that conscious sensory states are coordinated sensory representations of the present moment. Coordination is a matter of degree, so how much coordination is required? How well integrated must the sensory states be in order to count as conscious? What is the minimum amount of coordination necessary to constitute sensory consciousness? It seems to me that degree of coordination parallels degree of sensory consciousness, such that a little coordination would constitute a little sensory consciousness. Those in-between states as we drift off to sleep, characterized by a confused, random mix of sensations, are conscious sensory states on the edge of unconsciousness. Such loosely coordinated states facilitate almost no form of action, only the most general actions like changing position or pulling a blanket around your head. On the other end of the continuum, the most well-coordinated states involve highly focussed attention on a very specific task, such as the jeweler requires when repairing the intricate inner workings of an antique watch.
The third question raises an altogether new issue: how does the second sense perform its coordination function? Specifically, how does the second sense choose which sensory representations to combine into a conscious sensory state? If the second sense coordinates sensory representations from several sources, there must be some way for it to identify the representations in order to get the combinations right. Suppose I am currently representing a variety of colors, sounds and shapes, all of which get coordinated into my conscious sensory representation of a train passing. The question is, how does the second sense select from all of the available sensory representations to produce this useful, albeit disappointing, conscious sensory representation of the train passing? How does the second sense manage to choose just the sensory representations of the train at this moment? It would seem that the second sense must first identify what the sensory representations are about and when they occurred in order to select the right ones. But identifying a sensory representation requires representing it. To identify a sensory representation as a representation of this particular train, the second sense would have to have some way to represent that it is a representation about the train and that it represents some particular moment. Thus, it would seem that the second sense must produce higher-order representations, representations about its sensory representations, in order to accomplish its coordination function. If so, the second sense cannot be flat after all; it must be higher-order.
While identifying one’s representations by content is certainly one way to coordinate them, it does not seem to be the only way. Nature is filled with marvelous devices for selection and combination of things that do not rely on the ability to identify them. In photosynthesis, for example, plant cells extract water, carbon dioxide and solar energy from the atmosphere to produce complex hydrocarbons for the plant’s growth and development. No one supposes photosynthesis involves representations, so it cannot be a matter of sorting on the basis of representational content. Therefore, some other mechanism of selection and combination must be involved. Perhaps the second sense operates in the same sort of way. Or consider another familiar selection and combination mechanism, the computer. Computers are selection and combination wizards but are the foremost example of devices incapable of appreciating content. I see no reason why the second sense could not function like one of these non-representational devices. If so, it must be the case that features of the content of representations are somehow encoded in their vehicles. The vehicle of representation must wear its content on its sleeve at least to an extent that would allow it to be sorted by the kind of ‘stupid’ mechanism I propose. I have no specific suggestions about the sort of coding system used by a second sense to select and combine representations, but there seems no principled objection to such a system.
Moreover, even if something like a higher-order representation were required to coordinate sensory representations, the resulting theory would still differ significantly from existing higher-order theories. For the sake of argument, let us say a second sense does indeed require higher-order representations in order to perform its coordination function. Note that the conscious sensory states which are the output of this new hybrid sense still represent the world; they are the result of coordinating sensory representations about the world. On this hybrid sense theory, as in my second sense theory, conscious sensory states are coordinated sensory representations of the world at the present moment. But unlike the flat theory, the hybrid theory utilizes higher-order representations in order to produce these flat representations about the world. By contrast, higher-order theories claim that a sensory state becomes conscious when represented by a higher-order state. In the higher-order view, the representation relation itself constitutes sensory consciousness. There is no new coordinated representation of the world, according to higher-order theorists, only the original sensory representations now enjoying the dubious honor of being represented.
The hybrid theory seems to be an unwieldy combination of flat and higher-order theories and so I will not consider it further. I suggest it only to show that, even if we admit the need for higher-order representation in producing coordinated sensory representations, this is not in itself an argument for any higher-order theory as it currently stands. Instead of exploring a hybrid theory, we should think more about how a truly flat second sense might work. It is an engineering problem: given these sorts of input, how could a device produce these sorts of output? I wish I had a ready answer, but I see no reason to believe an answer would be impossible.
Finally, the fourth question: is the second sense some kind of homunculus? Doesn’t the selection process of the second sense require that there be a ‘little person’ who ‘sees’ the representations and thereby chooses which ones to collect into a conscious sensory state? This sort of homunculus would indeed be problematic, for then we would need to explain how our internal homunculus ‘sees’ the representations and chooses them. We would have the same problem of sensory consciousness, once removed. As Dennett notes, the key to avoiding the problematic sort of homunculus, call it the Cartesian homunculus, is to insure the mechanisms involved are “relatively stupid functionaries.” (Dennett 1991a, 14) As I have suggested in my response to the third question above, the second sense may operate in a way similar to the ‘relatively stupid’ functional operations of photosynthesis and computer processing. There are no infinite regress worries concerning these sorts of selection/combination mechanisms, so to the extent that the second sense runs on similar principles, it is spared such worries as well.
Now, if we cannot find a purely flat engineering solution to the selection question, the second sense will necessarily be a much smarter sort of homunculus. On the hybrid view, the second sense must produce higher-order representations of sensory representations in order to determine which representations to select. Yet even here the mechanism can be sufficiently stupid to avoid the regress of the Cartesian homunculus. Lycan conceives of a higher-order sensor as “an internal scanner or monitor that outputs second-order representations of first-order psychological states.” (Lycan 1996, 31) There need not be any ‘executive’ scanner, and scanners can be directed at “representational subsystems and stages of same.” (Lycan 1996, 32) The result is a decentralized model of organization that Dennett has characterized as ‘pandemonium,’ where specialist mechanisms compete for control of mental processes. (Dennett 1991a, 239) Lycan approvingly describes Dennett’s “’Joycean’ machines that formulate synthesized reports of our own passing states.” (Lycan 1996, 31) Though I believe the content of conscious sensory states is more coherent than the pandemonium model suggests, it is clear that neither the flat second sense nor the hybrid sense requires a Cartesian homunculus to do its job.
There you have the complete account of the second sense theory of sensory consciousness. I have argued that conscious sensory states are coordinated sensory representations of the world at the present moment. A second sense is the mechanism that selects sensory representations and coordinates them into conscious sensory states. Therefore, the operations of the second sense are necessary to determine which sensory states are conscious and which are not. This selection function explains the way a creature can (to some extent) control the content of her conscious sensory states. Though second sensing is not like external sensing in every respect, there are sufficient similarities to call this mechanism a kind of sense. It is non-cognitive, relays information, and takes a particular form of input and output.
As a new theory, there are no doubt scores of objections and refinements to the theory that lie ahead. Whatever becomes of the details, I hope two central elements of the theory find fertile ground. 1. In keeping with contemporary moves toward externalism, I claim that conscious sensory states represent the world. Though we are capable of representing our own states, sensory consciousness is not a matter of higher-order representation. 2. Nonetheless, I do not deny that something importantly different occurs when mental states are conscious as opposed to unconscious states. Conscious states are a special sort of representation: a coordinated representation of the present moment, produced by a second sense. In my view, these two elements of the second sense theory are the key to a satisfying explanation of sensory consciousness.
 Some may object to the notion that thoughts are felt in any way at all. No matter. This initial canvas of usage will be whittled and tailored to focus on a particular form of consciousness for examination, one that does not include thoughts.
 I do not intend the theory to apply to conscious thoughts. The two main reasons for this restriction are: 1) Conscious sensory states are arguably more basic than conscious thoughts. Indeed, I will argue that there can be, logically if not practically, conscious sensory states without any concepts whatsoever. 2) As the discussion of ‘qualia’ suggests, sensory states have been the primary focus of discussion about consciousness generally, so a theory of consciousness should account for at least this sort of mental state.
 As I understand the most common version, the inverted spectrum problem is the possibility that two people could have identical functional systems yet experience different qualia. Whether or not an inverted spectrum is possible, the answer is to be found in a different kind of investigation than the one I am conducting. Whether someone could say they see red (and instantiate all of the functional roles of ‘red’, however we define ‘functional roles’) when in fact they experience green is a separate question from the question of whether they are experiencing at all. It is the latter question that concerns me here.
 On another construal, reddish and greenish are essentially conscious and so are not so easily separated from the explanation of conscious sensory states. But such a view arbitrarily separates factors in common between unconscious and conscious sensory states. For a full argument against identifying sensory qualities and conscious sensory states, see Rosenthal 1991a.
 See the expanding literature on perception, especially color perception.
 If you object to the language of representation here, you may substitute whatever form of non-representational language is preferable. Here, for example, one could say ‘when one has red or green visual sensation’ or ‘when one is seeing red or green.’
 Some may argue that there is ‘nothing it is like’ to have conscious thoughts and so this is a misleading characterization of an unconscious state. I believe there is something it is like to have a conscious thought, but this is not the place to argue the point. I have limited my discussion to conscious sensory states, and most agree there is ‘something it is like’ to have a conscious sensory state. Nonetheless, on occasion I will discuss conscious and unconscious thoughts as they arise in other theories.
 Some philosophers have chosen this route. David Chalmers, for example, takes consciousness to be a primitive property on a par with the most basic physical properties. (Chalmers 1996)
 I assume here that sensory states are or have the function of being representational. Thinking of sensory states as representational helps clarify the distinction I want to make between conscious sensory states (sensory states that represent external objects) and self-conscious sensory states (sensory states that represent internal objects, viz. mental states). ‘External’ means ‘external to the mind.’ So, even a seemingly purely internal event, such as a wave of nausea, could represent an external object, viz. one’s upset stomach. Similarly, pains and itches arguably represent the state of the body where they occur. For more on the idea that sensations are representational, see Michael Tye (1992, 1995, 1998) Though I am partial to viewing sensations as representations, it is worth noting that nothing in the following theory requires that sensory states be wholly or even partly representational. Later I will consider the possibility that representation may be the function of conscious sensory states, which would be an advantage to considering sensory states to be representational. It is in answering questions of function that non-representationalists are on particularly shaky ground, and is one of the strongest reasons in favor of a representationalist theory of sensory states as well as sensory consciousness.
 I am grateful to Crawford Elder for this example.
 Of course, one could call such confabulations ‘false memories’ in which case all true episodic memories indicate conscious states at the time of the event remembered. This would not be much help, however, since most episodic memories are distorted in some way. Moreover, it is impossible from a subjective point of view to determine which are the true and which are the false (parts of) memories.
 William Lycan (1996, 4) uses ‘what it’s like’ in a similar way to identify the target of his explanation.
 Some qualification is in order here. Self-conscious or introspective states, to be discussed shortly, can also be conscious, and these states represent internal objects, namely mental states. This sort of state is rare and, like conscious thoughts, raises problems of its own. In the name of expedience, therefore, I will put them aside. Further, Rosenthal does not think that sensations are representational (Rosenthal 1993c, 202) and so they do not represent external objects on his view. Nonetheless it is sensory and representational states that become conscious according to Rosenthal’s higher-order thought theory and representational states do represent external objects. A final concern is the possibility of conscious sensory states representing things that don't exist, internally or externally. My sentiments lie with the teleofunctionalist proposal that such states constitute malfunctions of a system designed to represent veridically. As far as I can see, though, any theory of intentional inexistents could apply here as well.
 Again, ‘the world’ includes states of one’s body.
 The title recalls Stalin’s show trials where false evidence was manufactured in order to convict political insurgents.
 In Orwell’s novel 1984 historians busily rewrite history to suit current governmental policy.
 Several commentators to Dennett and Kinsbourne’s article in Behavioral and Brain Sciences (1992) make this point. See especially Clark 207-8, Glymour et al. 209-10, Van Gulick 228-9.
 See Baars et al. 1998; Baars 1993, 131; Baars and Fehling 1992, 204. Cf. Lahav 1993 for a similar account of the distinction between unconscious and conscious processing.
 A non-representational description would be: The time represented is not just the time at which the sensations occur. Otherwise, the wrist taps would be represented as grouped into sets, since the sensations occur in sets.
 Dennett (1991a, 132) is confounded by this idea, calling it the ‘bizarre category of the objectively subjective’ or ‘how things seem to you even though they don’t seem that way to you.’ I do not find this category at all bizarre but think of it as simply another example that we know less about ourselves than we formerly believed.
 At this point one might wonder whether sensory consciousness is best described as a series of states or whether it is more like a single continuous process. The choice is not critical to the larger discussion, however. Like water in a stream, conscious sensory states flow continuously, periodically disrupted by sharp shifts in content and periods of unconsciousness. Nonetheless it is analytically useful to freeze the process so as to better examine the states that compose it. For this analytical benefit I will speak of sensory consciousness in terms of individual states.
 Not only has the moment represented as ‘now’ passed, but actions occurring ‘now’ were initiated even earlier.
 The appropriate sort of higher-order representation is a perceptual representation according to the inner sense theory; it is a conceptual representation according to the higher-order thought theory, the other popular higher-order view.
 Rosenthal (1993, 1997) uses ‘conscious of’ broadly to mean any mental state with an object, whether or not that state is a conscious state. I find this usage confusing and so have coined the locution ‘state conscious of’ to mark the restriction to conscious states with a direct object.
 Sensory consciousness is basic relative to such developments as language and other symbolic forms of representation.
 The direction of the external senses tends to parallel the direction of the second sense during a focused task, but the two can come apart in unfocused moments. Changes in head and body position often occur as part of routine behaviors for which no conscious attention is required.
 One could also individuate an object purely conceptually, by learning its name, for example, that clearly do not involve perceptually isolating the borders of the thing. As I am looking at the distinction between conception and sensation here, only the requirements for sensation-based concepts are relevant to the point.
 One common objection to calling the inner sense a ‘sense’ is that there is no obvious end organ. But, as Armstrong notes, there is no one end organ for proprioception. (Armstrong 1984, 111) Lycan makes the additional point that there is no claim here that inner sensing is like external senses in every single respect. (Lycan 1996, 28)
 Searle (1980) presents the classic argument for this point.