Skip to main content

Consciousness After Postmodernism

Ralph Ellis
Clark Atlanta University
[email protected]

Postmodernists have been suspicious of the term 'consciousness,' because it seems to suggest the existence of a separate ego-subject, standing over again an object which it 'represents,' and to neglect the sense in which this subject-object relation is an artificial creation of modernity (Globus 1994). The modernist notion of consciousness, which seems to presuppose such a bifurcated subject-object relation, has led to the need to choose between a mind-body dualism and its equally problematic alternative, reductionistic physicalism; it has encouraged naive-objectivist epistemologies such as empiricism and logical positivism; it leads to misunderstandings of the 'unconscious' and the role of unconscious hermeneutic contributions to the ways people experience reality; it exacerbates the problems of self-absorbed egoism, socio-political atomism, and the attendant unworkable contractarian approaches to political theory; and these are just a few of the worst problems that arguably can be blamed on the subject-object paradigm and the related notion of individual consciousness.

At the same time that postmodernism has shied away from discussions of consciousness, so have more traditionally oriented philosophers of mind and cognitive theorists, but for different reasons: Recent theorists have been obsessed with the computer metaphor and with an insistence on reducing 'the mental' to something scientifically (and 'physically') explainable. So the 'computational' model of mind, which viewed 'consciousness' as merely an epiphenomenon of unconscious computational processes in the brain, became a strange bedfellow to postmodernism. For a generation of traditionally oriented neurophilosophers and scientists, the attempt to understand those aspects of experiential systems such as human minds which are not analogous to computer functioning, or to partes extra partes mechanical systems, got swept under the rug.

This eschewal of consciousness-talk by both cognitive theorists and postmodernists was an uneasy partnership, however, because postmodernists were very skeptical of the 'mechanical' and 'physical' aspects of the non-conscious substrates emphasized by computationalists. The notion of consciousness was thus bound to re-emerge sooner or later, but in a new key -- no longer dominated by a mechanical worldview, with its unworkable forms of representationalism, atomistic individualism, naive empiricist epistemology, and consequently incommensurable languages describing the subjective and objective dimensions. This paper will describe what I see as a new approach to the philosophy of mind, reopening the question of consciousness, but without the reductionistic and atomistic baggage that modernism would have brought to it.

The new approach has arisen from a specific rejection of the old metaphysical dogmas that originally led to the occlusion of consciousness from philosophy and science: the assumption that subject and object are clearly distinct; that the reality which ultimately must explain mental functioning is at bottom an atomistic-reductionism; that representational conscious activities (thoughts and perceptions) are clearly distinguishable from non-representational ones (feelings and emotions); and, perhaps most important, that all reality is fundamentally reactive and passive rather than active -- i.e., that nothing does anything unless caused to do it by some external force acting on it, that there is no such thing as a pattern of activity which organizes its own substrata rather than the other way around. In short, for modernist metaphysics, there was no important or non-arbitrary distinction between non-living things and living ones (i.e., those which appropriate, rearrange, and reproduce the needed substrata in order to maintain a higher-order pattern of activity); yet the difference between conscious beings and non-conscious ones (e.g., computers) hinges crucially on this distinction.

I shall focus here on three anomalies that arise for modernist approaches to consciousness, necessitating a new approach to consciousness, not merely a deletion of the concepts of consciousness and subjectivity from the philosophical and scientific vocabulary:

(1) Consciousness is an enacting of rather than a passive reaction to the physical events which serve as its substratum; but neither is it the non-physical half of an ontological dualism.

(2) Mechanistic causes at the empirically observable level seem to underexplain consciousness because, as Chalmers (1995) points out, we can easily imagine any proposed empirically observable mechanism as occurring in the absence of consciousness.

(3) Mechanistic causes also seem to overexplain consciousness, in the sense that they provide necessary and sufficient physical antecedents for any given event, so that no causal power is left for consciousness; yet we know that conscious intentions do play a role in bringing about many movements of our bodies. Modernism's best attempt to avoid this anomaly was the thesis of psychophysical identity, which failed because it is impossible to know what a state of consciousness is like merely by knowing everything that can be known empirically about its underlying physical mechanisms.

1. The anomaly of the non-passivity of conscious attention: Newton's analysis of intentional actions.

In the modernist framework as applied to psychology and cognitive neuroscience, consciousness was supposed to be caused by, or to result from, something that happened in the brain. Perceptual consciousness, for example, was supposed to result from stimulation of the occipital lobe and V4 visual areas, which in turn resulted from stimulation of the nervous system by incoming sensory data (i.e., patterns of light). But this "appendage" theory, as Thomas Natsoulas (1994) has called it -- this notion that consciousness is a byproduct of a physical cause and effect mechanism (in which consciousness itself is an effect but does not act as one of the causes) -- has led to certain anomalies. For example, when the occipital lobe is activated by incoming visual data, there is no perceptual consciousness of the object until the parietal and frontal lobes are active (Farah 1989; Luria 1980; Posner 1980, 1990; Posner and Petersen 1990); yet the activation of the parietal and frontal lobes is not caused by the activity of the occipital lobe (Aurell 1989). We know this because of a curious but consistent finding: The parietal lobe, which is almost immediately adjacent to the occipital lobe, and which must be activated in order for a consciousness of visual images to be formed, does not become active until about a third of a second after the occipital lobe is activated when a novel stimulus is presented (Runeson 1974; Srebro 1985; McHugh and Bahill 1985). So the question arises as to why it would take a nerve impulse, which normally travels about 100 miles per hour, a third of a second to travel only a few millimeters. If the imaginative activity of the parietal lobe were really caused by the nerve impulse which travels to it from the occipital lobe, the impulse should be delivered virtually instantaneously. Whatever is happening during this third of a second that is also needed in order for consciousness of the object to occur cannot be caused by the passive receiving of the nerve impulse to the parietal lobe from the occipital lobe, which in turn receives it from the incoming stimulus. Thus it appears that the response is not caused by the stimulus.

This paradox arises only if we assume that the parietal lobe (which is active when we are conscious of visual images) can only be activated as a result of prior occipital activity, which in turn results from prior optic stimulation originating from the environment. But recent research shows that this is not the case. Instead, what happens is that, prior to occipital processing of the visual stimulus, the parietal lobe has already been activated by the frontal lobe, which in turn is activated by the midbrain, which is the focus of emotional-motivational activity as triggered by thalamic arousal by the stimulus only if the stimulus is generally felt as possibly emotionally important for the organism's purposes (Luria 1980; Posner 1990; Damasio 1994). The needs of the organism as a whole must first motivate the asking of questions about what kinds of environmental stimuli might be important for the organism's purposes; at this point, the frontal lobe becomes active. As these questions are formulated with the help of the frontal lobe, the parietal lobe then begins to entertain vague images and/or concepts of the kinds of emotionally important objects that might be present in the environment. If and when this frontal-limbic-parietal activity, once having been developed, finds itself resonating with patterns of activity in the occipital lobe (which reflects sensory stimulation) -- only then does perceptual consciousness occur. The one-third-second delay does not result from any slowing of the incoming nerve impulse as it 'travels' from the occipital lobe to the parietal lobe. The parietal lobe (which is active when we are conscious of visual images) is not activated in response to the occipital lobe's activity at all. Instead, the organism must purposely activate the frontal and parietal lobes to 'look for' emotionally important categories of objects which the thalamus has already alerted the organism might be relevant, and this 'looking for' activity has already begun the forming of visual or conceptual imagery (including proprioceptive and sensorimotor imagery) prior to any occipital activity's having any effect on our perceptual consciousness (since at this point the impulse has not yet 'traveled' from the occipital to the parietal lobe). Rather than the frontal-parietal system's being a response to an occipital stimulus, the frontal-parietal activation must already have taken place before perceptual consciousness is possible, and the frontal-parietal pattern is what determines whether any given perceptual input will even register in consciousness, i.e., will be attended to.

The organism must act on its environment in order to be conscious of it; consciousness cannot result from a mere passive reaction to incoming input. Thus the model of the mind as a passive receiver of causal work done by stimulus inputs and other mechanical computations places the cart before the horse. The organism must first purposely act, and only then can consciousness of the environment result. It is this fundamental shift in the direction of causation which is now sometimes referred to as the 'enactive' view of the mind -- a term coined by Varela et. al. (1991). Rather than a stimulus' causing a response, it is the response which must occur first, and then act on the incoming afferent signals to produce a stimulus. We might call this enactive approach the current 'Copernican revolution' in cognitive theory and neuroscience.

Perhaps the clearest and most thoroughgoing expression of this kind of theory has been developed by Natika Newton (1982, 1989, 1991, 1992, 1993, 1996). According to Newton, every perceptual consciousness is preceded by an act of imagination, which creates a subjunctive based on a motivated act of action planning. The action motivated by the action planning process creates certain 'expectations' as to environmental feedback, and these expectations, whether fulfilled or not, constitute mental images of a subjunctive nature. Then, if the expectations are fulfilled as expected, the result is not a mere mental image (of a non-present object or state of affairs), but a perceptual image of an object or state of affairs as actually present in the environment. The expectation, however, must precede the effect of the incoming data from the senses on our perceptual consciousness. Subjunctive ideas are thus prior to perceptual input, and action planning guides the process of 'looking for' instantiations of the subjunctive category (e.g., the image) as actually instantiated in the environment.

Newton is a prime example of a vanguard of current neuroscientists who believe that consciousness plays an active, 'pragmatic' part (in the sense used by Dascal 1987) in bringing about many kinds of information processing, and is not just an epiphenomenon or 'appendage' to a basically non-conscious computational process. Building from a traditional foundation of 'Gibsonian affordances,' (i.e., that we understand and identify an object by imagining how it would be possible for our bodies to act in relation to the object), Newton uses recent neurological work (for example, Damasio 1994; Posner1990; Farah 1989; Luria 1980; Cytowic 1993), and a just-emerging knowledge of the workings of mental imagery (as related to the 'mental models' developed by Johnson-Laird 1991) to show how action planning grounds our understanding of objects, and ultimately of language, concepts, and logical relations. An infant identifies objects in terms of whether they 'afford sucking,' 'afford throwing,' etc. Similarly, as adults, when we anticipate how we might act in relation to an object or situation, we execute the rudiments of a subjunctive conceptualization. For example, to anticipate that 'If I throw a ball at something (under appropriate circumstances) it will knock it over,' is very similar (linguistically, neurophysiologically, and phenomenologically) to believing that 'If I were to throw a ball, it would knock something over.' Thus anticipations of the future ground our understanding of subjunctives and thus of abstract concepts. In Newton's approach, the key to this 'foundation of understanding' is the process of action planning. To make this case, Newton relies on extensive neuroscientific evidence (the details of which would be beyond our scope here) -- for example, the finding discussed by Ito (1993) and by Damasio (1994), that the brain mechanisms underlying abstract thought are identical to those underlying action-planning in the context of body movement.

Notice the eschewal in this approach of each of the modernist biases mentioned above:

(1) Because the organism must anticipate actions toward its environment in order for consciousness to occur, consciousness is not merely passively caused by incoming stimuli or unconscious computations performed on incoming stimuli. The body's organization of stimuli occurs prior to the reception of the stimuli, and if the body does not actively seek to appropriate and rearrange the physiological substrata for its own desired patterns of conscious activity, this consciousness can never occur.

Since consciousness is a higher-order process which must actively seek to appropriate and rearrange lower-level processes which are needed as substratum elements for its motivated pattern of activity, such a higher-order process cannot be explained as the causal result of the discrete actions of its own physiological substrates. It would be as misleading to explain consciousness as passively caused by the discrete mechanical interaction of particles of brain matter as it would be to explain a sound wave passing through a wooden door as being caused by the actions of the particles of wood in the door. Instead, it is the sound wave, originating elsewhere, that causes the particles to vibrate in the pattern they do -- a fact we would overlook if we were to content ourselves with explaining the pattern of the wave as being caused by the discrete movements of its substratum elements.

(2) Another modernist assumption that also must be rejected with an enactive approach such as Newton's or Varela's is the notion that consciousness plays no significant role in information processing -- i.e., the epiphenomenalist notion that consciousness is merely the tip of an iceberg which consists of unconscious computational brain processes. Instead, consciousness directs much of this activity, and much of it would never take place without the direction of consciousness; yet it is important that consciousness itself is embodied -- not in computational cerebral processes, but rather in emotional and motivational activities of the whole organism. It is the emotionally motivated process of action planning that directs the focus of conscious attention, not a computer-like computational process.

(3) This implies the rejection of still another set of modernist biases -- the presumption that representational states (thoughts and perceptions) are clearly distinguishable from non-representational ones (feelings and emotions), and the corollary presumption that subject and object are clearly distinct. The emotional purposes of the whole embodied organism direct conscious attention, which in turn influences in a necessary way what we perceive and think. We can be consciously aware of this whole process through 'proprioception,' and much (if not all) rational processing results from what Newton calls 'proprioceptive imagery.' I.e., we proprioceptively imagine what it would be like to throw a ball (when forming a subjunctive concept of such an event), to move our bodies rhythmically (for example, to the rhythm of a certain pattern of logical inference such as modus tollens or hypothetical syllogism). But proprioceptive imagery is directed toward something that is neither clearly subject nor clearly object -- my embodied self.

An enactive approach to consciousness leads to very different analyses of the relation between physical causation and conscious intention from any that were possible in modernist atomistic-reductionism, which viewed reality as fundamentally reactive rather than consisting of patterns of activity which appropriate their atomistic components. Beyond the one just discussed, the failure of the modernist conception of causation as completely passive led to still another anomaly in the philosophy of mind, whose solution also requires a rejection of this atomistic conception of the nature of causation.

2. The anomaly of physicalistic underexplanation: Chalmers' 'hard problem'

The approach which I have been outlining is also equipped to offer a new perspective on Chalmers' (1995) 'hard problem' of consciousness. The dilemma as formulated by Chalmers is very similar to G.E. Moore's 'open question' argument in ethical theory (Moore 1900-1956). Against any given physical explanation of consciousness, the question can be raised, 'But isn't it conceivable that all the elements in that explanation could occur, resulting in all the same information processing outcomes that would be produced in a conscious process, but in the absence of consciousness?' For example, computationalists have maintained for the past 30 years that consciousness can be explained either as an epiphenomenon of, or as identical with, a digital computer-like process which uses the hardware of the brain to process its software. But we can easily imagine such a computational process as occurring in the absence of consciousness. Therefore, some further explanation is required in order to understand why consciousness does in fact accompany such computational processes in certain cases (for example, in human organisms).

Just as in Moore's 'open question' argument, here too, the dilemma cannot be escaped simply by defining consciousness as such-and-such by arbitrary fiat (any more than we can define 'morally right' by arbitrary fiat as 'productive of pleasure'). For example, we cannot arbitrarily define consciousness as 'a linguistic processing system whose outputs resemble sentences in the English language, and which follows the principles of logic as contained in Copi's logic textbook.' The problem here would be the same as with Moore's open question: We could always ask, 'Yes, but is it not conceivable that a physical system could process information according to the rules of logic and the English language, without being accompanied by consciousness?'

If consciousness is not to be defined by arbitrary fiat, then how do we define what it is that we are trying to explain when we try to explain consciousness? Before asking this question, we must have a notion of what we mean by 'consciousness' as it occurs in our question. And, of course, anyone capable of formulating such a question does know what is meant by consciousness, because this person would have to experience her own consciousness in order to know what question she is trying to formulate (Gendlin 1992). Other than assuming such direct experiencing of one's own consciousness, there would seem to be no way of letting anyone know what we mean by the word 'consciousness,' since the stating of any definition presupposes that the hearer of the definition knows what it is to be conscious of something (in this case, the meanings of the elements in the proposed definition). Part of what we want to address in any explanation of consciousness, then, is the phenomenal experience of consciousness, as opposed to a definition by arbitrary fiat. Correlatively, part of what makes Chalmers' 'open question' so difficult is that, if we were to arbitrarily stipulate some non-phenomenal (i.e., physicalistic) definition of consciousness, it would be easy to imagine that the proposed physical process might have occurred without consciousness as phenomenally experienced, and therefore the latter is not really adequately explained by the physical explanation being proposed; some further explanation seems required as to why this particular physical process could not have occurred without being accompanied by the phenomenal experience of consciousness. (A similar point is made in Ellis 1995: 11-30; and by Goldstein 1994.)

At the same time, if there is to be any hope of seriously addressing Chalmers' open question, the definition of consciousness must not only be framed in phenomenally experienceable terms; it must also be broken down into specific enough elements that these elements can be correlated with physiological substrata which, in the final analysis, will turn out to be unimaginable without being accompanied by the corresponding elements of consciousness. It cannot be enough merely to say that consciousness is simply indefinable except through a direct experience of it. If the dualism that plagued modernism is to be avoided, we must identify elements which, on the one hand, are necessary for the phenomenal experience of consciousness, and on the other hand can be bridged to the empirically observable world.

The enactive approach we have been discussing meets these requirements, because on the one hand it characterizes consciousness not by arbitrary fiat but as phenomenally experienceable, while on the other hand the elements of the description lend themselves to being correlated with empirically observable physiological substrata, so that at the end of the day it should be impossible to imagine this particular combination of physiological substrata as being unaccompanied by its conscious correlates. The enactive view of consciousness can be characterized as follows:

Consciousness requires an interested anticipation of possible sensory and proprioceptive input in such a way that the pattern of the subject's interest determines the modality, the patterns, and the emotional significance of the anticipated input. Specifically, the anticipation takes the form of a sensorimotor, proprioceptive and affective 'image' of a state of affairs which is 'looked for' by the subject. The content of consciousness is vivid to the extent that the activity constitutive of the interest in the future resonates (in terms of holistic patterns of activity) with the activity of incoming (afferent) imagistic data, and with the activation of memories of past imagistic and conceptual data.

The sense in which enactive theorists suggest that consciousness is an 'anticipation' of a possible input is experienced very clearly by subjects in perceptual experiments who are instructed to imagine an object before it appears on a screen, or to continue looking for the object while other objects are being flashed intermittently; the object being imagined or looked for is in fact perceived more readily (Corbetta 1990; Pardo et al 1990; Logan 1980; Hanze and Hesse 1993; Legrenzi et al 1993; Rhodes and Tremewan 1993; Lavy and van den Hout 1994). Posner and Rothbart (1992) report that "During vigilant states the posterior attention network can interact more efficiently with the object recognition system of the ventral occipital lobe (96)." This attentional process "increases the probability of being able to detect the relevant signal (97)." To imagine the object is to be on the lookout for it, and vice versa. This is the sense in which to form a mental image of a wall as blue means to 'look for' or to 'anticipate' blue in the wall. If we imagine a pink wall at which we are actually looking as blue, we are putting ourselves into a state of readiness or vigilance to see blue if it should occur. If this were not true, then subjects asked to imagine an object would not see it more readily than those who are not already actively imagining it at the point when it is presented. When we look for something, we prepare ourselves to see what we are looking for. As Merleau-Ponty says, "I give ear, or look, in the expectation of a sensation, and suddenly the sensible takes possession of my ear or my gaze, and I surrender a part of my body, even my whole body, to this particular manner of vibrating and filling space known as blue or red" (1962: 212). And again, "It is necessary to 'look' in order to see" (1962: 232). And "The warmth which I feel when I read the word 'warm' is not an actual warmth. It is simply my body which prepares itself for heat and which, so to speak, roughs out its outline" (1962: 236). Helmholtz (1962) makes a similar point which is now widely accepted among neurologists: "We let our eyes traverse all the noteworthy points of the object one after another." I.e., the organism must actively search for information in the environment before that information is consciously seen. Vision is active, not passive.

We have already seen that abstract thought involves anticipation as much as does consciousness of sensory or perceptual imagery. To anticipate that if I throw a ball at something it will knock it over is similar to believing that if I were to throw a ball, it would knock something over. Thus anticipations of the future ground our understanding of subjunctives and thus of abstract concepts at the most basic level of phenomenal experiencing.

By 'interested anticipation,' I mean one which is emotionally motivated. The main feature that distinguishes conscious information processing from the kind of processing that nuts-and-bolts computers accomplish is this emotionally interested anticipation, which computers (and indeed all non-biological systems) lack (Cytowic 1995). We are conscious of incoming afferent data only to the extent that we actively 'pay attention' to them, and this process of directing attention is motivated by the needs of the organism. (From an empirical standpoint, afferent processing -- e.g., in the occipital lobe -- never results in conscious awareness of the object unless accompanied by frontal-limbic activity instigated by midbrain motivational activity -- see Posner 1990; Posner and Rothbart 1992; Damasio 1994; Farah 1989; Aurell 1989; Luria 1980.) In the enactive approach, the primary organismic need that motivates consciousness of objects is the need to anticipate future data which are considered important for the organism's purposes (Dennett 1996).

The above characterization of conscious experience emphasizes that the emotionally motivated anticipation of input leads to 'imagery.' By 'image,' of course, I do not mean a physical replica of some object, but rather the phenomenal sense that one is looking for (or listening for, tasting for, proprioceptively feeling for, etc.) some object or state of affairs that would take the form of an intentional object.

It should be emphasized that this characterization does not define 'emotions' and 'motivations' as necessarily conscious phenomena. Emotions and motivations are characterized by purposive strivings, and there do seem to be non-conscious yet purposive phenomena in nature, especially in biological organisms. For example, the human organism purposely does what is necessary to regulate its heartbeat and blood pressure, yet normally is not conscious of doing so. Merleau-Ponty has defined a 'purposeful organism' as one which changes, replaces, or readjusts the functioning of its own parts according to what is needed to maintain or enhance the existence and functioning of the whole organism.

Notice that any of the elements of the enactive characterization of consciousness, if they were to occur in isolation, could occur on a non-conscious basis (and therefore would be susceptible in principle to being bridged to empirically observable physical processes). For example, we can and do often have interested anticipations of the future without consciousness. Throughout nature we find purposeful activity without conscious awareness of anything. We can also have processing of afferent data without consciousness (i.e., non-conscious 'experience,' as in blindsight). We can have holistic processing without consciousness (as in holograms which are not conscious). We can have non-conscious interests alongside of non-conscious data processing, with no consciousness resulting from the mere additive juxtaposition of these elements. We can have non-conscious anticipations of the future (as in operant conditioning, or as when a computer predicts the future), juxtaposed with non-conscious activations of stored information or of present afferent activity, with no consciousness of the process. Consciousness occurs only when the interest in the future acts in such a way that the pattern of its own activity gives rise to an image (or concept) of a possible or alternative future; and perceptual consciousness occurs when this activity resonates with afferent activity and with activation of imagistic (or in some cases conceptual) memories. I.e., the interest in the future, the forming of the image, and the processing of the sensory (or sensorimotor) data do not merely occur alongside each other in additive juxtaposition, but instead the interest gives rise to the image; at the same time, the image, the interest, and the sensory (or sensorimotor) data all resonate with each other. The degree of resonance among these activities corresponds to the vividness of the consciousness. Also, the three elements of consciousness -- interested anticipation, image-formation (of a possible future), and (perhaps 'reentrant') activation of past afferent data (either as sense experience just having occurred, or as memory of sense experience long ago) constitute the three temporal moments, past, present, and future, so that every conscious experience seems to be stretched out over a very short but extended interval, rather than encapsulated in an infinitesimal 'present moment.' This quality of being 'stretched out' over the three temporal moments all at once is crucial for understanding the 'ineffable' quality of conscious experience, including its indefiniteness as to temporal modality, the inseparable blending of feeling with the intending of an object, and even a fuzziness in distinguishing subject from object (as when we attribute the phenomenal redness of an object to the object itself, as if the red were pasted to the surface of the object, or when we attribute the mood that an object produces in us to the object itself).

Most of the experienced properties of objects are attributed to the object because of subjective processing (for example, bats apparently 'see' certain sounds), yet these subjective attributes are ineffably blended and blurred with the apparent objectivity of the object. This blending results more basically from the way the organism blends the three temporal moments through anticipation and reentrant signalling. Here again, what is subjective and what is objective about our experience of reality is in principle not a clear or sharp distinction.

The enactive characterization of consciousness, taken from our phenomenal experience of it, can be broken down into elements which themselves can be studied in rational and empirical terms, but which, when they interact in a certain way, cannot be imagined as interacting in that way without also being accompanied by consciousness. These elements, essentially, are (1) an emotional motivation which grounds an interest in anticipating the future; (2) sensory, sensorimotor or proprioceptive imagery activated by this emotional motivation; and (3) a resonating between the activity of emotionally-motivated imagery and the activity stimulated by incoming sensory data and data reactivated through memory. If consciousness is characterizable as a certain kind of interaction of these elements, then the corresponding interaction of the necessary and sufficient patterns of activity in the physiological correlates of these conscious processes will be unimaginable without being accompanied by those conscious processes themselves.

An enactive characterization of consciousness thus makes possible a resolution of the 'hard problem,' because it bridges from the phenomenal level to the empirical-scientific level in such a way that the empirically observable elements could not imaginably relate in just that way without being accompanied by consciousness. But the way of relating at issue here is precisely one in which emotion and motivation actively drive the computational process, rather than arising as a passive reaction to it.

Consciousness, which inevitably includes an emotional element as part of the process of attentive awareness (an emotional element which is constitutive of the very 'felt' nature of conscious as opposed to unconscious processing), is a higher-order process which actively appropriates, replaces, and rearranges the physical substratum elements needed to maintain and enhance the pattern of its own process. It therefore cannot be that the pattern which is consciousness is passively a causal result of the actions of those substratum elements. But this in turn requires rejecting the same modernist assumptions that were called into question above, and for analogous reasons: Consciousness (subjectivity) is not caused by physical processes, but is not separable from them. Neither dualist interactionism nor causal epiphenomenalism can resolve the 'hard problem,' and psychophysical identity requires ignoring the difference between the phenomenal content of experience and its empirically observable correlates. Only an enactive approach can conceive of consciousness as a process which is inseparable from its substrata, because it is the pattern of their activity, yet is not passively caused by the actions of those substrata.

If the process, consciousness, is inseparable from its embodiment, yet its character is not passively caused by the nature of the bodily elements per se, then in more general terms there are processes in nature which actively appropriate their substrata rather than being passive results of them. And this requires rejecting the modernist assumption that natural processes do not act, but only react -- i.e., that nothing ever happens except as a passive reaction to some external force, that all reality is fundamentally passive. (For further discussion of this problem of non-passivity in non-conscious parts of nature, see Ellis, forthcoming.)

3. The anomaly of physical overexplanation

Although chalmers points out that explanations of empirical properties are not sufficient to explain why the empirical properties are accompanied by phenomenal consciousness, it is equally true that the empirical properties explain too much. I.e., if we accept the notion that one set of neurophysiological properties is the necessary and sufficient cause of some subsequent set of neurophysiological properties, then there can be no causal role for the corresponding conscious intentions. But we can easily observe that a conscious intention does play a causal role, because the conscious decision to raise my hand does play a part in bringing it about that the hand goes up. given the modernist approach to mechanical explanation, in which the empirically-observed level constitutes a sufficient causal chain, a process, such as consciousness, cannot appropriate and use its own substratum elements, so consciousness remains an irrelevant epiphenomenon which can play no causal role in physiological processes, including the computational processes of the brain. Physical explanations thus explain too much, in the sense that nothing is left to be explained by conscious intentions.

Without an enactive approach, in which consciousness is a process which takes physiological events as its substrata, there can be no solution to this problem of overexplanation. But if consciousness and physiology relate as a higher-order process relates to its own substrata, and such that the higher-order process is not merely caused by or equivalent to is own substratum elements, then physicalistic overexplanation ceases to be a problem. Suppose C1 and C2 are two conscious states, and that M1 and M2 are the physiological correlates of these conscious states.

C1 ----> C2
P1 ----> P2

In the modernist approach, if P1 was necessary and sufficient to bring about P2 (under the given circumstances), then nothing else could be either necessary or sufficient to bring about P2. Thus C1 could have no causal power to bring about P2. So, if C1 was the conscious intention to raise my hand, and P2 was the movement of the hand, it was necessary to say that the intention to raise the hand really played no role in the raising of the hand. Psychophysical identity could solve this problem only by creating a greater one. Of course, if C1 and P1 are the same thing as each other, then C1 and P1 could both be both necessary and sufficient to produce the same outcome. But C1 and P1 are not precisely the same thing as each other, because if they were then complete knowledge of P1 would yield complete knowledge of C1, whereas it doesn't. No amount of empirical knowledge and explanation of a headache can reveal to someone what it feels like to have a headache, unless that observer has also experienced something like a headache in his or her own consciousness.

Nor could causal epiphenomenalism solve the problem. Epiphenomenalism would simply say that C1 is caused by P1, and C2 is caused by P2, and neither C1 nor C2 causes anything. But, in the first place, if P1 causes C1, then P1 and C1 cannot be the same thing as each other; so the question arises as to what sort of entity C1 is if it is to be distinguished from a physical entity. Epiphenomenalism seems inevitably to lead to a metaphysical dualism. Moreover, it does not solve the problem, but only bites the bullet; it does not explain how my intention to raise my hand leads to the raising of the hand, but simply denies that it does lead to it.

Even this consequence could be avoided by an epiphenomenalist theory if it were plausible to posit that there is some little bit of matter in the brain whose only purpose is to serve as the substratum for consciousness, that this little bit of matter is caused to behave in the ways that correspond to conscious experience, and that it does not in turn have any effect on any other physical processes in the brain. But this would be inconsistent with what we have learned about the neurophysiology of consciousness. What the empirical evidence points to is that processing occurs in a conscious way only when it is very globally distributed in the brain. For example, we know that, when impulses caused by optic stimulation set up patterns of activity in the occipital lobe, but without coordinated limbic and frontal-cortex activity, no perceptual consciousness results from the occipital activity (Posner 1990; Damasio 1989; Eslinger and Damasio 1985; Damasio et al 1985; Nauta 1971; Luria 1973, 1980). Similarly, the transition from sleep to waking consciousness requires that the activities of the hypothalamus and cortex achieve a pattern of synchronization or coordination which was not present during sleep (Asimov 1965: 193; Ellis 1986: 46-52). When we are conscious of dream images during sleep, both efferent and afferent activity throughout the brain are detected, whereas during non-dreaming sleep both the afferent activity and some of the efferent activity are comparatively much less pronounced (Winson 1986: 46ff; Restak 1984: 315-333; Richardson 1991; Jouvet 1967). Another example is provided by the 1/3-second time delay from the activation of the occipital lobe (in response to a novel stimulus) to the presence of perceptual consciousness of the object, the latter correlating with coordinated limbic, frontal, parietal, and occipital activity (Aurell 1983, 1984, 1989; Runeson 1974: 14; Srebro 1985: 233-46). EEG and other electrical measures show that parietal activation does not occur until 1/3 second after the occipital activity, which by itself does not produce consciousness of the object. The activation does not merely 'travel' from the occipital to the adjacent parietal area; if it did, the distance involved would be traversed in much less time than 1/3 second. Instead, before perceptual consciousness can occur, the limbic system must be aroused, and it in turn must activate the frontal lobe to begin formulating questions about what the nature of the interesting or important environmental stimulus might be, which then activates ideas and/or images in the form of anticipations of possible perceptions with the help of the parietal lobe (Ellis 1995b, Ch. 1; Luria 1980; Posner and Rothbart 1992). Only as a result of this symphonic orchestration of global activity can the activity of the parietal lobe be matched against what is happening in the occipital lobe to see whether the image or idea hypothesized is actually instantiated in the environment. If so, perceptual consciousness of the corresponding object occurs. If not, a mere mental image of the object experienced as non-present occurs in consciousness. In either event, consciousness occurs only when brain activities are globally coordinated. (Interestingly, the subjective sense of a parietal-occipital 'match' in perceptual consciousness or 'mismatch' in the case of a mere mental image is another one of the 'valuations' that Jackendoff describes - the one that allows us to distinguish between a fantasized and a real object.) What these examples and many others suggest is that consciousness requires globally distributed processes in the brain, combining local mechanisms which under different circumstances would be active in various non-conscious processes.

Consciousness, then, cannot be confined to some small bit of matter which does not affect any other brain process involved in cognition. But if the physiological substratum of consciousness does affect further physiological and cognitive functioning, then consciousness affects further physiological and cognitive functioning, unless we assume that consciousness is somehow separable from its physiological substratum - which again would entail a dualism of physical and non-physical occurrences.

Moreover, if there is any local brain area that might be hypothesized to correspond with conscious processes more than any other, it would have to be the frontal lobe. I personally would not hypothesize that the frontal lobe is the main substratum of consciousness, but rather more modestly that it tends to be associated with consciousness because it is the area most responsible for coordinating the global patterns that constitute consciousness. But, even if I am wrong here - even if consciousness occurs 'in' the frontal lobe - this in itself would refute the appendage theory's defense against the charge that the valuations which direct attention are conscious, because the frontal lobe does play an extremely important role in causing the other processes that are necessary to direct attention. Luria (1980), Posner (1990), Posner and Rothbart (1992), and many other neurologists are now convinced that the frontal lobe is the crucial brain component in the process of directing attention to what is important. What makes it do so is that it receives rich input from the limbic system (importantly involved in motivational feelings and other 'valuations'), and then sends signals which coordinate the remainder of the cortex to be consciously aware of the arousing situation and to devise ways to deal with it.

Neither dualism, nor psychophysical identity theory, nor epiphenomenalism works as an explanation of the relation between consciousness and its physiological correlates, because the modernist concept of atomistic-reductionism does not allow a process to affect the behavior of its own substratum elements, but requires that a process must be caused by the interaction of the discrete movements of its substratum elements, each of which has a sufficient causal explanation of its own, so that the pattern of consciousness, paradoxically, can have no causal power.

But the enactive approach avoids this problem of causal overexplanation, because it does allow that a process can have causal power. In the case of the conscious states C1 and C2, and their physical correlates, P1 and P2, the enactive approach can allow that P1 is necessary and sufficient for P2 (under the given circumstances), while at the same time maintaining that C1 can also be necessary and/or sufficient for C2 and for P2. The reason is that, if C1 and P1 relate as process to substratum, then C1 and P1 are 'inseparable' from each other in the sense that they are necessary and sufficient for each other. If two events are necessary and sufficient for each other, then even if one does not cause the other, and even if one is not identical with the other, still, one of these events will be necessary and sufficient for whatever the other is necessary and sufficient for. Consider, for example, three dominos lined up in such a way that if domino A falls. it will knock over dominos B and C. Under these given circumstances, B's falling is necessary and sufficient for C's falling, but B's falling is not identical with C's falling, nor is it caused by it, nor does it cause it. Instead, B's falling and C's falling are events which, under the given circumstances, are inseparable from each other. Whatever is necessary and sufficient for one will be necessary and sufficient for the other.

The relation between a process and its substratum elements works out in a similar way. Since a process is inseparable, under the given circumstances, from the behavior of its substratum elements, then the process will also be necessary and sufficient for whatever its substratum elements are necessary and sufficient for. Yet this does not necessarily imply that the process is caused by its substratum, or that it is identical with it. Many things are true of a process which are not true of its substratum elements, even taken collectively. For example, a wave on the ocean may travel many miles in a horizontal direction, while its substratum elements, the movements of particles of water, are very small vertical oscillations.

The process-substratum relation in the case of consciousness is different from the relationship between a wave and the physical medium through which the wave passes in one crucial respect. Consciousness, unlike a sound wave or a wave in the ocean, is a purpose-directed process. Merleau-Ponty (1942-67: 47ff) defines a purposeful activity as one in which the organism's overall pattern of activity acts in such a way as to rearrange and readjust its various parts in order to maintain or enhance the overall pattern. Purely mechanical processes do not seem to behave in this way. A thermostat, while it will adjust its overall pattern to feedback from the environment, does not seem to be a purpose-directed system because, when one of its parts ceases to function or is removed, the thermostat does not act in such a way as to replace the missing part or try to compensate for its absence; it simply quits functioning. The thermostat does not 'care,' in this non-conscious sense of 'care,' whether it achieves its ultimate objective or not. It functions or not purely as an additive juxtaposition of the functioning of its parts.

It becomes increasingly clear, as we study the brain, the ecosystem, and the concept of 'living organisms' in biology, that at least many patterns of activity maintain their organizational structure across replacements of their own substrata. As Merleau-Ponty suggests, an organism will often rearrange the overall configuration of its parts if an imbalance is created in one part which disrupts the functioning of the whole. "'Forms' . . . are defined as total processes whose properties are not the sum of those which the isolated parts would possess. . . . We will say that there is form whenever the properties of a system are modified by every change brought about in a single one of its parts and, on the contrary, are conserved when they all change while maintaining the same relationship among themselves" (Merleau-Ponty 1942-1967: 47). One of Merleau-Ponty's favorite examples of this 'top-down' organizational structure in organisms is the development of the 'pseudo-fovea' in cases of hemianopsia. In these cases, the eyes change the functioning of the cones and rods from their original anatomical programming. In hemianopsia, the subject is rendered blind in half of each retina, so that he now has the use of only two half retinas.

Consequently one would expect that his field of vision would correspond to half of the normal field of vision, right or left according to the case, with a zone of clear peripheral vision. In reality this is not the case at all: the subject has the impression of seeing poorly, but not of being reduced to half a visual field. The organism has adapted itself to the situation created by the illness by reorganizing the functions of the eye. The eyeballs have oscillated in such a way as to present a part of the retina which is intact to the luminous excitations, whether they come from the right or the left; in other words, the preserved retinal sector has established itself in a central position in the orbit instead of remaining affected, as before the illness, by the reception of light rays coming from one half of the field. But the reorganization of muscular functioning, which is comparable to what we encountered in the fixation reflex, would be of no effect if it were not accompanied by a redistribution of functions in the retinal and calcarine elements which certainly seem to correspond point for point to the latter (Merleau-Ponty 1942-1967: 40-41).

Merleau-Ponty also notes the finding by Fuchs (1922) that all the colors are perceived by the new fovea even though it is now situated in a retinal area which in a normal subject would be blind to red and green. "If we adhere to the classical conceptions which relate the perceptual functions of each point of the retina to its anatomical structure - for example, to the proportion of cones and rods which are located there - the functional reorganization in hemianopsia is not comprehensible (41)." Here we have an excellent example of Merleau-Ponty's principle that in organisms the whole will readjust the functioning of some of its parts when other parts are disrupted, in order to maintain the original function of the whole. Other examples of self-directed neurophysiological reorganization following localized brain injury or trauma can be found in Restak (1984: 360ff). Kandel and Schwartz (1981) also place great emphasis on the 'plasticity' of the brain in reorganizing itself to accomplish its objectives by getting around disruptions in one way or another. They find, for example, that if brain cells of an embryo are transplanted to a different region of another embryo, they are transformed into cells appropriate to that region. This plasticity in the realizability of the mental functions of living beings has been emphasized by Putnam (1994), Horgan (1992) and Bickle (1992). Similarly, the organism's desires intend to remove the inevitability of electrochemical imbalance within the organism, not merely by eliminating this or that electrical imbalance (for example, in cases where to do so would only transmit the imbalance from one part of the nervous system to another, or from one bodily system to another), but rather by changing the context which renders the imbalance inevitable -- for example, by spatially removing the entire organism from the disturbing stimulus, by destroying the disturbing stimulus, or by finding or creating a stimulus in relation to which the whole organism's balance can be restored.

The twentieth century philosophy of mind has made every effort to remain tenaciously bottom-up. Cognitive functions have been explained as 'responses' to incoming 'stimuli,' with the stimuli combining in complex ways to mechanically cause the response. The response is thus a purely passive change, brought about by the stimulus. As in the characteristic twentieth century approach to natural science, here too the only inertia is an inertia of passivity; nothing would move or change unless acted upon by an outside force.

In order to overcome the problems we have just outlined, an adequate conception of consciousness must reopen these questions with regard to ontology and the theory of causation. We must develop a theory in which purposeful processes are able to appropriate their needed substratum elements, rather than merely being passive epiphenomena of them or ontologically identical with them. This in turn will require the development of a workable account of how it is that certain activities can be 'purposeful' in a scientifically intelligible universe. The twentieth century has simply turned its back on this problem. Purposeful activity is explained away as a purely mechanical process which only appears, anthropomorphically, as if it were purposeful. The standard explanation is that we view a mechanical process as if it were purposeful because we view it as if it were conscious, like ourselves, and we imagine that if we were to engage in that activity, we would be doing so with the consciousness of some purpose in mind. But to characterize a process as purposeful is not to anthropomorphize. The human organism was purposeful before it was conscious. Consciousness is not necessary to purposefulness, even in the human organism. So purposefulness cannot be explained simply as the addition of consciousness to a process which otherwise could be explained simply as one that displays certain 'tendencies' to accomplish certain results, as if the only difference between a purposeful and a non-purposeful process were that, in the latter, there is conscious awareness of the 'tendencies' which would be present in any purely mechanical system.

Developing such a conception will not be easy. The twentieth century has provided few tools or concepts to serve this kind of exploration. But the other alternative seems to be to eschew any hope of studying and understanding consciousness altogether; and that would be too great a sacrifice.


Aurell, Carl G. 1989. "Man's triune conscious mind". Perceptual and Motor Skills 68: 747-754.

Chalmers, David. 1995. "Facing up to the problem of consciousness". Journal of Consciousness Studies 2: 5-22.

Corbetta, M., F.M. Meizen, S. Dobmeyer, G.L. Schulman, and S.E. Petersen. 1990. "Selective attention modulates neural processing of shape, color and velocity in humans". Science 248: 1556-1559.

Cytowic, Richard. 1993. The Man Who Tasted Shapes. New York: Warner.

Damasio, Antonio. 1994. Descartes' Error. New York: Putnam.

Dascal, Marcelo. 1987. "Language and reasoning: Sorting out sociopragmatic and psychopragmatic factors". In J.C. Boudreaux, B. W. Hamill, and R. Jernigan (eds), The Role of Language in Problem Solving 2. Elsevier: North-Holland, 183-197.

Davidson, Donald. 1970. "Mental events". In Lawrence Foster and Joe W. Swanson (eds), Experience and Theory. Amherst: University of Massachusetts Press, 79-102.

Dennett, Daniel. 1991. Consciousness Explained. Boston: Little, Brown and Co.

Dennett, Daniel. 1996. Kinds of Minds. New York: Basic Books.

Ellis, Ralph D. 1995. Questioning Consciousness: The Interplay of Imagery, Cognition and Emotion in the Human Brain. Amsterdam: John Benjamins.

Ellis, Ralph D. 1986. An Ontology of Consciousness. Dordrecht: Kluwer/Martinus Nijhoff.

Ellis, Ralph. 1990. "Afferent-efferent connections and 'neutrality-modifications' in imaginative and perceptual consciousness". Man and World 23: 23-33.

Ellis, Ralph D. 1991. "A critique of concepts of non-sufficient causation," Philosophical Inquiry 13: 22-42.

Ellis, Ralph D. 1992b. "A thought experiment concerning universal expansion". Philosophia 21: 257-275.

Ellis, Ralph D. Forthcoming. "Personalism, Purposeful Processes, and the Contemporary Natural and Cognitive Sciences." Personalist Forum.

Farah, Martha. 1989. "The neural basis of mental imagery". Trends in Neuroscience 12: 395-399.

Fuchs, W. 1922. "Eine Pseudofovea bei Hemianopikern," Psychologische Forschung.

Gendlin, Eugene. 1992. "Thinking beyond patterns". In B. den Ouden and M. Moen (eds), The Presence of Feeling in Thought. New York: Peter Lang.

Goldman, Alvin. 1969. "The compatibility of mechanism and purpose". Philosophical Review 78: 468-482.

Goldstein, Irwin. 1994. "Identifying mental states: a celebrated hypothesis refuted". Australasian Journal of Philosophy 72: 46-62.

Hanze, Martin and Friedrich Hesse. 1993. "Emotional influences on semantic priming". Cognition and Emotion 7: 195-205.

Helmholtz, Hermann. 1962. Helmholtz's Treatise on Physiological Optics, J.P.C. Southall (trans). New York: Dover.

Ito, Masao. 1993. "Movement and thought: Identical control mechanisms by the cerebellum". Trends in the Neurosciences 16, 11: 448-450.

Jackendoff, Ray. 1987. Consciousness and the Computational Mind. Cambridge: The MIT Press.

Johnson-Laird, Philip N., and R.M.J. Byrne. 1991. Deduction. Hillsdale, N.J.: Erlbaum.

Kandel, Eric, and James Schwartz. 1981. Principles of Neural Science. New York: Elsevier-North Holland.

Lavy, Edith and Marcel van den Hout. 1994. "Cognitive avoidance and attentional bias: Causal relationships". Cognitive Therapy and Research 18: 179-194.

Legrenzi, P., V. Girotto, and P.N. Johnson-Laird. 1993. "Focussing in reasoning and decision making". Cognition 49: 37-66.

Logan, G.D. 1980. "Attention and automaticity in stroop and priming tasks: Theory and data". Cognitive Psychology 12: 523-553.

Luria, Alexander R. 1980. Higher Cortical Functions in Man, 2nd ed. New York: Basic Books.

McHugh, D.E. and A.T. Bahill. 1985. "Learning to track predictable target waveforms without a time delay". Investigative Ophthalmology and Visual Science 26: 932-937.

Merleau-Ponty, Maurice. 1942-1967. The Structure of Behavior. A. Fischer (trans). Boston: Beacon; original French edition 1942.

Merleau-Ponty, Maurice. 1962. Phenomenology of Perception. Colin Smith (trans). New York: Humanities Press.

Moore, G.E. 1900-1956. Principia Ethica. Cambridge: Cambridge University Press.

Nagel, Thomas. 1974. "Physicalism". Philosophical Review 74, 339-56.

Newton, Natika. 1993. "The sensorimotor theory of cognition". Pragmatics and Cognition 1: 267-305.

Newton, Natika. 1996. Foundations of Understanding. Amsterdam: John Benjamins.

Natsoulas, Thomas. 1993. "What is wrong with appendage theory of consciousness". Philosophical Psychology 6: 137-154.

Newton, Natika. 1982. "Experience and imagery". Southern Journal of Philosophy 20: 475-487.

Newton, Natika. 1989. "Visualizing is imagining seeing: a reply to White". Analysis 49: 77-81.

Newton, Natika. 1991. "Consciousness, qualia, and reentrant signalling". Behavior and Philosophy 19: 21-41.

Newton, Natika. 1992. "Dennett on intrinsic intentionality". Analysis 52: 18-23.

Ornstein, Robert and Richard Thompson. 1984. The Amazing Brain. Boston: Houghton Mifflin.

Pardo, J.V, P.J. Pardo, K.W. Janer, and M.E. Raichle. 1990. "The anterior cingulate cortex mediates processing selection in the stroop attentional conflict paradigm". Proceedings of the National Academy of Sciences 87: 256-259.

Posner, Michael I. 1980. "Orienting of attention". Quarterly Journal of Experimental Psychology 32: 3-25.

Posner, Michael I. 1990. "Hierarchical distributed networks in the neuropsychology of selective attention". In A. Caramazza (ed). Cognitive Neuropsychology and Neurolinguistics: Advances in Models of Cognitive Function and Impairment. New York: Plenum, 187-210.

Posner, Michael I. and Mary K. Rothbart. 1992. "Attentional mechanisms and conscious experience". In A.D. Milner and M.D. Rugg (eds), The Neuropsychology of Consciousness. London: Academic Press.

Restak, Richard. 1984. The Brain. New York: Bantam.

Runeson, Sverker. 1974. "Constant velocity -- not perceived as such". Psychological Research 37: 3-23.

Sellars, Wilfrid. 1965. "The identity approach to the mind-body problem". Review of Metaphysics 18, 430-51.

Srebro, Richard. 1985. "Localization of visually evoked cortical activity in humans". Journal of Physiology 360: 233-246.

Varela, Francisco, Evan Thompson, and Eleanor Rosch. 1991-1993. The Embodied Mind. Cambridge: The MIT Press.

Rhodes, Gillian and Tanya Tremewan. 1993. "The Simon then Garfunkel effect: Semantic priming, sensitivity, and the modularity of face recognition". Cognitive Psychology 25: 147-87.

Yarbus, Alfred L. 1967. Eye Movement and Vision. New York: Plenum.

[After Post-Modernism Conference. Copyright 1997.]