You are on page 1of 19

This article was downloaded by: [Adelphi University]

On: 23 August 2014, At: 00:32


Publisher: Routledge
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer
House, 37-41 Mortimer Street, London W1T 3JH, UK

Neuropsychoanalysis: An Interdisciplinary Journal


for Psychoanalysis and the Neurosciences
Publication details, including instructions for authors and subscription information:
http://www.tandfonline.com/loi/rnpa20

Consciousness: The Organismic Approach


a

Philip Clapson
a

P.O. Box 38225, London NW3 5XT, United Kingdom, e-mail:


Published online: 09 Jan 2014.

To cite this article: Philip Clapson (2001) Consciousness: The Organismic Approach, Neuropsychoanalysis: An
Interdisciplinary Journal for Psychoanalysis and the Neurosciences, 3:2, 203-220, DOI: 10.1080/15294145.2001.10773356
To link to this article: http://dx.doi.org/10.1080/15294145.2001.10773356

PLEASE SCROLL DOWN FOR ARTICLE


Taylor & Francis makes every effort to ensure the accuracy of all the information (the Content) contained
in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no
representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of
the Content. Any opinions and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied
upon and should be independently verified with primary sources of information. Taylor and Francis shall
not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other
liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or
arising out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any substantial or systematic
reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any
form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://
www.tandfonline.com/page/terms-and-conditions

203

Consciousness: The Organismic Approach

Downloaded by [Adelphi University] at 00:32 23 August 2014

Philip Clapson (London)

Abstract: In recent years, an approach has been emerging


which might be called organismic. It seeks to position consciousness, and thus human experience, in the biological processes of
the organism. If accepted, it will fundamentally change our selfunderstanding and our understanding of organisms in general.
The approach is seen in different ),vriters, but not always fully
carried through, or a particular emphasis may disguise the full
implications. The aim here is not to provide a comparative critique, but to try to stabilize some central principles and appreciate
their ilnport, and propose a research program.

Background: The Inner World


The Notion of Soul

To begin with, I want to introduce the idea that, even


today, the notion of soul lies behind our thinking,
though it does not match the nature of the universe as
we understand it. In fact, the term is not often used
(by comparison with the self, for example). Still it
retains a potency that no other term has. It has three
important characteristics. These are relatively transcultural.
1. Being. Ignoring the immortal and immaterial aspects (which historically have been of crucial significance), the soul is that by which a person is
understood to be what they are. It is inside the
(living) body and represents the essence of the person's characteristics, and is crucially influential on
their actions and destiny.
Philip Clapson is a philosopher working in London, who has been
developing his theory of the biology of experience over a number of years.
Acknowledgment. I would like to thank Mark Solms for helpful discussions on an early version of this paper.
I Person and mind are other substitutes, as will be discussed. Note:
The argument to follow is not supposing the soul as a homunculus, as in
the argUlnent against an inner observer of consciousness, or subject pulling
the strings of behavior.

2. Enduring. The soul endures. From birth to death


the soul provides continuity.
3. Experiencing. Ultimately, and perhaps at the surface, it is the soul that experiences (is glad, suffers, etc.).

For us, in principle, the category human being


implies uniqueness, sanctity, and ultimate value. The
notion of soul functions not merely, or even perhaps
mostly, as an explicit category, but as a way of grasping preconceptually this categorial significance. Thus
in Lakoff's terms (1987), it is a prototype, even metonomic. The notion of person has become the political
and social substitute of choice; but person is vague
outside its contextual use.
Since the notion of soul in its metaphysical aspects is commonly unacceptable in modern thinking
(outside a specifically religious framewor k), its use is,
in Lakoff's terms, metaphorical. When it is said that
something touches the soul, it is understood that the
nature of the experience is thereby expressed, but not
explained. I shall return to the point of this later, but
turn now to history.

Descartes

The difficulty Descartes introduced was the division


between a thinking substance, the mind, and the material world, including the human body. Although Descartes did not use the term consciousness, it is
accepted that what he was talking about was what
came to be called consciousness (for a history of the
term, see e.g., Humphrey, 1992).
Truly divisive was Descartes' conception of the
causal relevance of the mind (1985). He expressly understood the mind to be the harborer of experiences
(thoughts, sensations, etc.) which, being considered
and understood by the subject, or I, facilitated the initiation of action. An injured foot causes signals that

Downloaded by [Adelphi University] at 00:32 23 August 2014

204
make their way to the brain, which, transferred to the
mind, are felt, and this causes, or allows the I (of, or
as, the mind) to initiate remedial and precautionary
action. Descartes' language, as is characteristic of this
kind of discussion, is not precise. Mind, self, soul, and
I are often interchangeably deployed. Descartes was
aware of the brain and that it had great powers; but
the true being of the human was the mind as a distinct
substance with its own characteristics.
There seelns to be no quibble that our experience
involves the awareness of the world, and of our
thoughts and feelings. There seems to be, in conscious
experience, something grasped; and indeed the possibility of grasping anything, about the world or ourselves, seems to depend upon the fact of consciousness.

Philip Clapson
where (or how) the forces of world as Will representationally act themselves out.
For Hume, the I becomes opaque to the mind, for
there is no way of looking into the mind to find the I
whilst it is looking into itself. 2 But where, then, is the
I? Kant (1781) attempted to resolve Hume's difficulty
by making the I the logical requirement of experience
itself (thence Fichte). But he also gave this I a noumenal character to elevate it from the causal forces of the
world (thence free will). His solution has not found
much favor (thence Schopenhauer).
The mind, characterized thus, is deeply problematic. It seems to defy a simple account or to be functionally consistently explainable; and besides which:
(1) What ontologically and operationally, exactly is
it? (2) How can its operation in the physical universe
be understood?

Mind as Us and as Organ

Mind has a hybrid or dimodal existence as thing and


operator. While many subsequent writers were not so
confident of, even opposed, its separateness from the
body, it retained this ambiguous nature. That we experience lent credence both to the idea that it is the mind
that experiences, as perceptions, feelings, and
thoughts, and also that, as evidenced by the will, it is
the I of the mind that acts, referencing thoughts or
motives as the cause or justification of those acts.
However, whilst our experiencing can seem to be
welded to the world, as in direct perception or decision
making and acting, also a teeming quantity of thoughts
and fancies, feelings and imaginings occur to us all
the time. These seem unrelated to that with which we
are involved, and occur without any apparent intention
on our part. Thus, whilst there is a way in which the
mind is associated with our selves as active agents,
representing us as world operator employing our
bodily being for its purposes, at the same time, the
mind is itself filled with apparently independent activity over which we have no control. This renders unconvincing the idea that "we" (as the mind) are in
control of it.
The polar extremes may be seen with the egocentered philosophy of Fichte (1994), in which the
world (as represented in the mind) is derived from the
ego; and the philosophy of Schopenhauer (1818), in
which the forces of nature expressed as Will determine
what the mind undergoes and accomplishes. For
Fichte, the individual has absolute responsibility since
the mind is derived out of the ego; with Schopenhauer
there is no such responsibility since the mind is simply

Consciousness and Nonconsciousness, or the


Unconscious
The (iconic) mental categories of the early moderns-thoughts, feelings, perceptions, and so on (the
products and mechanisms of the mind)-were an attempt to individuate and explain our psychological life
(reference is to Locke, Berkeley, Hume). These kinds
of categories still dominate philosophical and psychological discourse. The problem is, where are they? No
one has seen a feeling or a thought, and a look into
the brain will not reveal them. They are, of course,
explanatory categories of behavior, or labels attached
to conscious experience. But they are deployed as if
they really existed, doing the job we take them to do.
It is on this assumption that both Functionalism and
the classical computer model of the mind became dominant in the late twentieth century. Looking back, the
computer model was born of new-gismo enthusiasm
and hope rather than insight. For consciousness per
se, from which the idea of mentality originates, had
to be explicitly ignored. Its actual behavior is not
clean-cut in the way theorists of the mind required.
Moreover, Descartes' view of the mind, centered
upon consciousness and its reflexivity, was explanatorily inadequate because it did not comprehensively account for human activity. By the time Freud developed
his two successive metapsychologies (psychoanalytic
theory), the idea of the unconscious was widely ac2 This reading takes into account Hume's dissatisfaction, in the Appendix, with his own account in 'Personal Identity" in the text of the
Treatise, Part iv., Sec. vi., Hume (] 739-] 740).

Downloaded by [Adelphi University] at 00:32 23 August 2014

Consciousness: The Organismic Approach


cepted. Most damaging for Descartes' position, Freud
showed that consciousness was not transparent; an individual's motives could be unknown to themselves,
and their explanation of themselves false. Those motives were unconscious.
Both philosophy and cognitive psychology have
found Freud's theories difficult because they rob consciousness, or more appositely mentality, of its reasoning essence, and therefore motivational tractability.
Freud was in a line from Schopenhauer through
Nietzsche.
By developing an account of consciousness that
begins in the body and instinctual drives, Freud tried
to bridge between the organic and the mental. In the
first theory he proposed that consciousness was, in a
sense, a filtered operation-only so much was let
through by a censor, the determining factor being social (or self-regarding) acceptability. In the second theory, the ego constantly defends itself from the primal
forces of the id and the prohibitive aims of the superego, which are at odds with its' 'reality principle." In
the second theory, the instinctual depths of the id are
necessarily' 'unavailable" to consciousness. 3
Though Freud's theories are seen as problematic,
cognitive psychology has embraced the nonconscious,
for otherwise much of human activity-the process
of thinking, for example, as opposed to the result of
thinking-is inexplicable (cf. LeDoux, 1998; Seager,
1999). In this, psychology mimics Freud's position.
Reference to consciousness per se becomes irrelevant,
for talk of the mental encompasses the conscious and
the seamlessly linked and functionally symbiotic nonconscious. The actuality of consciousness could be
discretely forgotten.
Freud's division of the mental realm, and its internal warfare, represents an inner mental world, as
does cognitive psychology's notion of the supportive
nonconscious to the conscious as mind.

The Mind-Body Problem


This topic lies at the heart of what is to follow, but
not in the terms in which it has received so much
attention: the attempt to reconcile, reduce, or eliminate
one of the disparate entities. The problem is that mentality seems not the same as physicality. For Descartes
3 Freud's early' 'Project for a Scientific Psychology" (1895) was his
subsequently abandoned attempt to create a fully materialistic neural account of hun1an functioning. But he never left behind either the biological
approach, or its terminology (Sulloway, 1983). For an overview of Freud's
theories see Laplanche and Pontalis (1973).

205

this was an ontological distinction: For we moderns


what is involved in the distinction remains controversial.
The arguments turn upon a simply expressed
idea: If, for example, I see a bunch of red roses, and
they appeal to me and I decide to buy some, I understand my feelings, thoughts, and actions to arise within
my experiencing. I go up to the vendor and say, "Your
roses are splendid; I'd like a bunch." My apparently
voluntary acting and speaking seem to be caused
within and from my experiencing. But at the same
time I also suppose that my brain, which is in some
way the same as (or generating) my experience, also
carries out operations that result in whatever I do as
organism in the world. The sensory and neural processes of my vision system interact with the neural
processes "underpinning" my desire and judgment
(in, for example, the hypothalamus, hippocampus, and
prefrontal cortex), followed by directed activity in my
motor and vocal systems, thus providing a strictly determined physical account of the seeing and purchasing of the roses. While these two things (experiencebrain) must be the same in some way, they do not
appear to be, not least in terms of causal explanation.
And given a view of the world that says that cause
and effect are physically determined, and since the
brain (and the extended nervous system) are the physical domain, the position of my experiencing could be
seen to be superfluous. But it doesn't seem to me to
be superfluous, nor to people at large. Hence the project to solve, or dissolve, the problem.
However, at its heart lies a supposition that, in
the Anglo-American philosophical tradition, and psychology and neuroscience, has hardly registered. If it
is false, then the centuries of split ink will have been
the attempt to solve a non-, or misperceived, problem.
The supposition is that, biologically, experience, and
what the brain and nervous system do, are in fact the
same thing. More precisely, it only becomes a necessity to perform the reconciliation, reduction, or elimination if the biological aim of each domain is directed
toward the same end. At least, those are the terms in
which, implicitly, the problem has been posed. 4
What terms? The soul notion. Whilst the actuality
is rejected, the operative significance remains. My experience is what I am in mentalist terms, just as neuronal processes are what I am in physical terms. It
seems straightforward to say: My circulating blood is
part of me; my neuronal processes are part of me; and
-+ With two causal realms, for example, there is the problem of causal
overdetermination.

Philip Clapson

Downloaded by [Adelphi University] at 00:32 23 August 2014

206

my thoughts and feelings are part of me. The clue is


the "part of me"; for whilst there is no actual soul,
the "me" substitutes it. "Isn't it all going on inside
me? We're just talking about what is inside the physical boundary."5 But the inclusion of all these elements
(blood, neurons, thoughts) is the soul notion now defined by the physical boundary. Returning to our three
soul characteristics, we see they are operative: (1) Inside the boundary is the being of the individual. (2)
Over a lifetime this boundary, defined by the DNA,
even if bits are lopped off or artificially welded on,
identifies what is the individual as enduring. When
dead, unless cryonically maintained, the boundary
goes. (3) Experiences are inside the boundary. The
individual cannot experience outside themselves, for
even "out-of-body" experiences are deemed to result
from brain states within the boundary. The boundary
defines what are the individual's experiences, even if
not exactly what it is that is experiencing or what
experiencing is.
By these lights, the mind-body problem is indeed
problematic. But if these lights misportray the situation, then the problem isn't there. How could this be?

Bewildering Experience
What is not at issue is that whatever goes on in the
brain and nervous system is operating to effect what
the body does. If we could imagine the body operating
without experience, then we would be dealing with a
"pure" physical entity acting in the world. We would
be dealing with one of the philosophers' favorite characters: the zombie. This entity has sensory organs, an
operative body, a ratiocinating brain full of plans and
programs designed to achieve prespecified and developing goals. It is, in fact, just the same as a human
being except that it does not experience. And for such
philosophers as David Chalmers (1996), therein is the
mystery: Why do humans experience? What is its
point?
I put this slightly differently from the strictly nonreductive thesis of Chalmers and fellow travelers, because the nonreductiveness is really secondary. The
central and positive question is: What is experiencing
for? Its nonreductiveness or potentially epiphenomenal character only become an issue if no biological
purpose for experiencing is identifiable. And the difficulty lies exactly where Descartes posed his solution.
.') About this most writers are absolutely explicit, for example Dennett
(1991, 1996) and Damasio (1999) see below.

If there is damage to the foot, the signals travel to the


brain which, transferred to the mind, results in pain,
which alerts the I of the mind (or the mind itself) to
take remedial or preventative action. But our zombie
does not need all this mind stuff because all that is
involved in the process, including the remedial and
preventative action, can be carried out without the intervention of (experienced) feeling. For all that is (really) involved (i.e., at the neuronal reductive level)
is information processing. Thus our experiencing is
apparently superfluous, mysterious, beyond physical
explanation.
Well, for Descartes it was information processing, but it took place in a mental substance divorced from the body, transduced from the body
through the pineal gland. The problem is for Chalmers
et aI., if it is not divorced from the body, why does it
exist over and above that which is truly necessary for
information processing, the neuronal channels of the
body with their electrochemical transmitters?

Into Biology
Transition to the Organismic Approach-Various
Writers
It is indicative of its power that even those who do
not accept nonreductive positions like Chalmers' can
find themselves snared by the soul notion.
The organismic approach resists what consciousness appears to reveal "to us." It is de facto antiCartesian. Why? Because what the organism does it
must do without recourse to what we appear to be as
"a knowing thing" as consciousness, unless we insist
on elevating it above biology (cf. Clar k' s final chapter,
1997). There is no way of creating a divide between
organism and experience, and granting experience
some grounds the organism does not have, without
accepting the sense, if not the letter, of Descartes' proposal (otherwise, in Ryle's words [1949], the ghost
remains in the machine).
John Searle's (1992) particular physicalism attempts to do so by saying that brains are of such a
biological character that consciousness (whatever that
is deemed to be) just is its result. Thus, mental processes in consciousness-like introspection and transparency-can be understood as a given of the nature
of the emergent domain consciousness is. But, as
many understand it, this is nonexplanatory and/or implausible.

Downloaded by [Adelphi University] at 00:32 23 August 2014

Consciousness: The Organismic Approach


Daniel Dennett, on the other hand, avows a kind
of organismic approach. "From the outset I wor ked
from the 'third-person' point of view of science, and
took my task to be building ... a physical structure
that could be seen to accomplish the puzzling legedermain of the mind" (1994, pp. 236-237). Dennett's
(varying) positions divide into two groups: The first
concerns intentionality made manifest by "mere machinery" -the decomposed homunculi of mental
tasks, for example (his topic of content).6 The second
relates how the brain actually operates as, or generates, consciousness-his multiple drafts model, for example (his topic of consciousness).
Before looking at Dennett's position, it will help
to be more precise about what is involved in the organismic approach. There are four points:
1. Brains and nervous systems appear to do all the
work in the physical universe of operating the human being.
2. We have (are) experience, which it is understood
results from consciousness, which itself results
from, or is an aspect of, the brain's activity.
3. There is an explanatory set of concepts called psychology (various actually) which account for human behavior, and, to varying degrees, incorporate
the apparent nature of consciousness.
4. There is, in some as yet unspecified way, the question of what the above says about what is going on
in human biology.
The organismic approach must take point 1 not
only as mandatory but also as precedent. Philosophers
of mind (usually) think about the person reversing the
precedent order between points 1 and 2. Dennett explicitly takes the (normally) antiphilosophical route.
But having done that, we return to point 4. What
is the biological position of consciousness, point 2?
For it is the fact of point 2 that generates the possibility
of point 3, psychological explanation.
Dennett's way of dealing with this is twofold.
First, Dennett does not believe in the actuality of point
3, psychological explanation, because as a robust
physicalist, he considers the intentionality of a human
being "as if"; the organism is to be interpreted as if
it believed and desired (for example), whereas it only
seems to because actually it is just (bio-)mechanics
(see his Intentional Stance, 1987).
Intentionality, or "aboutness" as is often said now, is the mark of
the mental according to Brentano (1874), a term of scholastic origin.
6

207

Second, in the case of consciousness, the observer (third-person) heterophenomenological method


interprets the untrustworthy subject's (first person)
phenomenological account. A subject's self-understanding is (anthropologically) edited and contextualized by the observer, and thus given objective
credence. What the first person's phenomenology
seems to be, a happening inside them that is observed
by them, is just the judgment they have and can express; there is no Cartesian Theater, no (added) observer-subject of the phenomenological "scene" over
and above the judgment the subject (can) talk about
(1991).
Moreover, Dennett's view is that there is no self,
but a Center of Narrative Gravity: the things we tend
to say because those are the ways we have (learned)
to be. Externally (just as in the Intentional Stance)
there is an appearance of self because these same or
similar things issue from the same mouth (along with
other behavior). Inside it could never be found.
But in spite of all of this, and contrary to his
avowal, Dennett's position has not left the soul notion
behind. Consider these sentences.
Mental contents become conscious not by entering
some special chamber in the brain, not by being transduced into some privileged and mysterious medium,
but by winning the competition against other mental
contents for domination in the control of behavior
and hence for achieving long-lasting effects-or as
we misleadingly say, "entering memory." And since
we are talkers, and since talking to ourselves is one
of our most influential activities, one of the most effective ways for a mental content to become influential is for it to get into position to drive the languageusing parts of the controls [1996, pp. 205-206].

Dennett's view here is that the brain "harbors"


mental contents that are vying to become current consciousness. But the brain has no mental contents. 7 Current best theory says the brain is a massive modular
neural networ k, operating by electrochemical transmission. Dennett concurs with this: for mental means
(for example) intentional, which for Dennett is "as
if. " What he really means, presumably, is that neural
states vie with each other to become conscious-and
apparently intentional. But why are neural states vying
to become conscious? Answer: to be influential. On
7 Cf. "If your model of how pain is a product of brain activity still
has a box in it labeled 'pain,' you haven't yet begun to explain what pain
is" (Dennett, 1991, pp. 454-455).

Downloaded by [Adelphi University] at 00:32 23 August 2014

208
what? Behavior. Why? How? Dennett never convincingly explains.
Since, as is well known, our behavior is extensively controlled by neural processes that are never
conscious, including some decision taking,8 and much
that we are conscious of seems irrelevant to our current behavior,9 and sometimes as consciousness we
say things differently from what we actually intend
and do, and as Freud pointed out we often do things
explaining ourselves for reasons that are not why we
are doing them at all (see Nisbett and Ross, 1980, for
experimental results), what is needed is an explanation
of the significance of those that actually are conscious.
Thus Dennett's account, that our judgments, our narrative, our images control our behavior, simply does not
adequately explain either why they are actually there
as they are-or not, while behavior still goes on. The
soul might necessarily meaningfully control (or Descartes' mind with its clearly thought self-understanding), but this is not what consciousness necessarily
does. lo Moreover, even if, for Dennett, conscious
states are brain states, he is still left with the apparent
duality of domains controlling our behavior: viz, why
does control have to be this emergent consciousness
as and from the physicality?
He says that: "Such questions betray a deep confusion, for they presuppose that what you are is something else, some Cartesian res cogitans in addition to
all this brain-and-body activity" (1997, p. 206). But
that is not the issue. The issue is: Why/how do just
these images, words, feelings, moods, thoughts control
what the organism does? What is the relation between
my seeing the roses and my buying some explained as
brain function? What functional contribution to what
I do (as organism) is it that I have a seeing (or thought
or feeling)-point 4? The soul notion may embrace
domains as me or mine, but it cannot tacitly be assumed to explain their interconnection; it does not.
The neurobiologist, Antonio Damasio, whose
work The Feeling of What Happens (1999) has been
widely discussed, takes an explicitly organismic approach. "Consciousness begins when brains acquire
the power ... of telling a story without words, the
story that there is life ticking away in an organism,
and that the states of the living organism, within body
bounds, are continuously being altered by encounters
R Experiments of unconsciously influenced decision making (e.g., Damasio, 1999, p. 301).
9 Not only our rich explicit fantasy life, but much else besides.
10 Memory, for example, is procedural as well as factual and personal;
but procedural memory (e.g., learning to ride a bicycle) is not rendered
by conscious conceptual thought.

Philip Clapson
with objects or events in its environment, or, for that
matter, by thoughts and by internal adjustments of the
life process" (p. 30).
The crucial move Damasio makes is to locate
consciousness as an organismic biological component. 11 What that component does, in principle, is portray the encounter between organism and world
(including itself as "inner" world). It is a story of
organismic engagement. So Damasio takes on point 4
directly. Thus he disengages the intuition of consciousness as the viewed scene of the observer, Dennett's Cartesian Theater. Moreover, and more true to
the phenomenology, he does not insist on Dennett's
unconvincing essential tie of consciousness and the
control of behavior.
Damasio takes as a central fact the findings of
Libet, Curtis, Wright, and Pearl (1983). In the last
two decades, Libet's location of the delay between
the nonconscious beginnings of voluntary action and
awareness of same have carried the implication that
consciousness (and thus what we experience) cannot
be a state that has its own powers of initiation or
control of action. 12 For consciousness seems to come
at the end of the brain (and nervous system) processing
cycle, when action (for example) has already been initiated; the delay is between 350 and 500 msec.
Damasio's original project (1994) along these
lines was to rehabilitate emotion into an account of
the reasoning processes of the human organism.
Broader now, as the title of this book indicates, he
looks to link the facts of experiencing and the facts of
neurophysiology, thus to position consciousness in the
biology of the organism. The following summarizes
various elements of Damasio' s position:
The idea of consciousness as a feeling of knowing is
consistent with the important fact I adduced regarding
the brain structures most closely related to consciousness: such structures, from those that support the
proto-self to those that support second-order mapII One reason for this is that Damasio does not come from, or endorse,
the computational view.
12 For example, "The brain evidently 'decides' to initiate, or, at least,
prepares to initiate the act at a time before there is any reportable subjective awareness that such a decision has taken place. It is concluded that
cerebral activity even of a spontaneously voluntary act can and usually
does begin unconsciously" (Libet et aI., 1983, p. 640). However, Libet's
own explanation of this, that consciousness acts potentially to veto unconsciously initiated action, is dualistic and implausible, since any veto consciousness imposes must itself be unconsciously initiated 350 to 500 msec
previously, as commentators have pointed out.
Libet's findings actually support our conlmon experience. In fast
sporting action "decisions" are taken before awareness of them. And we
duck before hearing the thunder, when it is overhead.

Consciousness: The Organismic Approach

Downloaded by [Adelphi University] at 00:32 23 August 2014

pings, process body signals of one sort or another,


from those in the internal milieu to those in the musculoskeletal frame. All of those structures operate
with the nonverbal vocabulary of feelings. It is thus
plausible that the neural patterns which arise from
activity in those structures are the basis for the sort of
mental images we call feelings. The secret of making
consciousness may well be this: that the plotting of
a relationship between any object and the organism
becomes the feeling of a feeling (1999, p. 313].

In being a "feeling of a feeling," Damasio is


attributing consciousness to a second-order neural
state "knowing," or being the feeling of the thereby
known, first-order feeling state. He is not explaining
how this can happen, but hypothesizing both what and
where the mechanisms may be, and their functional
significance. For the "being known" experienced as
consciousness raises it to functional significance by
being the known process of an experiencer. "Consciousness is thus valuable because it introduces a new
means of achieving homeostasis ... [e.g.] handl[ing]
the problem of how an individual organism may cope
with environmental challenges not predicted in its basic design such that conditions fundamental for survival can still be met" (p. 303).
This value, at its most basic level, is that: "The
simple process of feeling begins to give the organism
incentive to heed the results of emoting" (p. 284).
Emoting is not a conscious state but an organismic
state in relation to an object that can become feeling,
and then consciously known when it is second-order.
"Suffering begins with feelings, although it is enhanced by knowing, and the same can be said for joy"
(p. 284). The value of this is: "That the mechanisms
which permit consciousness may have prevailed [in
evolution] because it was useful for organisms to
know of their emotions ... and it became applicable
not just to the emotions but to the many stimuli which
brought them into action" (p. 285).
Now although Damasio's account of consciousness as a portrayal of neural states may be an advance
over Dennett's control account, a problem remains.
And this is in precisely the same place. An emotion
rises from a neural state to (another neural state of)
feeling to (yet another neural state of) a knowing of
feeling. What exactly is being explained thereby?
What domain is inhabited by feeling and the (conscious) feeling of feeling? And why, moreover, should
this be motivational (or an "incentive" in Damasio's

209

terms) for the organism?l3 While neural states function, we are simply adding in another stratum of apparentness, that of the mind where all this
representational occurrence is taking place. And why
should this bring evolutionary benefits? That there are
occurrences does not explain why those occurrences
function as claimed. Specifically, a feeling of pain
does not explain its function, since in Descartes' account, the mind has to be assumed for the function to
operate, and so far no fundamental explanation of the
mind exists.
Moreover, as the Libet data suggest, everything
that is involved in the causal process of seeing the
roses and deciding to buy occurs unconsciously. We
are left purely with the results of the process (the neural portrayal) as what is conscious. Consciously I decide nothing. My ccnsciousness simply follows along
behind what my brain is deciding nonconsciously.
"The idea that consciousness is tardy, relative to the
entity that initiates the process of consciousness, is
supported by Benjamin Libet's pioneering experiments," Damasio says (p. 287).
Thus Damasio's account remains on parallel
tracks. The nonreconciliation is caused, we diagnose,
by-the soul notion; because, in the principle of the
soul, whatever is going on in the organism has to be
in and for the organism qua organism; both neural
states and their conscious portrayal: isolated, solipsistic, self-operative.
Damasio correctly begins by establishing the precedence of point 1 (organism) over 2 (consciousness),
and has an attempt at explanation of point 4 (biology).
But because point 3 (psychology) is still his operative
understanding, he lapses in his further explanation to
a position where points 1 and 2 are equal and irreconcilable.
Indeed, it has been said truly that neuroscience
has not yet done its job: It has no explanatory domain
for the neuro- but simply grafts psychological concepts onto brain locations and their interconnections.
In the words of J. Graham Beaumont (1999): "Neuropsychology is in a conceptual morass. Neuropsychologists seek to study the relationship between brain and
mind, but without ever really addressing the status
of these two constructs, or what potential forms the
relationship between them might take" (p. 527).
The wretched ghost still haunts the feast.
Some writers, on Libet et al.' s findings, have
come to the conclusion that there is something myste13 That feelings (or sensations) are motivational is a standard premise
in neuroscience and neuropsychology.

Downloaded by [Adelphi University] at 00:32 23 August 2014

210
rious going on about what is happening as consciousness. 14 One such is Guy Claxton. In his paper,
"Whodunnit? Unpicking the 'Seems' of Free Will"
(1999), Claxton attempts to reconcile Libet with the
sense, in consciousness, that we have free will. The
notion of free will (of the mind, self, soul, person,
etc.), it has been long argued, does not coincide with
the determinism that neurophysiology implies.
The sense of free will, the seems of it in Claxton's
terms, is exactly knowledge occurring as consciousness to which Damasio refers. But this "knowledge"
is clearly not factual or justified (in philosophers'
terms). It is simply the that that the process of acting
can incorporate as a portrayal of the requisite neural
states. When I buy the red roses, I seem to be (I have
a sense that I am) acting under my own volition. Dennett's heterophenomenology is apposite because, although I may say I act voluntarily, though my
subjective experience does not describe the reality in
the physical universe, that I describe myself in this
manner from my experience is an important fact about
how neural states can portray themselves. Claxton
gives an example this way:
Conscious premonitions are attached precisely to actions that look as if they are going to happen-and
then sometimes don't. What folk psychology construes as "will power" -with its notorious' 'fallibility" -turns out to be a property not of some imperfect
agent but a design feature of the biocomputer. ... An
updating of prediction [as when we appear to "change
our mind"] is reframed as an inner battle between
conflicting intentions-and as further evidence of the
existence of the instigatory self [po 109].

Claxton supposes that the biocomputer-by


which he means what the organism does, most of
which is unconscious-portrays itself in kinds of conscious experience that we interpret psychologically.
These require such "sleight-of-hand" concepts as:
[A] sense of self which includes a kind of dummy
instigator claiming authorship [for our actions]. Selfas-instigator is really a simple subroutine, added to
the biocomputer, which does not affect the latter's
modus operandi at all, but simply takes the glimmerings of a naturally-arising prediction, and instantly
generates a "command" to bring about what was
14 In Continental philosophy, deconstruction of consciousness happened much earlier in the twentieth century, from Husserl to Heidegger and
Merleau Ponty; see more recently Varela, Thompson, and Rosch (1991),
particularly chapter 8.

Philip Clapson

going to happen anyway [po 111; slightly changed


for quote].

Claxton recruits Libel's findings to mark the differentiation between what the organism does in its
modus operandi and what is portrayed as, for example,
the' 'self-as-instigator." 15 This is undoubtedly in the
organismic camp.16 The features raised, put together
with others in this section, build an emerging picture
which we will address explicitly later in this paper.
But still, of his account, we must ask exactly what
the point of there being both a biocomputer and conscious experience which is deceptive of its actual powers? For it must be the biocomputer that causes the
conscious experience, and if the conscious experience
is, in some sense, a fake or an illusion or deceptive 17-all these words are used in the literature to
describe our experience as opposed to some other
(e.g., neural) actuality: (1) What is the point of our
experience in the first place? (2) Why is experiencing's illusory nature, in this account, simply (in Claxton's words) "comforting"? (To what, the soul?)
Claxton does not address either of these points.
A similar problem arises with the views of Peter
Halligan and David Oakley in their New Scientist article (2000), which refers to (and endorses) Claxton's
paper. Halligan and Oakley confuse a mentalist model
of the conscious-unconscious with the model conscious-neural-their ' 'unconscious parts of the
brain." There is an assumption that whatever neural
processing is doing, which is of course inaccessible
per se to consciousness, it supports the still justifiable
distinction, which Freud's work exemplifies, between
an unconscious mind full of mechanisms, schemes,
and plans, including popping things into consciousness, and a consciousness ("us"), which gullibly accepts all these pulled strings of the unconscious. They
finish their article thus:
Perhaps by now you will have begun to think of yourself differently, to realise that "you" are not really
in control. Nevertheless, it will be virtually impossible to let go of the myth that the self and free will
are integral functions of consciousness. The myth is
something we are strongly adapted to maintain, and
almost impossible to escape from. Maybe that's not
15 He also seems to endorse Libel's own account of the significance
of consciousness, which we reject.
16 Although, for similar reasons to Dennett, the computer analogy is
not helpful.
17 For example, The User Illusion is the title of Tor Norretranders'
book on consciousness (1998).

211

Consciousness: The Organismic Approach

Downloaded by [Adelphi University] at 00:32 23 August 2014

so surprising, because, after all, "we" are part of the


illusion [po 39].

But, as said of Dennett's probable slip of the pen,


no equation of the unconscious and neural can be
made. It is highly probable that neural representation,
as consciousness, gives a vision (image, picture, etc.)
of the world, and feelings and language. But that does
not mean that the brain is the unconscious as a physical implementation of scheming psychological states
manipulating consciousness. The brain, for example,
has no feelings. All that is available to science is our
experience as consciousness, behavior, and brain; all
else is psychology. And psychological explanation,
useful in its own way, as scientific theory is unsatisfactory (as Paul Churchland [1981] contended long ago)
and should not be claimed by neuroscience, which is
supposed to be an account of brain function and its
manifestations. This brief survey demonstrates that,
although there has been steady progress on the organismic view, it is still not a clear view.

The Organismic Approach


Preface
The aim, in this section, is to stabilize some central
principles, appreciate their import, and propose a research program.
When Freud derived human psychology from biology he was faced with precisely the same problem
as now. Recent brain science has been informative.
Brain deficit analysis and imaging have specified location and participation in the process of generating psychological states. However, the transitions remain
where they were. How are we to understand the move
between: (1) brain states; (2) conscious experience;
(3) psychological explanation? The initial step is to
disengage entirely from the influence of the soul notion. We do this by concentrating on (4) biological explanation.
The assumptions about consciousness we have
reviewed have not captured its biological function in
a satisfactory way (as proposed at the beginning of
"Into Biology' '). If we do find a satisfactory account,
we should genuinely appreciate its evolutionary advantage. But will it then actually be consciousness?
For we must end up with no inexplicable perceptions,
awarenesses, or invocations of knowing. Our account
must be entirely adequate in physical terms, with causality residing in the physical processes. Moreover,

regarding the organism qua organism, the account


must enhance our understanding of it, not leave us
baffled. How do we do this?
We will address the task in two ways. First we
will attempt to grasp the phenomenon of consciousness directly to identify a function for it in the next
section. Then we will pursue the argument for that
discovered function. Finally, we will review some of
the ramifications, before proposing a research
program.

The Mind Banished


The first move is to investigate more precisely what
seems to be going on between brain states and consciousness. But now we consider consciousness itself
to be under question: There is a phenomenon but we
are not sure what it is. Where it is the target of our
investigation, therefore, the word will be put in quotes:
, 'consciousness.' ,
We might suppose that if you and I look at a tree,
our "conscious" image of the tree will not be that
different, although our interpretations of and associations with the tree, and the world it is in, and other
current thoughts, will be. This is not to lapse into the
myth of the given; it is to suppose that neural processes
that manufacture the tree's image will not be that different. Again, although we will not have exactly the
same toothache pain, where it is manufactured by similar neural processes, it may feel similar. 18 Otherwise
the aim of neuroscience-to find the locations, interconnections, neural firings, and particular neurotransmitters of conscious or behavioral events-would be
implausible. Besides, some neural activities are known
very precisely in relation to their world targets. Given
this, the notion of the tree being "in mind," or such
phrases, may be considered redundant. We could just
say, with Damasio, that the image is the portrayal of
a brain occurrence. Might this help?
It is widely agreed that conscious experience results from heavy neural preprocessing (see, for example, Damasio, 1999; McCrone, 1999). This both
assembles the image (binding, in the brain, different
elements) and generates memories, ideas, and feelings.
Combining Damasio' sand Libet's findings-our
, 'conscious" state is a portrayal of the active brain
structures postdating the response of the organism-we may see that the notion of a mind as intelligiHI This is not to say our pain will be the same for the same injury; cf.
Patrick Wall (1999), below, footnote 19.

Downloaded by [Adelphi University] at 00:32 23 August 2014

212
bly doing things is redundant, an unhelpful fa~on de
parler. For example, the notion that the mind associates memories or thoughts with the image of the tree
offers nothing over and above the brain binding its
structure to give whatever occurs. Suppose I think of
another tree I saw, or a girl I once knew under a similar
tree, or a past tree struck by lightning: All that needs
be said is that the brain has bound whatever is apposite
to its operation now. It is occurrent.
Consider further this line of argument. Every neural state that is bound as "consciousness" expresses
(portrays) the neural status of the brain. Therefore we
need not suppose there is a neural connection between,
for example, memories qua memories. With the image
of the tree and the memory of a girl under a similar
tree, my "consciousness" does not thus remember the
girl by (causal) mental association. She is simply a
bound neural state. Of course, the girl may occur, associated with bindings of other neural elements that bespeak "this is the past girl under a similar tree," but
may not explicitly or immediately or ever. The girlbinding may be a neural process on some kind of neural association, that is, neural connections laid down
when the original experiences occurred as neural
states, or linked to others for brain-biological functioning. As they occur they may have the appearance, that
is, part of the way they are expressed as "consciousness," may involve the sense that there is a mentalist
association. But as in Claxton's example of the sense
of free will as causal, we would consider this an aspect
of the portrayal process, not indicative of mental powers. The girl may occur as a mere unrecognized
fragment.
Thus the argument would conclude that "consciousness" is not causal upon what happens in the
organism. Neural states cause each other; they cause
behavior; they also cause what is "conscious." But a
"consciousness" does not cause another' 'consciousness." Nor does it cause nonconscious activity of the
brain: that is, the brain does not read its own conscious
states on the supposition that otherwise it would not
know them (as is the case in, for example, Baars's
Work Space Theory of Consciousness [1996]).
This denial of conscious causality is most dramatically understood by pain not being a motivator. 19
However the neural states causing the experience of
pain achieve their result, the pain itself causes nothing
in the organism. For what the organism does-in tak19 That the brain in fact regulates pain felt in appropriate contexts
(i.e., that the experience is not a necessity of physically damaged states),
is described by Patrick Wall in his book Pain: The Science of Suffering
(1999).

Philip Clapson

ing action, for example-it does by silent neural states


subsequently represented as "consciousness."
But, while this holds the causality of the organism
in the physical world, we are still left with the "conscious" occurrence of our experience. We must conclude, therefore, that our experience must be for some
other purpose than internal causality.

"Consciousness" Is the Medium-Mechanism for the


Portrayal of Neural Status for External
Communication
And the inference to the best explanation must be that
it is to communicate the status of neural states to conspecifics.
Although "consciousness" and the experience it
delivers, are neural states, those neural states are directed outwards to the world. This (dis)solves the
causal mind-body problem (though not how "consciousness" is made). It also provides the biological
functional account for our experiencing, that fits with
Libet et al. It answers, too, J. Graham Beaumont's
(1999) concern about the status of the construct
"mind." The answer is: There is no such thing. Psychological explanation may remain in common use,
but it is not neuroscientific.
But now the question is: How can "conscious"
states communicate when they are internal to the organism? This brings us to the difference between consciousness and' 'consciousness." To understand it, we
must appreciate the incredible difficulty of communication between organisms that biology has to solve
without perception, awareness, or knowledge invocation. After all, organisms are just physical objects. Yet
they are brought to common action.
The organismic approach proposes that "consciousness" is a physical (neural) state of representation. It represents to signify. It signifies to communicate
to conspecifics how the organism finds the world, and
is reacting to the world. 20 Expressing these contents exhausts the function of "consciousness."
Signification is unproblematically physical. But
why does it happen?
For the organism to react to the environment, it
needs to represent aspects of the environment within
its brain so that, via sensory input from the environment, it can select beneficial action by manipulating
20 Dennett (1991, pp. 410-4] 1) refers to something like this as "semiotic materialism," quoting David Lodge.

Downloaded by [Adelphi University] at 00:32 23 August 2014

Consciousness: The Organismic Approach


those representations. They need to bear no relation
to the environment in terms of form or content (for
example, they could just be synaptic weightings in a
neural network), but clearly they must bear a correspondence. This is relatively uncontroversial. But historically there has been an aim to show why these
representations (for "higher beings") involve consciousness; why, as the phrase goes, there is thereby
a world for the organism. We shall call this position
representationalism.
But, suppose the organism needs to communicate
with conspecifics. How will it do it? Uncontroversially, it is supposed, by transmitted codifications (e.g.,
signs, smells, sounds) that fit the representational capability of the reciprocal organism. For we must suppose that, just as the environment is represented in the
organism, so must be the communications of conspecifics. For organism B to react to organism A, B needs
to have some way of recognizing what A is communicating. Since neural structures operate in the brains of
A and B, what is involved in the communication must
be the modification of B' s neural states by A's communication.
To react to the environment, the representational
structure in the organism need not match the environment in form and content. And this is true in B for A's
communication. B need not have a representational
structure that "pictures" A or A's environment. All
that matters is that B can read A's "code." Moreover,
in order to be able to respond to the input from A, B
must first integrate its current states and other information it is receiving from the environment. This process may be as complex and intricate as necessary.
Its form and functionality should be as adaptive and
flexible as possible. It can be silent. 21
But it is different when considering that A communicating to B involves B communicating to A. For
what concerns the organisms is not how they deal with
the input from each, but the nature of the reciprocating
output. What is now required is a structure that is as
communicable as possible. What is required is consistency and appositeness. What is required is a form
and content that the organisms can have adequately
in common. Moving up the phylogenetic scale, where
ability to manipulate the environment increases as organisms increase in functional capacity, communicative representation likely will involve the form and
content of the environment in which the conspecifics
"Silent," in the text, will mean "not apparent"; that is, not an
image or thought or word or feeling.
2\

213
operate. For this is what they have separately but in
common, and need common command of.
And there is a requirement, too, that coexisting,
or corelated elements of what is to be conveyed in the
communication should be copresented, or integral, as
the communication. Significant interrelation of the elements also needs to be represented. 22
Now to develop what is involved in this proposal,
we build up the steps:
1. For what the organism takes in and manipulates of the environment and conspecifics, what is
needed is a fast, task-specific yet widely comparative/
significance-balancing processing structure, to facilitate behavioral response to a complex and changing
environment.
2. For communicating to conspecifics, what is
required is a reciprocal, apposite, and consistent codification between the conspecifics.
3. Thus in the communication of A behaving in
the environment which B is to receive, A must assume
in B that what it (A) is communicating can be received
by B, and then reciprocated by some appropriate response.
4. If A is behaving in an environment which will
influence B's behavior-for example if they are hunting lions with a gazelle in view-A must suppose implicitly that B can represent "A's behavior in the
environment" and respond to it.
5. Now consider what happens in B as a result
of receiving the communication of A. B processes the
scene by its complex silent method that does not require representational structures that have the form or
content of the environment. What is involved in its
output to A? Well, B also cannot communicate the
environment in which it responds to A; it also must
assume (implicitly) that A is able to represent "B's
behavior in the environment" in A.
6. So, without any notion of the environment or
behavior or the gazelle or the conspecific, communicative success between the lions must assume all aspects
are reciprocally represented within them by some
means.
7. Thus there are two representational domains
involved in communication, input and output. And
there are two relevant factors: the actual behavior and
gestures of the organisms, and the environment of
their actions.
22 The obvious distinction being drawn is between the massive neural
network, parallel processing of the brain, and the (brain) images of "consciousness. ' ,

Downloaded by [Adelphi University] at 00:32 23 August 2014

214
8. B processes input from A by the complex silent method. The "conscious" state in B, as a result,
then entails A's movement, A's gestures, the gazelle
and the environment. That very "consciousness," it
is concluded here, is the response of B' s organism to
A's input.
In other words, from the input processing in B,
the "consciousness" of the scene in B becomes the
analogical output to A. It is in a representational form
that fits with A's requirement for adequate communication. This is because it is the analogical recreation
in B of what A is doing in the environment, together
with B' s actual reaction-thus output.
To lay it out further: If A moves to the right, B' s
silent neural states analyze A's action, and then A's
move to A's right is created as B's "consciousness."
For A (implicitly), this is B' s response (i.e., output)
to A's move. But in addition if, for hunting success,
B also should move in a direction in accord with A's
move, B's silent processing of A's move to A's right
also causes B' s move. Though this is a move per se,
it is also output to A. Thus the input processing locks
each to the other in the hunt program, and the output
signifying that interlock is the reciprocal "conscious"
state (which the other must assume) together with the
behavioral change.
What is explicitly communicated is movement,
gesture, and vocalization-what we call the reactive
element. What is implicitly communicated is the environment, including the conspecific and gazelle-what
we call the contextual element. 23 Of course the commonality of context is not precisely the same; it has,
as it were, a plasticity within its mutually defined operative locale.
9. Put another way, when you and I are talking
(which lions cannot) we do not have to say to each
other constantly: There is a wall, there is a chair, there
is a table, there is a window. Our organisms are already communicating this (nonconceptual) commonality that we assume by our common image as our
separate "consciousnesses." When my words (the input) are processed by your silent neural states and then
become "conscious," you take them as coming from
me (without considering it), for I am represented as
your "consciousness" too. But when they are your
"consciousness" :
Already they represent your (organismic) output to me,
for already they have associated the reaction you (organ23 Here we are treating of the "directly successful" proper function
(Millikan, 1984) of "consciousness." Output continues whether there is
a conspecific around or not.

Philip Clapson

ism) have to them-how you hear my tone, and your


gathering response, for example-which you may communicate.
The contextual commonality is not outside of us in which
we dwell; it is what we both project from ourselves as the
shared context of our intercommunication. 24

Context and reaction as output are the necessary


actualities of the biology of "consciousness." This
functions for nonverbal lions too.
10. Moreover, it is evident that when I do point
to the chair, my "conscious" image of the chair that
I am pointing to is part of my gesture. For the brain,
the extended finger and the image to which my organism is directed are a seamlessly linked act. 25 And you
take this to be so, partly explicitly by the extended
finger, my gaze, my words; and also implicity, for you
too have the image to which I am pointing which I
assume you have as your reciprocation.
11. In this account, the reason for the integral
nature of "consciousness" is evident. For humans,
and presumably some other animals, output must be
comprehensive enough to be an apposite analogical
representation of "conspecific behavior in the environment" (see also footnote 23).
12. The brain does not consider the' 'conscious"
image it has created and then, from that image, or
using that image as a guide, decide to point, as representationalism supposes. Why would it need to? It is
this interpretation that manufactures the superfluous
character of consciousness. The image has no property
for the brain that exceeds its communicative role (i.e.,
no intentionality about the world)
13. In the organismic approach, my "consciousness" is actually my organism's analogical output to
you. Although my sense about my "consciousness"
is that it is functioning as input to me or about me, in
fact it is my brain's projection of its status about its
relation to the world, including you. My (misinterpreted) sense of what it is, is part of my organism's
output.

The Required Shift of Paradigm


What is called' 'consciousness" is physical analogical
signification. 26 Its communicative role is for conspe24 Imagination is the brain capacity for the equivalent function without
physical copresence.
25 Indeed, considering Libel's findings, the (inaccessible) brain-decision to point subsequently incorporates the image to which the pointing
is directed.
26 A comparison can be drawn with the chameleon's skin, but space
precludes going further into this.

Downloaded by [Adelphi University] at 00:32 23 August 2014

Consciousness: The Organismic Approach


cifics, but cross-species communication also can take
place thereby. The assumption from representationalism is that communication just happens somehow (by
brain or mind magic). This is obscure, and still leaves
consciousness mysterious. But, indeed, even if B' s
brain interpreted what is going on with A, and A's
brain what is going on with B nonconsciously, both
brains still must be performing that interpretative task.
The claim here is that the interpretative task, in terms
of output, is the brain activity we refer to as "consciousness." For the function of "consciousness" is
supra-organismic.
But then what we point to as consciousness is not,
in terms of biology, consciousness. The Chambers'
dictionary has conscious as: "having the feeling or
knowledge of something; aware." But biologically,
"consciousness" is not the original location or manner of organismic knowledge: It is the articulation, the
re-representation of the knowledge, causally held in
the silent neural states, that can be communicated.
Indeed, whatever meaning is deemed to be, it results from re-representing neural states for communication to conspecifics. Were we solipsistic organisms,
meaning, in the terms we understand it, need never
arise.
A highly misleading philosophers' phrase is that
phenomena are how objects appear to the mind. The
position here is the contrary. Phenomena are how the
brain projects its status to the world. Without the
mind-which communicates by transmitting over the
(for Descartes) infinite gap between individuals (generating the "Are there other minds?" problem)-one
sees that what is transmitted manifests itself by the
same (or similar) "conscious" occurrence in another
individual, precisely because they have the same (or
similar) neural states. 27
If I say "I see a tree" you (organism) do not hear
me say "I see a tree" but undergo the neural states
generating your "consciousness" as my words "I see
a tree." I do not have access to the manufacture of
my words, nor you of your hearing. But their causal
existence within each of our organisms is compatible
neural states. The spoken and heard words do not assure our (infinitely separate) minds of the communication between us. The neural states those words portray
are the causal actuality our organisms then generate
(as "consciousness") as the fact of communication.
This is the communicative interlock in operation. 28
27 This is not a theory of mind. Not as, for example, is Nicholas
Humphrey's (1983) social but dualist view of consciousness as a means
of understanding other animals by one's own experience.
28 There is no passive "intake" in consciousness, as the philosophical
expression "consciousness of" misleadingly implies (cf. Munz, 1993, p.

68).

215
Now a riposte to this might be: But what if I'm
alone? I'm not communicating with anyone, so on this
principle my "consciousness" is redundant. But this
is not so. The biological proper function of "consciousness, " in Millikan's sense, is to be the portrayal
of neural status, and this will happen regardless of
whether anyone else is a participant in the communicative process (Millikan, 1984). The riposte, again, presupposes our experience is going on "for us," as
mind, as soul.
Suppose one is thinking and has, alone, a new
idea or realization. The organism, at some later time,
may well reuse this idea or realization. But it will
not remember and use that "conscious" state. Silent
neural states will repeat and the idea will reoccur (including possibly the bound' 'fact of past occurrence' ')
and may be useful in some communicative context.
Even if one is only "using the idea" again in some
isolated situation, that the idea occurs does not guide
what the organism does, but is simply available to
be the communication of whatever the organism is
operating as, those silent neural states. To say (as some
do), that a verbally expressed thought feeds back per
se into neural states, has not grasped the mechanisms,
for it preserves the notion that mentality is in the brain.
Indeed, as "conscious" appearance, no mental state
is ever exactly the same, for the brain's states will
never be the same.
A riposte to the banishment of "mind," and all
its terminology, which is so familiar to us, might be
that if one is describing what neural states are doing,
and the construct "mind" is such a description, why
is it not adequate?29 There are three responses. (1)
Mind concepts were invented before more insightful
organismic understanding occurred. De factor starting
from here we would not presuppose the mind, and
therefore owe it no historical allegiance. (2) The brain
does things that have rendered the mind story in the
first place-and we need to understand what they are
explicitly, which the current mind story obscures. (3)
We simply will not grasp an organismic understanding, which is biologically of great significance, if we
perpetuate a fiction in the midst of it.
Most disturbing is the understanding that writers
have reached, from Freud onwards, that our existence
is planned and controlled without any conscious
awareness until whatever the brain' 'decides" to make
conscious. But we now conclude that, even when
"conscious" awareness arrives, it is not awareness in
29 Many writers might take this view, including such diverse figures
as the philosopher Donald Davidson and cognitive scientist Bernard Baars.

216

Downloaded by [Adelphi University] at 00:32 23 August 2014

the commonly understood sense. For that presumes


some fundamental (ontologically primary) link between this very experience as our self and(/to) the
world, time, events, relationships. But the primary link
is in the silent neural states.
So it is that the organismic approach undermines
the soul notion fundamentally. For it concludes that
what is going on in the organism does not correlate
with the "for me" or "as me" sense. What is causal
is unknowable, and what is knowable functions for
the recipient who is not me-or more exactly, to the
organismic community of which I am a member. (It is,
perhaps, Dennett's attempt to save the soul-integrity of
consciousness that undermines his account.)

The Research Program of the Organismic Approach

Since the approach here is novel, any research program requires a fresh start in both understanding and
describing what is going on in the organism, and developing an adequate descriptive vocabulary. There
may be interest in associating aspects of mentalist terminology with the new, but finding the new is the
prime aim. For space reasons, it is not possible here
to layout formally the structure and content of such
a program. The following is an indication.
The Brain

As the organ responsible for "what we do," "what


we think," "how we feel," a number of fundamental
issues about the brain must be pursued. Some of these
lie in empirical science, as in: How does the brain turn
physiology into' 'conscious" experience? Some lie in
discovering what the brain does in its control activity,
which will be both empirical and according to testable
models. Models exist: The competitive demons idea,
as in Dennett's view of consciousness, as the "winning control of the organism"; Freud's metapsychological models; Baars's Work Space Theory of
Consciousness (1996); and the folk psychological
principles of means/end, belief/desire reasoning.
The problem is, to varying degrees, these project
onto the neurophysiology loaded mentalist assumptions. The task, now, is to be accurate to the physiology. The rise of the neural network model-which in
an as yet crude way perhaps replicates elements of
brain activity-demonstrates that the physical can
achieve the apparently mental without being primed
somehow with mental functionality. Physical network

Philip Clapson

states can identify objects and perform logical operations without wor king on any kind of problematic
symbolic content of those entities or processes. This
is all well known.
I can only raise two key issues.
1. The death of phosphorescence. The first is a
mistake about what consciousness does, which, despite material here being widely available, has not
been discarded. This is the supposed difference of processing between the conscious and nonconscious; that
when we are conscious "we" are active in a way
that is not the case when nonconscious. For example:
"PET studies, published in the 1990s, were stunning
for the sheer dynamism of the change.... When a task
was novel, wide areas of the brain were lighting
up.... But with just a few minutes' practice, the same
results could be produced with hardly any visible effort at all. A skill that had been learned or explored
in the full limelight of consciousness had become
downloaded to create a low-level action template"
(McCrone, 1999, p. 192).
This description by McCrone of Richard Haier's
experiment on students undergoing a computer game
test (typically) confuses "conscious" apparentness
with causal brain activity. It is reasonable to assume
that the brain must modify itself for the task, and to
begin with processing is wide ranging and complex.
Once modified, it can be replayed without the modification process. But this does not mean that "consciousness" pours its "full limelight" 30 on the
learning process-whatever that may mean. "Consciousness" is the (selected, regulated) portrayal of
what the brain is doing; the brain does by neural processes which are not, nor can be, conscious. While
McCrone's glamorizing misdescription is so inscribed
in our vocabulary, a grasp both of "consciousness"
(the brain's communication mechanism), and the neural causation, will remain elusive.
2. Biological translation categories. But clearly,
what causes the organism to act, although describable
in neural terms, must also be describable in terms of
biological categories. For neurophysiology implements biological aims. Freud understood that human
activity realized biological aims, which themselves
were unavailable to consciousness and generalized,
and became specific in the process of realization. This
is how one object could substitute another in satisfying
some primary biological goal. With a different intent,
but resonant insight, Lakoff (1987) and Johnson
(1987) have found within mental activity and its verbal
30

Ryle's word is phosphorescent (1949).

Downloaded by [Adelphi University] at 00:32 23 August 2014

Consciousness: The Organismic Approach


denotation, primary categories of body-world interaction. Finding the translation categories of brain physi010gy into the biological aims, portrayed as
, 'consciousness" and carried out in the actions of the
organism, will be a major task.
Now, one might suppose that belief-desire mentalist discourse is such a translation vocabulary. But
belief (for example) is neither a neural nor biological
category. One might say that belief represents a neural
state upon the basis of which the organism acts. Similarly desire represents a biological category upon
which the organism attempts to alter its state. But the
verbally expressed categorial self-understanding of
beliefs and desires (unless we are going to work downwards from apes to thermostats on the principle that
beliefs and desires are just [bio-]mechanical terms,
as some propose) do not illuminate either what the
neurophysiology does, nor the biological aims of the
organism. These are simply not available to belief-desire vocabulary. Common human experience expresses the sense that, despite being able to say what
one is doing, one does not understand it. "I love him,
but I don't know why." "I bought that table; I think
it's perfect-for some reason." The mystery in human
experience is that what we know and express does not
fully explain what is going on with us. But this is not
a mystery of the human soul; it is the divide between
what "consciousness" delivers interpreted in mentalist vocabulary, and us as organism.

Phenomenology

There are various aspects to reinterpreting phenomenology upon biological and neural understanding;
some of this has happened. The brain's operation may
account for aspects of our experience: Brain states occur for a limited time as does attention; immediate
memory can hold up to only seven items on average,
which presumably is a physiological constraint; emotional states coincide with the presence of certain hormones and peptides. These physical-experiential
coincidences (with the brain deficit and imaging analyses) indicate that "conscious experiencing" is physically locatable.
More important for the approach here, however,
is to analyze "conciousness" on the understanding
that what we are dealing with is a biological interpretation of physical states. "Consciousness" portrays
the operation of neurophysiological currency, upon
which already developed structures and states are influential. It does this in terms of how it finds the world

217
(or its own states as memories, concerns, etc.), and
how it is reacting to that finding. It is a twin poled
expressiveness, as Damasio, and Freud before him,
understood (cf. Freud's two [inner, outer] representational surfaces, as discussed by Mark Solms [1997]).
It is remar kable that evolution should have taken
this path. The hungry infant screams with a "conscious" state of presumably almost complete indeterminacy. The adult has a fine communicable sense of
his or her pangs of hunger together with discriminatory
premonitions of cuisine, and an inventory of appropriate restaurants. "Consciousness" is, as it were, a
biological fabric of interpreted neurophysiology that
during life is woven into communicative possibilities,
particularly with the acquisition of language. It wor ks
because brains are evolutionarily constructed so. In
this sense we are not separate individuals. Although
we may explore the functional details in lower animals
without language-where, for example, it begins in
evolution-we appreciate that language (with little
communicative content) can function because of its
effect on the massive neural structure it activates. 31
This mostly goes unnoticed, being mostly not part of
"consciousness" in the communicators.
Many characteristics of the phenomenology,
looked at this way, have simply passed by investigation. As in the previous section, only two key issues
can be raised.
1. The biological' 'sense of " "Consciousness"
brings with it its own biological function, sense of
Claxton dwells on the sense of free will. But in fact
this sense of underlies the whole fabric of "conscious" experience, indicating not some actuality of
the person's grasp of the world (a mentalist belief category), but merely a preparedness of the organism to
continue to the next moment on this indicator (a biological status or action category).
In the case of blindsight sufferers it is said, for
example, that lack of conscious experience limits their
actions. Under experiment, though they are capable of
guessing what object is in their blind field from a pregiven list with above average success, which implies
their brains have access to some visual information,
they would never act voluntarily in relation to that
object because they assert they cannot see it. This
seems to confirm that it is the visual experience itself
that allows a normal individual to act voluntarily
31 Euan MacPhail (1998), for example, is doubtful that our conscious
fabric can be anything like that of other animals despite our sentiments,
for only language, he thinks, enables consciousness of self which he deems
prerequisite. But his view of consciousness is not as communication.

Downloaded by [Adelphi University] at 00:32 23 August 2014

218
(Marcel, 1988). But the conclusion is not so established.
Blindsight sufferers have brain damage that prevents their organism from registering objects in a way
that, interpreted from the neurophysiology, will appear
"consciously." But this does not mean that it is the
, 'conscious" experience that enables voluntary action.
It is what, and how the brain registers what then appears as "consciousness" that enables voluntary action. With "conscious" experience of seeing comes
the sense that what is seen is there. But this sense
belongs to what "consciousness" delivers, not what
the organism grasps of the object that makes it prepared to act. And the sense arises so that, in the communication mechanism that "consciousness" is, the
expressed assurance of the object being there is made.
If the object is uncertain, through fogginess or obscurity, the organism aims to grasp the object and cannot,
and what is apparent, being ill-defined, may refuse to
be one thing or another. What is expressed as "consciousness" is the obscure and involuntary coming
and going of the nonobject, until suddenly the object
appears with a sense of certainty. "Ah yes, it's an X.
Definitely an X." For the X is "seen." And one would
act on it. But this sense of certainty can be shattered
at the next moment because the brain has gone on
wor king, quite unbeknownst to what is registered as
, 'consciousness," until there appears before one not
an X but a Y, about which there is again certainty.
There is a clear distinction between what is there
and the biological function "what one is certain of."
What normals confirm over blindsighters is thus not
that experiencing establishes a certainty about the
world (the power of consciousness), but that a neurophysiological trigger that will enable the behavior of
the organism is portrayed. Which is, of course, exactly
what one would expect if not supposing "consciousness" has a transcendental reality-acquiring capacity
(cf. Millikan's attack on meaning rationalism [1984]).
The illusion is, therefore, contra Claxton, not our
experience, but that our experience is itself causal. The
biocomputer is not creating an illusion by the sense
of certainty of a perception: That would only be the
case if "consciousness" aspired to the causality of
the silent neural states. It does not. The sense of certainty is (in this case) to convey that the organism
would act on what is apparent in the perception. Belief
misses this distinction of the biology, as indicated.
2. The death ofmentalist reflexivity. Another factor in mentalist phenomenology that misleads is reflexivity. For the way the brain presents its
interpretation gives the sense that, out of our con-

Philip Clapson
scious states, we can and do reflect on, reconsider, or
probe the contents of our own minds. The philosophers' introspection. We do not.
If I have forgotten a name and probe the depths
of my memory, I experience: the state of forgotten,
the state of realizing that I will have to recall because
I have forgotten, the state of attempted recall, and a
vaguer state of trying to "let my mind go blank" so
that the name will emerge. These states are sustained,
rather than being in an exclusive sequence. More accurately, the states not immediate to the moment will
seem to be on a kind of periphery (as vision has a
foveated area, with the periphery blurred or unfocused). And this tell us something about how, in the
brain, the assembly of neural states, bound as "consciousness," present themselves: that there is a time/
task-localized-concurrency function (called, to different purpose in the literature, working memory) which
serves to copresent different material that contextualizes extended operation. In thinking of a name, I do
not forget that it is a name I am trying to think of. But
this does not mean that each of these, as experienced,
causally interacts with the others. They are just the
portrayal of the brain's operation, which here is a
whole made up of overlapping segments.
A zombie presumably would not need all this
elaborate presentation for, since "consciousness"
cannot be its means of communication, it (presumably) does not need communicative contextualization
of an extended operation. But a human presents the
context of its state as well as an individual segment
because only thus is it explanatory. Indeed, this very
biological function may become a disadvantage, for
"letting the mind go blank" is the attempt, by the
brain, to prevent the context presentation getting in
the way of the recall task itself.
Copresentation of the segments gives the strong
impression, ensconced in our ideas and language, that
it is our experience that is causal in the interrogation
of our experience or minds. It seems to indicate, for
example, that realizing I have forgotten the name, I
"look inward" for it. But obviously the brain does
not' 'look inward." It tries to engage the right location
in its search method, which it presents as "looking
inward." Perhaps more than any other one topic, analyzing reflexivity will disabuse us of our mentalist interpretation of the nature of "consciousness."

Summary
The aim here has been to identify and outline an understanding of the human organism that can be turned

Downloaded by [Adelphi University] at 00:32 23 August 2014

Consciousness: The Organismic Approach


from a set of, as yet, unstructured insights into an
explicit research program. Much that is relevant has
not even entered the discussion.
Humankind is social, but not because individuals
wish to communicate the lonely depths of the soul, or
even the time of day. Humans, like other organisms,
are just physical objects. It is because in human biology the communicative function, "consciousness,"
necessarily causes the organism to share a communicable commonality of experiencing. The appearance
of our perceiving, thinking, and feeling is a biological
function to share organismic status reciprocally with
our fellows as an evolutionary feature which enhances
our survival and reproductive efficiency, particularly
in the achievement of common action. Otherwise, as
we sense, it is not merely pointless but a burden.

References
Baars, B. J. (1996), In the Theatre of Consciousness: The
Workspace of the Mind. New York: Oxford University
Press.
Beaumont, J. G. (1999), Neuropsychology. In: The Blackwell Dictionary ofNeuropsychology, ed. J. G. Beaumont,
P. M. Kenealy, & M. J. C. Rogers. Oxford: Blackwell.
Brentano, F. (1874), Psychology from an Empirical Standpoint, tr. A. Rancurello, D. Terrell, & L. McAllister.
London: Routledge.
Chalmers, D. J. (1996), The Conscious Mind. Oxford: Oxford University Press.
Churchland, P. M. (1981), Eliminative materialism and the
propositional attitudes. In: The Nature of Mind, ed. D.
M. Rosenthal. Oxford: Oxford University Press, 1991,
pp.601-612.
Clark, A. (1997), Being There: Putting Brain, Body and
World Together Again. Cambridge, MA: MIT Press.
Claxton, G. (1999), Whodunnit? Unpicking the "seems"
of free will. In: The Volitional Brain, ed. B. Libet, A.
Freeman, & K. Sutherland. Exeter, U.K.: Academic,
pp. 99-114.
Damasio, A. (1994), Descartes' Error. New York: Putnam.
- - - (1999), The Feeling of What Happens. London:
Heinemann.
Dennett, D. C. (1987), The Intentional Stance. Cambridge,
MA: MIT Press.
- - - (1991), Consciousness Explained. New York: Little Brown.
- - - (1994), Dennett. In: A Companion to the Philosophy
of Mind, ed. S. Guttenplan. Oxford: Blackwell, pp.
236-244.
- - - (1996), Kinds of Minds. London: Weidenfeld &
Nicolson.
D'escartes, R. (1985), The Philosophical Writings of Descartes, Vols. 1 & 2, tr. J. Cottingham, R. Stoothoff, &

219
D. Murdoch. Cambridge, U.K.: Cambridge University
Press.
Fichte, J. G. (1994), Introductions to the Wissenschaftslehre
(1797-1800), tr. D. Breazle. Indianapolis: Hackett.
Freud, S. (1895), Project for a Scientific Psychology. Standard Edition, 1:281-391. London: Hogarth Press, 1966.
Halligan, P., & Oakley, D. (2000), Greatest myth of all.
New Scientist, November 18, p. 34.
Hume, D. (1739-1740), A Treatise of Human Understanding. London: Fontana.
Humphrey, N. (1983), Consciousness Regained. Oxford:
Oxford University Press.
- - - (1992), A History of the Mind. London: Chatto &
Windus.
Johnson, M. (1987), The Body in the Mind. Chicago: University of Chicago Press.
Kant, I. (1781), Critique ofPure Reason, tr. N. Kemp Smith.
London: Macmillan, 1929.
Lakoff, G. (1987), Women, Fire and Dangerous Things.
Chicago: University of Chicago Press.
Laplanche, J., & Pontalis, J.-B. (1973), The Language of
Psychoanalysis. London: Hogarth Press.
LeDoux, J. (1998), The Emotional Brain. New York: Simon & Schuster.
Libet, B., Curtis, A. G., Wright, E. W., & Pearl, D. K.
(1983), Time of conscious intention to act in relation
to onset of cerebral activity (readiness potential). The
unconscious initiation of a freely voluntary act. Brain,
106:623-642.
MacPhail, E. M. (1998), The Evolution of Consciousness.
Oxford: Oxford University Press.
Marcel, A. J. (1988), Phenomenal experience and functionalism. In: Consciousness and Contemporary Science, ed.
A. J. Marcel & E. Bisiach. Oxford: Clarendon Press,
pp. 121-158.
McCrone, J. (1999), Going Inside. London: Faber & Faber.
Millikan, R. G. (1984), Language, Thought and Other Biological Categories. Cambridge, MA: MIT Press.
Munz, P. (1993), Philosophical Darwinism. London:
Routledge.
Nisbett, R. E., & Ross, L. (1980), Human Inferences: Strategic Shortcomings of Social Judgment. Englewood Cliffs,
NJ: Prentice Hall.
Norretranders, T. (1998), The User Illusion, tr. J. Sydenham. New York: Penguin-Putnam.
Ryle, G. (1949), The Concept of Mind. London: Hutchinson.
Seager, W. (1999), Theories of Consciousness. New
York: Routledge.
Searle, J. (1992), The Rediscovery of the Mind. Cambridge,
MA: MIT Press.
Schopenhauer, A. (1818), The World as Will and Representation, Vols. 1 & 2. New York: Dover, 1966.
Solms, M. (1997), What is consciousness? J. Amer. Psychoanal. Assn., 45:681-703.
Sulloway, F. (1983), Freud: Biologist of the Mind. Cambridge, MA: Harvard University Press.

Philip Clapson

220

Downloaded by [Adelphi University] at 00:32 23 August 2014

Varela, F. J., Thompson, E., & Rosch, E. (1991), The Embodied Mind. Cambridge, MA: MIT Press.
Wall, P. (1999), Pain: The Science of Suffering. London:
Weidenfeld & Nicolson.
Wittgenstein, L. (1976), Philosophical Investigations, tr. G.
E. M. Anscombe. Oxford: Basil Blackwell.

Philip Clapson
P. O. Box 38225
London NW3 5XT
United Kingdom
e-mail: philipclapson@yahoo.co.uk

You might also like