You are on page 1of 153

Postmodernism

Postmodernism is a complicated term, or set of ideas, one that has only emerged as an area of academic study since the mid-1980s. Postmodernism is hard to define, because it is a concept that appears in a wide variety of disciplines or areas of study, including art, architecture, music, film, literature, sociology, communications, fashion, and technology. It's hard to locate it temporally or historically, because it's not clear exactly when postmodernism begins. Perhaps the easiest way to start thinking about postmodernism is by thinking about modernism, the movement from which postmodernism seems to grow or emerge. Modernism has two facets, or two modes of definition, both of which are relevant to understanding postmodernism. The first facet or definition of modernism comes from the aesthetic movement broadly labeled "modernism." This movement is roughly coterminous with twentieth century Western ideas about art (though traces of it in emergent forms can be found in the nineteenth century as well). Modernism, as you probably know, is the movement in visual arts, music, literature, and drama which rejected the old Victorian standards of how art should be made, consumed, and what it should mean. In the period of "high modernism," from around 1910 to 1930, the major figures of modernism literature helped radically to redefine what poetry and fiction could be and do: figures like Woolf, Joyce, Eliot, Pound, Stevens, Proust, Mallarme, Kafka, and Rilke are considered the founders of twentieth-century modernism. From a literary perspective, the main characteristics of modernism include: 1. an emphasis on impressionism and subjectivity in writing (and in visual arts as well); an emphasis on HOW seeing (or reading or perception itself) takes place, rather than on WHAT is perceived. An example of this would be stream-of-consciousness writing. 2. a movement away from the apparent objectivity provided by omniscient third-person narrators, fixed narrative points of view, and clear-cut moral positions. Faulkner's multiply-narrated stories are an example of this aspect of modernism. 3. a blurring of distinctions between genres, so that poetry seems more documentary (as in T.S. Eliot or ee cummings) and prose seems more poetic (as in Woolf or Joyce). 4. an emphasis on fragmented forms, discontinuous narratives, and random-seeming collages of different materials. 5. a tendency toward reflexivity, or self-consciousness, about the production of the work of art, so that each piece calls attention to its own status as a production, as something constructed and consumed in particular ways.

6. a rejection of elaborate formal aesthetics in favor of minimalist designs (as in the poetry of William Carlos Williams) and a rejection, in large part, of formal aesthetic theories, in favor of spontaneity and discovery in creation. 7. A rejection of the distinction between "high" and "low" or popular culture, both in choice of materials used to produce art and in methods of displaying, distributing, and consuming art. Postmodernism, like modernism, follows most of these same ideas, rejecting boundaries between high and low forms of art, rejecting rigid genre distinctions, emphasizing pastiche, parody, bricolage, irony, and playfulness. Postmodern art (and thought) favors reflexivity and self-consciousness, fragmentation and discontinuity (especially in narrative structures), ambiguity, simultaneity, and an emphasis on the destructured, decentered, dehumanized subject. But--while postmodernism seems very much like modernism in these ways, it differs from modernism in its attitude toward a lot of these trends. Modernism, for example, tends to present a fragmented view of human subjectivity and history (think of The Wasteland, for instance, or of Woolf's To the Lighthouse), but presents that fragmentation as something tragic, something to be lamented and mourned as a loss. Many modernist works try to uphold the idea that works of art can provide the unity, coherence, and meaning which has been lost in most of modern life; art will do what other human institutions fail to do. Postmodernism, in contrast, doesn't lament the idea of fragmentation, provisionality, or incoherence, but rather celebrates that. The world is meaningless? Let's not pretend that art can make meaning then, let's just play with nonsense. Another way of looking at the relation between modernism and postmodernism helps to clarify some of these distinctions. According to Frederic Jameson, modernism and postmodernism are cultural formations which accompany particular stages of capitalism. Jameson outlines three primary phases of capitalism which dictate particular cultural practices (including what kind of art and literature is produced). The first is market capitalism, which occurred in the eighteenth through the late nineteenth centuries in Western Europe, England, and the United States (and all their spheres of influence). This first phase is associated with particular technological developments, namely, the steamdriven motor, and with a particular kind of aesthetics, namely, realism. The second phase occurred from the late nineteenth century until the mid-twentieth century (about WWII); this phase, monopoly capitalism, is associated with electric and internal combustion motors, and with modernism. The third, the phase we're in now, is multinational or consumer capitalism (with the emphasis placed on marketing, selling, and consuming commodities, not on producing them), associated with nuclear and electronic technologies, and correlated with postmodernism. Like Jameson's characterization of postmodernism in terms of modes of production and technologies, the second facet, or definition, of postmodernism comes more from history and sociology than from literature or art history. This approach defines postmodernism as

the name of an entire social formation, or set of social/historical attitudes; more precisely,this approach contrasts "postmodernity" with "modernity," rather than "postmodernism" with "modernism." What's the difference? "Modernism" generally refers to the broad aesthetic movements of the twentieth century; "modernity" refers to a set of philosophical, political, and ethical ideas which provide the basis for the aesthetic aspect of modernism. "Modernity" is older than "modernism;" the label "modern," first articulated in nineteenth-century sociology, was meant to distinguish the present era from the previous one, which was labeled "antiquity." Scholars are always debating when exactly the "modern" period began, and how to distinguish between what is modern and what is not modern; it seems like the modern period starts earlier and earlier every time historians look at it. But generally, the "modern" era is associated with the European Enlightenment, which begins roughly in the middle of the eighteenth century. (Other historians trace elements of enlightenment thought back to the Renaissance or earlier, and one could argue that Enlightenment thinking begins with the eighteenth century. I usually date "modern" from 1750, if only because I got my Ph.D. from a program at Stanford called "Modern Thought and Literature," and that program focused on works written after 1750). The basic ideas of the Enlightenment are roughly the same as the basic ideas of humanism. Jane Flax's article gives a good summary of these ideas or premises (on p. 41). I'll add a few things to her list. 1. There is a stable, coherent, knowable self. This self is conscious, rational, autonomous, and universal--no physical conditions or differences substantially affect how this self operates. 2. This self knows itself and the world through reason, or rationality, posited as the highest form of mental functioning, and the only objective form. 3. The mode of knowing produced by the objective rational self is "science," which can provide universal truths about the world, regardless of the individual status of the knower. 4. The knowledge produced by science is "truth," and is eternal. 5. The knowledge/truth produced by science (by the rational objective knowing self) will always lead toward progress and perfection. All human institutions and practices can be analyzed by science (reason/objectivity) and improved. 6. Reason is the ultimate judge of what is true, and therefore of what is right, and what is good (what is legal and what is ethical). Freedom consists of obedience to the laws that conform to the knowledge discovered by reason.

7. In a world governed by reason, the true will always be the same as the good and the right (and the beautiful); there can be no conflict between what is true and what is right (etc.). 8. Science thus stands as the paradigm for any and all socially useful forms of knowledge. Science is neutral and objective; scientists, those who produce scientific knowledge through their unbiased rational capacities, must be free to follow the laws of reason, and not be motivated by other concerns (such as money or power). 9. Language, or the mode of expression used in producing and disseminating knowledge, must be rational also. To be rational, language must be transparent; it must function only to represent the real/perceivable world which the rational mind observes. There must be a firm and objective connection between the objects of perception and the words used to name them (between signifier and signified). These are some of the fundamental premises of humanism, or of modernism. They serve--as you can probably tell--to justify and explain virtually all of our social structures and institutions, including democracy, law, science, ethics, and aesthetics. Modernity is fundamentally about order: about rationality and rationalization, creating order out of chaos. The assumption is that creating more rationality is conducive to creating more order, and that the more ordered a society is, the better it will function (the more rationally it will function). Because modernity is about the pursuit of everincreasing levels of order, modern societies constantly are on guard against anything and everything labeled as "disorder," which might disrupt order. Thus modern societies rely on continually establishing a binary opposition between "order" and "disorder," so that they can assert the superiority of "order." But to do this, they have to have things that represent "disorder"--modern societies thus continually have to create/construct "disorder." In western culture, this disorder becomes "the other"--defined in relation to other binary oppositions. Thus anything non-white, non-male, non-heterosexual, nonhygienic, non-rational, (etc.) becomes part of "disorder," and has to be eliminated from the ordered, rational modern society. The ways that modern societies go about creating categories labeled as "order" or "disorder" have to do with the effort to achieve stability. Francois Lyotard (the theorist whose works Sarup describes in his article on postmodernism) equates that stability with the idea of "totality," or a totalized system (think here of Derrida's idea of "totality" as the wholeness or completeness of a system). Totality, and stability, and order, Lyotard argues, are maintained in modern societies through the means of "grand narratives" or "master narratives," which are stories a culture tells itself about its practices and beliefs. A "grand narrative" in American culture might be the story that democracy is the most enlightened (rational) form of government, and that democracy can and will lead to universal human happiness. Every belief system or ideology has its grand narratives, according to Lyotard; for Marxism, for instance, the "grand narrative" is the idea that capitalism will collapse in on itself and a utopian socialist world will evolve. You might think of grand narratives as a kind of meta-theory, or meta-ideology, that is, an ideology

that explains an ideology (as with Marxism); a story that is told to explain the belief systems that exist. Lyotard argues that all aspects of modern societies, including science as the primary form of knowledge, depend on these grand narratives. Postmodernism then is the critique of grand narratives, the awareness that such narratives serve to mask the contradictions and instabilities that are inherent in any social organization or practice. In other words, every attempt to create "order" always demands the creation of an equal amount of "disorder," but a "grand narrative" masks the constructedness of these categories by explaining that "disorder" REALLY IS chaotic and bad, and that "order" REALLY IS rational and good. Postmodernism, in rejecting grand narratives, favors "mini-narratives," stories that explain small practices, local events, rather than large-scale universal or global concepts. Postmodern "mini-narratives" are always situational, provisional, contingent, and temporary, making no claim to universality, truth, reason, or stability. Another aspect of Enlightenment thought--the final of my 9 points--is the idea that language is transparent, that words serve only as representations of thoughts or things, and don't have any function beyond that. Modern societies depend on the idea that signifiers always point to signifieds, and that reality resides in signifieds. In postmodernism, however, there are only signifiers. The idea of any stable or permanent reality disappears, and with it the idea of signifieds that signifiers point to. Rather, for postmodern societies, there are only surfaces, without depth; only signifiers, with no signifieds. Another way of saying this, according to Jean Baudrillard, is that in postmodern society there are no originals, only copies--or what he calls "simulacra." You might think, for example, about painting or sculpture, where there is an original work (by Van Gogh, for instance), and there might also be thousands of copies, but the original is the one with the highest value (particularly monetary value). Contrast that with cds or music recordings, where there is no "original," as in painting--no recording that is hung on a wall, or kept in a vault; rather, there are only copies, by the millions, that are all the same, and all sold for (approximately) the same amount of money. Another version of Baudrillard's "simulacrum" would be the concept of virtual reality, a reality created by simulation, for which there is no original. This is particularly evident in computer games/simulations-think of Sim City, Sim Ant, etc. Finally, postmodernism is concerned with questions of the organization of knowledge. In modern societies, knowledge was equated with science, and was contrasted to narrative; science was good knowledge, and narrative was bad, primitive, irrational (and thus associated with women, children, primitives, and insane people). Knowledge, however, was good for its own sake; one gained knowledge, via education, in order to be knowledgeable in general, to become an educated person. This is the ideal of the liberal arts education. In a postmodern society, however, knowledge becomes functional--you learn things, not to know them, but to use that knowledge. As Sarup points out (p. 138), educational policy today puts emphasis on skills and training, rather than on a vague

humanist ideal of education in general. This is particularly acute for English majors. "What will you DO with your degree?" Not only is knowledge in postmodern societies characterized by its utility, but knowledge is also distributed, stored, and arranged differently in postmodern societies than in modern ones. Specifically, the advent of electronic computer technologies has revolutionized the modes of knowledge production, distribution, and consumption in our society (indeed, some might argue that postmodernism is best described by, and correlated with, the emergence of computer technology, starting in the 1960s, as the dominant force in all aspects of social life). In postmodern societies, anything which is not able to be translated into a form recognizable and storable by a computer--i.e. anything that's not digitizable--will cease to be knowledge. In this paradigm, the opposite of "knowledge" is not "ignorance," as it is the modern/humanist paradigm, but rather "noise." Anything that doesn't qualify as a kind of knowledge is "noise," is something that is not recognizable as anything within this system. Lyotard says (and this is what Sarup spends a lot of time explaining) that the important question for postmodern societies is who decides what knowledge is (and what "noise" is), and who knows what needs to be decided. Such decisions about knowledge don't involve the old modern/humanist qualifications: for example, to assess knowledge as truth (its technical quality), or as goodness or justice (its ethical quality) or as beauty (its aesthetic quality). Rather, Lyotard argues, knowledge follows the paradigm of a language game, as laid out by Wittgenstein. I won't go into the details of Wittgenstein's ideas of language games; Sarup gives a pretty good explanation of this concept in his article, for those who are interested. There are lots of questions to be asked about postmodernism, and one of the most important is about the politics involved--or, more simply, is this movement toward fragmentation, provisionality, performance, and instability something good or something bad? There are various answers to that; in our contemporary society, however, the desire to return to the pre-postmodern era (modern/humanist/Enlightenment thinking) tends to get associated with conservative political, religious, and philosophical groups. In fact, one of the consequences of postmodernism seems to be the rise of religious fundamentalism, as a form of resistance to the questioning of the "grand narratives" of religious truth. This is perhaps most obvious (to us in the US, anyway) in muslim fundamentalism in the Middle East, which ban postmodern books--like Salman Rushdie's The Satanic Verses --because they deconstruct such grand narratives. This association between the rejection of postmodernism and conservatism or fundamentalism may explain in part why the postmodern avowal of fragmentation and multiplicity tends to attract liberals and radicals. This is why, in part, feminist theorists have found postmodernism so attractive, as Sarup, Flax, and Butler all point out. On another level, however, postmodernism seems to offer some alternatives to joining the global culture of consumption, where commodities and forms of knowledge are offered by forces far beyond any individual's control. These alternatives focus on thinking of any

and all action (or social struggle) as necessarily local, limited, and partial--but nonetheless effective. By discarding "grand narratives" (like the liberation of the entire working class) and focusing on specific local goals (such as improved day care centers for working mothers in your own community), postmodernist politics offers a way to theorize local situations as fluid and unpredictable, though influenced by global trends. Hence the motto for postmodern politics might well be "think globally, act locally"--and don't worry about any grand scheme or master plan.
All materials on this site are written by, and remain the propery of, Dr. Mary Klages, Associate Professor, English Department, University of Colorado, Boulder. You are welcome to quote from this essay, or to link this page to your own site, with proper attribution. For more information, see Citing Electronic Sources. The Flax article referred to is Jane Flax, "Postmodernism and Gender Relations in Feminist Theory," in Linda J. Nicholson, ed., Feminism/Postmodernism, Routledge, 1990. The Sarup article referred to is Chapter 6, "Lyotard and Postmodernism," in Madan Sarup's An Introductory Guide to Post-Structuralism and Postmodernism, University of Georgia Press, 1993. Last revision: April 21, 2003 For comments, send mail to Professor Mary Klages Return to English 2010 Home Page

Practicing Post-Modernism:
The Example of John Hawkes
by John M. Unsworth

Contemporary Literature 32.1 (Spring 1991)


"The excitement of contemporary studies is that all of its critical practitioners and most of their subjects are alive and working at the same time. One work influences another, bringing to the field a spirit of competition and cooperation that reaches an intensity rarely found in other disciplines" (x). In these remarks on "contemporary studies," Jerome Klinkowitz takes for granted that contemporary writers and their critics belong to one "discipline," the academic discipline of literary study. This affiliation of criticism and creative writing within a single institutional framework does indeed compound the influence that critic and author have on one another's work, as it multiplies the opportunities and the incentives for cooperation; but rather than simply celebrating this fact, as Klinkowitz does, we ought to inquire into the consequences of the professional interaction and practical interdependence of author and critic, particularly as it affects the creativity of the former and the judgment of the latter. John Hawkes provides an excellent opportunity for such an inquiry, for several reasons. Discovered by Albert Guerard in 1947 and vigorously promoted by him in the years that followed, Hawkes was the first American "post-modern" author to gain notoriety.[1] Writers of Hawkes's generation were, in turn, the first in this country to spend their entire creative lives in the academy: they have used that position with unprecedented success to shape and control critical reception, especially through the mechanism of the interview. At the same time, as Guerard's influence on Hawkes demonstrates, criticism can shape a writer's understanding of what is important in his or her creative work. There are two places to look for evidence of the kind of influence I am discussing: in the author's work and in representations of that work, either by the author or by the critics. In what follows, I will look at a short story by Hawkes which encodes a drama of authorial influence on critical reading, and along with it I will consider a critical essay on the story which enacts the part scripted for the reader in that drama. Thereafter, I will take a broader sampling of Hawkes's critical fortunes, with an eye not only to the migration of descriptive language from author to critic, via the interview,[2] but also to the genesis of that language in the writing of Hawkes's earliest and most influential critic, Albert Guerard. The story and the critical reading I start with were both published in a 1988 anthology called Facing Texts: Encounters Between Contemporary Writers and Critics, edited by Heide Ziegler. This volume deserves comment in its own right, as an emblem of postmodern literary practice. The title of the anthology refers to the fact that it pairs creative

texts by prominent first-generation post-modern authors with critical essays on those texts; what makes the volume emblematic is that the critics were in most cases handpicked by the authors themselves. In fact, as her preface informs us, Ziegler herself was picked by one of those authors: Facing Texts originated in a suggestion made by William Gass to an editor at Duke University Press, that Ziegler should edit a collection of contemporary American fiction. Ziegler says that, when the project was proposed to her, I immediately recognized that in effect I was being offered the opportunity to realize one of my pet ideas: to bring together . . . unpublished pieces by authors as well as critics that would, in a sense, defy the chronological secondariness of critical interpretation. Such a book would make the relationship between author and critic an unmediated encounter, with authors and critics becoming one another's ideal readers. . . . if possible, the pieces offered by the authors should indeed be hitherto unpublished so as to give the critics a sense of the exclusiveness, even privacy of their work and thus convey to them the impression of a close encounter with the respective author. . . . [and] the authors should choose their own critics in order to ensure that the close encounter I had in mind would not, unintentionally, be hostile, and thus destroy the possibility of mutual ideal readership. (ix)[3] In Hawkes's case, Ziegler's solicitude is unnecessary: his contribution to this volume was designed to foster the kind of reading that she desired for it. "The Equestrienne" is a portion of Hawkes's 1985 novella Innocence in Extremis, which is, in turn, an outtake from a novel, Adventures in the Alaskan Skin Trade. A large part of the novel is devoted to relating the misadventures of "Uncle Jake," as recalled by his daughter; relative to that story, Innocence in Extremis is an extended flashback, to a time when Uncle Jake, as a boy, visited his ancestral home in France with his father and family. "The Equestrienne" is one of the three set pieces that make up the novella, but it has been published here without introduction or reference to the context in which it was developed, and it can be read as a free-standing, very short story.[4] In "The Equestrienne," Uncle Jake's French grandfather (referred to exclusively as "the Old Gentleman") stages an exhibition of dressage, on what we are told is one of several "occasions deemed by the Old Gentleman to be specially enjoyable to his assembly of delighted guests" (216). In this, the first of those (three) occasions and the only one presented here, a young cousin of Uncle Jake's performs for an audience consisting of the visitors (including Uncle Jake), members of the household, and some neighbors, all seated in rows of plush Empire chairs arranged in a courtyard of the family chateau. The girl and her horse are the center of attention, but the performance itself is the medium for an interaction between the audience and the Old Gentleman. In this case, the audience in the tale clearly stands for the audience of the tale, and almost from its opening lines the text signals the effect it wants to achieve -- most notably in the

modifiers that cluster around descriptions of the represented audience. As an example, take the passage just quoted: "the days of harmony and pleasure were further enhanced by certain occasions deemed by the Old Gentleman to be specially enjoyable to his assembly of delighted guests." It is the narrator who tells us that days already harmonious and pleasurable were "enhanced" by what is about to be related; and while we might be privy to some delusion in the Old Gentleman when we are told that he "deemed" his entertainment "to be specially enjoyable" to his guests, any distance between his objective and their reaction is collapsed in the very same sentence, when we learn that they are in fact "delighted." Each detail of the performance is similarly described and received. "The gilded frames and red plush cushions of the chairs shone in the agreeable light and . . . moved everyone to exclamations of surprise and keen anticipation." In the world of the text as we are given it, the light is "agreeable," and the audience is unanimous in its expression of "surprise and keen anticipation." Throughout the tale, the reactions of the audience consistently confirm what the narration announces. "Through the gateway rode a young girl on a small and shapely dappled gray horse. Here was a sight to win them all and audibly they sighed and visibly they leaned forward. . . . [an] already grateful audience" (216). There is no point in piling up further examples; suffice it to say that this high pitch of appreciation is insistently sustained, the only two discordant notes resolving into it almost immediately. In the first of these, contemplating his cousin, Uncle Jake thinks "with shame. . . of himself and his shaggy and dumpy pony" (218). In the second, shortly thereafter, his mother whispers to him: "mark my words, dear boy. That child is dangerous." These are important moments, but the importance lies not so much in any pall they cast over the performance as in the evidence they give of its irresistible charm. Uncle Jake's insecurity and his mother's mistrust soon give way to the universal sentiment: Uncle Jake realizes that "he wanted to become [his cousin] and take her splendid place on the gray horse," and even his mother admits, "'she is a beautiful little rider, Jake. You might try to ride as well as she does. It would please your father'" (219). In her essay on the story, Christine Laniel remarks that "The Equestrienne" "focuses on one of the most pervasive metaphors in Hawkes's works, which he analyzes as essential to his fiction writing when he refers to 'horsemanship as an art'" (221-22). Specifically, Laniel is suggesting that Hawkes offers dressage as a metaphor for the artistic use of language. That much can easily be read between the lines from which she quotes, but taken in full these lines also suggest that the same metaphor might be extended to include an association of other kinds of horsemanship with other ways of using language -- after all, the audience is composed of equestrians: Nearly everyone in that audience rode horseback. Most of them were fox hunters. Their lives depended on horses. . . . Yet for all of them their mares and geldings and fillies and stallions were a matter of course like stones in a brook or birds in the boughs. Most of the horses they bred and rode were large, rugged, unruly, brutish beasts of great stamina. The horses raced and hunted, pulled their carriages, carried them ambling through sylvan woods and took them cantering great distances, but little

more. So here in the Old Gentleman's courtyard the spectacle of the young equestrienne and her gray horse schooled only in dressage appealed directly to what they knew and to their own relationships to horse and stable yet gave them all a taste of equestrian refinement that stirred them to surprise and pleasure. They had never thought of horsemanship as an art, but here indeed in the dancing horse they could see full well the refinement of an artist's mind. (218) The thrust of this passage, it seems to me, is first to suggest horsemanship as a figure for the use of language in general, and then to distinguish between the nonutilitarian "refinement" of its use in fiction and the practicality of more quotidian language used with an end in mind, as for example to convey information (in "rugged, unruly, brutish" words "of great stamina" but no elegance). In this scene "artist" and audience share what might be called a professional interest in horses, not unlike the professional interest in language Hawkes shares with his readers; and while it may be the general reader and not the critical one who takes language as "a matter of course," even the most perspicacious fox hunters among us are obviously supposed to be "stirred" to "surprise and pleasure" at Hawkes's demonstration of verbal dressage. In fact, at the conclusion of the performance the story explicitly announces the lesson we are to draw from it: "the audience rose to its feet, still clapping. They exclaimed aloud to each other, while clapping, and smiles vied with smiles and no one had praise enough for the exhibition which had taught them all that artificiality not only enhances natural life but defines it" (220). Hawkes's instruction of the reader is too deliberate to be unintended and too obvious to ignore, so it must be explained. In Laniel's analysis, the author at these moments is "forestalling interpretation by anticipating it. As a consequence, the critic is thwarted in efforts to unveil supposedly hidden significations, which are obtrusively exposed by the writer himself"(222). She regards this aspect of the story as a problem only for a criticism which needs "to unveil supposedly hidden significations"; as we have seen, though, "The Equestrienne" does more than interpret itself: it so relentlessly superintends response that it is likely to frustrate any reader, and not merely a certain sort of critic. But for Laniel at least, the "alluring fascination" (222) of "The Equestrienne" survives in its strategy of "seduction, which implies the obliteration of reality and its transfiguration into pure appearance"(226). That is, although she acknowledges that the story reads its own moral, she still finds Hawkes's presentation of "the artificial" fascinating, because it undertakes "the willful deterioration of language as the vehicle of meaning." This deterioration is said to take place in a series of puns and paradoxes (sister-sinister, mastery-fragility, innocence-corruption, and so forth) and in sentences like the following (which explains the effect of the Old Gentleman's having positioned the girl sidesaddle on her horse, with her legs away from the audience): "The fact that she appeared to have no legs was to the entire ensemble as was the white ribbon affixed to her hat: the incongruity without which the congruous whole could not have achieved such perfection" (217). In this sentence, Laniel says,

we are made to experience both frustration and supreme satisfaction, since the expected word is missing and yet is virtually present, enhanced by the strange, incongruous connections that implicitly suggest it. By establishing the curious relationship of the logically unrelated, by uniting the like with the unlike in sudden and unexpected juxtapositions, the poetic text produces a jarring effect, so that we are left with the notion of a fundamental vacancy, of a basic lack that is the very essence of aesthetic pleasure. (228) Yet the sentence Laniel has chosen not only contains the "missing" word -- "perfection" -- but emphasizes it by placing it in the ultimate position. And in any case, Hawkes's notion of an "incongruity without which the congruous whole could not have achieved such perfection" is more plausible as a model than as an occasion for Laniel's observation that the "jarring effect" brought about by "the curious relationship of the logically unrelated" results in "a fundamental vacancy . . . that is the very essence of aesthetic pleasure." Laniel also tries to restore some ambiguity to the story by arguing that Hawkes's "rhetoric of seduction" is always "reversed into derision, as an insidious vein of self-parody gradually penetrates the text" (222). As she sees it, Hawkes's writing cannot function without initiating its own ironical debunking. The "morality of excess" [Innocence in Extremis 55] that guides the artist in his work also guides Hawkes in his writing, as exemplified by the profusion of superlatives and comparatives in the novella and in all his fiction. But this very excessiveness entails a crescendo, an escalation into more and more incongruous associations, so that his texts are relentlessly undermined by their own grotesque redoubling. (235) Self-parody is indeed an abiding characteristic of Hawkes's writing -- and often its saving grace -- but though the language we have already quoted from "The Equestrienne" does suggest an excessiveness that might easily escalate into self-parody, Laniel herself admits that "during the performance of the equestrienne the burlesque element is extremely slight" (233). Consequently, when she makes the argument that this text undermines itself she is forced to rely entirely on evidence collected from other, later sections of the novella and from the originary novel. Still, even if there is no parody in "The Equestrienne," its absence makes it worth discussing. In general, the significant gap in Hawkes's work is not between appearance and reality but between the serious and the parodic elements that constitute his fiction: the uneasiness of his texts is that while his self-parody seems deliberate, it doesn't ground or control the seriousness with which he presents his primary material. Since the critic is bound to make statements about the text, and since making those statements usually involves taking a position relative to the text by offering a reading, critics have often resolved this conflict in the text by going too far in one direction or the other -- either

affirming the response offered by the text (the more common tactic) or overstating the control exercised by the parodic element. Laniel's piece is unusual in that it does the latter, but in order to make this case she has to read beyond the immediate text. By so doing, she is in effect submitting "The Equestrienne" to the control of a self-parody which develops across other, broader contexts. This move begs the question of whether the parodic strain controls the larger contexts from which she abstracts it. In fact, I would argue, it does not -- the uneasiness simply reasserts itself when we look at Innocence in Extremis or Adventures in the Alaskan Skin Trade as texts in their own right. The significance of Hawkes's unstable self-parody, both with regard to its presence in his other fiction and its absence in the present case, is bound up with the problem of the audience and its response. In order to avoid the problem Laniel has with contextualization, let us look briefly at a discrete work, Travesty, written by Hawkes in the early 1970s. Travesty is the monologue of a man who intends to crash the car in which he, his daughter, and an existentialist poet (the lover of both his wife and daughter) are traveling. Papa, the driver, denies being jealous or having any murderous motive; instead, he tells Henri (the poet) that his plan is to create an "accident" so inexplicable that their deaths will have to be understood as the deliberate execution of an abstract design. Henri is apparently nonplused, since Papa reproaches him for his failure to appreciate the beauty of the thing: "Tonight of all nights why can't you give me one moment of genuine response? Without it, as I have said, our expedition is as wasteful as everything else" (82). The response Papa wants from Henri is specifically an aesthetic one, and he sees it as a mark of Henri's artistic insincerity that he is not able to provide it. But, as the reader well understands, the detachment from self-interest which such a response would require is too much to expect, even from an existentialist. As a monologist, Papa necessarily speaks for Henri, and in a similar way Hawkes, as a writer, speaks for the reader. His conceit is auto-destructive, but self-parody -- a preemptive mode of discourse -- is by definition both exclusive of and also highly attentive to the audience. The element of self-parody in Travesty asserts itself as the difference between the supposed reality of death within the fiction and the reality of death supposed which is the fiction -- Hawkes, in other words, is Henri if he is anyone in this story. But as this equation suggests, the parody does not extend to Papa, and much of what he says is seriously intended, not least his confessed need for a response: Let me admit that it was precisely the fear of committing a final and irrevocable act that plagued my childhood, my youth, my early manhood. . . . And in those years and as a corollary to my preoccupation with the cut string I could not repair, the step I could not retrieve, I was also plagued by what I defined as the fear of no response. . . . If the world did not respond to me totally, immediately, in leaf, street sign, the expression of strangers, then I did not exist. . . . But to be recognized in any way was to be given your selfhood on a plate and to be loved, loved, which is what I most demanded.(84-85)

Self-parody, this suggests, is more than an attempt to forestall a feared lack of response (or an undesirable response); it may also become a way to avoid "committing a final and irrevocable [speech] act." On one level, Hawkes is deadly serious about everything that Papa says; on another, he implicitly denies responsibility for the ideas Papa expresses. At both levels, he precludes response -- within the narrative through the technique of monologue, without it through the technique of self-parody. The effect on the reader is, as Laniel says, often baffling: the proffered position is clearly untenable, and yet the parody does not enable an alternate response because it equally clearly does not control the text. The instability I have been describing might also be regarded as a side effect of characterization. Hawkes is fond of creating figures of the artist, but these figures never completely fill the role in which they are cast; most often they are people who have the sensibility of the artist but who do not actually create art. Cyril in The Blood Oranges, Papa in Travesty, Uncle Jake in Adventures in the Alaskan Skin Trade are all men whose medium is action, not language, and who do not pretend to present the fiction in which their artistry is conveyed to the reader. In Travesty, the distinction would seem to be mooted when narration is placed entirely in the hands of "the man who disciplines the child, carves the roast" (44) -- but in fact it persists, since Papa's "creation," the actual crash, cannot be presented within the narrative structure Hawkes has set up and so is not presented at all. In other words, although Hawkes's novella develops in the space between the disclosure and the enactment of Papa's intentions, the aesthetic Hawkes has embodied in those intentions can be expressed only in words, never in action -- hence the equation of Hawkes with Henri. Seeking to evade both the irrevocable commitment of unfeigned statement and the fear of no response, Hawkes has adopted a narrative perspective that results in a fiction which implies but does not constitute the realization of an aesthetic. If the conflict between a desire to present this aesthetic and the fear that it will be rejected is settled in Travesty by giving the narrative over entirely to statement, in "The Equestrienne" Hawkes experiments with the opposite solution, usurping the response of his audience. Rather than seducing the reader, this makes her superfluous: hence Laniel's frustration at trying to present a reading of the story as given -- something that her recourse to other texts demonstrates she is ultimately unable to do. And like response, the absence of a controlling intelligence is dislocated in "The Equestrienne" from a metatextual position to a thematic one: "All at once and above the dainty clatter of the hooves, they heard the loud and charming tinkling of a music box. Heads turned, a new and livelier surprise possessed the audience, the fact that they could not discover the source of the music, which was the essence of artificiality, added greatly to the effect" (219).But even within the story, this absence proves to be more apparent than real: at the end of the girl's exhibition, the Old Gentleman appeared and as one the audience realized that though they had all seen him act the impresario and with his raised hand start the performance, still he had not taken one of the red plush chairs for himself, had not remained with them in the courtyard, had not been a passive witness to his granddaughter's exhibition. He was smiling broadly; he was

perspiring; clearly he expected thanks. In all this the truth was evident: that not only had he himself orchestrated the day, but that it was he who had taught the girl dressage, and he who had from a little balcony conducted her performance and determined her every move, and he who had turned the handle of the music box. Never had the old patrician looked younger or more pleased with himself. (220) The Old Gentleman is not "a passive witness" to the presentation; he is its conductor, and his curtain call might be compared to Hawkes's persistent assertion of the authorial self in his interviews: in both cases, the creator remains behind the scenes during the actual performance but reappears afterward to make sure that its significance is properly understood. The nature of Hawkes's dilemma and the variety of his attempts to resolve it are characteristically post-modern, in that they demonstrate a very real need to assert critical control over the text, combined with a desire that the reader should be persuaded to a particular aesthetic position. Such desires are not peculiar to post-modern authors, of course: Henry James once admitted to dreaming, "in wanton moods, . . . of some Paradise (for art) where the direct appeal to the intelligence might be legalized" (296). Late in his life, James made that appeal to future readers in his prefaces to the New York edition of his works, but he might well have envied the post-modern author, who can address the contemporary reader through the mechanism of the interview. Hawkes's inclination to avail himself of opportunities to discuss his work has resulted in quite a substantial body of interviews.[5] In these interviews, Hawkes propounds his aesthetic program, characterizes his fiction, and explains his intentions in specific novels; the images and analogies he uses migrate visibly from the interviews to the criticism and reappear in the questions posed by subsequent interviewers. In this way, the language of Hawkes's self-descriptions comes to dominate the critical reception of his work, functioning -- to borrow an idea from Kenneth Burke -- as a "terministic screen."[6] Hawkes's career also demonstrates, however, the influence of critics on authors: although the authority of this particular terministic screen is derived from Hawkes via the interview, Hawkes himself seems to have derived many of its component terms from Albert Guerard's early analyses of his work. Hawkes has often acknowledged his debt to Guerard, but to fully understand the nature of that debt we need to know something about the history of the relationship between these two men. Hawkes was not much of a student when he came to Harvard: the semester before he left for the war, he had flunked out.[7] His career as a writer started in Guerard's fiction writing class at Harvard, which he took after returning from service in the Ambulance Corps during World War II. At that time, he had just started working on his first piece of fiction, the novella Charivari, and though manifestly talented, he lacked experience both as a writer and as a reader of modern fiction. Prior to 1947, he had written only some juvenile verse, which he submitted to qualify for Guerard's class; during that class (for which he wrote The Cannibal,), Hawkes's "reading of modern experimental literature was largely confined to poetry," according to Guerard

(Introduction xn). In a recent interview, Hawkes recalled that when they first met, "Guerard . . . was probably in his early thirties, but to me he was an awesome figure. He was quite formidable, quite authoritarian, extremely knowledgeable, a novelist himself, and he had so suddenly and abruptly praised my fiction at the outset in such a way as to give me real confidence" ("Life" 112). Obviously, in the course of this long friendship Hawkes has had many occasions to express his ideas about fiction, and it is likely that Guerard's published criticism of Hawkes reflects those ideas to some extent. We may even grant that, as Guerard has faded from the forefront of contemporary criticism, and as Hawkes has become firmly established as one of the major talents of his generation, the balance of power in the relationship may have shifted somewhat in recent years. But it is nonetheless clear that Guerard played an influential role in molding Hawkes's understanding of the value of his own fiction. The nature and extent of that influence is clear if we compare a few passages from Guerard's early criticism to Hawkes's subsequent self-evaluations. It was Guerard who brought Hawkes and James Laughlin together, and when, in 1949, New Directions published Hawkes's first novel (The Cannibal,), Guerard provided the introduction. This introduction is the earliest critical analysis of Hawkes's work, and its influence on later Hawkes criticism, including the author's own, is inestimable. In it, Guerard says that "Terror . . . can create its own geography" (xiii) and announces, in terms that persist to this day, that "John Hawkes clearly belongs . . . with the cold immoralists and pure creators who enter sympathetically into all their characters, the saved and the damned alike. . . . even the most contaminate have their dreams of purity which shockingly resemble our own" (xii). Not long thereafter, the Radcliffe News published Hawkes's first interview, entitled "John Hawkes, Author, Calls Guerard's Preface Most Helpful Criticism" (March 17, 1950) -- and so it would seem to have been. Guerard's remarks about sympathy for "the saved and the damned alike" are reflected in Hawkes's earliest published critical writing (1960), in which he talks about the experimental novel as displaying "an attitude that rejects sympathy for the ruined members of our lot, revealing thus the deepest sympathy of all" ("Notes on Violence").[8] As late as 1979, Hawkes still describes himself as being "interested in the truest kind of fictive sympathy, as Albert Guerard, my former teacher and lifelong friend, has put it. To him the purpose of imaginative fiction is to generate sympathy for the saved and damned alike" ("Novelist"27).[9] In his 1949 introduction, Guerard confidently compares Hawkes to William Faulkner, Franz Kafka, and Djuna Barnes (although he predicts that Hawkes "will move . . . toward realism"), and he concludes -- on a disciplinary note -- that "How far John Hawkes will go as a writer must obviously depend on how far he consents to impose some page-bypage and chapter-by-chapter consecutive understanding on his astonishing creative energy; on how richly he exploits his ability to achieve truth through distortion; on how well he continues to uncover and use childhood images and fears" (xv). In an addendum to the introduction, written for The Cannibal,'s reissue in 1962, Guerard notes that "the predicted movement toward realism has occurred" but reiterates the importance of nightmare and "vivifying distortion" in Hawkes's fiction (xviii). The concepts of distortion and terror, and the paradoxical linkage of purity and contamination, have since

become staples in the discussion of Hawkes's work: the Hryciw-Wing bibliography lists at least twenty-one essays with the words "nightmare" or "terror" in the title (beginning with a review by Guerard in 1961), and countless others have incorporated the same idea into their arguments.[10] Guerard's addendum also praises Hawkes for being able "to summon pre-conscious anxieties and longings, to symbolize oral fantasies and castration fears -- to shadow forth, in a word, our underground selves" (xviii). In his first essay in self-explanation, presented at a symposium on fiction at Wesleyan University in 1962 and published in Massachusetts Review, Hawkes himself states: The constructed vision, the excitement of the undersea life of the inner man, a language appropriate to the delicate malicious knowledge of us all as poor, forked, corruptible, the feeling of pleasure and pain that comes when something pure and contemptible lodges in the imagination -- I believe in the "singular and terrible attraction" of all this. For me the writer should always serve as his own angleworm -- and the sharper the barb with which he fishes himself out of the blackness, the better.("Notes on The Wild Goose Chase" 788) The image of the fishhook is a more memorable formulation of Guerard's claim that Hawkes's fiction has the ability to "shadow forth our underground selves"; certainly it seems, in keeping with the metaphor of which it is a part, to have set itself deep in Hawkes's vision of his own work. In a 1964 interview, one which has remained among the most often cited, Hawkes told John Enck: "my aim has always been . . . never to let the reader (or myself) off the hook, so to speak, never to let him think that the picture is any less black than it is or that there is any easy way out of the nightmare of human existence" ("John Hawkes" 145). In 1971, the piece in which the metaphor originally appeared was reprinted along with Enck's interview in John Graham's Studies in Second Skin (the dedication to which reads: "For Albert Guerard, who led the way" -- Graham is another of Guerard's former students), and in 1975 the image returns in the following exchange with John Kuehl: Kuehl: You once referred to fishing for yourself. Hawkes: I said that "the author is his own best angleworm and the sharper the barb with which he fishes himself out of the darkness the better.". . . I mean that the writer who exploits his own psychic life reveals the inner lives of us all, the inner chaos, the negative aspects of the personality in general. . . . our deepest inner lives are largely organized around such impulses, which need to be exposed and understood and used. (Kuehl 164-65) It is perhaps significant that a few pages later, Hawkes remarks: "For me evil was once a power. Now it's a powerful metaphor" (166).[12]

The "powerful metaphor" of authorship as auto-piscation was also used by Hawkes the year before to open an influential essay called "Notes on Writing a Novel," which was first printed in 1973 in the Brown Alumni Monthly, reprinted the next year in TriQuarterly, and finally revised and collected in a 1983 volume fittingly entitled In Praise of What Persists. In that piece, Hawkes relates the following anecdote: A scholarly, gifted, deeply good-natured friend once remarked that "Notes on Writing a Novel" is a deplorably condescending title. . . . At that moment. . . . I thought of a metaphor with which I'd ended a talk on fiction ten years ago at Boston College, when I said that "for me, the writer of fiction should always serve as his own angleworm, and the sharper the barb with which he fishes himself out of the darkness, the better." But when I proposed "The Writer as Angleworm" as an alternative, my friend pointed out that preciousness is worse than condescension. (109) The "friend" remains unnamed, but it is somehow appropriate that Hawkes has trouble remembering the genesis of his image, mistaking the Wesleyan venue for a Boston College one; in an interview given in 1979 and published in 1983, he makes a similar mistake when Patrick O'Donnell remarks on "the fetus fished out of the flood in The Beetle Leg." Hawkes responds: "Yes. Thinking of that image reminds me of an interview with John Graham where I said that 'the writer should be his own angleworm [etc.].'" By this point Hawkes is not remembering the occasion on which he originally formulated the idea but misremembering one on which he quoted it -- the interview with Enck, published in Graham. Hawkes goes on to dwell on the image at some length, demonstrating that it still informs his understanding of his own work, however vague its origins: It's an interesting paradox: separating the artist from the human personality, the artistic self from the human self, then thinking of the artist's job as one of catching, capturing, snaring, using a very dangerous and unpleasant weapon, a hook, knowing that his subject matter is himself or his own imagination, which he has to find himself and which he catches ruthlessly. It's a very schizophrenic image, full of dangerous, archetypal maneuvers in the deepest darkness within us. ("Life" 123) Hawkes's choice of words is revealing, in that schizophrenia is often linked to the presence of an overpowering authority figure; we have already seen that Hawkes initially regarded Guerard as "an awesome figure . . . quite formidable, quite authoritarian." In a 1971 encounter called "John Hawkes and Albert Guerard in Dialogue, "Hawkes jokes about that "awesome" authority, but with an insistence and intensity that belie his tone. Hawkes: . . . I have long suspected that I'm a fiction created by Albert Guerard. I think I knew from the very first moment we met. (14) when I met him, and for years afterwards, he was, as a teacher, a ruthless authoritarian, a tremendous disciplinarian. About fifteen years ago, I had thought that I had achieved some kind of equality with Albert, at least on a personal level, and had escaped this terrible awe, and the awesome business of father/teacher, but now I've been plunged right back into it. (15)

My writing has been filled with awkwardness. . . . It's always been Albert who has pointed out where the distorting glasses have been taken off, or where the writing was flabby. . . . Guerard: That's fantasy. It's not true at all . . . (25-26) Despite Hawkes's bantering manner (and Guerard's denial), it is obvious that this relationship was an extremely important one for Hawkes, and his gratitude seems more than slightly tinged with the anxiety of influence. This is understandable, in light of the fact that for more than a decade after leaving Guerard's class, Hawkes submitted each of his novels to Guerard before publishing it; and in at least one instance, Guerard seems to have exercised his authority in the form of a veto. As Hawkes tells it, when Guerard read the manuscript for The Lime Twig, "he sent it back saying 'Jack, this is deplorable; it's a good idea, but poorly conceived and written, and you'll have to start over again'" ("Life" 112). After that, it took Hawkes four years to revise the book, and although Guerard continued to exert a shaping influence on Hawkes's career, this was the last time he was given a manuscript for preapproval. Elsewhere in his dialogue with Guerard, Hawkes says, "just as you controlled everything else, you are, as a matter of fact, responsible for my fiction becoming increasingly socalled 'realistic"' (23), but after The Lime Twig this realism coincided with a new emphasis on the comic and a marked uneasiness on the part of Hawkes: Beginning with Second Skin, I was reluctant and partly afraid to ask my mentor for his approval of my work. That was the first manuscript I published without Guerard's pre-reading. I know he likes Second Skin a great deal. . . . [but] I don't think he likes the next two novels all that much; my feeling is that he thinks The Blood Oranges is, in some ways, a falling off. But he liked Travesty a great deal. . . . The reason that we first went to France was because Guerard, himself, is partly French. . . . So France was the world that Guerard represented. ("Life" 113) If Hawkes was conscious of his comic novels as a departure from the kind of writing approved of by his mentor, Travesty (a "French" novel) would seem to have been his gesture of reconciliation. His next book, The Passion Artist, returned to the earlier style and setting and was very favorably reviewed by Guerard. With regard not only to Hawkes's stylistic oscillations but also to the genealogy of his self-understanding, the central issue is the relation of the artist to the contents of his unconscious mind. In exactly this connection, Frederick Busch -- one of John Hawkes's earliest and friendliest critics -- recently wondered whether John Hawkes, studying his life, perhaps studies his art as well. . . . [he] now faces the danger he has faced throughout a distinguished career - - of tapping his usual psychic resources, of using his usual dreams, of relying upon his usual metaphors, and therefore of risking the loss of new language, new fictive worlds.

. . . I go so far as to sorrow over his considerable praise from academics. . . because I fear that they seek to encourage Hawkes to write what is "teachable" and teachably "post-Modern." . . . like every writer who taps his inner imagery, [Hawkes] must determine when he is to avoid his own urgings and the temptation to use what becomes a habitual vocabulary of images. (When People Publish 110) It is interesting that, in an earlier version of the same essay, Busch's pessimism was decidedly less pronounced: In Death, Sleep & The Traveler, Hawkes may be thinking about who he is as a writer, what he has done, and what he ought to do. He may, at times, seem to be writing out of a sense of Hawkes. . . . When Hemingway became a student of Hemingway -- To Have and Have Not, as compared to its point of origin, "After the Storm," is a good example -- he failed to measure up to his teacher. While I do not see signs of such a failure in Death, Sleep & The Traveler, I do see Hawkes as engaged in the most profound examination of his own writings; and he is daring to risk being influenced by that seductive writer, John Hawkes. ("Icebergs" 62-63) Busch's change in tone between 1977 and 1986 suggests that he does feel Hawkes, with the aid of his academic critics, has seduced himself. I would want to add only that Hawkes's "sense of Hawkes" has, from the beginning, been shaped and developed by his most important reader, Guerard. And although influence here reverses the direction it followed in the case of "The Equestrienne," in each case the academic context shared by reader and writer has fostered an extraordinary symbiosis, one which ultimately enervates both criticism and creativity. In The Romantic Ideology, Jerome McGann says that there is "[an] essential difference which separates the journalistic and polemical criticism whose focus is the present from the scholarly and historical criticism which operates in the present only by facing (and defining) the past" (2-3). To date, much of the criticism of post-modern fiction has indeed been polemical and journalistic and has aimed at reproducing the ideology of the fiction it discusses. But even though no one at present can claim to have the same distance from post-modernism as we have from romanticism, it is still possible to submit post-modern fiction to a criticism that scrutinizes its cultural and institutional determinants. Indeed, as McGann points out, there are good reasons for doing so: When critics perpetuate and maintain older ideas and attitudes in continuities and processive traditions they typically serve only the most reactionary purposes of their societies, though they may not be aware of this; for the cooptive powers of a vigorous culture like our own are very great. If such powers and their results are not always to be deplored, cooptation must always be a process intolerable to critical consciousness,

whose first obligation is to resist incorporation. and whose weapon is analysis. (2) What was new in 1947 has begun to age, and it is now time to ask what purposes are served by perpetuating the ideas and attitudes identified with it. The problem McGann describes is only exacerbated when author and critic are contemporaries cohabiting in one institution. Under these circumstances, the material inducements to cooperation may well subvert the independence of both parties: each is in the position to augment the prestige of the other, but neither is really in control. As McGann predicts, having been incorporated, each is controlled by the ideology of the institution that creates and confers their prestige, and both end up serving the most reactionary purposes of that institution. Where post-modernism is concerned, the institution is the academy and the ideology is that of professionalism. Others have pointed out before now that academic professionalism is itself at the service of larger cultural mechanisms, and that its most reactionary purpose is to co-opt and sequester intellectual energies -- whether critical or creative -- so that they do not disrupt the smooth operation of those mechanisms.[13] Earlier I asserted that the post-modernism of Hawkes and his generation is continuous with modernism, but here that assertion needs to be qualified. First-generation postmodernism differs from its predecessor in one crucial way, namely in being institutionalized. Modernism, for the most part, rejected the security of the academy in order to take liberties with the culture; by contrast, post-modernism stands at the embarrassing conjunction of that modernist heritage of alienation and a practical condition of institutional respectability and security. The aesthetic similarities between modernism and post-modernism pale into insignificance next to this situational difference -- and since the aesthetic features of post-modernism serve purposes different from those they served under modernism, our advocacy of those features serves different purposes as well. It may be too late for authors such as Hawkes to alter their course, but it is by no means too soon for the criticism of post-modern fiction to put aside polemic in favor of analysis and begin resisting the urge to cooperate.

Notes
1. According to Michael Koehler, the term "post-modern" was introduced (in English) in the 1940s by Arnold Toynbee, who used it to denominate the entire period from 1875 to the present. Koehler says that Irving Howe may have been the first person to call the literature after modernism "post-modern," in his 1959 essay "Mass Society and Post-Modern Fiction." A good deal of the confusion that has accompanied the use of this term in recent years might be attributed to the failure to acknowledge that there have already been two generations of the postmodern and that, in many ways, the two have little in common. For the sake of clarity, I use the original form of the word (in which the hyphen privileges the modern) to refer to the first of these two generations, which sees itself as extending the project of modernism. In "postmodernism," on the other hand, the hyphen has dropped out and the agglutinated form, in which "post" gets top billing, implies the emergence of a new entity. This form of the word is

increasingly common, but I would suggest that rather than being applied indiscriminately it ought to denote specifically that rising generation which conceives of itself as distinct from and often opposed to modernism. Back 2. Kenneth Burke's idea of the migration of metaphor is relevant here: "In general, primitive magic tended to transfer an animistic perspective to the charting of physical events. And positivistic science, by antithesis, leads to an opposite ideal, the transferring of physicalist perspective to human events. Each is the migration of a metaphor" (Philosophy 147). In the present case, the migration consists in a transfer of an authorial perspective to the criticism of fiction. Back 3. The other authors in Ziegler's anthology are Robert Coover, Guy Davenport, John Barth, Donald Barthelme, Stanley Elkin, Susan Sontag, Walter Abish, and Joseph McElroy. Back 4. "The Equestrienne" appears in Facing Texts; Innocence in Extremis was published by Burning Deck in 1985; Adventures in the Alaskan Skin Trade was published in hardcover by Simon and Schuster in 1985 and then, as part of the Contemporary American Fiction series, in paperback by Penguin in 1986 Back 5. According to Carol A. Hryciw-Wing's recent bibliography, forty-four interviews with Hawkes were published between 1950 and 1985. Back 6. See Kenneth Burke, "Terministic Screens," chapter 3 of Language as Symbolic Action. Burke says that "even if any given terminology is a reflection of reality, by its very nature as a terminology it must be a selection of reality; and to this extent it must function also as a deflection of reality" (45). He goes on to elaborate the point as follows: "Not only does the nature of our terms affect the nature of our observations, in the sense that the terms direct the attention to one field rather than to another. Also, many of the 'observations' are but implications of the particular terminology in terms of which the observations are made. In brief, much that we take as observations about 'reality' may be but the spinning out of possibilities implicit in our particular choice of terms" (46). Back 7. To my knowledge, the only personal nightmare ever related by Hawkes (for whom the nightmare has become a trademark) is a recurrent dream "about not passing courses and not graduating from Harvard, in which case I would not have been a teacher, et cetera" (Hawkes and Guerard 21). Back 8. This brief essay and a story are accompanied by Guerard's "Introduction to the Cambridge Anti-Realists," among which Guerard includes Hawkes. Back 9. This interview is accompanied by Guerard's review of The Passion Artist. Back 10. Guerard himself, through all four revisions of his entry on Hawkes in the reference work Contemporary Novelists ( 1972, 1976, 1982, 1986), has continued to praise Hawkes for his use of "childhood terror, oral fantasies and castration fears, fears of regression and violence, profound sexual disturbances" (395). Not surprisingly in the 1986 entry Guerard seems somewhat dissatisfied with Adventures in the Alaskan Skin Trade, because it contains so few archetypal dreams [which] echo powerful dreams in the earlier books"; for Guerard, it is only in these echoes that "the author's true voice is dominant" (397). Back 11. This piece has not only been reprinted in Graham but also in Klein and in volume 29 of Contemporary Literary Criticism. Back

12. Kuehl publishes this interview as a chapter in his book on Hawkes -- not an uncommon practice in book-length studies of contemporary authors. Back 13. For a full-length discussion of professionalism as an ideological tool in the administration of culture, see Larson. Back

Works Cited
Burke, Kenneth. Language as Symbolic Action: Essays on Life, Literature, and Method. Berkeley: U of California P, 1966. -------. The Philosophy of Literary Form: Studies in Symbolic Action. 3rd ed. Berkeley: U of California P, 1973. Busch, Frederick. "Icebergs, Islands, Ships Beneath the Sea." A John Hawkes Symposium: Design and Debris. Ed. Anthony C. Santore and Michael Pocalyko. New York: New Directions, 1977. 50-63. -------. When People Publish: Essays on Writers and Writing,. Iowa City: U of Iowa p, 1986. Contemporary Literary Criticism. Detroit: Gale, 1984. Contemporary Novelists. Ed. D. L. Kirkpatrick. 4th ed. New York: St. Martin's, 1986. -------. Ed. James Vinson, 1st-3rd eds. New York: St. Martin's, 1972, 1976, 1982. Graham, John, ed. Studies in Second Skin. The Charles E. Merrill Studies. Columbus, OH: Merrill, 1971. Guerard, Albert. Introduction. The Cannibal. By John Hawkes. 1949. New York: New Directions, 1962. ix-xx. -------. "The Passion Artist: John Hawkes." Rev. of The Passion Artist, by John Hawkes. New Republic 10 Nov. 1979: 29-30. Hawkes, John. "The Equestrienne." Facing Texts: Encounters Between Contemporary Writers and Critics. Ed. Heide Ziegler. Durham: Duke UP, 1988. 215-20. -------. "John Hawkes: An Interview." With John Enck. Wisconsin Studies in Contemporary Literature 6 (1965): 141-55. -------. "Life and Art: An Interview with John Hawkes." With Patrick O'Donnell. Review of Contemporary Fiction 3.3 (1983): 107-26. -------. "Notes on The Wild Goose Chase." Massachusetts Review 3 (1962): 784-88. -------. "Notes on Violence." Audience 7 (1960): 60. -------. "Notes on Writing a Novel." TriQuarterly 30 (1974): 109-26. Rpt. from Brown Alumni Monthly Jan. 1973: 9-16. Rpt. as "Dark Landscapes." In Praise of What Persists. Ed. Stephen Berg. New York: Harper, 1983. 135-47. -------. "The Novelist: John Hawkes." With Thomas LeClair. New Republic 10 Nov. 1979: 26-29. -------. Travesty. New York: New Directions, 1976. Hawkes, John, and Albert Guerard. "John Hawkes and Albert Guerard in

Dialogue." A John Hawkes Symposium: Design and Debris. Ed. Anthony C. Santore and Michael Pocalyko. New York: New Directions, 1977. 14-26. Hryciw-Wing, Carol A. John Hawkes: A Research Guide. New York: Garland, 1986. James, Henry. Preface to the New York edition of The Portrait of a Lady. The Art of Criticism: Henry James on the Theory and the Practice of Fiction. Ed. William Veeder and Susan M. Griffin. Chicago: U of Chicago P, 1986. 286-99. Klein, Marcus, ed. The American Novel Since World War II. Greenwich, CT: Fawcett, 1969. Klinkowitz, Jerome. "Cross-Currents/Modern Critiques/Third Series." The Fiction of William Cass: The Consolation of Language. By Arthur M. Saltzman. Cross-Currents/Modern Critiques. Carbondale: U of Southern Illinois P, 1986. ixx. Koehler, Michael. "'Postmodernismus': Ein begriffsgeschichtlicher Uberblick." Amerikastudien 22.1 (1977): 8-18. Unpublished translation by Tom Austenfeld, held in Bowers Library, Wilson Hall, U of Virginia, Charlottesville, VA. Kuehl, John. John Hawkes and the Craft of Conflict. New Brunswick: Rutgers UP, 1975. Laniel, Christine. "John Hawkes's Return to the Origin: A Genealogy of the Creative Process." Facing Texts: Encounters Between Contemporary Writers and Critics. Ed. Heide Ziegler. Durham: Duke UP, 1988. 221-46. Larson, Magali Sarfatti. The Rise of Professionalism: A Sociological Analysis. Berkeley: U of California P, 1977. McGann, Jerome J. The Romantic Ideology: A Critical Investigation. Chicago: U of Chicago P, 1983. Ziegler, Heide. Preface and Introduction. Facing Texts: Encounters Between Contemporary Writers and Critics. Durham: Duke UP, 1988. ix-x, 3-13.

HANDOUT ON POSTMODERNISM: LYOTARD AND HABERMAS

What's here

Reading assignments for this session Definitions for key terms, including "modernism" and "postmodernism" Background information on Lyotard and Habermas What next?--suggestions on where to go from here

Assignment for next time Jean-Francois Lyotard, The Postmodern Condition, introduction and sections 1-3, 9-10: xxiii-xxv, 3-11, 31-41. (on reserve) Jurgen Habermas, "Modernity--An Incomplete Project," in The Anti- aesthetic, ed. Hal Foster (Port Townsend, WA: Bay Press, 1983): 3-15. (on reserve) Definitions (please note the plural) As you'll soon discover, "postmodernism" has been defined in a number of different ways. And these definitions themselves depend on competing definitions of "modernism" itself. With that in mind, then, let me offer you three important and influential definitions of "modernism," "modernity," and "the modern": 1. "Modernism" is a movement in the arts: The "modernist" movement is often said to have reached its height in the 1910s and '20s, with the achievements of great writers like Joyce, Kafka, and Proust; great painters like Matisse and Picasso; and great composers and musicians like Stravinsky, Berg, and Schoenberg. 2. "The modern age" is a period in Western history: "The modern age" includes both the nineteenth and twentieth centuries. It might be said to begin with the French Revolution of 1789, in which old forms of government are cast off and new social experiments are begun. It might also be said to coincide with the consolidation and global expansion of industrial capitalism. 3. "The modern spirit" is the driving force behind scientific and rationalist thought: "The modern spirit" is what has motivated and inspired Western thinkers for the last 350 years or so. It has led thinkers such as Descartes, Newton, Locke, and Jefferson to make ambitious claims for the power of human reason. Such thinkers have argued that reason makes it possible for human beings to penetrate the mysteries of nature, or to develop forms of government that will ensure human rights and expand human liberties. Lyotard, selections from The Postmodern Condition (1979)

In Lyotard, there is much that might be confusing. As you read, however, you'll be safe in assuming that Lyotard is working with the third definition of modernity, the one that associates modernism with the Enlightenment. In the postmodern era, Lyotard argues, scientific and rationalist discourses have lost their "legitimacy." To figure out he means, you'll need to poner the concept of "legitimacy" and the process of "legitimation." Here are some questions to ask: How do various discourses and disciplines, not to mention particular arguments, "legitimize" themselves? How do they show, demonstrate, prove that they are worthy of serious attention--or that they should be viewed as reliable and indeed authoritative? To what standards, what protocols, what values do and must they appeal? In addition, you should know a bit more about the "grand narratives" mentioned throughout the text. Think in particular about the narratives most frequently associated with the history of science or, somewhat more generally, with the development or "triumph" of human reason. (For example, consider the narrative that tells us how we emerged from the "Dark Ages," dominated by priests and princes, into a period of "Enlightment," dominated by scientists and philosophers.) Such narratives, Lyotard says, are crucial to the process of "legitimation" (see xxiii). Do you see why they might be? Do you agree with Lyotard's assertion that there is now an "incredulity" towards, a deep suspicion of, such narratives? Are you yourself suspicious of them? Habermas, "Modernity--An Incomplete Project" (1981) Habermas is a German thinker, widely regarded as the most compelling and interesting critic of postmodernism. Although he begins this piece by talking about aesthetic modernity, Habermas (like Lyotard) is really more interested in the third definition of "modernity." Indeed, he is convinced that the scientific and rationalist modern spirit is still worth embracing and defending. In this essay and in many of his other writings, he expresses a great deal of faith in human reason, urging humans not to abandon the modern hope that more careful uses of reason might lead to the eventual betterment of humankind. Indeed, Habermas insists that far from being exhausted, "the modern project" has yet to be fulfilled (see 13). In reading Habermas, try to get a fix on what his political position might be. He has a lot to say about conservatives and neoconservatives. Is he a conservative himself? And why does he identify Derrida and Foucault as "young conservatives"? Do you think he's right to do so? What next?

Homepage for English 60A Syllabus for week seven (postmodernism and postcolonialism) Students' responses to these readings Links to internet resources on postmodernism

revised September 26, 1997 mail to Tim Spurgin

Postmodernism and Critical Theory


Postmodernism and Critical Theory are broad rubrics for intellectual movements rather than specific theories, but they are essential parts of social semiotic analysis. Postmodernism derives from Post-Structuralism and Deconstructionism, which were initially criticisms of the Structuralist movement of the 1960s. Critical theory derives from neo-Marxism and Feminist theory, extended to include Post-colonial theory and Queer theory. Structuralism was an off-shoot of general semiotics and formal linguistics and proposed that there were systematic abstract relations-of-relations among the many sign-elements of human culture, whether in kinship relations and village planning (Levi-Strauss), cognitive developmental operations (Piaget), or linguistic phonology (R. Jakobson). These patterns of relations could often be expressed mathematically in terms of abstract group theory and other algebraic structures. It was employed in part also by Barthes (literary semiology) and Lacan (psychodynamics). This view, while basically correct, was too narrow and was criticized for being too static and synchronic (Bourdieu), too ahistorical (Foucault), and too definitive (J. Derrida). Derrida in particular mounted a radical philosophical critique in which he pointed out that the very act of meaning making always presupposes an unanalyzable ground of the possibility of meaning and of sign systems, and that the dialectic of sign and ground produces an inherent instability or indefiniteness in any meaning. These were the Post-Structuralist and Deconstructionist critics. Post-structuralism was very quickly succeeded by a more similar but more general critique of the master cultural narratives, metaphors, and assumptions of European (and European-American) culture (F. Lyotard), including those underlying the previous critical tradition of Marxism and neo-Marxism. This was and is Post-Modernism, though the term is already somewhat unfashionable in France. Post-Modernism made common cause with other new critiques, from non-European cultures, especially post-colonial ones, feminist critiques, and later gay and Queer Theory critiques. Some of these movements also allied themselves with neo-Marxism, as Critical Theory. The present rubric for Postmodern and Critical Theory work is often called Culture Studies. Perhaps the most characteristic tenet of postmodern critical work is that everything that European philosophy and science has held to be fundamentally true at an abstract or programmatic level (ontology, epistemology, metaphysics, logic) is in fact a contingent, historically specific cultural construction, which has often served the covert function of empowering members of a dominant social caste at the expense of Others. It dismantles the most foundational procedures and assumptions whereby prior European philosophical traditions sought to establish universal truths or principles. It is fundamentally a

revolutionary political movement, argued in intellectual terms. For a rather casual introduction to some of these issues, see Lemke, "Semiotics and the Deconstruction of Conceptual Learning." Some useful sources: M.Foucault, J.-F. Lyotard, J. Baudrillard, M. deCerteau, G. Deleuze, B. Latour, D. Haraway, J. Butler, M. Serres, F. Jameson, H.K. Bhabha

POSTMODERNISM, PEDAGOGY, AND PHILOSOPHY OF EDUCATION


Clive Beck Ontario Institute for Studies in Education

INTRODUCTION

In recent years, philosophers of education have been paying a great deal of attention to trends within philosophy which may be loosely referred to as postmodernist. In this paper, I wish to examine some of these trends and note some implications they have both for pedagogy in schools and for teaching and research in philosophy of education. It may be presumptuous of me to talk on this vast topic. But I wish to assure you that I am not doing so just because, as PES President for the year, I have a captive audience. I would have been this presumptuous even if my paper had been refereed! But then, of course, it probably would not have been accepted. Today, then, you are seeing academic freedom at work. I hope the results are better than they often are when academics are given freedom. I should say at the outset that I am not an expert on postmodernism. However, postmodernist doctrines and practices kept intruding into my life especially as an attender of PES conferences and a reader of graduate student theses and course papers to the point where I could no longer ignore them. Also, from my little site in the academic world some might call it a hind-site but I prefer to see it as a fore-sight I see enough problems with postmodernism, and enough misplaced criticisms of it, that I am inclined to say to heck with the experts and just wade in, a response which postmodernists officially at least must accept, given their avowed rejection of the concept of an expert. In discussing postmodernism I will, as a non-expert, focus especially on secondary sources, the literature which for me is the most accessible and with which I have been able to become most familiar. I will also give a large amount of attention to one writer, namely Richard Rorty, mainly because among self-proclaimed postmodernists he is one of the more theoretical, which suits my purposes in this paper. Some might say that Rortys theoretical approach means that he is less of a postmodernist; but to me it means that he is a more open postmodernist, willing to talk about his methodological and substantive assumptions. Philosophical postmodernism is a development of which one might say that, like many other things, it has done more good than harm and it has done an awful lot of harm! As with most philosophical movements, it is perhaps best viewed as a rich quarry in which we can go searching for gems of insight while not feeling obliged to take home all the rubble. In this paper I will be concentrating mainly on the gems, looking at the positive side of postmodernism. This should not be interpreted as indicating that I am a postmodernist; however, given the trenchant criticisms of modernism developed by postmodernism, I would equally not wish to be seen as a modernist.

WHAT IS POSTMODERNISM? Postmodernism is not just a philosophical movement: it is found also, for example, in architecture, the graphic arts, dance, music, literature, and literary theory.1 As a general cultural phenomenon, it has such features as the challenging of convention, the mixing of styles, tolerance of ambiguity, emphasis on diversity, acceptance (indeed celebration) of innovation and change, and stress on the constructedness of reality. Philosophical postmodernism, in turn, does not represent a single point of view. There are progressive postmodernists and conservative ones,2 postmodernists of resistance and postmodernists of reaction,3 strongly reform-minded postmodernists and others who concentrate on pricking bubbles. There are bleeding hearts and loose cannons. There is constant debate among so-called postmodernists about how a true postmodernist should approach life and inquiry and hence what qualifies as postmodernism. The names most often associated with postmodernism are those of JeanFrancois Lyotard, Jacques Derrida, Michel Foucault, and Richard Rorty. Theoretical approaches most commonly seen as postmodernist are deconstruction(ism), poststructuralism, and neopragmatism.4 However, a case could be made for adding other names, e.g., Nietzsche, the later Wittgenstein, Winch, Heidegger, Gadamer, and Kuhn; and other theoretical approaches, e.g., perspectivalism, postanalytic philosophy, and hermeneutics. Even the critical theory of Jurgen Habermas, with its affinity with hermeneutics and its communicative ethics, has clear postmodern elements, despite Habermass insistence that he is furthering the project of modernity rather than rejecting it. I mention all these names and movements not to impress or confuse, but to show the great overlap between different schools of thought and the pervasiveness of the postmodernist outlook. I feel that in discussing postmodernism we have often spent too much time searching for a neat central core. What is needed rather is to expose ourselves to and respond to a whole family of related outlooks and approaches. Overlap can be found not only between contemporary theoretical approaches but also between these and ones of earlier historical periods. This is the view of Lyotard who, according to John McGowan, holds that postmodern and modern cannot be distinguished from each other temporallythey exist simultaneously, referring to two different responses to modernity.5 Rorty takes a similar position, questioning whether the shifts associated with postmodernism are more than the latest moments of a historicization of philosophy which has been going on continuously since Hegel.6 Further, Rorty thinks that these changes were pretty well complete in Dewey.7 He does not see Foucault, for example,

as any more radical in the postmodern manner than Dewey. He says: I do not see any difference between Dewey and Foucault on narrowly philosophical grounds. The only difference I see between them is the presence or lack of social hope which they display.8 There is of course something odd about seeing Hegel, Nietzsche, or even Dewey as postmodernists, given that they wrote within the modern era and in many ways expressed its spirit. Some writers prefer a more chronologically correct definition of postmodernism. John McGowan, for example, sides with Frederic Jameson in expressing the view that postmodernism as a temporal term designates a (very recent) historical period that is to be identified by a set of characteristics that operate across the whole historical terrain.9 However, despite the awkwardness, I prefer to interpret postmodernism as embracing many approaches and insights which were around before the last few decades and even before the present century. Personally, I feel I have been something of a postmodernist most of my life, even before my exposure to postmodernist writings (I can show you chapter and verse if you wish). And in terms of the history of philosophy, I think the notion that these are entirely new developments exaggerates the extent to which human thought and behavior change, and leaves us wondering how people in earlier centuries could have been so dense as to be completely taken in by the ideas of Plato, Descartes, and Kant. Indeed, it is a good question whether these gentlemen were completely taken in by them themselves: as we know, philosophers often get carried away, and then feel compelled to defend what they have said. For these various reasons, then, the view of postmodernism I am employing in this paper is a rather broad one. In opting for breadth, however, I am not alone. Some general philosophers, such as Rorty (as we have seen) and Richard Bernstein, take a similar tack, as do many educational theorists for example, Stanley Aronowitz, Henry Giroux, and William Doll.10 AN OUTLOOK INFLUENCED BY POSTMODERNISM Accounts of postmodernism abound today in the literature of both general philosophy and educational theory.11 Accordingly, I will not here provide a general exposition of postmodernism but rather, after the brief statement of a particular theme, will go straight to an integration of it (usually in a modified form) into my own proposed approach. I hope, however, that such a treatment will, incidentally, help clarify the nature of postmodernism. The understanding of postmodernism I will assume here is a rough composite of ideas from Rorty (especially) and Lyotard, Derrida, and Foucault. It should be stressed, however, that many of these ideas have

appeared in other schools of thought, both historical and contemporary, e.g., Marxism, feminism, critical pedagogy. I have chosen to focus on these particular writers because they provide a convenient point of departure; and also because discussing them helps us come to terms with the dominant philosophical tradition, which we have some responsibility to try to influence. I have called what I am presenting here an outlook, but that term is rather too cognitive in its connotations. The word attitude is sometimes seen as more appropriate for what postmodernists are talking about. The issues in question also have a strong methodological component: they have to do with an approach to inquiry and life in general. One might almost say that what we are concerned with here is a way of life, which includes cognitive, affective, and methodological components. Reality Postmodernists have helped us see that reality is more complex than we had imagined. It does not exist objectively, out there, simply to be mirrored by our thoughts. Rather, it is in part a human creation. We mold reality in accordance with our needs, interests, prejudices, and cultural traditions. But reality is not entirely a human construction, made by us, not given to us,12 as postmodernists have claimed. Knowledge is the product of an interaction between our ideas about the world and our experience of the world. As E.T. Gendlin says, the assumption is overstated, that concepts and social forms entirely determineexperience. [W]hat the forms work-in, talks back.13 Of course, all experience is influenced by our concepts: we see things even physical things through cultural lenses. But this influence is not all-controlling; again and again reality surprises us (as modern science has shown) in ways that compel us to modify our ideas.14 We thought the world was flat, for example, but were obliged eventually to change our minds. This view may appear dangerously close to Kants notion that knowledge is a product of interaction between mental structures and sense data. However, whereas Kants mental structures were innate and universal and his sense data natural and pure, I see culture and experience as already deeply infected by each other. They are interdependent, and differ only in degree of determination by human agency. A corollary of this interactive view of reality is that there is no sharp factvalue distinction. All factual statements reflect the values they serve, and all value beliefs are conditioned by factual assumptions. There is again a difference of degree which enables us to talk of facts and values. But

what we call facts are only somewhat less value-determined: they are not independent of values. This ties in with Foucaults postmodernist notion that knowledge and power cannot be separated, since knowledge embodies the values of those who are powerful enough to create and disseminate it.15 Foucault has perhaps an overly conspiratorial view of knowledge, but the link with peoples interests which he identifies cannot be denied. Change and Difference Because reality is in part culture dependent, it changes over time, as cultures do, and varies from community to community. Knowledge is neither eternal nor universal. Once again, however, we should not exaggerate this point, as postmodernists have done. There are enduring interests (Dewey) and tentative frameworks (Charles Taylor) which point to a degree of continuity; and there are some commonalities (again qualified) from culture to culture and probably across the whole human race. To deny continuity and commonality where it in fact exists, as postmodernists tend to do, is just as irrational and unpragmatic as to see knowledge as eternal and universal. It betrays an absolutist attachment to such values as innovation, originality, and diversity. Furthermore, it can have unfortunate practical consequences, since it leaves people without an adequate basis for daily living. It is one thing to reject the idea of a fixed, universal foundation to reality, quite another to claim that no useful guidelines can ever be identified. Taking note of the postmodernists cautions, however, we should be careful with generalizations: they can be deceptive. Behind a general formulation such as all humans are rational or people pursue pleasure there is usually a great diversity of realities and interpretations. We should try to become more aware of this, and also more often explicitly qualify claims with words such as some, many, most, sometimes, often. But even qualified generalizations are of great value in everyday life. Metaphysics Postmodernism is often seen by its proponents as bringing an end to metaphysics, ontology, epistemology, and so forth, on the ground that these types of discourse assume a fixed, universal reality and method of inquiry. However, in my view it is better to shift to a modified conception of these fields rather than do away with them completely. Precisely because we live in a changing, fragmented, postmodern world, we need whatever stability we can find. And inquiry into general intellectual,

moral, and other patterns limited and tentative though they may be is a legitimate form of metaphysics. An irony of the postmodernist movement is that, despite itself, it is centrally concerned with what we can say of a general nature about reality. I would even say that it has led to a massive (and salutary) revival of metaphysics. Postmodernists believe they have put an end to metaphysics and have thrown the ladder away after reaching their foundationless perch. But in fact their writings are full of general assumptions about culture, human nature, values, inquiry. As Landon Beyer and Daniel Liston observe, postmodernist analyses are paradoxical, containing standpoints without footings and talking about nothing.16 Not that postmodernists always deny that this is what they do Derrida happily admits that he crosses out his own claims; but to admit a fault is different from overcoming it. The Self Postmodernism has rightly questioned the idea of a universal, unchanging, unified self or subject which has full knowledge of and control over what it thinks, says, and does. It has shown that the self is strongly influenced by its surrounding culture, changes with that culture, and is fragmented like that culture. To a degree, it is not we who think, speak, and act but the culture which thinks, speaks, and acts through us. In many ways Rorty is correct when he describes the moral self as a network of beliefs, desires, and emotions with nothing behind itconstantly reweaving itselfnot by reference to general criteriabut in the hit-ormiss way in which cells readjust themselves to meet the pressures of the environment.17 It is an exaggeration, however, to maintain that because the self is limited, conditioned, and contingent in this way it has no significance, identity, or capacities. Individuals may be no more important than cultures, but neither are they less so. Individuals are just as unified and characterizable as communities, and they have considerable (though not unlimited) capacity for self-knowledge, self-expression, and self-regulation. There is no basis for emphasizing culture or community to the neglect of individuals. And the same may be said for specific groups within a larger culture: ethnic groups, gender categories, socio-economic classes, and so on. There is a tendency among postmodernists to emphasize these categories to the neglect of individuals. But in fact two individuals of the same national background, ethnicity, gender, religion, or the like may differ greatly. And two individuals who differ in all these respects may turn out to be kindred spirits who can have a close friendship, even a good marriage,

and agree on most major matters. Individuals are only in part identifiable in terms of the various categories to which they belong. Inquiry Postmodernist insights require a major shift in our conception of inquiry. No longer should we see ourselves as seeking to uncover a pre-existing reality; rather, we are involved in an interactive process of knowledge creation. We are developing a working understanding of reality and life, one which suits our purposes. And because purposes and context vary from individual to individual and from group to group, what we arrive at is in part autobiographical; it reflects our personal narrative, our particular site in the world. To some extent, then, we must question the notion of expertise. In particular fields, some people do know more than others; but the difference, insofar as it exists, is usually one of degree. So-called experts are often heavily dependent on non-experts for input if they are to arrive at sound insights; and since each individual or groups needs and circumstances are different, expert knowledge cannot be simply applied; it must be greatly modified for a particular case. The interaction between expert and non-expert, teacher and taught, is often best seen as a dialogue or conversation (to use Rortys term), in which there is mutual influence rather than simple transmission from one to the other. The knowledge arrived at, too, is more ambiguous and unstable than we had previously thought. It refers to probabilities rather than certainties, average effects, better rather than the best; and it is constantly changing as each individual or group gives a particular interpretation to it, reflecting distinctive needs and experiences. And as postmodernists have pointed out, language is well adapted to this constant play of interpretation. Words are not tied to fixed concepts or referents; they depend for their meaning on a whole system of words within which they are embedded, a system which changes over time and varies from one speech community or language user to another. Inquiry must also be approached pragmatically.18 We should not insist that reality, including human nature, take a certain form but rather accept what emerges. If altruism, for example, has to be based in part on feelings of group solidarity, then we must acknowledge that: there is no point clinging to a rationalistic view of moral motivation that cannot work. Once again, however, we should be careful not to exaggerate these points. Postmodernists have often attacked notions of reason, means-end thinking, theory, teaching. But in fact there is a place for them, in a modified form. We must employ reason as well as feelings, intuitions, direct social

influence, and so forth. We must think in means-end terms to some extent if we are to know what we want in life and how to achieve it. Theory, understood as a loose interconnection of qualified generalizations, is crucial for daily living. Teaching, so long as it is largely dialogical, is both possible and necessary. And so on. All of these can cause problems if they are understood too strictly and taken too seriously; but without them we would quite literally be lost. We must also qualify the notion of a pragmatic approach to inquiry. While there is no external foundation to reality, no traditional Kantian backup, as Rorty says, there are internal continuities which serve as important reference points. It is possible and necessary, then, to develop theory which explains particular phenomena in terms of these continuities. Postmodernists often display an easy pragmatism which, while claiming to be open and tolerant, is merely superficial, since it fails to develop and use theory of this kind; its doctrines thus become dogmatic assertions, without explanation or justification. Forms of Scholarship One of the slogans of postmodernism is that there is no center, and in particular there is no central tradition of scholarship (namely Eurocentric, middle- class, predominantly male) of which other traditions Native American, Afro-American, Islamic, feminist, working class, for example are mere colonies. Insofar as we study traditional Western scholarship, we should be wary of its white, middle-class, male bias; and we should (if we belong to one or more other categories) approach it as equals, expecting to contribute as much as we learn. This is in line with the view of knowledge and inquiry noted earlier. With this approach I am in agreement, but as you might expect I have some provisos. To begin with, we should not exaggerate the extent of the bias (great though it undoubtedly is) in traditional Western scholarship. There is much we can learn from such scholarship (although also much we must reject). This is because the writers in question, though white, middleor upper-class, and male, were also human beings, struggling with basic issues of how humans are to survive, flourish, and find meaning in life. The bias in favor of particular ethnic, class, and gender interests is only part of the picture. Terms such as Eurocentric and patriarchal are bandied about too much, as though they described everything that an individual or group does, and as if every error that is made is due to the bias in question. As noted earlier, people of different races, genders, religions, or whatever may have a great deal in common. There is enormous scope for people of different categories to learn from each others scholarship.

None of this means, however, that we should regard the Western scholarly tradition as the central one to which others merely contribute or add footnotes. Rather, white, middle-class males should just contribute along with everyone else, and any new, common tradition should be pluralistic scholarship, not simply a modification of the mainstream. A key point, in line with my earlier remarks about the self, is that in addition to anti-racist, feminist, anti-agist, etc. scholarship we need individual scholarship: Jane Doe scholarship, Jos Sanchez scholarship, Shiu Chun Leung scholarship, etc. We have not taken the personal quest of individuals seriously enough: every human being is constantly questioning, observing, theorizing, trying to understand life and make the most of it in his or her own very distinctive situation. The radical democracy of postmodernism leads in this direction, but it gets waylaid because of its excessive preoccupation with cultures and speech communities. Every individual should be seen as the center of a scholarship her or his own comparing notes on equal terms with other individuals, groups, and traditions. SOME IMPLICATIONS FOR PEDAGOGY There are many implications of the foregoing for educational practice, but space permits me only to outline a few of the main ones. To begin with, students in schools from an early age should be helped to see how ideas and institutions are tailored to suit peoples values and interests: how, for example, a picture book or novel expresses the distinctive needs and background of the author; or how TV programming promotes life-styles which benefit commercial enterprises; or how the health professions tend to favor males over females; or how the school curriculum reflects the values of certain sectors of society. This need not involve use of technical language, or be particularly confrontational: such study can be a rather straightforward and enjoyable aspect of the school day. But unless we foster this kind of cultural-political understanding, we are supporting our students continued perception of the world as value-neutral, unproblematic, and unchangeable. Surprisingly, Rorty questions engaging in this kind of problem posing in schools. He maintains that lower education (primary and secondary) is mostly a matter of socialization, of trying to inculcate a sense of citizenship. It should aim primarily at communicating enough of what is held to be true by the society to which the children belong so that they can function as citizens of that society. Whether it is true or not is none of the educators business, in his or her professional capacity.19 However, to me this is an extraordinary and inexplicable betrayal of the main thrust of postmodernism. How can a society succeed in constantly breaking the crust of convention, as Rorty advocates,20 when all its school teachers and

all its young people up to the age of eighteen are involved in singleminded reinforcement of convention? And how will this affect the selfimage and well-being of young people who, as every parent knows, begin systematically to question our conventions from about the age of two? I agree that schools should teach students about social conventions and institutions, probably more than they do at present; but integral to that teaching should be fundamental evaluation and critique. At the same time as we encourage the questioning of accepted realities, however, we must help students find foundations for their lives, if of a less permanent kind. Lack of a sense of stability and direction is one of the major problems of contemporary culture and is a factor in todays reactionary trends in religion, politics, education, and other spheres. If we do not acknowledge this need, our anti-foundationalist teaching may backfire and at any rate may cause students (and parents) considerable distress. We should work with students (and parents, as far as possible) in a dialogical manner, identifying outlooks which are an appropriate combination of old and new elements. Students need to find enduring values (e.g., relational, aesthetic, occupational) and ideals (e.g., pluralistic, global, ecological) which do not contradict their experience of reality but at the same time provide an adequate basis for everyday living. One way of putting this point is to say, as I did in Part III, that metaphysics is important. Schools must encourage and assist students to engage in general theorizing about reality and life. The postmodernist emphasis on concrete, local concerns is important and should be applied in education: school studies are often too abstract and of little apparent relevance. But learning should combine both the concrete and the general. The learning of isolated facts and skills can be equally boring and meaningless. It is often through the drawing of broader connections between phenomena and the exploration of their value implications that learning comes alive. And study of this more theoretical kind is necessary if students are to build up a comprehensive worldview and way of life that will give them the security, direction, and meaning they need. Another set of implications for schooling has to do with the democratic and dialogical emphasis of postmodernism, its questioning of the motives of authorities and its downplaying of the role of experts. We must think increasingly in terms of teachers and students learning together, rather than the one telling the other how to live in a top-down manner. This is necessary both so that the values and interests of students are taken into account, and so that the wealth of their everyday experience is made available to fellow students and to the teacher. Of course, the extent to which the teacher may be regarded as an expert varies from subject to subject. In science and mathematics, for example, a

teacher may well know considerably more than most of the students in the class, while in values and family life this is less obviously the case; and with respect to a particular values topic, e.g., bullying in the school yard, a student may well know more than the teacher. But even where the teacher does have greater knowledge, we should question excessive use of a teacher dominated method. Lyotard has pointed out the extent to which students today at the postsecondary level can learn from computerized data banks, which he calls the Encyclopedia of tomorrow;21 and the same point could be made with respect to the elementary and secondary levels. Increasingly, teachers must help students learn how to learn, using such technology. One great advantage of self-directed inquiry is that through it students are more actively involved in determining what they learn and why, and thus are able to give expression to their distinctive interests and needs. However, while I support a democratic, dialogical approach in schools, I believe that Lyotard (like another education critic, Ivan Illich, before him) underestimates the importance of the teacher in motivating and facilitating learning. The activity of teachers in structuring school studies and making learning materials available at appropriate points results in students learning a great many things they would not otherwise learn. It is not enough simply to give students learning skills and set them loose: most young people need ongoing encouragement and help in order to learn what they need for life in todays world. Perhaps this is simply due to a shortcoming of contemporary culture: it has made young people too dependent on adult help. Or perhaps it is the result of more basic features of human nature. But whatever the reason, so long as students need external help in order to learn, we are hiding our heads in the sand if we do not provide it. (We, on the other hand, also need help from our students in order to learn). In democratizing education, then, we should not simply dismantle all structures and hope that something happens, but rather try to create structures that give students the support they need and allow them to make a significant input and have optimal control over their learning. While schooling should as far as possible be dialogical, it should not be a mere pooling of ignorance. To be effective, dialogue requires strong input of many kinds: information, examples, stories, feelings, ideas, theories, worldviews, and so on. The point about a democratic approach is not that structure and content are unnecessary, but that students (and teachers) should have a major say in how their learning is structured and what content is made available to them. SOME IMPLICATIONS FOR PHILOSOPHY OF EDUCATION

There are many implications of what we have been discussing for philosophy of education, but once again I must be selective. To begin with, students of education, like school students, should be helped to see that knowledge is value dependent, culture dependent, and changeable that we are not searching for a fixed, universal philosophy of life and education. At the same time, however, they should be helped to identify continuities and commonalities that give some stability and direction to their lives and to the practice of teaching. One way of achieving the twofold goal of combating foundationalism and yet helping students develop modest foundations for life and education is to study various forms of scholarship e.g., anti-racist, feminist, individual, and so on as advocated in Part III, above. In this way students will see that theory is necessarily tailored to suit diverse group and individual needs. As I have argued, however, this does not involve denying substantial overlap between different forms of scholarship. Indeed, the exploration of what different categories of people have in common should be a major aspect of educational studies. The philosophy of education classroom, like the school classroom, should also be strongly democratic and dialogical. In this way the energies of students will be engaged, their values respected, and their insights made available to fellow students and to professors. It is surprising how often professors of education advocate democracy for schools and yet do not practice it with their own students. If we believe in a democratic approach to inquiry we should model it ourselves, so that our students understand what we mean and are given the opportunity to develop a democratic pedagogy which they can in turn employ in schools. Adopting a genuinely democratic and dialogical approach involves a fundamental re-thinking of the nature of philosophy and of intellectual work in general and of our role as professors. We should not view our research into educational theory as something that can be carried on separately in the mind or in the study and then used as a key to unlock the secrets of education and life. As Rorty says:
the intellectualis just a special case just somebody who does with marks and noises what other people do with their spouses and children, their fellow workers, the tools of their trade, the cash accounts of their businesses, the possessions they accumulate in their homes, the music they listen to, the sports they play and watch, or the trees they pass on their way to work.22

Philosophers are simply living life like everyone else, working on the same problems as everyone else, but using a distinctive language (often more distinctive than need be). We should compare notes with others, including our students, not impose our solutions on them.

In this respect, the postmodernist attitude is the same as the hermeneutic attitude, on Gadamers interpretation. As Dieter Misgeld expounds Gadamers position:
Hermeneuticsis a mode of inquiry that refuses to legitimate any disposition on the side of those inquiring to exempt themselves from what is topical in the inquiry. [I]f inquiry is itself a situated activity, just as much as what one studies, the conduct of life of those inquiring comes to be an issue as does the relation of inquiry to their lives. 23

This is not to downplay the importance of theory, as many postmodernists have done. Rather it is to recognize that everyone is constantly theorizing about life trying to make sense of it including the academically least able student in our class. Our task as professors is not to blind students with our knowledge of the history of philosophy and our command of technical jargon but rather to help them see that they are grappling with the same issues as we are and have been all their lives and to enable them to get into conversation with philosophers, ancient and modern, and other theorists, largely as equals. However, while our educational theory will always be somewhat selfreferential in this way, the broader our base of experience the more others (including our students) will gain from our theory. We education professors must as much as possible go out into society, homes, schools. As noted earlier, philosophy is not a theoretical key that unlocks practice. Theory must be fundamentally rooted in practical experience if it is to be of value. The common professorial disclaimer that we are not equipped to talk about practical matters appears humble but is in fact arrogant; and it betrays a lack of understanding of theory. If we are not equipped to talk about practice, we are not equipped to talk about theory. We must as far as possible address both theory and practice. That is the most effective way to contribute to education, which is our responsibility. People who specialize mainly in theory or mainly in practice can make a contribution, but normally they would contribute more even in their area of specialization if they did both (in accordance with Buckminster Fullers principle of synergy). Far from doing a better job by specializing in theory, we almost inevitably do a worse job. Finally, just as we should encourage our students to dialogue with us and other theorists rather than drinking it in, so we ourselves should be more critical or dialogical in relation to so-called pure philosophers. I feel that, in general, philosophers of education over the past few decades have shown too much deference to pure philosophy. We have tended to quote people such as Austin, Wittgenstein, Heidegger, Habermas, Foucault, Rorty, and so on rather than interrogating them. As you can see from this paper, I believe in taking pure philosophers seriously; but they, like us, make enormous errors. I feel that, in good postmodernist

spirit, we who are in education should develop a positive image of ourselves as sensitive, knowledgeable people, working away in our particular site, interacting with other scholars and learning from them, but having as much to offer as to gain, and as in no way merely applying the findings of pure philosophy. In closing, I would like to pose a question: Am I here today engaging in genuine dialogue (and do I with my students back home?) or am I preaching, imposing, controlling, and so forth, in the manner criticized by postmodernists and by myself in this paper? That is something I want to reflect on more. But part of the answer, I think, lies in how active you are in assessing what I have to say. Part of the key to avoiding authoritarianism and indoctrination in classrooms of school or university is not to have teachers refrain from saying what they think, but rather to have students feeling free and acquiring the skills, emotions, and habits they need to react strongly and honestly to what teachers say. And the same is true here. I have said my piece as forcefully and clearly as I can. Now it is up to you to assess equally forcefully what I have said from the vantage point of your own experience, culture, ideas, interests, needs, values. I am sure my respondents will do that only too soon!

For responseS to this essay, see Feinberg and Greene.


1

On this point see Linda Hutcheon, The Politics of Postmodernism (London: Routledge, 1989), 1.
2

See Stanley Aronowitz and Henry Giroux, Postmodern Education (Minneapolis: University of Minnesota Press, 1991), 19, 59.
3

See Carol Nicholson, Postmodernism, Feminism, and Education: The Need for Solidarity, Educational Theory 40, no. 1 (1990): 43.
4

See Nicholson, 198.

John McGowan, Postmodernism and Its Critics (Ithaca: Cornell University Press, 1991), 184.
6

Richard Rorty, The Dangers of Over-Philosophication Reply to Arcilla and Nicholson, Educational Theory 40, no. 1 (1990): 43.
7

Rorty, The Dangers of Over-Philosophication, 43. Rorty, The Dangers of Over-Philosophication, 44.

McGowan, 181. My parentheses.

10

See Aronowitz and Girouxs Postmodern Education and William Dolls, A PostModern Perspective on Curriculum (New York: Teachers College Press, 1993).
11

Apart from works cited above and below, I would like to mention especially Chris Weedon, Feminist Practice and Poststructuralist Theory (Oxford: Blackwell, 1987).
12

Hutcheon, 2.

13

E.T. Gendlin, Thinking Beyond Patterns: Body, Language, and Situations, in The Presence of Feeling in Thoughts, ed. B. denOuden and M. Moen (New York: Peter Lang, 1991), 29.
14

This process of interaction is discussed by Northrop Frye in terms of the tension between centripetal and centrifugal tendencies. See his The Great Code (New York: Harcourt Brace Jovanovich, 1983), 52, 61-62, 217-18; and Words with Power (Penguin, 1990), 37-40
15

See for example Michel Foucault, The History of Sexuality (New York: Random House/Vintage, 1990/1976), 11-13.
16

Landon E. Beyer and Daniel P. Liston, Discourse or Moral Action? A Critique of Postmodernism, Educational Theory 42, no. 4 (1992): 383-87.
17

Richard Rorty, Postmodernist Bourgeois Liberalism, in Hermeneutics and Praxis, ed. Robert Hollinger (Notre Dame, Indiana: University of Notre Dame Press, 1985), 217.
18

For accounts of Rortys pragmatism, see for example, his Objectivity, Relativism, and Truth (Cambridge: Cambridge University Press, 1991), 63-77; and Richard Bernsteins Beyond Objectivity and Relativism (Philadelphia: University of Pennsylvania Press, 1983), 198-207.
19

Rorty, The Dangers of Over-Philosophication, 41-42. Rorty, The Dangers of Over-Philosophication, 44.

20

21

Jean-Francois Lyotard, The Postmodern Condition, trans. Geoff Bennington and Brian Massumi (Minneapolis: University of Minnesota Press, 1984/1979), 51.
22

Richard Rorty, Contingency, Irony, and Solidarity (Cambridge: Cambridge University Press, 1989), 37.
23

Dieter Misgeld, On Gadamers Hermeneutics, in Hermeneutics and Praxis, ed. Robert Hollinger, 162.

POST-MODERNISM

AND THE RECOVERY OF THE PHILOSOPHICAL TRADITION. F. L. Jackson ljackson@morgan.ucs.mun.ca Introduction 1 As century and millennium draw to a close the paradoxical thought preoccupying philosophers is whether or how philosophy is at an end. According to the now common opinion - among many academic philosophers, indeed, a certainty - the ideal of a universal knowledge through principles, philosophia, has long since been exposed as spurious so that no person of right mind would nowadays recognize or indulge in it as a legitimate pursuit. For the new philosophers the fact is that "philosophy" as traditionally understood is a thinking no longer relevant for a post-modern consciousness and world; if it might still have a role it can only be in some radically attenuated sense: as writing its own obituary, clearing away of the rubble of its own ruined foundations, speculating as to what it might now mean to live and think post-philosophically. 2 That the philosophical legacy has become moribund would certainly appear confirmed in the universities, where the former queen of the faculties has long been deposed and the view of philosophy as an obsolete discipline is so broadly established that even full professors of philosophy are rendered mute by the question as to why it should even be taught at all, much less what its proper curriculum should be. In the general culture too the appeal to rational grounds is viewed as un-chic, if not indecent; a moralistic presumption prevails that equates the naive appeal to principles with allegiance to established religions: as indicative of an atavistic and reactionary turn of mind. In a culture that tolerates the most capricious and absurd superstitions provided they claim no more than a subjective validity, the achievement and the way of philosophy does not even garner that much respect. The popular view accords more with the judgement of Nietzsche that the philosophical outlook and spirit is not merely misguided, it is perverse.

3 In this light the spirit of the times might, on considerable evidence, be described as aphilosophical through and through. But it can hardly be right to deplore this state of affairs, as traditionalists tend to do, as a kind of Roman degeneration of modern culture into mere thoughtlessness and caprice. For it must also be acknowledged that in consideration of its commitment to subjective freedom and its insistence on open discourse as sine qua non for the acceptance of any moral, intellectual or political position - not to mention the unprecedented numbers of philosophers populating contemporary universities - it could just as well be said that never has there been an age

so thoroughly "philosophical" as is our own. Even those writers who would now claim to have at last overcome philosophical culture and its "logocentrism" are far from representing this eventuality as catastrophic; on the contrary, they herald it as the final liberation from an intellectual despotism, the emancipation of thought from all its past delusions. 4 Indeed it is now de rigeur among philosophers themselves to argue that philosophy did in fact end, with Hegel or thereabouts, and that the age when people believed in a universal, absolute knowledge, or that the actual is the rational, is long since over. So has almost everyone from Kierkegaard and Feuerbach to Rorty and Derrida argued (even Auden: "Goodbye, Plato and Hegel/ The shop is closing down..."). Thus it cannot just be a question of philosophy having somehow spontaneously withered away over the past century or so; rather the significant fact is that there has been a deliberate and resolute effort to overthrow it, and that this indeed has been the principal project of philosophy itself in the ultra-modern era. Bewailing the "decline" of philosophy is thus not quite to the point; the real challenge is to understand this ultra-modernist legacy of overthrow and the motives for it. 5 This is not to deny that there is something logically fishy about arguments which claim to set absolute limits to argument or a theory to end all theory, which is what the war over the end of philosophy being waged in the journals is mostly about. No less scrutable are pronouncements that we are now passing from a culture founded on intellectual principle to one that no longer is; especially when this position is argued intellectually. It is no doubt the paradox which prompted Lyotard to warn us that we must not view the "post-modern condition" as the dawn of some new culture to supplant the older modern one, for that would require us to give the "rationale" whereby the first is distinguished from the last, which is to contradict just what the step means to be, namely a stepping beyond all rationale-fixated culture. So if it is to be neither the advent of a new culture nor a lunge into the void, the link of post-modernity to modernity must somehow be maintained in stepping beyond it. Thus his formula: post-modernity is modernity itself in it self-negative extension.[1] 6 Attempts to think the end of philosophy share the same difficulty: how it is thinkable to go beyond philosophical reason or set it in abeyance without resorting to arguments that are again philosophical. It is the Cartesian problem of how one is to think beyond thinking. One way is to construct arguments that can claim to be "persuasive" in some para-logical sense; it has become common practice since Heidegger, Wittgenstein, Foucault et al. to appeal to poetic, linguistic or coercive "reasons" and even to cite these as the real hidden force behind the arguments of philosophy itself.[2] Another way is to abstain from argument altogether, as does Derrida who, when asked what he really means (vouler dire) in his books and arguments replies that he means nothing at all,[3] which is the right answer if what one in fact "means to say" is precisely that all meaning is

undecidable. 7 The paradox is nothing new. In various forms it has plagued the whole career of antiphilosophical thinking in its rise to predominance over the past two centuries. If that history may be described as the history of attempts to effect the definitive critique of the philosophical tradition, it is also the history of this paradox and of successive attempts to surmount it. It is the purpose hereinafter to explore this distinctively ultra-philosophical spirit and very broadly to sketch the lines of its development from its early nineteenth century origins to its current post-modern denouement. It will be argued that it is precisely the contradiction entailed in the very idea of a philosophical conquest of philosophy that has rendered all attempts to articulate it ambiguous and deficient, and that it is the same ambiguity that has provided the dialectical engine which has driven each interim stage of the argument beyond itself, forcing its restatement at a further level. 8 For it is only when philosophical movements reach their proper denouement that it first becomes possible to begin to understand and evaluate them within the purview of the wider history of thought. Before that, the dogmatic enthusiasm that is associated with projects still under way and whose aims are as yet unsullied and undoubted makes any real questioning of them virtually impossible. So it has been with Euro-American thought since the eclipse of the great age of modern philosophy in and after Hegel's time and, whose logic and limits only lately have begun to come into view. "Ultra-philosophy" would seem the apt term to designate the general form of the thinking peculiar to that era which, in other contexts, is often referred to as "ultra-modernity". The prefix "ultra-" has the convenient double sense of "going-beyond" and "taking-to-the-extreme", and ultraphilosophy stands in just such an ambiguous relation to the philosophy of classical modernity which it would at once overthrow, but also drive to its limit. 9 The common view of post-modern writers that their own perspective owes its origin to very recent insights on the part of a Rorty or Derrida is quite mistaken. The undertaking to emancipate thought from philosophy is already two centuries old and has generated a substantial legacy of its own. The earliest forms of ultra-philosophy are to be found in nineteenth century materialism or evolutionary theory, in Feuerbach's "goingbeyond" of Christian theology or Schopenhauer's and Kierkegaard's subordination of speculative reason to specifically contra-rational absolutes. The 20th century saw a return to philosophy in a new key in the form of methodologies whose ostensible aim was the "reform" or "critique" of philosophy, thus again with an essentially ultra-philosophical intent. Its most recent shape is the post-modern scepticism which assumes the whole philosophical legacy to be self-discredited and which would no longer seek to go beyond it or reform it, but remain sceptically poised, as it were, on its nether side. 10 In this its most recent mutation, however, the project of ultra-philosophy has been brought to the brink. In this sceptical form the contradiction inherent from the beginning

in the idea of thinking the end of thought is escalated to suicidal intensity. For what would now be accomplished is no longer just the overthrow of philosophy but also the overthrow of the overthrow, the critique of the critique. What has come to light for postmodernism is that there can be no decisive argument to put an end to thought since all such arguments are but thinking again. The only option, then, is simply to assume outright the nullity of all argument, both philosophical or meta-philosophical, and to sustain this stand through purely sceptical-intellectual activity (which Derrida calls "deconstruction" and Rorty a neo-pragmatic "conversation") engaging extant positions of every kind and seeing them as self-invalidating while conscientiously seeking to remain position-less and inconclusive itself. But with that, the essential project of ultraphilosophy, which was to carry out the final overthrow of the philosophical legacy, is really abandoned. There is now no longer any distinction between what is to be gone beyond and the going beyond it, between the philosophical legacy or its critique. All that remains is philosophy that has become totally and purely academic, a reflection which has no content of its own beyond the endless evocation and subversion of arguments, and which "means to say" nothing beyond this exercise of a wholly negative reason. 11 The career of ultra-modernist thought may accordingly be delineated in three principal phases. The nineteenth century saw the advent of various doctrines that had as their common distinctive theme the dethronement of the spiritual-speculative outlook of the western tradition and its replacement with distinctly counter-speculative forms of world-explanation: a position to be designated hereinafter as "counter-philosophy". At the turn of the century new schools of philosophical inquiry appear which make it their business to disclose and correct, from a second-order, critical standpoint, what are alleged as the fatal fallacies of all western philosophy: thus "meta-philosophy". Finally the limit of ultra-philosophy is reached in the post-modernism which declares both the dogmatic and the critical forms of the opposition to philosophy self-defeating, and proposes instead to expose the whole legacy of reasoned discourse as spurious and annulled in itself "post-philosophy". Each successive shape of the ultra-modernist thesis has its own distinctive approach to how the end of philosophy is properly to be thought; each has its unique interpretation - and indeed misinterpretation - of what it is in the speculative tradition that must be rejected; and each, in its own way runs afoul of an ineradicable paradox that plagues every step of the way. I. Counter-Philosophy: Scientism and Absolutism 12 What it was that originally provoked the ultra-modernist turn in philosophy is a question that already has too many answers. From Feuerbach to the present the account of what in the older speculative tradition demanded its radical repudiation has been stated and restated in so many conflicting ways that to cleave to one or another version would be arbitrarily to fall in with some particular school. For to accept Nietzsche's answer or Ayer's or Dewey's is thereby to reject Kierkegaard's or Marx's or Heidegger's for these are wholly contrary accounts of the matter which cannot be reconciled. What is more to

the point is to go back to the beginning again to seek to understand the ultraphilosophical project as whole, as a history, and to consider how the argument takes shape, what conflicts arise and develop in it, and what is its final outcome. 13 The boldest, most straightforward arguments are usually those made at the beginning. The apocalyptic writers of the nineteenth century were the first to challenge the traditional modern-western account of the world and attempt to articulate entirely new perspectives considered appropriate to the emerging ultra-modernist culture with its techno-political humanism and appeal to a radical subjective freedom. There was the sense that history had "broken in two",[4] and that the history of philosophy in particular had reached an epochal impasse in which its limits had been reached and exposed. The ancien rgime of thinking reason was summarily jettisoned and new modes of thought proposed whose thrust was distinctively realist, non-conceptual, historical, experiential, humanistic and world-affirmative. 14 This counter-philosophical spirit took two chief forms: the first would abandon speculative thought altogether for a dogmatic rationalism appealing to positive, atheist and materialistic world-explanations - scientism; the second somewhat retained a speculative appearance but such as posited an explicitly counter-rational principle as its object and theme - absolutism. Scientism set in opposition to the spiritual-speculative view of the world - to "metaphysics" - another derived from one or other of the finite sciences, elevated to the rank of a philosophy-surrogate; thus sociology (Comte), politics (Feuerbach), psychology (Mill), biology (Spencer) or physics (Mach). Absolutism would still make its case as philosophy, even as metaphysics, though as inverse metaphysics, centring on a notion of being or "ultimate reality" as in itself irrational, self-oppositional and paradoxical, a perpetually self-reflexive "absolute-finite" in principle destructive of every objective stability. In this is expressed again, in another way, the basic thesis of counter-philosophy: namely that it is the finite self-consciousness and world that is really absolute, a view which Schopenhauer, Stirner, Kierkegaard and Nietzsche all champion. 15 Few any longer question the so-called "scientific view of the world" that once gave fright to kings and popes. Its appeal rests on the claim to have abandoned the vagaries of abstract thinking and to have reestablished science anew on a wholly non-theoretical base, relying exclusively on the brute facts of nature and society as should be obvious to a healthy mind that has given up trying merely to "think" the world and has instead wholly immersed itself in it. Condemning speculative metaphysics as a fraudulent appeal to indemonstrable figments, it promotes in its place a comprehensive, realist, demystified, "metaphysics-free" account of man, nature and history which resolves not to stray from the finite, concrete, immediate and factual human world; an account which, since "positive" and not theoretical,[5] is immune to all theoretical doubts and distortions.

16 It is just this anti-intellectual bias, however, which renders scientism un-scientific in practice. For its appeal to evidence is at bottom dogmatic: some "general fact" is postulated and then ordinary facts conscripted in "confirmation" - thus that the progressive evolution of species is the brute fact of nature is demonstrated by the existence of certain frogs, or the actual policies of Napoleon are evinced as "proof" of the class struggle as the brute fact of history. But such a verification is wholly circular and the notion that there are primordial general facts is in any case clearly a fiction whose real function is to substitute for the appeal to reasoned principle. As the theory of evolution, physicalism, mechanistic psychology, historicism and other such doctrines demonstrate, what scientism actually produces are crypto-metaphysical doctrines[6] which deal in postulates no less figmentary than those they mean to replace. In short, scientific positivism is just metaphysics again in another form, a metaphysics of the finite or factual world posited as absolute, that is, as "unconditionally given". 17 Though with similar roots and intent, absolutism stands utterly opposed to scientific positivism; the two wage continuous war throughout the nineteenth century and beyond. "The Absolute" in its ultra-modern meaning embodies a distinctly contra-speculative reference, a radical affirmation of the finite-as-absolute similar to scientism's, though now from the side of the absolute. Typically characterised as what exists in itself before all consciousness of it, thus in principle opaque and impenetrable to reason, it is only the Absolute's own self-disclosure which make its apprehension even possible, an apprehension that is for this reason pre-rational or aesthetic. Schopenhauer's Will and the Kierkegaardian inwardness provide early examples of this absolutist reference which appears in other guises throughout the century: "the Unknowable", "the Incomprehensible", "Will to Power" and so forth.[7] nineteenth century absolutism generated a whole legacy of popular imagery - "ultimate reality" as Life, Self, Cosmos, Energy, the Unconscious etc. - while the literary tradition was also much given over to the same romantic-absolutist language of inscrutables and ineffables.[8] What Heidegger, playing Parmenides to ultra-modernist Milesians, later will simply call Sein, springs from the same ancestry. 18 There thus exists from the beginnings of counter-philosophy a profound revolutionary-reactionary division that stems from a fundamental ambiguity as to how the reality of a wholly finite, natural-historical human existence might be comprehended and affirmed over against the idealityof the world as it is for traditional philosophical thought. At this point the goal is not conceived as one of bringing thinking itself to an end but rather as discovering a distinctly counter-conceptual mode of thinking: thus "science" (in this corrupted sense) or "subjectivity". For in its innermost soul the intent of the ultramodernist spirit is not to repudiate modernity, but only to overcome what is still mediated in it, to affirm its core principle of a concrete human freedom in the world as an actually or virtually realized condition. And this is precisely its ambiguity: it would go beyond modernity and its tradition and not go beyond it; it would extend it to its most extreme form and yet withdraw from that. Accordingly, both scientific positivism or absolutist nihilism would affirm a finite reason in place of a universal and deny the idea of freedom

for the sake of an actual one conceived in social terms or as a self-affirmative life. A most intense debate develops as to precisely how the "overthrow of idealism" is to be appropriately effected and what a new, ultra-modern thinking-in-the-world would be; whether the revolutionary repudiation of thought altogether or its reactionary reconstitution in a self-negative form; whether simply to step beyond reason or turn it against itself.[9] 19 This intense controversy within counter-philosophy embodies the paradox intrinsic in the ultra-modern ideal of a purely finite reason and freedom and the corresponding overthrow of the philosophical spirit from this radical human standpoint. The counterphilosophers could only solve the dilemma by sundering the classical modern idea of freedom, the unity of reason and being, into its constitutive elements, and playing these off against one another such that what one specifies as the epitome of the metaphysical and abstract the other advances as the essence of the this-worldly and concrete. Thus positivism and its variants abrogate universal being in the name of the world as it is for a finite human reason, while absolutism abrogates universal reason in the name of being as it is for the finite existent. The one indicates as the key metaphysical superstition it would repudiate precisely that which the other affirms as the truth to be rescued from it; and vice versa. The great debate between moralism and romanticism affords the popular paradigm: both affirmed a radical finite freedom as the unity of self-consciousness with nature. But for moralism freedom is preeminently realized in the human-practical overcoming of nature so that nothing is so morally abhorrent as the doctrine that freedom is something instinctive. But for romanticism it is just in the natural immediacy of individual self-feeling that freedom is aesthetically given, and nothing is thought to pervert this instinctive identity of freedom with "life" so much as the divorce of reason from nature which moralism promotes.[10] In this manner the modern principle of the unity of reason and being would be at once subverted and conserved. 20 The same opposition pervades the thinking of the whole era.[11] Strausss and Kierkegaard debated the revolutionary sense of the modern-Christian principle of divinehuman identity, the former representing it as commitment to an objective human selfmaking to which subjective faith is to be given over, the latter as precisely the subjective passion of faith which leaps beyond all humanistic moralism and rationalism. Countermetaphysics similarly divided into polar arguments of positivism and nihilism: Comte would seek a new ultra-rationalist basis for science and social morality in a being-forman of the world to which the traditional transhuman visions of philosophy and religion are to be assimilated. But it is just the relentless in-itself-being of reality, the utter unreason of the absolute, which for Schopenhauer annihilated everything that is merely positive or objective in human existence. What is remarkable is how the one view negatively mirrors the other and precisely and utterly abrogates just what the other asserts. 21 Counter-ethical thought had among its chief representatives Feuerbach and Stirner in Germany, Mill and Spencer in England and James and Royce in America. The same

mutually oppositional relation of affirmation/abrogation is manifest. Feuerbach, for example, describes speculative philosophy as intellectualized Christian theology; its image of the God-man prefigures freedom as the finite individual's immediate sense of his own human species-being. The setting aside of the alienated spiritual-intellectual form in which religion and philosophy represent this relation is a political emancipation (Feuerbach: "politics is our religion"[12]) in which a subjective, un-humanized individuality is awakened to the consciousness of its essential humanity. To Stirner, nothing could be more alien than the notion of an objective human essence. The belonging-to-self of individuality is an ethical absolute and everything stands in relation to it as "its own". All objective ethical "causes" dissolve in the infinite reciprocity of Der Einsige und sein Eigenheit, of singularity and ownership,[13] and the "spirit", whether of liberalism, humanism or moralism, is only the moribund after-life of a religiousphilosophical unfreedom - a "spook". 22 These are the same positions Marx and Nietzsche later refined into doctrines that became enormously consequential for later ultra-modern thought, culture and political life. Both were aware of the limits of earlier counter-philosophical arguments which, continuing to play on the same field they would abandon, were in the end self-defeating. Marx, recognizing Feuerbach as mentor, complained that his overthrow of theology was still theological,[14] while Nietzsche, acknowledging Schopenhauer as teacher, faulted him for refuting morality only to advance a more decadent form of the same.[15] The trouble with the arguments of their predecessors, both concluded, was their one-sided dismissal of counter-positions had taken insufficient account of the force of those positions, a defect Marx and Nietzsche would remedy by seeking more definitely to identify their own specific counter-thesis and negatively to comprehend it within their argument. Nietzsche's work is wholly addressed to morality, the humanistic will-not-towill as the antithesis of will-to-power; he questions how it could even arise in the first place - "how the saint is possible" - and how it has come to contaminate the whole of historical culture. Marx on the contrary would account for radical individualism and its anti-humanist ethic, which he saw as thwarting man's natural species-life; he offered a logic of ideological power and class dialectic to explain what he saw as the cruel anomaly of the rise of bourgeois societies founded on a spurious subjective freedom. Thus humanism becomes decadent individualism and individualism alienated humanism. 23 The appeal to totalistic, ideologically inspired theories of human history to justify some one-sided repudiation of western philosophical culture as a whole began in earnest with Marx and Nietzsche and established violent prejudices which provided the fuel, first for the class struggle and then the 20th century wars. For Marx, the revolutionary humanist, the engine that impels history is the contradiction embodied in autocratic individualism; for Nietzsche, the aesthetic autocrat, it is the apotheosis of the lifedenying, humanistic spirit. What again is remarkable is the mutuallycontradictory character of these accounts; how they explicitly cite one another as opposites. But of course the same human history cannot be both the tale of how objective social freedom was ever frustrated by the oppressive power of the absolute individual will, and also the

progressive perversion of authentic subjective life at the hands of a repressive politicaltechnological idealism. 24 It is clear that the polar-opposite Marxist-Nietzschean accounts of the past are pure concoctions whose real purpose is to substantiate arguments which of their nature shun all appeal to rational grounds. It is history that becomes the medium in which the contrariety between positivist and absolutist accounts of freedom is sustained, and in such a manner that each side declares itself to be the liberation from its own counter-thesis, construed as having dominated the human past up to now. What is called history becomes the chronicle of the progress of a spirit each would now overthrow: for Nietzsche the apotheosis of the nihilistic human will against which the new philosopher would now dare reaffirm "Life"; for Marx an epic of ideological oppression on the part of the ruling classes, now at last overthrown. Both the specific account of present cultural crisis and the specific caricature of the past from which it is alleged to spring, belong together as reciprocal facets of one argument, whose interest is not really in world history but in reconstructing it to generate counterfoils to what are essentially ultra-modernist positions. It is inevitable that history itself, especially the history of philosophy, is barbarized in the process, and the legacy of this barbarization is everywhere still evident and has indeed become the accepted view of the past. 25 What scientism's atheistic, a-logistic positivism would defend is objective progress toward a fully actual human world, a condition of finite and tangible freedom such as traditional spirituality is said to have written off as impossible and unworthy. Absolutism would similarly affirm a radically finite freedom, the freedom of authentically subjective individual life whose repression is alleged to have constituted the burden and theme of traditional culture. These ultra-modernist forms of extreme humanism and extreme individualism appear to themselves as if pitted against a common enemy, the tradition of reason and its "idealism", but in reality they are pitted against each other and with am intensity which, when translated into political action, was to become fanatical. 26 The response of the philosophical tradition to nineteenth century ultra-modernism was to attempt to erect bulwarks; the later part of the century sees a rash of "neoidealisms" - neo-Platonism, neo-Thomism, neo-Kantism etc. - which were not true reversions to these earlier positions but exploited them to fashion anti-anti-idealist weapons with which to go to war with materialism. But doing battle on fields and with arms chosen by the enemy, they succeeded only in further distorting the very sources they would invoke - Plato became a Victorian moralist, Hegel a Prussian nationalist or British imperialist. In the subsequent stand-off between ultra-modernism and neoidealism it became clear to the former that its fuller conquest of the tradition of reason required that the attack be taken into the precincts of philosophy itself, there to repudiate it on its own turf. Accordingly, 20th century critical thought is more than merely counterphilosophical; it carries out its subversion of reason from a standpoint that claims to be at once beyond philosophy and itself philosophical: meta-philosophy.[16]

II. Meta-Philosophy: the 20th Century Schools 27 The general standpoint and presumption of 20th century thought is of selfconsciously free, contemporary individuals existing in immediate relation to a finite world they directly know as their own. All notions of reality beyond this world are ruled false or "metaphysical". Scientism and absolutism have so far become second nature that the thought-world appears to have entirely receded into the past: a "traditional philosophy" that can be no longer relevant for a confident individuality that has become wholly attached to what is distinctly and concretely there and possible for finite human practice and life. Schools of analytical and existential philosophy arose to articulate this position. They would aggressively seek to occupy the intellectual territory on the hither side of an epochal break with the old world of reason, taken as a fait accompli, and would rise to the adequate thought of a brave new world of subjective freedom which is sustained through the definitive critique of the standpoint of traditional philosophy: definitive since itself philosophical. Philosophy is to carry out its own refutation. 28 From this standpoint the whole of traditional thought is taken as vitiated through its habit of transcending limits now declared insuperable. It has been guilty of ignoring the perspectival limits of consciousness, for example, of thinking beyond time, of "forgetting" the radical finiteness of being, of uncritically accepting non-factual statements as true, of failing to realize "thinking" is only linguistic activity and so on - to all of which offenses traditional philosophy itself would of course readily confess. The new philosophy on the contrary will make no claim to any first-order knowledge; it will constitute itself solely as the second-order reflection whose only aim is to legislate against such transgressions and to get investigations under way designed to expose, arrest, curb and correct the perennial pretensions of rational thought in its misguided aspiration to an impossible universal knowledge. 29 As logic and ontology are foundational in philosophy, the new meta-philosophy initially took shape as attempts to establish a new logic and ontology of the finite to supersede their traditional foundation in thinking reason. Accordingly, a number of nineteenth century experiments in mathematics and psychology paved the way for later reconstructions of logic along essentially extra-logical lines: Brentano, Boole, Frege, Peirce and others. The inward motivation of this revolution was not at all to advance logical science itself but to bring logic as a whole under what are essential ultra-logical criteria, drawn from mathematics, semiotics or psychology. The analytical and phenomenological schools trace their roots to such meta-logical and meta-ontological "investigations" of the first decades of the 20th century: Husserl's Logical Investigations,

Principia Mathematica, Tractatus Logico-Philosophicus, Being and Time. 30 The aim of the new methods was completely to undermine the traditional philosophy through methodical "clarifications" of all its alleged obfuscations and fallacies. That such a meta-analysis of philosophy is the only legitimate task of philosophy itself was to become the conventional wisdom by mid-century. Ironically, "philosophy" appeared suddenly reborn; for several generations the works of the grand masters of metaphilosophy - Frege, Dewey, Russell, Husserl, Wittgenstein, Heidegger - became virtually scriptural, the required class-texts of vast academic schools whose scholars produced mountains of research aimed at completing the final critique of traditional philosophy. The whole legacy from Plato and Kant was read and taught again on a mass level, not on its own terms, but so as to provide grist for the meta-philosophical mill to grind into fine critical dust, or as a source of interesting themes to be suitably transposed into the new key. Meta-logic and meta-ontology came to dominate academic philosophy through the century; it precisely expresses the spirit of the ultra-modernist heyday, the era of final solutions, whose art, popular culture and philosophy, no less than its politics, affirmed as absolute the finite will to overthrow all absolutes. 31 The claim of the new analysis to put philosophy on the side of science did not mean philosophy was itself to become science but that since knowledge is assumed exclusively to be the positive-scientific account of the fact-world, the true role of philosophy must be to establish and defend the rules of such scientific verification meta-scientifically.[17] Though Russell's logical atomism looks much like a rehash of classical British empiricism (for simple and complex ideas read atomic or molecular facts; for laws of induction read truth-functions etc.) the difference is that for Russell there no longer are any empirical things-in-themselves; no ideas, no thinking subjects, no reasoned empirical inferences, in short, no philosophical knowledge. There is only the positive "fact-world" and individuals who use language to mirror it. Logic is not thought reflecting on its own inward structure - there are no "thinking beings", only brain-equipped linguistic animals. Logic is meta-logic, the second-order system of rules, themselves wholly factual, for the correct formulation of positive statements. The realm of propositionally pictured fact is for Russell the only real world there is, the radically finite here-and-now world which analytical philosophy would oppose to the thought-world of traditional metaphysics. 32 The commencement is thus decidedly not with any appeal to a rational basis but to a series of dogmas which simply declare how things stand with finite individuals fashioning statements about their equally finite world. Among these dogmas: only the fact-world exists and nothing else does; to "know" is correctly to state facts through propositions; only propositions referring to empirical facts are true or false, all others merely formal or empty expressions; empirical science alone judges as to what the facts are and metaphysical or ethical statements are nonsense; the exclusive business of logic (hence philosophy) is so to clarify the rules of propositional statement that all non-factual claims can finally be put to rest. These same positions are repeated in Wittgenstein's

Tractatus which more decisively makes it the sole business of philosophy, not to frame propositions of its own, but only "to make propositions clear". The work makes a beginning toward ridding the Russellian formulae of their crypto-metaphysical residue, establishing more strictly the rule that of what would lie beyond the facts and their verbal picturing "nothing can be said" and so we should remain silent. 33 Later positivists develop the same emphasis in attempts to formulate a "principle of verification"[18] through whose relentless application every temptation to metaphysical judgement might be arrested. Through a generation of analytical literature, however, the limits of the verification criterion worked their way to the fore: it is impotent respecting scientific generalities like E=MC2; it cannot explained why only physics-like statements are factual without invoking empiricist metaphysics; the ghost of an ultra-factual "worldout-there" always seems presupposed; its restriction of meaningfulness to factual utterance in any case stretches credulity. Moreover, its essential dogmatism is exposed in that its criterion cannot apply to itself without self-destructing. It becomes apparent that the regime that allows only factual statements to be meaningful is itself wholly metaphysical and does not square with the intent of the new philosophy which was to establish a meta-metaphysical beach-head in the everyday world in such way as to demystify it of all metaphysical prejudices. The need is felt for a less theory-laded approach to analytical investigation such as would comprehend a multitude of meaningful ways in which individuals use language to address and express their immediate world. 34 The later Wittgenstein will thus speak of propositional logic as only one use of language which it is presumptious to rank above others. His analysis asks that we avoid assumptions as to what may or may not be meaningful or true and which privileges some particular use of language, a step which can only be justified extra-linguistically, that is metaphysically. The more adequate inoculation against metaphysics is the recognition that the problems of traditional philosophy are really linguistic neuroses and bottlenecks and true philosophy the analytical therapy which liberates language from these fixations and shows "the fly the way out of the fly-bottle".[19] Such analysis will avoid explicit counter-metaphysical refutations like that of logical positivism; it will simply unravel the linguistic tangles that constitute the knotty problems and puzzles, including positivist ones, that engender what has been called "philosophy". The standard of normality for this therapy is the everyday, spontaneous use of language as the "common behaviour of mankind".[20] It is no longer a question of uncovering hidden realities or even of comprehending or changing obvious ones; the simple task of a linguistic philosophy is to bear witness to ordinary language-behaviour and thus to "leave everything as it is". Thus would Wittgenstein affirm the preeminence of the immediate, quotidian world of everyday talk over the alleged tortured perspectives of reason. It is no longer a question of a pre-given fact-world pictured in static empirical propositions, but of a contextual, behavioural world of common linguistic usage, seen as absolute since nothing whatever can be uttered or understood except in its terms.

35 Wittgenstein's linguistic positivism sent everyone into the cultural byways looking, "not for the meaning, but the use". With Austin the everyday dictionary and thesaurus were elevated to the rank of philosophical texts. The rule was to treat all instances of linguistic behaviour as differing "language-games", each with its peculiar rules, each appropriate to its context, and none, not even empirical propositions, affording privileged access to extra-linguistic truth. It was now even possible to turn again to religious "godtalk" or ethical or metaphysical pronouncements so long as the same non-committal interest was maintained as would apply to the analysis of the rules of the lingo that builders use on the job; that is, without making any commitment whatever to what the language of theology, ethics or science actually said. This studied reduction of every content to the form of the language used to communicate it became one of the most powerful paradigms of all 20th century academic teaching and research. In philosophy it was thought a great liberation to be released from grappling with first-order problems which one could now feel satisfied were in any case bogus and easily resolved simply by reference to the ordinary language one ordinarily spoke and in which one could presume to be already somewhat expert. Philosophy of language provided a solid, readily available and democratic vantage-point from which almost anyone could effect the summary overthrow of philosophy and be instantly emancipated from all the illusions, as well as the hard labour, of rational thought. 35 Husserl makes essentially the same commencement as Russell with researches into mathematical foundations. His is also a revolt against traditional metaphysics and "unscientific" ways of thinking.[21] He too appeals to the immediate, temporal life-world as it is for existing individuals, with stress on the subjective aspect of its givenness. His understanding of the role of a new logic and ontology is the mirror-complement of Russell's: what is important is not the fact but the facticity of the fact, not the fact-world as objective but as a system of meaning. Scientific philosophy will be the eidetic analysis of the modes of the "being-there" of the world for the "consciousness-of" it, to which access is gained by suspension of every thesis and inference that would go beyond the "things themselves" in their primordial givenness. This epoch sets all appeal to metaphysics, including empiricist metaphysics, in abeyance; in one para-Cartesian stroke the world for thinking reason is summarily suspended and all that remains is phenomenologically to describe the pre-reflexive being-for-consciousness-of-the-world which is thereby revealed. 36 Heidegger's inspiration for Sein und Zeit was the same intentional relation of existential consciousness to its own pre-reflexive world. Dasein, as "the being for whom being itself is a question", is quite the same "I" as Husserl's phenomenological subject though analyzed rather in ontological terms of the modes of this finite-being (being-inthe-world, fallen-ness, being-with, Angst) as also the modes in which being stands related to it (available, useful, present or absent, disclosed, concealed). Time is revealed as the essence of being; it is in its various ecstatic modes that being presents and absents itself. Thus would Heidegger express how things stand for the finite individual who affirms a radically temporal, conditional and contingent world as his own. A whole mid-century

culture of popular existentialism took its cue from this kind of reflection and developed it in all sorts of directions, particularly in the arts. 37 But like Wittgenstein, and for analogous reasons, the later Heidegger drew away from the quasi-psychological approach of phenomenology[22] into a more direct ontological format. For if access to being is sought through analysis of the special case of Dasein, as Sein u. Zeit proposed, it must remain problematical whether what is disclosed thereby applies only to the special case, or to being itself; whether temporality, for example, is a dimension peculiar to human being only or to Being as such and on the whole.[23] Playing Spinoza to Husserl's Descartes, Heidegger gave himself wholly over to the "question of being" as such and to the thinking that might think it in this negativeontological sense. His later essays are thus occupied with giving an account of being quabeing in the classical Thomistic-Aristotelian manner, except that instead of the eternal, unitary, universal categories of being in the traditional account, it now discloses itself through radically contrary, this-worldly categories of particularity, temporality, fatality, difference, contingency, eventuality, fortuity and so on. In short, Heidegger's is an inverse metaphysics, a meta-metaphysics of the finite, which is to say a doctrine of being as time.[24] 38 In so seeking an account of being as it would be for a wholly finite subject and renouncing the conceptual thinking that would "transgress" this limit, Heidegger resorts to more and more recondite neologisms, questionable etymology and unhistorical histories, couched in a counter-conceptual, quasi-theological and poetizing language that speaks in earthy woodland metaphors of paths, turnings, inns, clearings, backtracking, harkening and so forth, just as Nietzsche liked to speak of mountains and clear air. The result is an arcane language that the most practised academics learn to speak only with difficulty with ceaseless debate over lexical nuances even then. This abstruseness is not just a weakness, however, but deliberate on the part of the author who explicitly pronounces conceptual language to be inappropriate to the standpoint of the finite subjectivity he would establish and articulate. As what he means to say is thus intentionally and in itself contrary to thought and cannot in principle be articulated in any clear way, it can be grasped only aesthetically, or better, subjective-existentially, which is of course the whole point. 39 The aim of the 20th century schools was to avoid the paradox entailed in direct confrontations with philosophical reason by developing meta-philosophical disciplines that could claim to be independent of it while setting its limits and effecting its decisive critique - and doing so philosophically. It would be a thinking-beyond-thinking, a radical thinking; "ultra-philosophy" in the proper sense of the term. In an inverted replay of the stoic and epicurean dogmatisms which sought a philosophical freedom in but not of the world, the meta-philosophers of Language and Existence promised disengagement from the thought-world of traditional morality and metaphysics and triumphant return to the human here-and-now world of positive fact and authentic existence.[25] This they would

accomplish through new ways of thinking that dissociate themselves from the philosophical legacy while remaining critically engaged with it. Linguistic analysis allows weighty issues of philosophy to continue to be addressed, while at the same time assuring a complete and utter detachment from them. So also existential ontology, which represents being as what is forever concealed in every attempt to comprehend it in thought, but which declares itself nonetheless in poetic intuitions which not only supersede thinking but claim to be thinking itself at its deepest and most penetrating. 40 The more these ultra-philosophical programmes came to dominate 20th century inquiry the more professionalised and esoteric they became. From the original revolutionary enthusiasm of a decisive redirecting of thought to the human world and an absolute individual freedom within it, philosophy withdrew into a nether-world of industrious paper-work, of interminable critique-ing of critiques and circular interpretation of interpretation addressed to the so-called "literature", that is, chiefly to its own journalistic productions. Drawn into this purely intellectual process, the ordinary issues and ideas that might spontaneously occur to a genuinely philosophical spirit are institutionalized and dissipated in highly specialized forms of argumentation. What passed for the teaching of philosophy became largely a matter of the inculcation of the orthodox watchwords, formulae and conventions required of any who might elect to participate in the esoteric business of academic seminars and research, so that the metaphilosophical schools tended finally to degenerate into a kind of scholasticism. 41 The reason for this lies in the way the ambivalence of the ultra-modernist project recurs in the case of meta-philosophy. Its very idea depends on assuming a double-tiered thinking: a division between a first-order, uncritical thinking that in the case of philosophy spawns illusory knowledge, and a second-order thinking which knows nothing itself but is purely critical. Everything depends on keeping these two strictly separated: second-order critique must not be confused with a first-order knowledge - the axioms of logic are not facts, the epoch is not a psychological event, linguistic analysis is not a Cambridge language-game. And likewise, the basis of first-order knowledge must not be the product of second-order reflection but be given independently - facts are just there, language is ordinary behaviour, the encounter with being is prereflexive. Yet the nature and limit of first-order knowledge is precisely what second-order critique claims the right to dictate, though it can never say where it gets its criterion for so doing. If it simply asserts it, that is arbitrary; if it appeals to some theoretical justification it become itself a first-order knowledge; if it applies the same criterion to itself - as if linguistic analysis were itself a language-game, or Heideggerean being another way being is present - then it becomes reflexively circular.[26] 42 The fate of meta-philosophy is thus that the need to hold these two sides apart keeps foundering on their incipient reciprocity and vice versa. The objective of a final and decisive meta-philosophical critique fades as argument and meta-argument pass inexorably over into one another and as critique inevitably becomes theory and theory

evokes the need of new critique. This inevitable collapse into a vortex of mutual contradiction may appear to be somewhat arrested by stop-gap measures, such as Gadamer's hermeneutical circle which would artificially hold the moments of this reflexivity apart and set them into an endless series. But as Kant pointed out, a series with no beginning or end has no decidable interim locus either, so that in truth nointerpretation of an interpretation can be significant and is in fact meaningless precisely so far as circular. Reflexivity is thus the reef upon which the whole meta-philosophical ideal is bound to founder.[27]

III. Post-Philosophy: the Sceptical Result 43 Post-modernism springs from recognition of the insufficiency of earlier, dogmatic forms of ultra-modernism. Though frequently presented as a new and original view, it does not really take thinking in any new directions but continues the directions of ultramodern thought a further stage. It sees that liberation is not achieved through metaarguments that, in seeking to limit the standpoint of philosophical reason, only tacitly recognize it, thereby reinstating the same issues and conundrums of traditional thought in another form. Post-philosophy will go further to affirm the bankruptcy of all principlecentred thought as such, "logocentrism", whether traditional, counter-traditional or metatraditional. It will no longer even pretend to bring philosophy to an end (though it may abandon it) for that is to assume there is such a thing and that it somewhat makes sense to end it; and this is just what must be denied. To aspire to a final solution in philosophy, even one that would eradicate it altogether, is in any case only to establish some further regime in its place. 44 By sceptically suspending, not only first-order thought, but also the search for immaculate second-order critical conceptions, post-philosophy would seem to realize the essential aims of an ultra-modern overthrow without falling into the trap of simply reinstating philosophy again on the other side of the critical boundary-line. For even a negative ontology is still about being, symbolic logic still has rules and axioms, a strict science is envisaged beyond the epoch, and some semiotic theory or other is inevitably invoked in defense of the appeal to a pre-theoretical standard of words. If both counterand meta-philosophical critiques only resurrect philosophy again, then how might the ultra-modernist project be refashioned such as successfully to accomplish its aim of a radical overthrow of the traditional thinking spirit of philosophy? 45 The post-modern answer is that we ought to resolve to rid ourselves from the very outset of the "prejudice" that there are such things as philosophical positions and arguments and that they make any sense; a prejudice which leads to another, namely that it is up to us to expose them as false by carrying out their decisive critique and declaring

an end to philosophy. The new scepticism will suspend all such assumptions outright; it will not seek to promote any new first-order insights but neither will it advance any new critical methodology, for that too becomes irrelevant once the illusion that there are philosophical positions, correct arguments, true judgements and so on have all been put to rest. Its sole object will be to point out how all discourses of the kind which pretend to a privileged viewpoint from which to execute true judgements of universal accounts of the world are spurious; and not spurious from the point of view of some alternative, more "correct" account, but spurious in themselves. It follows that all attempts to carry out a critique of such accounts participate in that discourse and so are equally to be judged spurious. 46 Post-modern thought thus represents the sceptical turn which no longer seeks either the dogmatic or critical repudiation of philosophy because it has come to the view that all argument for or against rational foundations are in themselves pointless. If it remains "philosophy" at all it is only as post-philosophy, the reflection which seeks no more than to convince the philosophical legacy of its own self-defeated irrelevance. This it might do in a number of ways: by juxtaposing or recontextualizing fragments of texts drawn from the literature to expose the alleged self-conflictual nature of philosophical arguments that they flout their own rules, contravene the very axioms they disavow and even conflict with the philosopher's personal character.[28] Or, it might redefine philosophy as nothing more than a type of cultural narrative, specifically "meta-narrative", and then argue on grounds of the relativity of culture the illegitimacy of that genre. Or it might commence with the pragmatic requirements of the extant democratic societies showing how their interests and advancement must take precedence over philosophical rumination which, if it might have once had a value, now only deflects and confuses the commitment of progressive individuals to the open society.[29] 47 Though post-philosophy takes many forms the common theme is sceptical in the broadest sense. For the perspective for which the futility of all reasoned argument has become axiomatic, after all, there can be no longer be talk of positions or critical refutations thereof. Adopting no position, philosophical or meta-philosophical, the postphilosopher occupies a "non-locus" on the boundary between philosophy and its negation, from which vantage point to interrogate positions and counter-positions in such a way as will simply allow their self-refuting tendency to do its own work. Thus Derrida: I keep myself at the limit of philosophical discourse ...for I do not believe in what today is so easily called the death of philosophy...[30] I have attempted to find...a non-site, a non-philosophical site, from which to question philosophy. [This] search for a non-philosophical site does not bespeak an anti-philosophical attitude. My central question is: how can philosophy as such appear to itself as other than itself, so that it can interrogate and reflect upon itself in an original manner.[31]

Rorty uses a similar language; he speaks of a rhetoric, "strong poetry" or small-p philosophy whose specific business will be to take large-P Philosophy to task, to force it to give up on itself. Such a thinking which withdraws from itself to interrogate or renounce itself has already abandoned the option of taking a stand within or outside philosophy. It stands aloof to the tendency within philosophy to surmount oppositions, reduce differences to unity and give itself a transcendental content; but it equally disdains to stand outside philosophy passing judgement on this tendency from some other position (science, meta-logic, praxis, poetry or whatever). In the interest of a more complete undermining of thought it lets ambiguities stand, embraces the metaphoric, undecidable character of meaning, and pursues "philosophy" only as the means to an ironic suspension that sets every philosophical issue whatever, including all resolutions thereof, in abeyance. 48 This of course is the classical form of all scepticism. In lieu of categories, axioms or methods its appeal is to tropes, rhetorical devices whose function is not to prove or disprove anything but to effect the epoch which sustains detachment from all reasons and arguments. Derrida's trope is "diffrance", described as neither concept nor technique but the dynamic that predetermines all meaning as differential/deferential rather than identical/referential. It is advanced as "the common root of the oppositional concepts, sensible-intelligible, intuition-signification, nature-culture" (also word-idea, beingthought, ontic-ontological, writing-speaking etc.). The "logocentric" thinking of philosophy prejudices one term in a dichotomy and represses the other so as to bring it to "presence" and to link it to some fictional "transcendental signified" seen as the object of a fictional intuition of thought - "idea", "being" etc. - which is thereby made immune to ambiguity or controversy. To reverse this metaphysical tendency, as critiques of metaphysics do, simply by affirming the opposite term - matter rather than mind, say - is only logocentrism again since "every transgressive gesture, precisely by giving us a grip on the closure of metaphysics, reencloses us within this closure."[32] To restore the priority of defrance, philosophical and meta-philosophical positions are "interrogated" to reveal how metaphoric instability still clings to and corrupts their terminology and unsettles the attempted fixations of meaning by which they would sublate ambiguity only to retain it in covert ways.[33] 49 Rorty is a "positive" sceptic in that the standpoint from which he would subvert and finally abandon philosophy springs from practical considerations, namely, what is necessary to advance the cause of "post-modern bourgeois liberalism".[34] Pragmatism is of course scepticism's other face, its ethical counterpart, as in ancient times. Rorty's is not the approach of the exquisitely erudite European who knows how to make words and texts "tremble" and shatter every meaning into a maelstrom of nuances and conflicting associations. He is the no-nonsense American pragmatist who has learned from James and Dewey how to caricature philosophical verbalisms to make them ridiculous in the eyes of common-sense individuals confident of their objective freedom. Rorty also disdains to debate philosophy on its own terms; rather he would challenge classical philosophical notions of the thinking subject or a reason mirroring nature on ideological

grounds rather than in the quasi-metaphysical mode of semiotic analysis. He sees the acceptance of philosophical beliefs as inimical to the openness to practical possibilities that is essential to the advancement of "liberal society" which he describes postphilosophically in terms of actually extant, ethnic-historical collectivities, namely "the rich North Atlantic democracies" whose survival is for him all that matters.[35] 50 This Anglo-American pragmatism contrasts sharply, of course, with the anarchic sensualism of the French post-modernists; but it is evident from the esteem they hold for one another's work that it is quite the same interest that moves both a Rorty and a Derrida.[36] For Derrida, interrogating the extant legacy of philosophical writing has the end, in the Nietzschean tradition, of an aesthetic suspension of assent to all objective accounts of existence; Rorty's rhetorical interrogation of Philosophy, on the other hand, employs irony, satire and rhetoric to loosen habitual attachments to theoretical abstractions, thereby to strengthen communal solidarity among contingently constituted individuals. The aim and effect is in general the same: the conservation of a radically concrete individual freedom through the deliberate subversion of the abstract perspectives of reason. 51 Common also to Derrida and Rorty is the view that meta-philosophy - the standpoint equally of Wittgenstein and Heidegger - is no longer supportable. Such methods could not complete the decisive overthrow of reason because even though claiming to occupy purely critical and thus "presuppositionless" positions they founded philosophical positions nonetheless: a counter-metaphysics of temporal as opposed to infinite being, a meta-logic of fact opposed to a logic of thought, a transcendental deduction from contingency rather than apperception, a semiotic a priori replacing an epistemological one. This could not arrest and suspend the dominion of philosophical thought but only divide it into two, a traditional and a contemporary philosophy, the western-traditional legacy and its meta-philosophical critique, the corpse and its autopsy. Oppositionally dependent on the very legacy they would overthrow, they were doomed to remain entangled in it, the older tradition persisting in the new meta-philosophical doctrines as their specifically negated content. 52 Post-philosophy would rather accomplish the sceptical neutralization of philosophy, not by direct refutation, but by sceptically-pragmatically construing all its positions to make them appear self-refuting, to generate their own contrariness, or to collapse, as it were, under their own weight. This tactic again shows little interest in, or respect for the actual history of philosophy, for it gives no credence even to the idea that there is such. This amounts to a licence to manhandle traditional authors and texts. Rorty cites Representation as the ruling myth of philosophy and assimilates virtually the whole of the western tradition to this one idea, by which he understands the invention of fictitious faculties or media (first Thought and more recently Language) whose real purpose is to establish some static perception of things as absolute and permanent; a view anathema to liberals. Derrida rather speaks of a long-standing addiction of philosophy to the idea of Presence, similarly an invention of universal, self-given objects - "nature", "spirit",

"being" - whose intent is to enable the denial of what Nietzsche calls "Life" and Derrida the inescapable ambiguity and uncertainty intrinsic to the determination of meaning. 53 Both belabour Descartes, Kant, Heidegger and many others by way of exemplifying these alleged self-contradictory artifices of philosophy, and this without much regard to the actual history of thought which in fact offers precious little confirmation of such a consistent record of specific delusions and indeed a great deal of plain evidence to the contrary.[37] But for post-philosophy this it not to the point since it is neither its aim nor intent to be an objective interpretation of philosophical history, the validity of which it in any case roundly denies. Rather, as with ancient scepticism, the sweeping judgements and clever reworking of the arguments of an Aristotle or Hegel or Nietzsche in order to "demonstrate" the alleged self-inconsistency of philosophical positions are sceptical tropes whose sole purpose is to maintain a post-modern detachment from the standpoint of thinking reason, whether in its universal or its historical manifestations. 54 But in post-philosophy the original ultra-modernist paradox, as to how an end to thinking may be thought, is again not really resolved but only brought more vividly to light. For not only is the ambiguity of its outlook patent in the torturously obscure and deliberately indecisive rhetoric in which it is obliged to couch its thesis, but also in the self-subverting character of the task that it sets itself. For what it attempts is to abjure in principle every appeal to principle; to render the absolute indeterminacy of meaning meaningful; to deny that logic has force and then turn the logic of positions against themselves; to affirm categorically and as a global judgement that no overview is ever possible; and so on. 55 Were post-philosophy indeed to fall victim to the temptation to give itself a definite content (Rorty is often suspected of such for his blatantly liberal assumptions and Derrida for a tendency to relapse into semiotic theory) it would cease to be authentically postphilosophical and become just another meta-narrative - the problem to which Lyotard was sensitive. It is therefore essential to post-modern thinking it not be "about" anything, or at least not allow itself to say what it is about. The only "content" it has is to be the relentless, subversive, inconclusive reflection carried out on an extant philosophical literature which, paradoxically, it is bound to conserve in order to sustain itself through the continuous deconstruction of it. 56 Through it own very project, then, post-philosophy becomes a wholly intellectual activity without result, thematic substance or reference. In it the paradox implied in the attempt to think beyond thinking is no longer merely latent, as in earlier ultra-philosophy; it is this paradox itself in the active form of a self-annihilating thinking. The restrictions it would set on all reasonable argument prevent it from arguing its own case with reason, that is, intelligibly. Perched on a sceptical fence it must withdraw in one moment what it asserts in the next: it says philosophy is about the writing-reading of texts and then again

that there are no texts; or philosophy is an open, deliberately inconclusive conversation and then draws the boldest, dogmatic conclusions about all and sundry. That post-modern writing is given to wilful inconsistency, to ambiguous sleights of language or has recourse to comic, anarchistic or even pornographic rhetoric, expresses the predicament that it may never allow itself to say what it means, identify a theme, or reach a conclusion, for to do that would undermine the purity of the "post-philosophical" nonthinking it would sustain. Conclusion: The Recovery of Philosophy 57 In post-philosophy ultra-modernist thought reaches both an impasse and a completion. Its project radically to affirm the modern principle of a concrete, here-andnow freedom in contrast with the other-worldliness of the spiritual-speculative tradition is articulated in its most extreme form. In its purely sceptical reflection on the philosophical legacy it is itself the attempted embodiment of the paradoxical idea of a self-annihilating thinking. This is far from saying, however, that it has at last succeeded in finally overthrowing and nullifying thought so that it really is now all over for philosophy. On the contrary, post-philosophy, even more than earlier forms of ultra-philosophy, remains tied to the tradition it disavows. By its own admission it cannot think to bring about the actual end of philosophy for that would not only be to revert to an ultra-modernist dogmatism whose very difficulties it was meant to overcome, but also to eliminate the very context whose deconstruction alone is what sustains it. And so it can only remain on the sceptical margins and boundaries, a purely suspensive thinking unable either to go beyond philosophy or return to it. 58 If the outcome of ultra-modern thought since Hegel has indeed been the destruction of philosophy, this ought not to be understood as the direct consequence of its arguments but rather as a significant side-effect. While it is true that appreciation of the basic standpoint and argument of the great western philosophical texts has atrophied or been distorted and maligned to the point of extinction, this is not due to the success of scientism or Marxism or analysis or existential ontology or post-modernism in literally disproving, demystifying, repudiating, exposing or disposing of it. Rather it is due to the real history of philosophy, the actual tradition of thought, having been buried and obscured under so many layers of misinterpretation and distortion visited upon it by generations of aggressive ultra-modernist dogma that it has become barely recoverable. For as earlier made out, not only is there a history of the ultra-philosophical argument as such, but also a history of its various reconstructions of the philosophical legacy, reconstructions which had little or nothing to do with that legacy itself or with understanding it on its own terms, but with enlisting it, appropriately misconstrued, in support of one or another version of the argument for a radicalized modernity. As the form of the ultra-philosophical dogma changed, so did the form of the attack on the philosophical tradition, and so also the form of the reconstruction of it.

59 And its point in all this was to retain a relation to philosophical history even while superseding it; to conserve itself as "philosophy" through appeal to negative reconstructions of the whole tradition of reason in lieu of a first-order appeal to it, which, in the interests of the affirmation of a radical subjective freedom and finite humanity, it would avoid. Thus what was unique about the attack on the philosophical tradition which has here been called counter-philosophy is its apocalyptic outlook; its view of being itself the legitimate issue of philosophical history whose final chapter it would write. Thus for nineteenth century scientism the upshot of intellectual history is the final conquest of the liberal-scientific spirit over a pre-enlightened cultural past epitomized in religious and metaphysical superstition. Absolutism on the other hand would find liberation in escape from a dehumanized, reason-ridden past into a present existentialized subjectivity. Both would repudiate philosophical history and give starkly contradictory accounts of it. For their sole interest in history was imaginatively to exploit it as a means of furthering a contemporary confrontation between contrasting views of what ultra-modern liberation means: for one the triumph of humanism and technology over a benighted past, the other a triumph of subjective life over abstraction and morality. The point is, for all their popular influence, the narratives which Nietzsche, Marx and their contemporaries imposed on the history of western art, religion and philosophy were not only mutually contradictory, they are fictional and ideological, not really "histories" at all. Yet these not only still enjoy a preeminence, but compete with, and have largely supplanted the comprehension of the authentic western legacy on its own terms. 60 The meta-philosophies of the 20th century are extensions of absolutist and scientistic beginnings but differ in no longer seeing themselves as a culmination of world-philosophical history but as opposing to it entirely new, "contemporary" insights into the foundations of philosophy itself. They would thus seek to occupy an independent ultra-modern standpoint from which to view the arguments of the past in terms of the basic misconceptions on which they were alleged to rest which would now be their business critically to reexamine and correct. Its approach to traditional philosophy would be to root the whole of it in some alleged specific fallacy - forgetfulness of being, misuse of language, wilful transcendence of fact, a category mistake. Accordingly the great classical works were energetically reviewed, rewritten and retaught from some such perspective - Heideggerean, Rylean, Wittgensteinian - with the result that by mid-century a whole new generation of academic philosophers had become thoroughly imbued with reconstructed interpretations of Plato, Spinoza or Kant which not only openly conflicted with the originals but violently with each other. 61 The history of philosophy was thus the object of a systematic, comprehensive distortion from which it has yet to recover, carried out in order to legitimize contemporary concepts that would lay hold of and express the absolute commitment to a present, con-temporal human self-consciousness and world. The enterprise fell into two general camps, an Anglo-American which positively embraced a behaviouristic anthropology and liberal-technocratic ideals and chiefly enlisted logic and language in its service; and a Continental-European which sought to refuse and stand against just this

humanist, technocratic modernity through a cultivated pessimism which would turn philosophical thinking into a kind of ponderous lament that might fill the void created by the loss of a metaphysical tradition.[38] 62 For post-modernism again, the history of thought as a whole is judged no longer meaningful so that even the distinction between contemporary and traditional philosophy is likewise meaningless. The ruin of the western cultural legacy lies at its feet; it constructs, reconstructs or deconstructs it at its pleasure since the life has gone out of it. If, as Rorty puts it, philosophy may once have been a useful tool for articulating the ideal enhancement of the human condition, in a liberal-technocratic society that is already free it must be abandoned as an outmoded, irrelevant relic. Or, as Derrida suggests, while there can be no desire to rejuvenate a wholly discredited philosophical literature, there might still be a virtue in rummaging about in its rubble to confirm and remind ourselves of our intellectual emancipation from all its reasons and positions. 63 With the idea of a meaningful tradition thus put to rest one way or the other, everything can now be put on one post-modernist plane; Plato can enter into dialogue with Freud, Gide be mated with Hegel, rock poets refute Kant, the western canon dumped because patriarchal or Christian theology is daily refuted in undergraduate seminars. In academe a belligerent antipathy to the whole legacy of reason has become pervasive, inclusive not only of historical culture but also of modernity itself. This outlook is sustained through popular declamations against the relevance of the past or against the very idea of reasoned argument having any exclusive rights in the aftermath of the overthrow of intellect; or else through exquisitely convoluted, literary-aesthetic arguments that would so thoroughly fragment and relativize meaning as to prevent any possible recurrence of the dread disease of definite thought. 64 What ultra-modernism would articulate is the extreme ideal of Modernity as fully and literally actual, a concretely present condition in which every reality or value has been thoroughly assimilated to the interests and perspectives of existing individuals who are subjectively convinced of their absolute freedom and of the world as subordinate to that freedom. This human-existential condition it affirms as one already or virtually accomplished, thus such as exists before all mediations of history, culture or thought. For this reason it violently disengages itself from such mediations, even those of its own western-intellectual legacy from which it draws its ideals and its language. To the latter's notion of a reasonable, universal and objective freedom it opposes the contrary extreme of a finite, temporal, pragmatic, contingent and wholly subjective one. But as this latter vision in the end is boud to contradict its own very ideal of a concretely realized human freedom it falls into a scepticism where freedom itself becomes dissipated, confused and degenerate.

65 For in its post-modern form ultra-philosophy has discovered that since it can never complete the intellectual overthrow of reason, its only recourse is sceptically to suspend or abandon it. But in this it forfeits all legitimacy as philosophy and reaches an impasse beyond which, as it itself admits, it is impossible to go. In this sceptical form the ultramodernist revolt is thus paralysed in its tracks; it can neither establish any position beyond the philosophical tradition nor can it return to it, nor can it give it up. The worlds now confronting each other are no longer some one ultra-modernist doctrine set against another - scientism contra absolutism, liberalism contra existentialism - nor is it the triumph of "contemporary" over "traditional" philosophy. It is now the philosophical legacy as a whole in its historical integrity on the one hand, and the utterly destroyed, annihilated post-modern account of it on the other. Thus, it can no longer make sense either to remain attached to the ultra-modernist critique or to the one-sided defense of traditional thought as against it. From a viewpoint no longer intimidated by the biases which have dominated the past two centuries it has become feasible to begin to speak of the ultra-philosophical project as having reached its limit, making it possible to recover again the connection between this revolt and the actual philosophical legacy it thought to abandon. The issue thereby shifts to become that of how the western tradition is after all to be reconciled to its ultra-modern critique, or contrariwise, how the ultra-modern demand for a concrete and worldly human freedom is to recover its roots in philosophical world-history. This implies a number of obvious challenges: to reinstate and liberate the authentic philosophical legacy from its ultra-philosophical distortions; to revisit the question as to what inspired the ultra-modernist revolution, what underlies its hostility to the philosophical spirit, and how it came to its present post-modernist impasse; overall to restore the sense of the unity, wholeness, continuity and the substance of worldphilosophical culture as comprehensive of and moving beyond the now tiresome negativity of the ultra-modernist preoccupation with a history "broken in two".[39]

NOTES [1] Heidegger in The Question of Being (New Haven 1958) also recognizes that any simple counter-metaphysical "passing over the line" is problematical, though he does not resolve it. Derrida also has argued that, though known as spurious, it is important the classical philosophical arguments not simply be set aside but continue to be taught and studied since it is only their active deconstruction that sustains a post-philosophical awareness. [2] Ironically, the traditional logic has long known just these arguments as modes of informal fallacy e.g. equivocation, amphibole, ad baculum etc. [3] Madison, J.B. Working through Derrida (Evanston, 1993), p.2.

[4] Marx and Nietzsche both independently employ this phrase, though their accounts of the old history ending and the new beginning are the exact inverse of one another. The point is elaborated in my "The Revolutionary Origins of Contemporary Philosophy"; Dionysius, ix (1985), 129-171. [5] The key to "positivism" lies in the claim that objectivity and self-consciousness are not two realities but one, and that this is revealed in the simple intuition of fact The proof is said to be directly witnessed in the "absolute fact" of self-feeling, the immediate givenness of oneself to oneself. Comte makes self-feeling, as opposed to the dualistic theoretical and practical perspectives, the basis of an identification of the objective with the phenomenal, and Mill and Russell likewise cite "feeling" as the final test of the certainty of fact, in ordinary parlance, the criterion of "obviousness". [6] Hegel, on the other hand, (Enc. 249. Zus.) makes the provocative suggestion that both evolution and emanation (the fundamentalist "creationism") are metaphysical schematizations of nature which begin at one extreme (proto-biological or ultrabiological) and deduce the whole of the order of species from it as an abstract series; neither of which really grasp the dynamic of the totality nature as one that is objectively concrete. [7] The absolute is no less Anglo-American than it is European - cf. Spencer, Bradley, Whitehead or Royce. It has its political expression in nineteenth century imperialism with its reverence for Queen or Kaiser as an "absolute individual", Der Allerhochster. [8] Hardy's The Dynasts features a whole Greek chorus of "absolute spirits" declaiming about the fatality of human history. Like many of his contemporaries Yeats' weakness was for occultism and its political expression, the cult of nationalism. [9] The starkest intellectual forms of the great revolutionary-reactionary debate were those of the nineteenth century, whose paradigms spilled over to fuel 20th century social, cultural and political tensions. The dilemma is still the subject of learned (though tamer) debates among prominent contemporary philosophers: see After Philosophy: End or Transformation? (Cambridge, 1987). [10] That the notion of an actual freedom underlies the romantic identity of self and reality in self-feeling is exemplified in Nietzsche's definition of will to power as the "instinct to freedom", which he everywhere opposes to the unreality of a merely moral freedom. [11] A more extended account in Jackson, F.L., "The New Faith: Strauss, Kierkegaard and the Theological Revolution" (Dionysius, xii, 1988) and "The Beginning of the End of Metaphysics" (Dionysius, xv, 1991). [12] The thesis of his Principles of Philosophy. Feuerbach also describes as his first principle "not the substance of Spinoza..the ego of Kant [or] the absolute spirit of Hegel, but the true ens realissimum - man." (Essence of Christianity, tr. Eliot p.xxxv).

[13] Stirner, M. The Ego and His Own tr. Byington, (Sun City 1982). [14] Marx: Theses on Feuerbach, I. [15] For example, Beyond Good and Evil, ss. 47, 56. [16] "Meta-philosophy" in its ultra-modern sense is ambiguous, since the "beyond" it refers to is not the "meta-" of traditional "meta-physics", thinking that goes beyond the finite, but the reverse. Meta-philosophy's "beyond" would leave the thought-world behind for a here-and-now defined in specifically counter-metaphysical terms of language, temporality, the fact-world, Dasein etc. Its appropriate image is Nietzsche's Zarathustra who climbs down the mountain into wisdom. The "going-beyond" is thus really "metameta-physical"; a drawing-back from first-order contexts of thinking (ethics, logic, ontology) into a second-order counter-thinking: (meta-ethic, meta-logic, meta-ontology). [17] Russell's The Scientific Outlook (1931) and Ayer's Language Truth and Logic (1936, c.2.) are typical manifestos of this fundamental collusion of analysis with empirical science. [18] Given the ultra-rational stand of Carnap and others it would have been more proper to speak of a verification criterion rather than arch; a "criterion" is a dogmatic device, a "principle" implies a reason. [19] Wittgenstein's specific views on philosophy are found in Philosophical Investigations, ss. 89-133. [20] Philosophical Investigations, s.206. [21] Though he thinks of it very differently; see for example Cartesian Meditations (The Hague, 1960) ss. 3-7, where scientific evidence is spoken of, not in Russellian terms of factuality, but in terms of "apodeictic certainty" or "givenness". "The evidence for the factual existence of the world [is] not apodeictic" and is thus to be included in the "Cartesian overthrow" (p.17). Husserl's narrow ties are with nineteenth century psychologism (Brentano) and historicism (Dilthey). Like Russell he betrays a notorious naivety respecting the actual history of philosophy, blaming Hegelian metaphysics, for example, for the degeneration of the idea of a "philosophical science" - a charge that would mystify Hegel who speaks of little else. But it is not really naivety that renders meta-philosophical accounts of the tradition characteristically cavalier and skewed but the deliberate intent negatively to reconstruct it to suit the ultra-modernist thesis. This is evidenced by the simple fact that the manner in which phenomenology and analytical philosophy understood the history of metaphysics are not just different; they are mirror images of one another. [22] Phenomenology had indeed a strong impact on 20th century psychology. Not only had Sartre, Jaspers and many others written extensively on psychological subjects, but "phenomenological psychology", owing much to Merleau-Ponty's seminal work, The

Phenomenology of Perception, became for a time in universities everywhere the chief rival to the Skinnerian behaviourism which tended to be the model championed in analytical circles. [23] A Heideggerean account of Heidegger's "turn" is found in Nicholson, G., Illustrations of Being (Toronto 1992), c.4.4. [24] A clear forerunner of Heideggerean being is "the world as will" as Schopenhauer described it: the wholly inscrutable manifestation of an absolute being-in-self that is directly the annihilation of what is so manifest. [25] Captivated by the Platonic vision of an oasis of reasonable life removed from the shifting sands of world-bound opinion, stoicism and epicureanism were able to attain to such only in the limited form of an inward, detached self-consciousness to which an ineradicable outwardness and arbitrariness still clung. (See Hegel, Lectures on the History of Philosophy {New York, 1974}, v.2 sect.2.) Ultra-modern dogmatism takes a reverse course. Its vision is a modern-Christian one of a divine-human reconciliation whereby a free, rational, human spirit has reengaged the world to redeem it. But though it is just this concrete spirit that ultra-modernism would get hold of it only does so in a partial and one-sided way, such as loses hold of the universal aspect of reason and freedom and sinks itself entirely into their finite and existential expressions. [26] Logical positivism was never able successfully to come to grips with paradoxes of self-reference, to formalize the reference of propositions to what they denote, or to avoid an empiricist metaphysics without lapsing into solipsism. Likewise, in insisting that thinking is just language, later analysis could sustain itself as philosophy only by turning into a metaphysics of words. Again, the ontological reflection that would repudiate any universal account of things by dint of the sheer finitude of existence, is forced to exempt its own account from the same ban or else risk collapse into a banal absurdism. And the argument in its later form could never complete its turn to a stable thought of being as time, since this would contradict what being is said to be, namely temporal, selfdifferential. [27] An example of the emerging consciousness of this fact in Hilary Lawson, Reflexivity, the Post-Modern Predicament (LaSalle 1985). [28] This appeal ad hominem was one of Nietzsche's favourite tactics: consider his diatribe against Strauss, his essay Contra Wagner, the vitriolic attack on Aquinas in Geneology of Morals, or the chapter of Beyond Good and Evil titled "On the Prejudices of Philosophers". A recent post-modern example is Derrida's Glas (U.Nebraska, 1986) in which Hegel's relations with his sister and others are made to appear perversely at odds with his philosophy of the family. Such attacks are of course not "personal" in the strict sense but they reflect a basic ultra-modernist prejudice which refuses to accept that "thought" can have any other meaning beyond the thought of some particular, finite individual.

[29] What Hegel called "objective spirit" becomes in its ultra-modernist reformulation by Marxists, English liberals and American pragmatists the apotheosis of the practical which assimilates all other dimensions of freedom to itself. With Richard Rorty it assumes a post-modern form which no longer speaks of a "free society" as a desired end-state or achieved revolution as with earlier liberals or socialists, but only the contingent commitment of individuals to an undefined social openness. [30] Derrida, Positions (Chicago 1981), p.6 [31] In an interview with R. Kearney in Kearney, Dialogues with Contemporary Continental Thinkers, (Manchester UP, 1984) p.98. [32] Positions, p.12. [33] Derrida's goes to great lengths to repudiate the logic of Aufhebung in an effort to represent the wish to transcend difference as what primarily moves Hegel's logical thought. This runs surprisingly contrary to how Hegel himself represents the dynamic, for example in Encyc., ss. 79-82, or how he treats difference itself in ss. 117-120 and in the Doctrine of Essence of the Science of Logic where, far from "transcending" difference, Hegel demonstrates how, in the concept of Ground, difference and identity are revealed as presupposing one another. Derrida, on the other hand would fix difference as absolute. [34] Rorty, R. Objectivity, Relativism and Truth, (Cambridge 1991), pp.197-202. [35] Objectivity, Relativism and Truth, p.15. [36] Derrida's positive attachments to America are well know; for reciprocity on Rorty's part see for example: "Is Derrida a Transcendental Philosopher?", in Madison, G.B., Working Through Derrida, Evanston 1993. [37] To depict Descartes as a "representationalist" as Rorty does entirely affronts the actual Cartesian argument which commences precisely with the suspension of representational assumptions in order to proceed from self-conscious thought alone. Similarly, Derrida's quite silly account of Leibniz's logic or religious ambitions or his free-form Freudian speculations on Hegel's feelings for his sister are the purest flights of trivializing invention showing an almost perverse disdain both for the individuals and their arguments. But again, for post-modernists, the point is never historical arguments themselves but only how they may be exploited for the construction of their own sceptical tropes, as post-modern architects freely borrow from the styles of the past without much regard for their original spirit. [38] Colourful insight into the existential sense of loss of a metaphysical tradition as the root of the Heideggerean account of "denken" is provided by Rorty in "Overcoming the Tradition" (Consequences of Pragmatism, Minnesota, 1982).

[39] The important point is that the ultra-modernist revolt did not "reject" the philosophical tradition; it co-opted and misconstrued it in order to take certain of its key principles to their extreme. It is this legacy of co-optive distortion that has, however, rendered the present time largely incapable of philosophy as traditionally understood since it no longer has a clear idea what it is or was on its own terms. The recovery of the actual western tradition of thought is for this reason a paramount challenge of the times, and a prime objective of Animus, the journal in which the present essay appears.

FINAL DRAFT for World Congress of Philosophy August 13, 1998

Daniel C. Dennett

Postmodernism and Truth(1)

Here is a story you probably haven't heard, about how a team of American researchers inadvertently introduced a virus into a third world country they were studying. They were experts in their field, and they had the best intentions; they thought they were helping the people they were studying, but in fact they had never really seriously considered whether what they were doing might have ill effects. It had not occurred to them that a side-effect of their research might be damaging to the fragile ecology of the country they were studying. The virus they introduced had some dire effects indeed: it raised infant mortality rates, led to a general decline in the health and wellbeing of women and children, and, perhaps worst of all, indirectly undermined the only effective political force for democracy in the country, strengthening the hand of the traditional despot who ruled the nation. These American researchers had something to answer for, surely, but when confronted with the devastation they had wrought, their response was frustrating, to say the least: they still thought that what they were doing was, all things considered, in the interests of the people, and declared that the standards by which this so-called devastation was being measured were simply not appropriate. Their critics, they contended, were trying to impose "Western" standards in a cultural environment that had no use for such standards. In this strange defense they were warmly supported by the country's leaders--not surprisingly--and little was heard--not surprisingly--from those who might have been said, by Western standards, to have suffered as a result of their activities.

These researchers were not biologists intent on introducing new strains of rice, nor were they agri-business chemists testing new pesticides, or doctors trying out vaccines that couldn't legally be tested in the U.S.A. They were postmodernist science critics and other multiculturalists who were arguing, in the course of their professional researches on the culture and traditional "science" of this country, that Western science was just one among many equally valid narratives, not to be "privileged" in its competition with native traditions which other researchers--biologists, chemists, doctors and others--were eager to supplant. The virus they introduced was not a macromolecule but a meme (a replicating idea): the idea that science was a "colonial" imposition, not a worthy substitute for the practices and beliefs that had carried the third-world country to its current condition. And the reason you have not heard of this particular incident is that I made it up, to dramatize the issue and to try to unsettle what seems to be current orthodoxy among the literati about such matters. But it is inspired by real incidents--that is to say, true reports. Events

of just this sort have occurred in India and elsewhere, reported, movingly, by a number of writers, among them:

Meera Nanda, "The Epistemic Charity of the Social Constructivist Critics of Science and Why the Third World Should Refuse the Offer," in N. Koertge, ed., A House Built on Sand: Exposing Postmodernist Myths about Science, Oxford University Press, 1998, pp286-311

Reza Afshari, "An Essay on Islamic Cultural Relativism in the Discourse of Human Rights," in Human Rights Quarterly, 16, 1994, pp.235-76.

Susan Okin, "Is Multiculturalism Bad for Women?" Boston Review, October/November, 1997, pp 25-28.

Pervez Hoodbhoy, Islam and Science: Religious Orthodoxy and the Battle for Rationality, London and New Jersey, Zed Books Ltd. 1991.

My little fable is also inspired by a wonderful remark of E. O. Wilson, in Atlantic Monthly a few months ago: "Scientists, being held responsible for what they say, have not found postmodernism useful." Actually, of course, we are all held responsible for what we say. The laws of libel and slander, for instance, exempt none of us, but most of us--including scientists in many or even most fields--do not typically make assertions that, independently of libel and slander considerations, might bring harm to others, even indirectly. A handy measure of this fact is the evident ridiculousness we discover in the idea of malpractice insurance for . . . . literary critics, philosophers, mathematicians, historians, cosmologists. What on earth could a mathematician or literary critic do, in the course of executing her profession duties, that might need the security blanket of malpractice insurance? She might inadvertently trip a student in the corridor, or drop a book on somebody's head, but aside from such outr side-effects, our activities are paradigmatically innocuous. One would think. But in those fields where the stakes are higher--and more direct--there is a longstanding tradition of being especially cautious,

and of taking particular responsibility for ensuring that no harm results (as explicitly honored in the Hippocratic Oath). Engineers, knowing that thousands of people's safety may depend on the bridge they design, engage in focussed exercises with specified constraints designed to determine that, according to all current knowledge, their designs are safe and sound. Even economists--often derided for the risks they take with other people's livelihoods--when they find themselves in positions to endorse specific economic measures considered by government bodies or by their private clients, are known to attempt to put a salutary strain on their underlying assumptions, just to be safe. They are used to asking themselves, and to being expected to ask themselves: "What if I'm wrong?" We others seldom ask ourseles this question, since we have spent our student and professional lives working on topics that are, according both to tradition and common sense, incapable of affecting any lives in ways worth worrying about. If my topic is whether or not Vlastos had the best interpretation of Plato's Parmenides or how the wool trade affected imagery in Tudor poetry, or what the best version of string theory says about time, or how to recast proofs in topology in some new formalism, if I am wrong, dead wrong, in what I say, the only damage I am likely to do is to my own scholarly reputation. But when we aspire to have a greater impact on the "real" (as opposed to "academic") world-- and many philosophers do aspire to this today--we need to adopt the attitudes and habits of these more applied disciplines. We need to hold ourselves responsible for what we say, recognizing that our words, if believed, can have profound effects for good or ill.

When I was a young untenured professor of philosophy, I once received a visit from a colleague from the Comparative Literature Department, an eminent and fashionable literary theorist, who wanted some help from me. I was flattered to be asked, and did my best to oblige, but the drift of his questions about various philosophical topics was strangely perplexing to me. For quite a while we were getting nowhere, until finally he managed to make clear to me what he had come for. He wanted "an epistemology," he said. An epistemology. Every self-respecting literary theorist had to sport an epistemology that season, it seems, and without one he felt naked, so he had come to me for an epistemology to wear--it was the very next fashion, he was sure, and he wanted the dernier cri in epistemologies. It didn't matter to him that it be sound, or defensible, or (as one might as well say) true; it just had to be new and different and stylish. Accessorize, my good fellow, or be overlooked at the party.

At that moment I perceived a gulf between us that I had only dimly seen before. It struck me at first as simply the gulf between being serious and being frivolous. But that initial surge of self-righteousness on my part was, in fact, a naive reaction. My sense of outrage, my sense that my time had been wasted by this man's bizarre project, was in its own way as unsophisticated as the reaction of the first-time theater-goer who leaps on the stage to

protect the heroine from the villain. "Don't you understand?" we ask incredulously. "It's make believe. It's art. It isn't supposed to be taken literally!" Put in that context, perhaps this man's quest was not so disreputable after all. I would not have been offended, would I, if a colleague in the Drama Department had come by and asked if he could borrow a few yards of my books to put on the shelves of the set for his production of Tom Stoppard's play, Jumpers. What if anything would be wrong in outfitting this fellow with a snazzy set of outrageous epistemological doctrines with which he could titillate or confound his colleagues?

What would be wrong would be that since this man didn't acknowledge the gulf, didn't even recognize that it existed, my acquiescence in his shopping spree would have contributed to the debasement of a precious commodity, the erosion of a valuable distinction. Many people, including both onlookers and participants, don't see this gulf, or actively deny its existence, and therein lies the problem. The sad fact is that in some intellectual circles, inhabited by some of our more advanced thinkers in the arts and humanities, this attitude passes as a sophisticated appreciation of the futility of proof and the relativity of all knowledge claims. In fact this opinion, far from being sophisticated, is the height of sheltered naivet, made possible only by flatfooted ignorance of the proven methods of scientific truth-seeking and their power. Like many another naif, these thinkers, reflecting on the manifest inability of their methods of truth-seeking to achieve stable and valuable results, innocently generalize from their own cases and conclude that nobody else knows how to discover the truth either.

Among those who contribute to this problem, I am sorry to say, is, my good friend Dick Rorty. Richard Rorty and I have been constructively disagreeing with each other for over a quarter of a century now. Each of us has taught the other a great deal, I believe, in the reciprocal process of chipping away at our residual points of disagreement. I can't name a living philosopher from whom I have learned more. Rorty has opened up the horizons of contemporary philosophy, shrewdly showing us philosophers many things about how our own projects have grown out of the philosophical projects of the distant and recent past, while boldly describing and prescribing future paths for us to take. But there is one point over which he and I do not agree at all--not yet--and that concerns his attempt over the years to show that philosophers' debates about Truth and Reality really do erase the gulf, really do license a slide into some form of relativism. In the end, Rorty tells us, it is all just "conversations," and there are only political or historical or aesthetic grounds for taking one role or another in an ongoing conversation.

Rorty has often tried to enlist me in his campaign, declaring that he could find in my own work one explosive insight or another that would help him with his project of destroying the illusory edifice of objectivity. One of his favorite passages is the one with which I ended my book Consciousness Explained (1991):

It's just a war of metaphors, you say--but metaphors are not "just" metaphors; metaphors are the tools of thought. No one can think about consciousness without them, so it is important to equip yourself with the best set of tools available. Look what we have built with our tools. Could you have imagined it without them? [p.455]

"I wish," Rorty says, "he had taken one step further, and had added that such tools are all that inquiry can ever provide, because inquiry is never 'pure' in the sense of [Bernard] Williams' 'project of pure inquiry.' It is always a matter of getting us something we want." ("Holism, Intrinsicality, Transcendence," in Dahlbom, ed., Dennett and his Critics. 1993.) But I would never take that step, for although metaphors are indeed irreplaceable tools of thought, they are not the only such tools. Microscopes and mathematics and MRI scanners are among the others. Yes, any inquiry is a matter of getting us something we want: the truth about something that matters to us, if all goes as it should.

When philosophers argue about truth, they are arguing about how not to inflate the truth about truth into the Truth about Truth, some absolutistic doctrine that makes indefensible demands on our systems of thought. It is in this regard similar to debates about, say, the reality of time, or the reality of the past. There are some deep, sophisticated, worthy philosophical investigations into whether, properly speaking, the past is real. Opinion is divided, but you entirely misunderstand the point of these disagreements if you suppose that they undercut claims such as the following:

Life first emerged on this planet more than three thousand million years ago.

The Holocaust happened during World War II.

Jack Ruby shot and killed Lee Harvey Oswald at 11:21 am, Dallas time, November 24, 1963.

These are truths about events that really happened. Their denials are falsehoods. No sane philosopher has ever thought otherwise, though in the heat of battle, they have sometimes made claims that could be so interpreted.

Richard Rorty deserves his large and enthralled readership in the arts and humanities, and in the "humanistic" social sciences, but when his readers enthusiastically interpret him as encouraging their postmodernist skepticism about truth, they trundle down paths he himself has refrained from traveling. When I press him on these points, he concedes that there is indeed a useful concept of truth that survives intact after all the corrosive philosophical objections have been duly entered. This serviceable, modest concept of truth, Rorty acknowledges, has its uses: when we want to compare two maps of the countryside for reliability, for instance, or when the issue is whether the accused did or did not commit the crime as charged.

Even Richard Rorty, then, acknowledges the gap, and the importance of the gap, between appearance and reality, between those theatrical exercises that may entertain us without pretence of truth-telling, and those that aim for, and often hit, the truth. He calls it a "vegetarian" concept of truth. Very well, then, let's all be vegetarians about the truth. Scientists never wanted to go the whole hog anyway.

So now, let's ask about the sources or foundations of this mild, uncontroversial, vegetarian concept of truth.

Right now, as I speak, billions of organisms on this planet are engaged in a game of hide and seek. It is not just a game for them. It is a matter of life and death. Getting it right, not making mistakes, has been of paramount importance to every living thing on this planet for more than three billion years, and so these organisms have evolved thousands of different ways of finding out about the world they live in, discriminating friends from foes, meals from mates, and ignoring the rest for the most part. It matters to them that they not be misinformed about these matters--indeed nothing matters more--but they don't, as a rule, appreciate this. They are the beneficiaries of equipment exquisitely designed to get what matters right but when their equipment malfunctions and gets matters wrong, they have no resources, as a rule, for noticing this, let alone deploring it. They soldier on, unwittingly. The difference between how things seem and how things really are is just as fatal a gap for them as it can be for us, but they are largely oblivious to it. The recognition of the difference between appearance and reality is a human discovery. A few other species--some primates, some cetaceans, maybe even some birds--shows signs of appreciating the phenomenon of "false belief"--getting it wrong. They exhibit sensitivity to the errors of others, and perhaps even some sensitivity to their own errors as errors, but they lack the capacity for the reflection required to dwell on this possibility, and so they cannot use this sensitivity in the deliberate design of repairs or improvements of their own seeking gear or hiding gear. That sort of bridging of the gap between appearance and reality is a wrinkle that we human beings alone have mastered.

We are the species that discovered doubt. Is there enough food laid by for winter? Have I miscalculated? Is my mate cheating on me? Should we have moved south? Is it safe to enter this cave? Other creatures are often visibly agitated by their own uncertainties about just such questions, but because they cannot actually ask themselves these questions, they cannot articulate their predicaments for themselves or take steps to improve their grip on the truth. They are stuck in a world of appearances, making the best they can of how things seem and seldom if ever worrying about whether how things seem is how they truly are.

We alone can be wracked with doubt, and we alone have been provoked by that epistemic itch to seek a remedy: better truth-seeking methods. Wanting to keep better track of our food supplies, our territories, our families, our enemies, we discovered the benefits of talking it over with others, asking questions, passing on lore. We invented culture. Then we invented measuring, and arithmetic, and maps, and writing. These communicative and recording innovations come with a built-in ideal: truth. The point of asking questions is to find true answers; the point of measuring is to measure accurately; the point of making maps is to find your way to your destination. There may be an Island of the Colour-blind (allowing Oliver Sacks his usual large dose of poetic license), but no Island of the People Who Do Not Recognize Their Own Children. The Land of the Liars could exist only in

philosophers' puzzles; there are no traditions of False Calendar Systems for mis-recording the passage of time. In short, the goal of truth goes without saying, in every human culture.

We human beings use our communicative skills not just for truth-telling, but also for promise-making, threatening, bargaining, story-telling, entertaining, mystifying, inducing hypnotic trances, and just plain kidding around, but prince of these activities is truthtelling, and for this activity we have invented ever better tools. Alongside our tools for agriculture, building, warfare, and transportation, we have created a technology of truth: science. Try to draw a straight line, or a circle, "freehand." Unless you have considerable artistic talent, the result will not be impressive. With a straight edge and a compass, on the other hand, you can practically eliminate the sources of human variability and get a nice clean, objective result, the same every time.

Is the line really straight? How straight is it? In response to these questions, we develop ever finer tests, and then tests of the accuracy of those tests, and so forth, bootstrapping our way to ever greater accuracy and objectivity. Scientists are just as vulnerable to wishful thinking, just as likely to be tempted by base motives, just as venal and gullible and forgetful as the rest of humankind. Scientists don't consider themselves to be saints; they don't even pretend to be priests (who according to tradition are supposed to do a better job than the rest of us at fighting off human temptation and frailty). Scientists take themselves to be just as weak and fallible as anybody else, but recognizing those very sources of error in themselves and in the groups to which they belong, they have devised elaborate systems to tie their own hands, forcibly preventing their frailties and prejudices from infecting their results.

It is not just the implements, the physical tools of the trade, that are designed to be resistant to human error. The organization of methods is also under severe selection pressure for improved reliability and objectivity. The classic example is the double blind experiment, in which, for instance, neither the human subjects nor the experimenters themselves are permitted to know which subjects get the test drug and which the placebo, so that nobody's subliminal hankerings and hunches can influence the perception of the results. The statistical design of both individual experiments and suites of experiments, is then embedded in the larger practice of routine attempts at replication by independent investigators, which is further embedded in a tradition--flawed, but recognized--of publication of both positive and negative results.

What inspires faith in arithmetic is the fact that hundreds of scribblers, working independently on the same problem, will all arrive at the same answer (except for those negligible few whose errors can be found and identified to the mutual satisfaction of all). This unrivalled objectivity is also found in geometry and the other branches of mathematics, which since antiquity have been the very model of certain knowledge set against the world of flux and controversy. In Plato's early dialogue, the Meno, Socrates and the slave boy work out together a special case of the Pythagorean theorem. Plato's example expresses the frank recognition of a standard of truth to be aspired to by all truth-seekers, a standard that has not only never been seriously challenged, but that has been tacitly accepted--indeed heavily relied upon, even in matters of life and death--by the most vigorous opponents of science. (Or do you know a church that keeps track of its flock, and their donations, without benefit of arithmetic?)

Yes, but science almost never looks as uncontroversial, as cut-and-dried, as arithmetic. Indeed rival scientific factions often engage in propaganda battles as ferocious as anything to be found in politics, or even in religious conflict. The fury with which the defenders of scientific orthodoxy often defend their doctrines against the heretics is probably unmatched in other arenas of human rhetorical combat. These competitions for allegiance--and, of course, funding--are designed to capture attention, and being welldesigned, they typically succeed. This has the side effect that the warfare on the cutting edge of any science draws attention away from the huge uncontested background, the dull metal heft of the axe that gives the cutting edge its power. What goes without saying, during these heated disagreements, is an organized, encyclopedic collection of agreedupon, humdrum scientific fact. Robert Proctor usefully draws our attention to a distinction between neutrality and objectivity.(2) Geologists, he notes, know a lot more about oil-bearing shales than about other rocks--for the obvious economic and political reasons--but they do know objectively about oil bearing shales. And much of what they learn about oil-bearing shales can be generalized to other, less favored rocks. We want science to be objective; we should not want science to be neutral. Biologists know a lot more about the fruit-fly, Drosophila, than they do about other insects--not because you can get rich off fruit flies, but because you can get knowledge out of fruit flies easier than you can get it out of most other species. Biologists also know a lot more about mosquitoes than about other insects, and here it is because mosquitoes are more harmful to people than other species that might be much easier to study. Many are the reasons for concentrating attention in science, and they all conspire to making the paths of investigation far from neutral; they do not, in general, make those paths any less objective. Sometimes, to be sure, one bias or another leads to a violation of the canons of scientific method. Studying the pattern of a disease in men, for instance, while neglecting to gather the data on the same disease in

women, is not just not neutral; it is bad science, as indefensible in scientific terms as it is in political terms.

It is true that past scientific orthodoxies have themselves inspired policies that hindsight reveals to be seriously flawed. One can sympathize, for instance, with Ashis Nandy, editor of the passionately anti-scientific anthology, Science, Hegemony and Violence: A Requiem for Modernity, Delhi: Oxford Univ. Press, 1988. Having lived through Atoms for Peace, and the Green Revolution, to name two of the most ballyhooed scientific juggernauts that have seriously disrupted third world societies, he sees how "the adaptation in India of decades-old western technologies are advertised and purchased as great leaps forward in science, even when such adaptations turn entire disciplines or areas of knowledge into mere intellectual machines for the adaptation, replication and testing of shop-worn western models which have often been given up in the west itself as too dangerous or as ecologically non-viable." (p8) But we should recognize this as a political misuse of science, not as a fundamental flaw in science itself.

The methods of science aren't foolproof, but they are indefinitely perfectible. Just as important: there is a tradition of criticism that enforces improvement whenever and wherever flaws are discovered. The methods of science, like everything else under the sun, are themselves objects of scientific scrutiny, as method becomes methodology, the analysis of methods. Methodology in turn falls under the gaze of epistemology, the investigation of investigation itself--nothing is off limits to scientific questioning. The irony is that these fruits of scientific reflection, showing us the ineliminable smudges of imperfection, are sometimes used by those who are suspicious of science as their grounds for denying it a privileged status in the truth-seeking department--as if the institutions and practices they see competing with it were no worse off in these regards. But where are the examples of religious orthodoxy being simply abandoned in the face of irresistible evidence? Again and again in science, yesterday's heresies have become today's new orthodoxies. No religion exhibits that pattern in its history. 1. Portions of this paper are derived from "Faith in the Truth," my Amnesty Lecture, Oxford, February 17, 1997 2. Value-Free Science?, Harvard Univ. Press, 1991

REVIEW OF _WHAT'S WRONG WITH POSTMODERNISM?_

by ROBERT C. HOLUB Department of German University of California-Berkeley _Postmodern Culture_ v.2 n.2 (January, 1992)

Norris, Christopher.

_What's Wrong With

Postmodernism? Critical Theory and the Ends of Philosophy_. Baltimore: Johns Hopkins UP, 1990. [1] From the outset two features of the title of Christopher Norris's latest book need clarification. First, it is not insignificant that, despite the possibility of an interrogatory "What," the title is not a question, but a declaration. Norris knows what's wrong with postmodernism, and he does not hesitate to impart his diagnosis to the reader. Second, the term "postmodernism" does not match exactly the material he covers. He is actually less concerned with postmodernism as a direction in literature and the arts--its more usual field of meaning--than he is with contemporary theory. The title should be understood, therefore, as an assertion about recent directions in theory, not as a query into artistic practices. And what is most interesting about Norris's survey of the critical terrain is the way in which he divides the turf. Most commentators tend to take a stand either for or against poststructuralism, defined rather generally as anything coming out of France or influenced by the French over the past two decades. By contrast Norris splits French and Francophilic theory into two halves. While he continues to advocate most prominently the work of Jacques Derrida and Paul de Man, he is highly critical of Baudrillard, certain aspects of Jean- Francois Lyotard, and Philippe Lacoue-Labarthe's monograph on Heidegger. Joining these French postmodernists on Norris's roster of adversaries are American neopragmatists, in particular Stanley Fish and Richard Rorty. Making a surprising appearance on the approval list is the German philosopher of communication theory, Jurgen Habermas. Although he devotes a chapter of this book to a reproof of Habermas's remarks on Derrida--a chastisement whose root cause is Habermas's carelessness in attributing to Derrida views held by his less philosophically schooled American epigones--he approves of the broad and critical outline of recent French thought found in Habermas's _Philosophical Discourse of Modernity_ (1985). Since these are anything but natural alliances,

[2]

they deserve further attention. Essentially Norris validates those theorists who he feels continue a tradition of enlightenment critique. There is no difficulty in placing Habermas in this camp since he is perhaps the single strongest voice in contemporary theory to openly and directly declare his allegiance to the progressive heritage of modernity. Norris does not discuss his work in any detail, however, except to point out his errors in dealing with Derrida, and his reference to Habermas's notion of universal or formal pragmatics as "transcendental pragmatics" indicates at least a possible confusion of Habermas's current concerns with his abandoned attempt to locate "quasi-transcendental" interests in the late sixties. More difficult to locate in a tradition of enlightened reason are Derrida and de Man. The latter is incorporated into the enlightenment project largely by way of his interest in "aesthetic ideology," which includes a critique of Schiller and of all subsequent misreadings of Kant's aesthetic theory. Derrida is likewise assimilated to the enlightenment paradigm through Kant. In Chapter Five, a consideration of Irene Harvey's _Derrida and the Economy of Difference_ (1986), Norris argues with Harvey (and Rodolphe Gasche) that Derrida is best described as a rigorous Kantian, except that he is "asking what conditions of IMpossibility mark out the limits of Kantian conceptual critique" (200). Indeed, Norris claims that Derrida's is "the most authentically %Kantian% reading of Kant precisely through his willingness to problematise the grounds of reason, truth and knowledge" (199). Norris thus opposes both the facile notion of Derridean deconstruction as the authorizing strategy for "free play" as a free-for-all of meaning, a false lesson learned and propagated by inattentive American disciples, and the equally false understanding of Derrida's work as a dismissal of previous philosophical problems, the tendency found in Fish, Rorty, and French postmodernists such as Baudrillard. Derrida and de Man are for Norris rigorous philosophical minds who question traditional philosophemes and point out their limits. These actions, however, are undertaken in the spirit of Kantian critique, and have nothing to do with the various illicit reductions (of truth to belief, of philosophy to rhetoric, of history to fiction, and of reality to appearance) prevalent in the neopragmatic and the poststructuralist camp.

[3]

This is a credible account of contemporary theory. It makes necessary distinctions between Derrida and his American reception and correctly credits de Man with a seriousness of purpose that is not always matched by poststructuralist gamesmanship. It also rightly dismisses the philosophical legitimacy of the "antitheoretical" neopragmatists, who seem to delight more in the sophistry of their own banal arguments than in the pragmatic endeavors they allegedly prefer. What is not very persuasive in Norris's presentation, however, is the contention that the works of Derrida and de Man carry with them a profoundly ethical and political message that can assist us in combating the entrenched conservatism of the Reagan-Bush-ThatcherMajor era. Indeed, it is precisely in the realm of ethics that Derrida and de Man are most open to attack. Derrida's very style of debate has proven a barrier to discussion of philosophical and political issues. Although it would be silly not to grant his theoretical points in the debate with Searle, the manner in which he ridicules his adversary, refusing to clarify Searle's misunderstandings and to confront issues on which they both have something to say, leads to a closing down of discussion. His encounter with Gadamer, a more patient and open interlocutor than Searle, repeats this elusive strategy; one has the impression here as well that Derrida simply does not want to enter into candid and direct debate about his theoretical position. His sarcastic and condescending dismissal of Anne McClintock and Rob Nixon, who criticize Derrida for his analysis of the word "apartheid," provides a more directly political illustration of an arrogance of argumentation that Derrida has come to epitomize. Finally, one could detail--as I do in a forthcoming book (_Crossing Borders_)--the lack of candor in his response to critics of de Man; in this performance from 1989 his dogmatism about his own position, his haughtiness concerning deconstruction, and his unwillingness to counter opponents's legitimate objections was obvious except to deconstructive true believers in what has become (unfortunately) a quasi-religious cult.

[4]

[5]

The afterword to _Limited Inc._ (1988), the book version containing his essay on Austin and his response to Searle, entitled "Toward An Ethic of Discussion," thus has something of a hollow ring to it. Although Norris uses this afterword as a counter-illustration to the wayward practices of postmodernist thinking, a careful consideration of it would reveal seminal weaknesses in Derrida's ethics and politics. Most blatant perhaps is Derrida's interpretation of his use of the word "police" in his earlier rebuttal of Searle. In the final section of his lengthy response Derrida has written that "there is always a police and a tribunal ready to intervene each time that a rule . . . is invoked in a case involving signatures, events, or contexts." He continues by hypothesizing a situation in which Searle is arrested by the Secret Service in Nixon's White House and taken to a psychiatrist. He asserts that there is a connection "between the notion of responsibility manipulated by the psychiatric expert [the representative of law and of political-linguistic conventions, in the service of the State and its police] and the exclusion of parasitism." He concludes by stating that the entire matter of the police must be reconsidered, "and not merely in a theoretical manner, if one does not want the police to be omnipotent" (_Limited Inc._ 105-6). Searle's practice, the exclusion of parasitism, is thus connected directly with the State and the police, and for good measure Derrida includes a warning about the possible omnipotence of the police. For a reader in 1977, when the debate originally occurred, it would have been difficult not to identify the police and the State with repression; it seemed that Derrida was making an openly political statement. But in 1988 he denies this most obvious reading: His statements "did not aim at condemning a determinate or particularly repressive politics by pointing out the implication of the police and of the tribunal whenever a rule is invoked concerning signatures, events, or contexts. Rather, I sought to recall that in its very generality, which is to say, before all specification, this implication is irreducible" (_Limited Inc._ 134). Derrida is of course correct when he writes in 1988 that there is no society without police and no conceptuality without delimiting (or policing) factors. But there are nonetheless two disturbing aspects of his recent self-interpretation. The first is that Derrida

[6]

seeks to control or limit meaning by clarifying his intention from 1977. He tells us how the word "police" "must be understood" (_Limited Inc._ 136). Thus he would appear here to want his intention to govern the entire scene of meaning, a possibility he attributed to Searle and argued explicitly against in 1977. Second, he seems to argue disingenuously in 1988. Although his 1988 argument makes more philosophical sense, the rhetoric of his arguments in 1977 was certainly meant to suggest a political disqualification of Searle's position. One cannot connect the police and the State--traditional buzz words, among the left, for repressive instances---with an adversary's stance, and not expect that connection to be understood as a political attack. That Derrida denies this dimension of his 1977 essay appears simply as dishonesty. But in that same "ethical afterword" Derrida also seals himself off from any political criticism. Deconstruction, he tells us, if it has a political dimension, "is engaged in the writing . . . of a language and of a political practice that can no longer be comprehended, judged, deciphered by these codes [the traditional Western codes of right or left]" (_Limited Inc._ 139). We are left with the conclusion that only deconstruction can comprehend, judge, and decipher what it is doing. Those who stand outside the light of its eternal truth have no right to pass political judgment. If a self-policing notion of deconstruction is thus the upshot of Derrida's "ethic of discussion," then Norris might want to reconsider its political usefulness. The case for de Man's political usefulness is even weaker. It rests, in Norris's view of things, on the notion of "aesthetic ideology." Following de Man's lead, Norris locates "aesthetic ideology" in post-Kantian philosophers who confound the realm of language, conceptual understanding, or linguistic representation with the phenomenal or natural world. No doubt this topos has been consistently thematized in de Man's writings; it accounts for his placement of allegory above symbolism, his critique of romanticisms, and even his objections to literary theories such as Jauss's aesthetics of reception. But the schema of intellectual history propagated by de Man and repeated by Norris is both undifferentiated and ahistorical. Friedrich Schiller, to whom Norris constantly refers as the first "misreader" of Kant and therefore the perpetrator of the original sin of "aesthetic

[7]

ideology," certainly differed from the author of the _Critique of Judgment_ on matters of aesthetics. But Schiller's relationship to Kant should not be categorized as a misreading, although Schiller undoubtedly misunderstood various aspects of Kantian thought. Rather, Schiller was trying to go beyond Kant in establishing an objective realm for aesthetic objects. He did this consciously and openly, and his purpose in doing so had to do not only with philosophy, but also with reactions to the French revolution. To wrench Schiller out of his historical moment and make the resulting abstraction responsible for a wayward tradition in aesthetic thought, which encompasses all major tendencies from the Romantics to the New Critics, is to propagate a type of black-and-white portrayal that recalls Heidegger's totalized picture of Western philosophy since the pre-Socratics. Norris criticizes Lacoue-Labarthe for refusing to entertain sociohistorical discussions of Heidegger's work, but he himself consistently steers the reader away from a historical situating of theory that could lead to a more differentiated understanding. Even if we accept the schema informing "aesthetic ideology," however, it is difficult to see why it has to be connected with political critique. It may be true that the organic worldview of Romanticism can lend itself to various political abuses, among them nationalism and fascism. But it can also have affinities with various sorts of ecological consciousness or with a "principled and consistent" socialism that Norris defends in his introduction. Norris offers no argument for political affiliations either. Instead he contends that "collapsing ontological distinctions is an error that all too readily falls in with a mystified conception of Being, nature and truth" (268), and that "there is no great distance" (21) between the notion of an organic state and an authentic nationalism. These juxtapositions masquerading as arguments serve only to discredit anything not associated with de Manian thought, but in their undifferentiated, schematic, and ahistorical formulation they are only persuasive to those already convinced of their correctness. In short, there is no reason--and Norris supplies none--to connect de Man's mode of operation with anything politically progressive, nor any grounds for finding his objects of criticism inherently regressive. It is probably worth

[8]

noting that de Man's own theoretical position did not move him toward any great political activity during his three decades of teaching in the United States, and that the short speeches at his funeral (found in _Yale French Studies_ in 1985) contain no references to political inspiration he supplied. Most of the talk about "aesthetic ideology" surfaces only after his wartime journalism came to light, although Norris did develop this line of thought somewhat earlier to defend de Man against political attacks by Frank Lentricchia and Terry Eagleton. The notion that de Man enunciates a coherent and powerfully progressive political program is thus something totally absent from comments about him during his lifetime. Unless we buy Norris's line on de Man, however, his endeavor in the final chapter to save de Man while simultaneously criticizing Lacoue-Labarthe and Heidegger is an empty gesture. While the differences between Heidegger and de Man with regard to National Socialism are not trivial, we should not ignore the obvious similarities. Most notable among these is postwar attitude of repression and prevarication. Neither man owned up publicly to his actions, and there is much evidence to suggest that de Man misled people with regard to his activities during the war. To suggest, as Norris does, that de Man's postwar writing must be read as a determined effort to resist the effects of the very ideology that had entrapped him is simply not supported by common sense. Antifascist and political essays are not de Man's preferred genre; he produced no body of significant statements on any directly political matter as an academician. Moreover, when political topics suggested themselves he consistently turned away from them. Norris himself points to his essay on Heidegger from 1953 in which the context of Heidegger's interpretations of Holderlin--World War II and national destiny--are written off as a "side issue that would take us away from our topic." The bulk of the writings we have at our disposal indicates that Norris is performing the same function for de Man as Lacoue-Labarthe does for Heidegger. Both claim that the best way to understand the phenomenon to which de Man/Heidegger succumbed is to look at de Man/Heidegger's theory. Norris writes: "What Lacoue-Labarthe cannot for a moment entertain is the idea that Heidegger's philosophical concerns might

their

not, after all, have come down to him as a legacy of `Western metaphysics' from Plato to Nietzsche, but that they might--on the contrary--be products of his own, deeply mystified and reactionary habits of mind." If we substitute "Norris" for "Lacoue-Labarthe," "de Man" for "Heidegger," "aesthetic ideology" for "Western metaphysics," and "from Schiller to Jauss" for "from Plato to Nietzsche," we can see that the parallelism Norris seeks to escape is unwittingly retained. In this most welcome and perceptive book on contemporary theory Norris thus fails to step back far enough from the critics he has discussed in the past. De Man and Derrida are powerful and interesting voices in theory, and they are certainly a cut above many who would emulate their deconstructive strategies. But their political and ethical valence remains clouded by the undecidabilities of the very practices they exhibit in their writings. There is also a theoretical dimension to their inability to offer a sustained ethical vision. The preference for viewing language as a system rather than as speech acts, for looking at semantics and semiology rather than at pragmatics, for remaining in the realm of virtual language rather than

[9]

its actualization in the world--in short, for valorizing everywhere %langue% over %parole%--prevents de Man, Derrida, and Norris as well from theorizing ethics and politics. We only have to look at Derrida's initial remarks on Austin to see why deconstruction has such difficulties in connecting theory and practice. Instead of examining Austin from the potentially radical reorientation that Austin himself offers--language as action--Derrida shifts the discussion back to the "non-semiotic," to the level of linguistic meaning that Austin wanted to leave behind. A similar unwillingness to conceive language pragmatically, as always infused with ethical substance, is evident in Derrida's confrontation with Gadamer. In this regard, as Gadamer points out, Derrida's point of departure is retrograde. Norris's attempt to make the deconstructive strategies of de Man and Derrida the basis for a political opposition is thus a questionable undertaking. In this his most

overtly political volume to date he might have done better to explore more thoroughly those theories that take language-as-action as their starting point.

Practicing Post-Modernism: The Example of John Hawkes by John M. Unsworth Contemporary Literature 32.1 (Spring 1991)

"The excitement of contemporary studies is that all of its critical practitioners and most of their subjects are alive and working at the same time. One work influences another, bringing to the field a spirit of competition and cooperation that reaches an intensity rarely found in other disciplines" (x). In these remarks on "contemporary studies," Jerome Klinkowitz takes for granted that contemporary writers and their critics belong to one "discipline," the academic discipline of literary study. This affiliation of criticism and creative writing within a single institutional framework does indeed compound the influence that critic and author have on one another's work, as it multiplies the opportunities and the incentives for cooperation; but rather than simply celebrating this fact, as Klinkowitz does, we ought to inquire into the consequences of the professional interaction and practical interdependence of author and critic, particularly as it affects the creativity of the former and the judgment of the latter.

John Hawkes provides an excellent opportunity for such an inquiry, for several reasons. Discovered by Albert Guerard in 1947 and vigorously promoted by him in the years that followed, Hawkes was the first American "post-modern" author to gain notoriety.[1] Writers of Hawkes's generation were, in turn, the first in this country to spend their entire creative lives in the academy: they have used that position with unprecedented success to shape and control critical reception, especially through the mechanism of the interview. At the same time, as Guerard's influence on Hawkes demonstrates, criticism can shape a writer's understanding of what is important in his or her creative work. There are two places to look for evidence of the kind of influence I am discussing: in the author's work and in representations of that work, either by the author or by the critics. In what follows, I will look at a short story by Hawkes which encodes a drama of authorial influence on critical reading, and along with it I will consider a critical essay on the story which enacts the part scripted for the reader in that drama. Thereafter, I will take a broader sampling of Hawkes's critical fortunes, with an eye not only to the migration of descriptive language from author to critic, via the interview,[2] but also to the genesis of that language in the writing of Hawkes's earliest and most influential critic, Albert Guerard. The story and the critical reading I start with were both published in a 1988 anthology called Facing Texts: Encounters Between Contemporary Writers and Critics, edited by Heide Ziegler. This volume deserves comment in its own right, as an emblem of postmodern literary practice. The title of the anthology refers to the fact that it pairs creative texts by prominent first-generation post-modern authors with critical essays on those texts; what makes the volume emblematic is that the critics were in most cases handpicked by the authors themselves. In fact, as her preface informs us, Ziegler herself was picked by one of those authors: Facing Texts originated in a suggestion made by William Gass to an editor at Duke University Press, that Ziegler should edit a collection of contemporary American fiction. Ziegler says that, when the project was proposed to her, I immediately recognized that in effect I was being offered the opportunity to realize one of my pet ideas: to bring together . . . unpublished pieces by authors as well as critics that would, in a sense, defy the chronological secondariness of critical interpretation. Such a book would make the relationship between author and critic an unmediated encounter, with authors and critics becoming one another's ideal readers. . . . if possible, the pieces offered by the authors should indeed be hitherto unpublished so as to give the critics a sense of the exclusiveness, even privacy of their work and thus convey to them the impression of a close encounter with the respective author. . . . [and] the authors should choose their own critics in order to ensure that the close encounter I had in mind would not, unintentionally, be hostile, and thus destroy the possibility of mutual ideal readership. (ix)[3]

In Hawkes's case, Ziegler's solicitude is unnecessary: his contribution to this volume was designed to foster the kind of reading that she desired for it. "The Equestrienne" is a portion of Hawkes's 1985 novella Innocence in Extremis, which is, in turn, an outtake from a novel, Adventures in the Alaskan Skin Trade. A large part of the novel is devoted to relating the misadventures of "Uncle Jake," as recalled by his daughter; relative to that story, Innocence in Extremis is an extended flashback, to a time when Uncle Jake, as a boy, visited his ancestral home in France with his father and family. "The Equestrienne" is one of the three set pieces that make up the novella, but it has been published here without introduction or reference to the context in which it was developed, and it can be read as a free-standing, very short story.[4] In "The Equestrienne," Uncle Jake's French grandfather (referred to exclusively as "the Old Gentleman") stages an exhibition of dressage, on what we are told is one of several "occasions deemed by the Old Gentleman to be specially enjoyable to his assembly of delighted guests" (216). In this, the first of those (three) occasions and the only one presented here, a young cousin of Uncle Jake's performs for an audience consisting of the visitors (including Uncle Jake), members of the household, and some neighbors, all seated in rows of plush Empire chairs arranged in a courtyard of the family chateau. The girl and her horse are the center of attention, but the performance itself is the medium for an interaction between the audience and the Old Gentleman. In this case, the audience in the tale clearly stands for the audience of the tale, and almost from its opening lines the text signals the effect it wants to achieve -- most notably in the modifiers that cluster around descriptions of the represented audience. As an example, take the passage just quoted: "the days of harmony and pleasure were further enhanced by certain occasions deemed by the Old Gentleman to be specially enjoyable to his assembly of delighted guests." It is the narrator who tells us that days already harmonious and pleasurable were "enhanced" by what is about to be related; and while we might be privy to some delusion in the Old Gentleman when we are told that he "deemed" his entertainment "to be specially enjoyable" to his guests, any distance between his objective and their reaction is collapsed in the very same sentence, when we learn that they are in fact "delighted." Each detail of the performance is similarly described and received. "The gilded frames and red plush cushions of the chairs shone in the agreeable light and . . . moved everyone to exclamations of surprise and keen anticipation." In the world of the text as we are given it, the light is "agreeable," and the audience is unanimous in its expression of "surprise and keen anticipation." Throughout the tale, the reactions of the audience consistently confirm what the narration announces. "Through the gateway rode a young girl on a small and shapely dappled gray horse. Here was a sight to win them all and audibly they sighed and visibly they leaned forward. . . . [an] already grateful audience" (216). There is no point in piling up further examples; suffice it to say that this high pitch of appreciation is insistently sustained, the only two discordant notes resolving into it almost immediately. In the first of these, contemplating his cousin, Uncle Jake thinks "with shame. . . of himself and his shaggy and dumpy pony" (218). In the second, shortly

thereafter, his mother whispers to him: "mark my words, dear boy. That child is dangerous." These are important moments, but the importance lies not so much in any pall they cast over the performance as in the evidence they give of its irresistible charm. Uncle Jake's insecurity and his mother's mistrust soon give way to the universal sentiment: Uncle Jake realizes that "he wanted to become [his cousin] and take her splendid place on the gray horse," and even his mother admits, "'she is a beautiful little rider, Jake. You might try to ride as well as she does. It would please your father'" (219). In her essay on the story, Christine Laniel remarks that "The Equestrienne" "focuses on one of the most pervasive metaphors in Hawkes's works, which he analyzes as essential to his fiction writing when he refers to 'horsemanship as an art'" (221-22). Specifically, Laniel is suggesting that Hawkes offers dressage as a metaphor for the artistic use of language. That much can easily be read between the lines from which she quotes, but taken in full these lines also suggest that the same metaphor might be extended to include an association of other kinds of horsemanship with other ways of using language -- after all, the audience is composed of equestrians: Nearly everyone in that audience rode horseback. Most of them were fox hunters. Their lives depended on horses. . . . Yet for all of them their mares and geldings and fillies and stallions were a matter of course like stones in a brook or birds in the boughs. Most of the horses they bred and rode were large, rugged, unruly, brutish beasts of great stamina. The horses raced and hunted, pulled their carriages, carried them ambling through sylvan woods and took them cantering great distances, but little more. So here in the Old Gentleman's courtyard the spectacle of the young equestrienne and her gray horse schooled only in dressage appealed directly to what they knew and to their own relationships to horse and stable yet gave them all a taste of equestrian refinement that stirred them to surprise and pleasure. They had never thought of horsemanship as an art, but here indeed in the dancing horse they could see full well the refinement of an artist's mind. (218) The thrust of this passage, it seems to me, is first to suggest horsemanship as a figure for the use of language in general, and then to distinguish between the nonutilitarian "refinement" of its use in fiction and the practicality of more quotidian language used with an end in mind, as for example to convey information (in "rugged, unruly, brutish" words "of great stamina" but no elegance). In this scene "artist" and audience share what might be called a professional interest in horses, not unlike the professional interest in language Hawkes shares with his readers; and while it may be the general reader and not the critical one who takes language as "a matter of course," even the most perspicacious fox hunters among us are obviously supposed to be "stirred" to "surprise and pleasure" at Hawkes's demonstration of verbal dressage. In fact, at the conclusion of the performance the story explicitly announces the lesson we are to draw from it: "the audience rose to its feet, still clapping. They exclaimed aloud to each other, while clapping, and smiles vied with smiles and no one had praise enough for the exhibition which had taught them all that artificiality not only enhances natural life but defines it" (220). Hawkes's instruction

of the reader is too deliberate to be unintended and too obvious to ignore, so it must be explained. In Laniel's analysis, the author at these moments is "forestalling interpretation by anticipating it. As a consequence, the critic is thwarted in efforts to unveil supposedly hidden significations, which are obtrusively exposed by the writer himself"(222). She regards this aspect of the story as a problem only for a criticism which needs "to unveil supposedly hidden significations"; as we have seen, though, "The Equestrienne" does more than interpret itself: it so relentlessly superintends response that it is likely to frustrate any reader, and not merely a certain sort of critic. But for Laniel at least, the "alluring fascination" (222) of "The Equestrienne" survives in its strategy of "seduction, which implies the obliteration of reality and its transfiguration into pure appearance"(226). That is, although she acknowledges that the story reads its own moral, she still finds Hawkes's presentation of "the artificial" fascinating, because it undertakes "the willful deterioration of language as the vehicle of meaning." This deterioration is said to take place in a series of puns and paradoxes (sister-sinister, mastery-fragility, innocence-corruption, and so forth) and in sentences like the following (which explains the effect of the Old Gentleman's having positioned the girl sidesaddle on her horse, with her legs away from the audience): "The fact that she appeared to have no legs was to the entire ensemble as was the white ribbon affixed to her hat: the incongruity without which the congruous whole could not have achieved such perfection" (217). In this sentence, Laniel says, we are made to experience both frustration and supreme satisfaction, since the expected word is missing and yet is virtually present, enhanced by the strange, incongruous connections that implicitly suggest it. By establishing the curious relationship of the logically unrelated, by uniting the like with the unlike in sudden and unexpected juxtapositions, the poetic text produces a jarring effect, so that we are left with the notion of a fundamental vacancy, of a basic lack that is the very essence of aesthetic pleasure. (228) Yet the sentence Laniel has chosen not only contains the "missing" word -- "perfection" -- but emphasizes it by placing it in the ultimate position. And in any case, Hawkes's notion of an "incongruity without which the congruous whole could not have achieved such perfection" is more plausible as a model than as an occasion for Laniel's observation that the "jarring effect" brought about by "the curious relationship of the logically unrelated" results in "a fundamental vacancy . . . that is the very essence of aesthetic pleasure." Laniel also tries to restore some ambiguity to the story by arguing that Hawkes's "rhetoric of seduction" is always "reversed into derision, as an insidious vein of self-parody gradually penetrates the text" (222). As she sees it, Hawkes's writing cannot function without initiating its own ironical debunking. The "morality of excess" [Innocence in Extremis 55] that

guides the artist in his work also guides Hawkes in his writing, as exemplified by the profusion of superlatives and comparatives in the novella and in all his fiction. But this very excessiveness entails a crescendo, an escalation into more and more incongruous associations, so that his texts are relentlessly undermined by their own grotesque redoubling. (235) Self-parody is indeed an abiding characteristic of Hawkes's writing -- and often its saving grace -- but though the language we have already quoted from "The Equestrienne" does suggest an excessiveness that might easily escalate into self-parody, Laniel herself admits that "during the performance of the equestrienne the burlesque element is extremely slight" (233). Consequently, when she makes the argument that this text undermines itself she is forced to rely entirely on evidence collected from other, later sections of the novella and from the originary novel. Still, even if there is no parody in "The Equestrienne," its absence makes it worth discussing. In general, the significant gap in Hawkes's work is not between appearance and reality but between the serious and the parodic elements that constitute his fiction: the uneasiness of his texts is that while his self-parody seems deliberate, it doesn't ground or control the seriousness with which he presents his primary material. Since the critic is bound to make statements about the text, and since making those statements usually involves taking a position relative to the text by offering a reading, critics have often resolved this conflict in the text by going too far in one direction or the other -- either affirming the response offered by the text (the more common tactic) or overstating the control exercised by the parodic element. Laniel's piece is unusual in that it does the latter, but in order to make this case she has to read beyond the immediate text. By so doing, she is in effect submitting "The Equestrienne" to the control of a self-parody which develops across other, broader contexts. This move begs the question of whether the parodic strain controls the larger contexts from which she abstracts it. In fact, I would argue, it does not -- the uneasiness simply reasserts itself when we look at Innocence in Extremis or Adventures in the Alaskan Skin Trade as texts in their own right. The significance of Hawkes's unstable self-parody, both with regard to its presence in his other fiction and its absence in the present case, is bound up with the problem of the audience and its response. In order to avoid the problem Laniel has with contextualization, let us look briefly at a discrete work, Travesty, written by Hawkes in the early 1970s. Travesty is the monologue of a man who intends to crash the car in which he, his daughter, and an existentialist poet (the lover of both his wife and daughter) are traveling. Papa, the driver, denies being jealous or having any murderous motive; instead, he tells Henri (the poet) that his plan is to create an "accident" so inexplicable that their deaths will have to be understood as the deliberate execution of an abstract design. Henri is apparently nonplused, since Papa reproaches him for his failure to appreciate the beauty of the thing: "Tonight of all nights why can't you give me one moment of genuine response? Without it, as I have said, our expedition is as wasteful as everything else"

(82). The response Papa wants from Henri is specifically an aesthetic one, and he sees it as a mark of Henri's artistic insincerity that he is not able to provide it. But, as the reader well understands, the detachment from self-interest which such a response would require is too much to expect, even from an existentialist. As a monologist, Papa necessarily speaks for Henri, and in a similar way Hawkes, as a writer, speaks for the reader. His conceit is auto-destructive, but self-parody -- a preemptive mode of discourse -- is by definition both exclusive of and also highly attentive to the audience. The element of self-parody in Travesty asserts itself as the difference between the supposed reality of death within the fiction and the reality of death supposed which is the fiction -- Hawkes, in other words, is Henri if he is anyone in this story. But as this equation suggests, the parody does not extend to Papa, and much of what he says is seriously intended, not least his confessed need for a response: Let me admit that it was precisely the fear of committing a final and irrevocable act that plagued my childhood, my youth, my early manhood. . . . And in those years and as a corollary to my preoccupation with the cut string I could not repair, the step I could not retrieve, I was also plagued by what I defined as the fear of no response. . . . If the world did not respond to me totally, immediately, in leaf, street sign, the expression of strangers, then I did not exist. . . . But to be recognized in any way was to be given your selfhood on a plate and to be loved, loved, which is what I most demanded.(84-85) Self-parody, this suggests, is more than an attempt to forestall a feared lack of response (or an undesirable response); it may also become a way to avoid "committing a final and irrevocable [speech] act." On one level, Hawkes is deadly serious about everything that Papa says; on another, he implicitly denies responsibility for the ideas Papa expresses. At both levels, he precludes response -- within the narrative through the technique of monologue, without it through the technique of self-parody. The effect on the reader is, as Laniel says, often baffling: the proffered position is clearly untenable, and yet the parody does not enable an alternate response because it equally clearly does not control the text. The instability I have been describing might also be regarded as a side effect of characterization. Hawkes is fond of creating figures of the artist, but these figures never completely fill the role in which they are cast; most often they are people who have the sensibility of the artist but who do not actually create art. Cyril in The Blood Oranges, Papa in Travesty, Uncle Jake in Adventures in the Alaskan Skin Trade are all men whose medium is action, not language, and who do not pretend to present the fiction in which their artistry is conveyed to the reader. In Travesty, the distinction would seem to be mooted when narration is placed entirely in the hands of "the man who disciplines the child, carves the roast" (44) -- but in fact it persists, since Papa's "creation," the actual crash, cannot be presented within the narrative structure Hawkes has set up and so is not presented at all. In other words, although Hawkes's novella develops in the space between the disclosure and the enactment of Papa's intentions, the aesthetic Hawkes has embodied

in those intentions can be expressed only in words, never in action -- hence the equation of Hawkes with Henri. Seeking to evade both the irrevocable commitment of unfeigned statement and the fear of no response, Hawkes has adopted a narrative perspective that results in a fiction which implies but does not constitute the realization of an aesthetic. If the conflict between a desire to present this aesthetic and the fear that it will be rejected is settled in Travesty by giving the narrative over entirely to statement, in "The Equestrienne" Hawkes experiments with the opposite solution, usurping the response of his audience. Rather than seducing the reader, this makes her superfluous: hence Laniel's frustration at trying to present a reading of the story as given -- something that her recourse to other texts demonstrates she is ultimately unable to do. And like response, the absence of a controlling intelligence is dislocated in "The Equestrienne" from a metatextual position to a thematic one: "All at once and above the dainty clatter of the hooves, they heard the loud and charming tinkling of a music box. Heads turned, a new and livelier surprise possessed the audience, the fact that they could not discover the source of the music, which was the essence of artificiality, added greatly to the effect" (219).But even within the story, this absence proves to be more apparent than real: at the end of the girl's exhibition, the Old Gentleman appeared and as one the audience realized that though they had all seen him act the impresario and with his raised hand start the performance, still he had not taken one of the red plush chairs for himself, had not remained with them in the courtyard, had not been a passive witness to his granddaughter's exhibition. He was smiling broadly; he was perspiring; clearly he expected thanks. In all this the truth was evident: that not only had he himself orchestrated the day, but that it was he who had taught the girl dressage, and he who had from a little balcony conducted her performance and determined her every move, and he who had turned the handle of the music box. Never had the old patrician looked younger or more pleased with himself. (220) The Old Gentleman is not "a passive witness" to the presentation; he is its conductor, and his curtain call might be compared to Hawkes's persistent assertion of the authorial self in his interviews: in both cases, the creator remains behind the scenes during the actual performance but reappears afterward to make sure that its significance is properly understood. The nature of Hawkes's dilemma and the variety of his attempts to resolve it are characteristically post-modern, in that they demonstrate a very real need to assert critical control over the text, combined with a desire that the reader should be persuaded to a particular aesthetic position. Such desires are not peculiar to post-modern authors, of course: Henry James once admitted to dreaming, "in wanton moods, . . . of some Paradise (for art) where the direct appeal to the intelligence might be legalized" (296). Late in his life, James made that appeal to future readers in his prefaces to the New York edition of his works, but he might well have envied the post-modern author, who can address the contemporary reader through the mechanism of the interview.

Hawkes's inclination to avail himself of opportunities to discuss his work has resulted in quite a substantial body of interviews.[5] In these interviews, Hawkes propounds his aesthetic program, characterizes his fiction, and explains his intentions in specific novels; the images and analogies he uses migrate visibly from the interviews to the criticism and reappear in the questions posed by subsequent interviewers. In this way, the language of Hawkes's self-descriptions comes to dominate the critical reception of his work, functioning -- to borrow an idea from Kenneth Burke -- as a "terministic screen."[6] Hawkes's career also demonstrates, however, the influence of critics on authors: although the authority of this particular terministic screen is derived from Hawkes via the interview, Hawkes himself seems to have derived many of its component terms from Albert Guerard's early analyses of his work. Hawkes has often acknowledged his debt to Guerard, but to fully understand the nature of that debt we need to know something about the history of the relationship between these two men. Hawkes was not much of a student when he came to Harvard: the semester before he left for the war, he had flunked out.[7] His career as a writer started in Guerard's fiction writing class at Harvard, which he took after returning from service in the Ambulance Corps during World War II. At that time, he had just started working on his first piece of fiction, the novella Charivari, and though manifestly talented, he lacked experience both as a writer and as a reader of modern fiction. Prior to 1947, he had written only some juvenile verse, which he submitted to qualify for Guerard's class; during that class (for which he wrote The Cannibal,), Hawkes's "reading of modern experimental literature was largely confined to poetry," according to Guerard (Introduction xn). In a recent interview, Hawkes recalled that when they first met, "Guerard . . . was probably in his early thirties, but to me he was an awesome figure. He was quite formidable, quite authoritarian, extremely knowledgeable, a novelist himself, and he had so suddenly and abruptly praised my fiction at the outset in such a way as to give me real confidence" ("Life" 112). Obviously, in the course of this long friendship Hawkes has had many occasions to express his ideas about fiction, and it is likely that Guerard's published criticism of Hawkes reflects those ideas to some extent. We may even grant that, as Guerard has faded from the forefront of contemporary criticism, and as Hawkes has become firmly established as one of the major talents of his generation, the balance of power in the relationship may have shifted somewhat in recent years. But it is nonetheless clear that Guerard played an influential role in molding Hawkes's understanding of the value of his own fiction. The nature and extent of that influence is clear if we compare a few passages from Guerard's early criticism to Hawkes's subsequent self-evaluations. It was Guerard who brought Hawkes and James Laughlin together, and when, in 1949, New Directions published Hawkes's first novel (The Cannibal,), Guerard provided the introduction. This introduction is the earliest critical analysis of Hawkes's work, and its influence on later Hawkes criticism, including the author's own, is inestimable. In it, Guerard says that "Terror . . . can create its own geography" (xiii) and announces, in terms that persist to this day, that "John Hawkes clearly belongs . . . with the cold immoralists and pure creators who enter sympathetically into all their characters, the saved and the damned alike. . . . even the most contaminate have their dreams of purity

which shockingly resemble our own" (xii). Not long thereafter, the Radcliffe News published Hawkes's first interview, entitled "John Hawkes, Author, Calls Guerard's Preface Most Helpful Criticism" (March 17, 1950) -- and so it would seem to have been. Guerard's remarks about sympathy for "the saved and the damned alike" are reflected in Hawkes's earliest published critical writing (1960), in which he talks about the experimental novel as displaying "an attitude that rejects sympathy for the ruined members of our lot, revealing thus the deepest sympathy of all" ("Notes on Violence").[8] As late as 1979, Hawkes still describes himself as being "interested in the truest kind of fictive sympathy, as Albert Guerard, my former teacher and lifelong friend, has put it. To him the purpose of imaginative fiction is to generate sympathy for the saved and damned alike" ("Novelist"27).[9] In his 1949 introduction, Guerard confidently compares Hawkes to William Faulkner, Franz Kafka, and Djuna Barnes (although he predicts that Hawkes "will move . . . toward realism"), and he concludes -- on a disciplinary note -- that "How far John Hawkes will go as a writer must obviously depend on how far he consents to impose some page-bypage and chapter-by-chapter consecutive understanding on his astonishing creative energy; on how richly he exploits his ability to achieve truth through distortion; on how well he continues to uncover and use childhood images and fears" (xv). In an addendum to the introduction, written for The Cannibal,'s reissue in 1962, Guerard notes that "the predicted movement toward realism has occurred" but reiterates the importance of nightmare and "vivifying distortion" in Hawkes's fiction (xviii). The concepts of distortion and terror, and the paradoxical linkage of purity and contamination, have since become staples in the discussion of Hawkes's work: the Hryciw-Wing bibliography lists at least twenty-one essays with the words "nightmare" or "terror" in the title (beginning with a review by Guerard in 1961), and countless others have incorporated the same idea into their arguments.[10] Guerard's addendum also praises Hawkes for being able "to summon pre-conscious anxieties and longings, to symbolize oral fantasies and castration fears -- to shadow forth, in a word, our underground selves" (xviii). In his first essay in self-explanation, presented at a symposium on fiction at Wesleyan University in 1962 and published in Massachusetts Review, Hawkes himself states: The constructed vision, the excitement of the undersea life of the inner man, a language appropriate to the delicate malicious knowledge of us all as poor, forked, corruptible, the feeling of pleasure and pain that comes when something pure and contemptible lodges in the imagination -- I believe in the "singular and terrible attraction" of all this. For me the writer should always serve as his own angleworm -- and the sharper the barb with which he fishes himself out of the blackness, the better.("Notes on The Wild Goose Chase" 788) The image of the fishhook is a more memorable formulation of Guerard's claim that Hawkes's fiction has the ability to "shadow forth our underground selves"; certainly it

seems, in keeping with the metaphor of which it is a part, to have set itself deep in Hawkes's vision of his own work. In a 1964 interview, one which has remained among the most often cited, Hawkes told John Enck: "my aim has always been . . . never to let the reader (or myself) off the hook, so to speak, never to let him think that the picture is any less black than it is or that there is any easy way out of the nightmare of human existence" ("John Hawkes" 145). In 1971, the piece in which the metaphor originally appeared was reprinted along with Enck's interview in John Graham's Studies in Second Skin (the dedication to which reads: "For Albert Guerard, who led the way" -- Graham is another of Guerard's former students), and in 1975 the image returns in the following exchange with John Kuehl: Kuehl: You once referred to fishing for yourself. Hawkes: I said that "the author is his own best angleworm and the sharper the barb with which he fishes himself out of the darkness the better.". . . I mean that the writer who exploits his own psychic life reveals the inner lives of us all, the inner chaos, the negative aspects of the personality in general. . . . our deepest inner lives are largely organized around such impulses, which need to be exposed and understood and used. (Kuehl 164-65) It is perhaps significant that a few pages later, Hawkes remarks: "For me evil was once a power. Now it's a powerful metaphor" (166).[12] The "powerful metaphor" of authorship as auto-piscation was also used by Hawkes the year before to open an influential essay called "Notes on Writing a Novel," which was first printed in 1973 in the Brown Alumni Monthly, reprinted the next year in TriQuarterly, and finally revised and collected in a 1983 volume fittingly entitled In Praise of What Persists. In that piece, Hawkes relates the following anecdote: A scholarly, gifted, deeply good-natured friend once remarked that "Notes on Writing a Novel" is a deplorably condescending title. . . . At that moment. . . . I thought of a metaphor with which I'd ended a talk on fiction ten years ago at Boston College, when I said that "for me, the writer of fiction should always serve as his own angleworm, and the sharper the barb with which he fishes himself out of the darkness, the better." But when I proposed "The Writer as Angleworm" as an alternative, my friend pointed out that preciousness is worse than condescension. (109) The "friend" remains unnamed, but it is somehow appropriate that Hawkes has trouble remembering the genesis of his image, mistaking the Wesleyan venue for a Boston College one; in an interview given in 1979 and published in 1983, he makes a similar mistake when Patrick O'Donnell remarks on "the fetus fished out of the flood in The Beetle Leg." Hawkes responds: "Yes. Thinking of that image reminds me of an interview with John Graham where I said that 'the writer should be his own angleworm [etc.].'" By this point Hawkes is not remembering the occasion on which he originally formulated the idea but misremembering one on which he quoted it -- the interview with Enck, published

in Graham. Hawkes goes on to dwell on the image at some length, demonstrating that it still informs his understanding of his own work, however vague its origins: It's an interesting paradox: separating the artist from the human personality, the artistic self from the human self, then thinking of the artist's job as one of catching, capturing, snaring, using a very dangerous and unpleasant weapon, a hook, knowing that his subject matter is himself or his own imagination, which he has to find himself and which he catches ruthlessly. It's a very schizophrenic image, full of dangerous, archetypal maneuvers in the deepest darkness within us. ("Life" 123) Hawkes's choice of words is revealing, in that schizophrenia is often linked to the presence of an overpowering authority figure; we have already seen that Hawkes initially regarded Guerard as "an awesome figure . . . quite formidable, quite authoritarian." In a 1971 encounter called "John Hawkes and Albert Guerard in Dialogue, "Hawkes jokes about that "awesome" authority, but with an insistence and intensity that belie his tone. Hawkes: . . . I have long suspected that I'm a fiction created by Albert Guerard. I think I knew from the very first moment we met. (14) when I met him, and for years afterwards, he was, as a teacher, a ruthless authoritarian, a tremendous disciplinarian. About fifteen years ago, I had thought that I had achieved some kind of equality with Albert, at least on a personal level, and had escaped this terrible awe, and the awesome business of father/teacher, but now I've been plunged right back into it. (15) My writing has been filled with awkwardness. . . . It's always been Albert who has pointed out where the distorting glasses have been taken off, or where the writing was flabby. . . . Guerard: That's fantasy. It's not true at all . . . (25-26) Despite Hawkes's bantering manner (and Guerard's denial), it is obvious that this relationship was an extremely important one for Hawkes, and his gratitude seems more than slightly tinged with the anxiety of influence. This is understandable, in light of the fact that for more than a decade after leaving Guerard's class, Hawkes submitted each of his novels to Guerard before publishing it; and in at least one instance, Guerard seems to have exercised his authority in the form of a veto. As Hawkes tells it, when Guerard read the manuscript for The Lime Twig, "he sent it back saying 'Jack, this is deplorable; it's a good idea, but poorly conceived and written, and you'll have to start over again'" ("Life" 112). After that, it took Hawkes four years to revise the book, and although Guerard continued to exert a shaping influence on Hawkes's career, this was the last time he was given a manuscript for preapproval. Elsewhere in his dialogue with Guerard, Hawkes says, "just as you controlled everything else, you are, as a matter of fact, responsible for my fiction becoming increasingly socalled 'realistic"' (23), but after The Lime Twig this realism coincided with a new emphasis on the comic and a marked uneasiness on the part of Hawkes:

Beginning with Second Skin, I was reluctant and partly afraid to ask my mentor for his approval of my work. That was the first manuscript I published without Guerard's pre-reading. I know he likes Second Skin a great deal. . . . [but] I don't think he likes the next two novels all that much; my feeling is that he thinks The Blood Oranges is, in some ways, a falling off. But he liked Travesty a great deal. . . . The reason that we first went to France was because Guerard, himself, is partly French. . . . So France was the world that Guerard represented. ("Life" 113) If Hawkes was conscious of his comic novels as a departure from the kind of writing approved of by his mentor, Travesty (a "French" novel) would seem to have been his gesture of reconciliation. His next book, The Passion Artist, returned to the earlier style and setting and was very favorably reviewed by Guerard. With regard not only to Hawkes's stylistic oscillations but also to the genealogy of his self-understanding, the central issue is the relation of the artist to the contents of his unconscious mind. In exactly this connection, Frederick Busch -- one of John Hawkes's earliest and friendliest critics -- recently wondered whether John Hawkes, studying his life, perhaps studies his art as well. . . . [he] now faces the danger he has faced throughout a distinguished career - - of tapping his usual psychic resources, of using his usual dreams, of relying upon his usual metaphors, and therefore of risking the loss of new language, new fictive worlds. . . . I go so far as to sorrow over his considerable praise from academics. . . because I fear that they seek to encourage Hawkes to write what is "teachable" and teachably "post-Modern." . . . like every writer who taps his inner imagery, [Hawkes] must determine when he is to avoid his own urgings and the temptation to use what becomes a habitual vocabulary of images. (When People Publish 110) It is interesting that, in an earlier version of the same essay, Busch's pessimism was decidedly less pronounced: In Death, Sleep & The Traveler, Hawkes may be thinking about who he is as a writer, what he has done, and what he ought to do. He may, at times, seem to be writing out of a sense of Hawkes. . . . When Hemingway became a student of Hemingway -- To Have and Have Not, as compared to its point of origin, "After the Storm," is a good example -- he failed to measure up to his teacher. While I do not see signs of such a failure in Death, Sleep & The Traveler, I do see Hawkes as engaged in the most profound examination of his own writings; and he is daring to risk being influenced by that seductive writer, John Hawkes. ("Icebergs" 62-63)

Busch's change in tone between 1977 and 1986 suggests that he does feel Hawkes, with the aid of his academic critics, has seduced himself. I would want to add only that Hawkes's "sense of Hawkes" has, from the beginning, been shaped and developed by his most important reader, Guerard. And although influence here reverses the direction it followed in the case of "The Equestrienne," in each case the academic context shared by reader and writer has fostered an extraordinary symbiosis, one which ultimately enervates both criticism and creativity. In The Romantic Ideology, Jerome McGann says that there is "[an] essential difference which separates the journalistic and polemical criticism whose focus is the present from the scholarly and historical criticism which operates in the present only by facing (and defining) the past" (2-3). To date, much of the criticism of post-modern fiction has indeed been polemical and journalistic and has aimed at reproducing the ideology of the fiction it discusses. But even though no one at present can claim to have the same distance from post-modernism as we have from romanticism, it is still possible to submit post-modern fiction to a criticism that scrutinizes its cultural and institutional determinants. Indeed, as McGann points out, there are good reasons for doing so: When critics perpetuate and maintain older ideas and attitudes in continuities and processive traditions they typically serve only the most reactionary purposes of their societies, though they may not be aware of this; for the cooptive powers of a vigorous culture like our own are very great. If such powers and their results are not always to be deplored, cooptation must always be a process intolerable to critical consciousness, whose first obligation is to resist incorporation. and whose weapon is analysis. (2) What was new in 1947 has begun to age, and it is now time to ask what purposes are served by perpetuating the ideas and attitudes identified with it. The problem McGann describes is only exacerbated when author and critic are contemporaries cohabiting in one institution. Under these circumstances, the material inducements to cooperation may well subvert the independence of both parties: each is in the position to augment the prestige of the other, but neither is really in control. As McGann predicts, having been incorporated, each is controlled by the ideology of the institution that creates and confers their prestige, and both end up serving the most reactionary purposes of that institution. Where post-modernism is concerned, the institution is the academy and the ideology is that of professionalism. Others have pointed out before now that academic professionalism is itself at the service of larger cultural mechanisms, and that its most reactionary purpose is to co-opt and sequester intellectual energies -- whether critical or creative -- so that they do not disrupt the smooth operation of those mechanisms.[13] Earlier I asserted that the post-modernism of Hawkes and his generation is continuous with modernism, but here that assertion needs to be qualified. First-generation postmodernism differs from its predecessor in one crucial way, namely in being institutionalized. Modernism, for the most part, rejected the security of the academy in order to take liberties with the culture; by contrast, post-modernism stands at the

embarrassing conjunction of that modernist heritage of alienation and a practical condition of institutional respectability and security. The aesthetic similarities between modernism and post-modernism pale into insignificance next to this situational difference -- and since the aesthetic features of post-modernism serve purposes different from those they served under modernism, our advocacy of those features serves different purposes as well. It may be too late for authors such as Hawkes to alter their course, but it is by no means too soon for the criticism of post-modern fiction to put aside polemic in favor of analysis and begin resisting the urge to cooperate. Notes 1. According to Michael Koehler, the term "post-modern" was introduced (in English) in the 1940s by Arnold Toynbee, who used it to denominate the entire period from 1875 to the present. Koehler says that Irving Howe may have been the first person to call the literature after modernism "post-modern," in his 1959 essay "Mass Society and Post-Modern Fiction." A good deal of the confusion that has accompanied the use of this term in recent years might be attributed to the failure to acknowledge that there have already been two generations of the postmodern and that, in many ways, the two have little in common. For the sake of clarity, I use the original form of the word (in which the hyphen privileges the modern) to refer to the first of these two generations, which sees itself as extending the project of modernism. In "postmodernism," on the other hand, the hyphen has dropped out and the agglutinated form, in which "post" gets top billing, implies the emergence of a new entity. This form of the word is increasingly common, but I would suggest that rather than being applied indiscriminately it ought to denote specifically that rising generation which conceives of itself as distinct from and often opposed to modernism. Back 2. Kenneth Burke's idea of the migration of metaphor is relevant here: "In general, primitive magic tended to transfer an animistic perspective to the charting of physical events. And positivistic science, by antithesis, leads to an opposite ideal, the transferring of physicalist perspective to human events. Each is the migration of a metaphor" (Philosophy 147). In the present case, the migration consists in a transfer of an authorial perspective to the criticism of fiction. Back 3. The other authors in Ziegler's anthology are Robert Coover, Guy Davenport, John Barth, Donald Barthelme, Stanley Elkin, Susan Sontag, Walter Abish, and Joseph McElroy. Back 4. "The Equestrienne" appears in Facing Texts; Innocence in Extremis was published by Burning Deck in 1985; Adventures in the Alaskan Skin Trade was published in hardcover by Simon and Schuster in 1985 and then, as part of the Contemporary American Fiction series, in paperback by Penguin in 1986 Back 5. According to Carol A. Hryciw-Wing's recent bibliography, forty-four interviews with Hawkes were published between 1950 and 1985. Back 6. See Kenneth Burke, "Terministic Screens," chapter 3 of Language as Symbolic Action. Burke says that "even if any given terminology is a reflection of reality, by its very nature as a terminology it must be a selection of reality; and to this extent it must function also as a deflection of reality" (45). He goes on to

elaborate the point as follows: "Not only does the nature of our terms affect the nature of our observations, in the sense that the terms direct the attention to one field rather than to another. Also, many of the 'observations' are but implications of the particular terminology in terms of which the observations are made. In brief, much that we take as observations about 'reality' may be but the spinning out of possibilities implicit in our particular choice of terms" (46). Back 7. To my knowledge, the only personal nightmare ever related by Hawkes (for whom the nightmare has become a trademark) is a recurrent dream "about not passing courses and not graduating from Harvard, in which case I would not have been a teacher, et cetera" (Hawkes and Guerard 21). Back 8. This brief essay and a story are accompanied by Guerard's "Introduction to the Cambridge Anti-Realists," among which Guerard includes Hawkes. Back 9. This interview is accompanied by Guerard's review of The Passion Artist. Back 10. Guerard himself, through all four revisions of his entry on Hawkes in the reference work Contemporary Novelists ( 1972, 1976, 1982, 1986), has continued to praise Hawkes for his use of "childhood terror, oral fantasies and castration fears, fears of regression and violence, profound sexual disturbances" (395). Not surprisingly in the 1986 entry Guerard seems somewhat dissatisfied with Adventures in the Alaskan Skin Trade, because it contains so few archetypal dreams [which] echo powerful dreams in the earlier books"; for Guerard, it is only in these echoes that "the author's true voice is dominant" (397). Back 11. This piece has not only been reprinted in Graham but also in Klein and in volume 29 of Contemporary Literary Criticism. Back 12. Kuehl publishes this interview as a chapter in his book on Hawkes -- not an uncommon practice in book-length studies of contemporary authors. Back 13. For a full-length discussion of professionalism as an ideological tool in the administration of culture, see Larson. Back Works Cited Burke, Kenneth. Language as Symbolic Action: Essays on Life, Literature, and Method. Berkeley: U of California P, 1966. -------. The Philosophy of Literary Form: Studies in Symbolic Action. 3rd ed. Berkeley: U of California P, 1973. Busch, Frederick. "Icebergs, Islands, Ships Beneath the Sea." A John Hawkes Symposium: Design and Debris. Ed. Anthony C. Santore and Michael Pocalyko. New York: New Directions, 1977. 50-63. -------. When People Publish: Essays on Writers and Writing,. Iowa City: U of Iowa p, 1986. Contemporary Literary Criticism. Detroit: Gale, 1984. Contemporary Novelists. Ed. D. L. Kirkpatrick. 4th ed. New York: St. Martin's, 1986. -------. Ed. James Vinson, 1st-3rd eds. New York: St. Martin's, 1972, 1976, 1982. Graham, John, ed. Studies in Second Skin. The Charles E. Merrill Studies. Columbus, OH: Merrill, 1971.

Guerard, Albert. Introduction. The Cannibal. By John Hawkes. 1949. New York: New Directions, 1962. ix-xx. -------. "The Passion Artist: John Hawkes." Rev. of The Passion Artist, by John Hawkes. New Republic 10 Nov. 1979: 29-30. Hawkes, John. "The Equestrienne." Facing Texts: Encounters Between Contemporary Writers and Critics. Ed. Heide Ziegler. Durham: Duke UP, 1988. 215-20. -------. "John Hawkes: An Interview." With John Enck. Wisconsin Studies in Contemporary Literature 6 (1965): 141-55. -------. "Life and Art: An Interview with John Hawkes." With Patrick O'Donnell. Review of Contemporary Fiction 3.3 (1983): 107-26. -------. "Notes on The Wild Goose Chase." Massachusetts Review 3 (1962): 784-88. -------. "Notes on Violence." Audience 7 (1960): 60. -------. "Notes on Writing a Novel." TriQuarterly 30 (1974): 109-26. Rpt. from Brown Alumni Monthly Jan. 1973: 9-16. Rpt. as "Dark Landscapes." In Praise of What Persists. Ed. Stephen Berg. New York: Harper, 1983. 135-47. -------. "The Novelist: John Hawkes." With Thomas LeClair. New Republic 10 Nov. 1979: 26-29. -------. Travesty. New York: New Directions, 1976. Hawkes, John, and Albert Guerard. "John Hawkes and Albert Guerard in Dialogue." A John Hawkes Symposium: Design and Debris. Ed. Anthony C. Santore and Michael Pocalyko. New York: New Directions, 1977. 14-26. Hryciw-Wing, Carol A. John Hawkes: A Research Guide. New York: Garland, 1986. James, Henry. Preface to the New York edition of The Portrait of a Lady. The Art of Criticism: Henry James on the Theory and the Practice of Fiction. Ed. William Veeder and Susan M. Griffin. Chicago: U of Chicago P, 1986. 286-99. Klein, Marcus, ed. The American Novel Since World War II. Greenwich, CT: Fawcett, 1969. Klinkowitz, Jerome. "Cross-Currents/Modern Critiques/Third Series." The Fiction of William Cass: The Consolation of Language. By Arthur M. Saltzman. Cross-Currents/Modern Critiques. Carbondale: U of Southern Illinois P, 1986. ixx. Koehler, Michael. "'Postmodernismus': Ein begriffsgeschichtlicher Uberblick." Amerikastudien 22.1 (1977): 8-18. Unpublished translation by Tom Austenfeld, held in Bowers Library, Wilson Hall, U of Virginia, Charlottesville, VA. Kuehl, John. John Hawkes and the Craft of Conflict. New Brunswick: Rutgers UP, 1975. Laniel, Christine. "John Hawkes's Return to the Origin: A Genealogy of the Creative Process." Facing Texts: Encounters Between Contemporary Writers and Critics. Ed. Heide Ziegler. Durham: Duke UP, 1988. 221-46. Larson, Magali Sarfatti. The Rise of Professionalism: A Sociological Analysis. Berkeley: U of California P, 1977.

McGann, Jerome J. The Romantic Ideology: A Critical Investigation. Chicago: U of Chicago P, 1983. Ziegler, Heide. Preface and Introduction. Facing Texts: Encounters Between Contemporary Writers and Critics. Durham: Duke UP, 1988. ix-x, 3-13.

Postponing the Postmodern

By Ben Agger

The term postmodernity has multiple, often contradictory meanings (e.g., see Huyssen 1986; Best and Kellner 1991). It is my argument here that we should treat postmodernity as a utopian category-- as something to be achieved-- and neither as a method of periodization (e.g., Harvey 1989) nor of celebration (e.g., Lyotard 1984; Kroker and Cook 1986). In this sense-- peculiarly, to those who take it for granted that postmodern theory breaks ranks with Marx and Marxism-- postmodernity helps formulate a contemporary version of Marx's own eschatology, which was to lead to socialism and then communism. Saying this does not reduce postmodern theory to Marxism, even assuming that we could settle on a singular version of Marxism. I argue that postmodernism, conceived within the eschatological or "critical" framework of Marxist critical theory, does not betray Marxism but extends Marxism into the late 20th century, formulating postmodernity as the latter-day version of Marx's socialism. In particular, postmodern critical theory is the first narrative to pose a possible utopian future not as a determinate outcome of nature-like social laws but rather as one conceivable discursive accomplishment among many. Shorn of "necessity," postmodernism bridges the global and local, system and action. This is not to suggest that socialism is, or should be, dropped as a political aim, to be hoped for and fought for. It is rather to suggest that socialism as an "imaginary" (vision,

model, blueprint) has lost a great deal of its currency at a time when perestroika is celebrated, albeit falsely, as the triumph of capitalism over socialism and communism. Western Marxists have long understood Soviet command socialism to be a betrayal of Marx's socialist humanism, thus treating the collapse of the Soviet Union not as a referendum on capitalism but as an outcome of state-socialist "contradictions" in an era of "late socialism." Simply because the imaginary of socialism has been tainted by Cold War affiliation to the Soviet experience does not make socialism unworthy as a utopian goal of critical social theory. On the contrary, we need socialism more than ever now that both American and Soviet manifest destinies have been found wanting. Yet we do not have the luxury of simply repeating Marx's 19th century litany of socialism's Aufhebung of capitalism as if the Cold War never happened (and, with it, the many aspects of America's transvaluation of meaningful political discourse into what the Frankfurt School called "affirmative" terminology). To put this differently, I contend that postmodern theory affords the left a new imaginary with which to revive Marxism at a time when "class struggle" has been assailed from all sides, including by multiculturalists, feminists and people of color. This requires all sorts of theoretical work, demonstrating that one can refashion Marxist categories in light of historical transformations unforeseen by Marx. In that sense, postmodern theory empirically explains changes in capitalism and the world system unforeseen by Marx. Far from opposing or vitiating Marxism, postmodernism is both an empirical revision of Marxism conceived within Marx's original frame of reference and a way of signaling the enduring significance of Marx's vision of utopia-- a society as yet "nowhere." Both of these functions of postmodernism are crucial for critical thinkers and actors who refuse to concede that Marxism has been bypassed by the so-called postmodern. There is an irony here: Critical (or "left'') postmodernists like Jameson, Aronowitz and Harvey use postmodernism as a way of defending the significance of Marx's worldhistorical eschatology against other postmodernists like Baudrillard who celebrate postmodernity's break with modernity. Postmodernist battles postmodernist over the Marx legacy, one asserting the possibility of a postmodernity that furfills Marx's dream of a disalienated society and the other offering an existing postmodernity as proof that Marx was wrong all along to posit a disalienated society articulated in terms of classlessness. All of this postmodernist discourse engages the question of what the Frankfurt School and then Mandel called late capitalism, lateness being an issue of historicity resolved differently by theorists who variously defend and abandon Marxism. The fact that "lateness" is an issue at all already vitiates so-called orthodox Marxism, to which western Marxism (Lukacs, Gramsci, Frankfurt School, Sartre, Merleau-Ponty, Beauvoir) was a response. The issue of lateness, as I am calling it, acknowledges that capitalism, if one can even use that theoretical construct in the 1990s, needs to be theorized today in ways that trade on Capital but add to it analyses of culture, gender, race, colonialism and the environment, a project attempted by Habermas under the rubric of new social movements theory. What endures about Marxism is both Marx's understanding of the contradictory logic of capital and his vision of a disalienated society. Already in _Capital_ Marx clearly

understood that the so-called fetishism of commodities reified human relationships, producing their representation as relationships among things in nature. This nature-like appearance of capitalism allows capitalism to be represented by economists and social theorists as eternal, rational, necessary, obdurate. One could say that what Marx called commodity fetishism was the first "postmodern" understanding of capitalism, and of capitalism's "lateness," in the sense that it recognized that alienation (here, the economic exploitation of labor power) requires a certain discursive formulation (here, alienation's nature-likeness) in order for it to be reproduced in everyday life. In terms of the issues that Marx addressed in _Capital_, workers reproduce capital by failing to understand the historicity of their lives-- the fact that they are oppressed by capital, which has emerged historically and thus can be challenged. Instead, they experience everyday life as immutable. The everydayness of their alienated lives is concealed in the supposed laws of the bourgeois market economy, which discursively produce the illusion of fair exchange promulgated by the labor contract. Both law and economic theory produce and hence protect this discourse of commodity fetishism, which in turn reproduces itself in workers' quiescence and conformity. Marx was postmodern avant la lettre in that he understood how texts like economic theory become lives, hence authoring those lives secretly. He interrogated the increasingly permeable barrier between textuality and materiality that gave rise to what he called commodity fetishism. Marx was of two non-contradictory minds here: He argued that the logic of capital is objective in the sense that workers have only their labor power to sell, thus standing on the brink of destitution. But at the same time the logic of capital is subjective and intersubjective-- discursive-- in the sense that culture must produce the representation of alienated experience and practice as nature-like, hence reproducing nature-like society. The objective, subjective and intersubjective comprise a complex totality that cannot be dissected into base and superstructure or economics and culture. Discourse ("fetishism," in the language of _Capital_) supports capital ("commodity," in _Capital_) by robbing capital of historicity, practice, textuality-discourse. For Marx the solution to this was an unmasking via the critique of ideology. This unmasking revealed the falsehood of certain nature-like representations of capital. Marx was prepostmodern where he undertook this unmasking through a more or less straightforward language of counterfactual critique, laying waste to false consciousness through political education. The Frankfurt School's critical theory, which converges with many postmodern themes (see Jay 1984), no longer took for granted Marx's optimism about how representation could unmask representation on the battleground of competing truth claims. Marcuse in (1964) _One-Dimensional Man_, for example, suggests explicitly that radical discourse is increasingly coopted by an affirmative culture in which dissent becomes lifestyle, hence robbing language of its demystifying and galvanizing power. Although not completely resignatory-- not as resignatory as Adorno in (1973) _Negative Dialectics_-- Marcuse in (1972) _Counterrevolution and Revolt_ gave up on the very New Left social movements that he championed in (1969) _Essay on Liberation_ as the harbinger of a new postmodern sensibility capable of formulating qualitatively different discourse/ practices of late-capitalist everyday life. Empirically, Marcuse was correct: The New Left and counterculture withered on the vine, failing to resist their own commodification as "the 'sixties" became a growth industry capitalizing

on yuppie nostalgia: 1960s album rock reissued on compact disk and payable in installments. Theoretically, he, Adorno and Horkheimer exaggerated the totalizing tendencies of the culture industry, which was treated too much as a monolith. Postmodern theory focuses on the crisis of representation as, in effect, the crisis of late capitalism. No longer should we assume that representation (e.g., the nature-like representation of capitalism in bourgeois economic and social theory) can be demystified through discourse that is not fraught with what Derrida calls undecidability. Representation is no longer possible, if it ever was. It never was possible if by representation we mean the positivist reflection of a world "out there." Today, text and world have blurred to the point of virtual indistinguishability. As Adorno reminds us, though, the indistinguishability of concept and thing is never total; there remains an "indissoluble something" which eludes representation and thus makes truth possible. The kernel of non-identity upon which his negative dialectics rests is remarkably similar to Derrida's own stress on the irreducibility of "alterity" to "presence," making deconstruction possible. Deconstruction (e.g., Eagleton 1983) can be viewed as a political practice if we regard Derrida's work as critical social theory, as I (1994) have argued we must. In these ways, Marx needs to be extended into a period of capitalism's lateness, using postmodern discourse theory as well as critical theory to show how text and world have become terms of each other, thus making demystifying representation next to impossible. But the same postmodernism that makes way for this engagement with lateness also contains an eschatological moment in its implication of a time and place somehow beyond midnight-- the "post"modern. Postmodernity can be viewed as a furfillment of Marx's dream of disalienation, which he characterized as the end of prehistory (modernity). This is confusing because French theorists like Lyotard have positioned Marx as modernist, thus establishing their own "postmodern" identity as postmarxistof course, a political posturing. Marx worked within the framework of modernity, attempting to show its dialectical potential for becoming something other. He did not use the term postmodernity. However, as Berman (1982) has argued convincingly, the passage from The ManiSesto about how in capitalism "all that is solid melts into air" is a postmodern phrasing par excellence. Terminology is less important here than meaning. I contend that Marx was trying to tease out the potentials and constraints of modernity through his critique of bourgeois political economy. This augured a break with the modern so dramatic that we scarcely possess the discourse necessary to evoke the disalienated experiences and practices first voiced in Economic and Philosophical Manuscripts and later articulated by the Frankfurt theorists in their attempt to develop an aesthetic of socialism (Benjamin, in his work on Paris; Adorno, in his appraisal of Schoenberg; Marcuse, in his discussions of ergs). In claiming the postmodern for Marxism, I am not reducing theory to a singular political program but rather suggesting that Marx foresaw the need to furfill the "project of modernity," as Habermas has termed it. He recognized the lateness of capitalist civilization as an opportunity to preserve the best features of that civilization (e.g., what the Frankfurt theorists called enlightenment) while transcending it in favor of something

qualitatively different. Today, this is anathema to those postmodernists who contend that Marxism is politically pernicious, or at least passe, and that Marx embraced modernity uncritically. I think both versions of Marx are wrong; better, they conceal their own interest in producing a neoconservative version of leftism in celebration of the so-called "end of Communism." Even a casual glance at the CIS reveals that their leadership is very far from dealing with the lateness of capitalism as an occasion of emancipation. Instead, state socialism has given way to a helter-skelter venture capitalism in which Russians scramble to survive. In this context, some Russians want to return to Communism. My eschatological version of postmodernism comes out of left field, especially in the United States, where French theory has been ritualized as downright semio-celebration. The popularity of the "late" Baudrillard, he of (1988) _America_, is a case in point. Where Foucault and Derrida remained committed to a radical interrogation of modernist philosophical and theoretical assumptions, Baudrillard has gone off the political deep end. Whereas his (1981) _For a Critique of the Political Economy of the Sign_ and even (1983) _Simulations_ made some important points about the semiotics of late capitalism, he has dissolved "reality" into the endless play of simulations and thus lost all basis for ideology critique. Baudrillard dissolves material reality into "simulations," thus replacing the political economy of labor power with the political economy of the sign. A critical theory informed by postmodernism need not choose between these two versions of political economy but, with Horkheimer (1972) in his programmatic 1937 essay on"Traditional and Critical Theory," links material and ideal, economics and culture. The Frankfurt School theorists, as exemplified by Horkheimer and Adorno's (1972) argument in _Dialectic of Enlightenment_, identified the "culture industry" as the institutions and discursive practices bridging base and superstructure. For postmodernism to embrace a leftist eschatology requires that we historicize modernity in such a way that we recognize within it the possibilities of dialectical transcendence-- radical social change. In this sense, postmodernity is a stage of modernity not yet fully developed. There is a strong temptation simply to equate the postmodern with the present, using postmodernity much the way Daniel Bell (1973) used the term postindustrialism to describe a stage of capitalism based on various information technologies unanticipated by Marx. It is important for postmodernity not to function merely sociologically, as a description of the present and imminent future. Typically, this sociologization of postmodernity is celebratory, defending the rupture with modernity as a rupture with political partisanship and ideological contestation (again, resembling Bell's (1960) argument for the "end of ideology"). I contend that postmodernity must remain an eschatological category precisely because the prefix "post" connotes a resolution of modernist "lateness" in the sense of lying-beyond-modernity. There is another, more nihilist reading of lateness: Postmodernity is late-in-the-day, beyond the possibility of radical social change, a reading approximated by Adorno who, as Ryan (1982) recognizes, is remarkably similar to Derrida in his critique of "identity theory." The problem with an eschatological version of critical theory is that lateness is seen to precede the dawn of a new world, a rhetoric used by Marx and many other utopians.

When the clock strikes midnight, social change is to arrive punctually-- in Marx's terms in _Capital_, suddenly the expropriators are expropriated. But modernity is elastic, containing both the possibility of the Holocaust and of radical social change deserving the prefix "post." The single best reason for calling this radical change postmodern rather than Marxist is that it is incredibly difficult to restore the political currency of Marxist eschatology after perestroika as well as Reagan. Marx was insufficiently dialectical in the way he demarcated the boundary between the modern and postmodern. He failed to see that postmodernity was in fact not a rupture with late modernity-- the clock striking twelve-- but a moment of modernity that exists as a dialectical possibility and not a certainty lying at the end of a lawlike process of socialist unfolding. Postmodernism is usefully ironic in that it stresses the undecidability of discourse and action while at the same time preserving the possibility of meaning conceived dialectically as an engagement with nothingness, meaninglessness, alterity. Postmodernism need not be nihilist if it is conceived within the frame of reference of modernity as an attempt to "fulfill" modernity, establishing a regime of reason, albeit by resolving modernity's lateness in a way that replaces the diachronic timetable of modernist unfolding with the synchronic historicity of utopia. To put this differently, postmodernism ensures that the left proceeds with no cosmic guarantees about the inevitability of radical social change and thus a legitimation of their own vanguardist privileges. This is scandalous talk for those convinced that Marxism is yet another modernism failing to address the venality and directionlessness of the present adequately. Admittedly, there are few Marxisms that think beyond the 19th century. More important, there are few Marxisms that make irony a principle of dialectical humility and dialogical democracy. The Parisian existential Marxists, especially Merleau-Ponty, anticipated Derrida in their attempt to reconcile Heidegger and Marx. Better than the new French theorists, MerleauPonty, Sartre, Beauvoir and Camus reconciled the existentialist and left projects without sacrificing one to the other. Meaninglessness need not thwart action if we understand that history, however devoid of millennial telos, is still available to human deliberation and design. In this sense, postmodernism resists the nihilist tendencies of its own critique of foundationalismof first principles and assumptions that intend to elude interrogation and critique. The crucial contribution of postmodern theory to the critique of foundationalism and the philosophy of presence, as Derrida termed it, lies in its demystification of totalizing intellectual and political systems that obliterate "alterity" (difference, otherness). In this sense, postmodernism fosters dialogical democracy, recognizing that the good is talk and all talk is good. At the same time, the risks of antifoundationalism are nihilism and relativism that lead to political quiescence. The U.S. reception of postmodernism has stressed this post-political quietism. Indeed, socalled post-Marxism (see Block 1990) extends from postmodernism in this sense. But the American reception of postmodernism has tended to ignore postmodernism's stress on the linkage between discourse and democracy, a linkage that I contend is precisely the opening of Derrida's critique of western logocentrism to radical politics. Put differently, the American reception of postmodernism suppresses (or simply never learned) the social and intellectual history of French postmodern theory, which emerged out of the 1968 May Movement as a critique of Stalinist and orthodox-Marxist authoritarianism in

preference for a radical micropolitics of everyday life (later to emerge as new social movements theory). Far from turning away from politics, people like Derrida and Foucault viewed their own philosophical work as intensely and obviously political, contributing to the heterodox French left project, especially in ways that embrace the feminist and gay/ lesbian movements. This French version of micropolitics derived from antifoundationalism in the sense that the antifoundationalist critique of western logocentrism provides the moral and political paradigm of democratic community. The French theorists attempted to install deconstructive perpetual interrogation, especially the questioning of and debate over first principles, as a basis of community-- of the good, in logocentric parlance! This issued in a radical micropolitics not at all devoid of values but rather founded on the value of dialogue and discourse themselves. In particular, Derrida and Foucault sought to empower those who have historically occupied the "subject positions" of alterity and otherness, enabling them to enter community and thus achieve political and social power. Deconstruction, as Derrida understood it, is the activity whereby dichotomies are revealed to be hierarchies (e.g., male/female, where to be female means that one is notmale, thus defining women in terms of their "absence" of maleness). Once deconstructively revealed, these hierarchies are to be displaced by the invention of new discourse/practices. A great deal of deconstructive activity has been spent decentering bipolar concepts and practices of gender, a theme central to Derrida's and French feminists' critiques of western phallogocentrism (see Hekman 1990; Agger 1993). The point of deconstructive critique is not purely negative, however, inasmuch as dichotomies/ hierarchies, once deconstructed, would then be displaced by new discursive versions. Thus, deconstruction is at once a negative and positive activity, not only demystifying present discursive practices but also attempting to replace them with "different" ones. This does not issue in a new foundationalism, however, if deconstructors live up to their own standards of undecidability. That is, the development of new discursive practices that do not trade on sheer dichotomies (only concealing hierarchies) is not an occasion for denying those new discursive practices deconstructive attention. On the contrary, deconstruction ceaselessly seeks to destroy the hardening (reification, in Marxist) of discourse into cant-- and thus power. For example, once left-feminist deconstructors displace the phallogocentric bipolarity of masculinity/femininity in favor of a new trinity of race/class/gender, they risk reifying race/class/gender into an unassailable formulation that does our thinking for us, hence undermining the very "difference" deemed so important by multiculturalists. This is not to decide against multiculturalism, especially where it is grounded in a critique of phallogocentrism, but simply to point out that a good multicultural community would vigilantly protect itself against the reification of its own hallowed concepts that, over time, harden into a code of political correctness. This talk of the good risks logocentrism-- the bane of postmodernists. But I contend that postmodernism can talk of the good just as Marxism can theorize discourse as a significant political factor in late capitalism. There are many "goods," some less logocentric than others. So powerful has been the logocentrism of the Greeks, who

sought the good in a cave, that it is very difficult to rescue moral philosophy and morallyenergized social theory (e.g., Habermas) from its conflation with Platonist logocentrism, which set the standard for later versions. I contend that Marx was the "first" postmodernist in the sense that he criticized logocentrism implicitly where, following Hegel although with a materialist intent, he made historicity thematic as an antidote to metaphysics. In other words, Marx was antifoundationalist, opposing speculative philosophy as empirically and politically arbitrary. By the same token, postmodernism makes way for a discursive formulation of the good that turns textuality into a political language game through which power is transacted. When Derrida says that there is nothing beyond the text, I hear him to say that there are no grounds of judgment outside of judging itself-- a necessarily perspectival, undecidable activity. This does not disqualify judgment but only situates judging in the undecidable discursive activity beyond which there are no aprioristic certainties. In Marxist terms, there is only historicity. Indeed, it is abundantly clear that Derrida has done all sorts of political "judging," on behalf of various French and international new social movements that he supports. Postmodern discourse theory, in its painstaking attention to the significance of discourse, suggests that we should no longer pose Marxism/ postmodernism as disjunctive alternatives. We need a new theoretical mapping that locates Marxism, postmodernism, feminism, environmentalism and anticolonialism on the same cognitive map, as Jameson calls it. To name this overarching cognitive map is somewhat like attempting to map the "outside" of the universe-- a fruitless exercise in metamapping. Perhaps only reflecting my autobibliographical grounding in the Frankfurt School, I would name the "big" map or narrative critical theory, treating Marxism, postmodernism, feminism, environmentalism and anticolonialism as "moments" of critical theory. Although I have much sympathy with Lyotard's aversion to metanarratives as codes of discipline, I am uncomfortable with his dichotomy of big and small narratives: It is increasingly clear that we need both global and local explanations, especially where they are dialectically connected. Postmodernity, then, is not to be located off the Marxist map, a time after time when leftist eschatological aims no longer apply. Instead, postmodernity is a contemporary formulation of utopia that can only be reached through modernity. It has much the same status as Marx's notion of how socialism would end prehistory. Only with postmodernity will modernity achieve its telos-- dialogical democracy. In appearing to claim postmodernism "for" Marxism I am not making a one-sided appropriation. Marxism is transformed by its engagement with postmodernism and feminism, perhaps beyond recognizability. I contend that it is also revivified now that discursive politics and personal politics matter like never before. Notes Originally published: Cultural Studies, Volume 1, pages 37-46. Copyright ~ 1996 by JAI Press Inc. Re"printed" with Permission. ISBN: 1-55938-951-6

References Adorno, T.W. 1973. _Negative Dialectics_. New York: Seabury. Agger, Ben. 1993. _Gender, Culture and Power: Toward a Feminist Postmodern Critical Theory_. Westport, CT: Praeger. _____. 1994. "Derrida for Sociology? A Comment on Fuchs and Ward" _American Sociological Review_, 59(4):501-504. Baudrillard, J. 1988. _America_. New York: Verso. _____. 1981. _For a Critique of a Political Economy of the Sign_. St. Louis: Telos Press. _____. 1983. _Simulations_. New York: Semiotext(e). Bell, D. 1960. _The End of Ideology_. Glencoe, IL: Free Press. _____. 1973. _The Coming of Post-industrial Society_. New York: Basic. Berman, R. 1982. _All That is Solid Melts Into Air_. New York: Simon and Schuster. Best, S. and D. Kellner. 1991. _Postmodern Theory: Critical Interrogations_. New York: Guilford. Block, F. 1990. _Postindustrial Possibilities_. Berkeley: University of Calffornia Press. Eagleton, T.1983 . _Literary Theory: An Introduction_. Minneapolis. University of Minnesota Press. Harvey, D. 1989. _The Condition of Postmodernity_. Oxford: Blackwell. Hekman, S. 1990. _Gender and Knowledge_. Boston: Northeastern University Press. Horkheimer, M. _Critical Theory_. New York: Herder and Herder. Horkheimer, M. and T.W. Adorno. 1972. _Dialectic of Enlightenment_. New York: Herder and Herder. Huyssen, A. 1986. _After the Great Divide_. Bloomington, IN: Indiana University Press. Jay, M. 1984. _Adorno_. Cambridge, MA: Harvard University Press. Kroker, A. and D. Cook. 1986. _The Postmodern Scene_. New York: St. Martin's.

Lyotard, J. 1984. _The Postmodern Condition_. Minneapolis: University of Minnesota Press. Marcuse, H. 1964. _One-Dimensional Man_. Boston: Beacon. _____. 1969. _An Essay on Liberation_. Boston: Beacon. _____. 1972. _Counterrevolution and Revolt_. Boston: Beacon. Ryan, M. 1982. _Marxism and Deconstruction_. Baltimore: Johns Hopkins University Press.

PREMISE / Volume II, Number 8 / September 27, 1995 / Page 5

Postmodernism[1]
by D. Martin Fields Under the post-modern onslaught, all boundaries and distinctions rapidly fall. Some of the losses associated with the collapse of traditional distinctions have been trivial, but others have been earthshaking, and there seems to be no way to distinguish between the two in a post-modern context. People no longer know where the lines fall.[2] "Beauty is in the eye of the beholder." To be sure, many of us have uttered these words at some point in time. But few of us have really thought about the implications of this little phrase. What it tells us is that there exists no objective standard for beauty; what is beautiful and pleasing to the eye depends on the observer. What may be beautiful to you may not be beautiful to someone else, and what one perceives as ugly may be truly exquisite to another. Now someone may say that this is simply nit-picking away at an innocent little phrase that has seen many adolescents through the more insecure years of life. That is probably true. But what this little phrase implies is becoming more and more

the worldview of many in America today; only they are not simply restricting it to beauty. It is becoming more and more common to read that truth, as well as beauty, is also in the eye of the beholder. Truth is not something "objective," that exists apart from us, rather it is "what works for us." This emerging perspective says that there are no standards or foundations for truth; truth, as it were, is relative to individuals or cultures. One of the advantages of the ministry I do is that I get to work on the college campus. As I talk to students about Christianity of all the questions (or subtle objections) I get, the most common is "Why do Christians believe that there is only one truth?" For these folks, it seems reasonable that if there is a "God," it is fine. In fact, George Barna found that 62% of all Americans believe that the Bible is totally accurate in all of its teachings, 70% believe that there are no absolutes! Such a lack of foundation among Americans is reflected in the fact that Barna's current book of statistics is entitled Absolute Confusion. [3] These beliefs, while they seem to be so outrageous, reflect the mind-set of many college students, as well as their professors, and are certainly in tune with the spirit of the age in America at large. A new worldview is emerging, a worldview that supersedes all worldviews. It is called Postmodernism, and it calls into question the traditional notions of truth, structure, and reality. It dislocates any center of discourse to the edges of human preference and subjectivity, and reinforces the belief that absolute truth was once a viable belief, but has turned out to be little more than a passing fad. So where did this come from? For many of us the notion that there is no objective truth seems silly, and yet this notion is becoming more and more entrenched. After all, who would question the scientific truth that light travels at 186,282 mi/sec., or that the law of non-contradiction is a fundamental rule of logic? Better yet, who would question that "2 + 2 = 4," or that interpretation of John 14:6 that says Jesus is the only way to the Father? Answer: more and more people. Those beliefs may be true for some people, but not necessarily all. What we are seeing today is a shift; a shift in worldview. We are seeing the shift from the Modern to the Postmodern. In this essay we will examine the elements that gave rise to postmodernism, look at the essentials of postmodern thought, and examine and critique postmodern thought in light of the Christian worldview.[4]

Background
As many scholars have shown us, a shift in worldview is nothing new. Western thought has managed to move through a plethora of outlooks. As Gene Veith puts it: One worldview follows another. In the eighteenth century the Enlightenment challenged the Biblical Synthesis that had dominated Western culture. With the nineteenth century came both Romanticism and Scientific Materialism. The twentieth century has given us Marxism and fascism, positivism and existentialism.[5]

And the list goes on. But before we can discuss postmodernism, we need to first take a look at the periods that preceded it: the premodern and the modern. The premodern, as it is called, is the period in intellectual history that would encompass all thought from the birth of philosophy in Thales, through the Renaissance and the Reformation, up until the dawn of the Enlightenment.[6] Premodernism, like modernism after it, was an important phase of Western culture that cannot be characterized by any one worldview. It was, as Veith puts it, a ". . . complex, dynamic, tension-filled era [which] included mythological paganism and classical rationalism, as well as Biblical revelation."[7] But for all of its diversity, one commonality that most outlooks shared was a strong belief in the supernatural, and that there existed absolute truth. Plato, for example, saw the world as manifesting so much diversity and change that another world, the world of the forms--which exist beyond the senses--must exist to bring coherence and purpose to the world of experience. With the rise of the Medieval period, the Christian worldview came to dominate the majority of the scholarly landscape. It is God who is the foundation of truth, and the purpose of man was to discern his relationship to God. Many of the great Christian theologians, such as Augustine, Aquinas, Pascal, Luther and Calvin, flourished during this period. Belief in absolute truth and the supernatural was non-negotiable. It was the basis for their worldview; the foundation for reality as a universe and not a multiverse.[8] However, man's inherent desire to be autonomous never faded, and that, coupled with the successes of reason and science, made a shift in worldview inevitable. Man no longer needed to be bound by the superstitious, out-dated beliefs of the past. The modern man did not need the supernatural to guide him; reason and science alone could give him the answers he needed to understand the universe and structure the world. The shift from the premodern to the modern had taken place, and the Enlightenment was the proof. The Enlightenment was the birth of the "Modern" period in intellectual history. Some historians date this period as beginning with the French Revolution in 1789, and ending with the fall of the Berlin Wall in 1989.[9] While many Enlightenment thinkers did not completely reject belief in God, they banished him to the remotest of the transcendent. If God did exist, he was neither concerned, nor involved in his creation. Reason and science were now the objects of worship, and redemption for mankind was to be found in their study and application. Modern worldviews such as positivism sought to unify the sciences, and order human life by finding the basic paradigm to explain human nature. [10]Secular Humanism with its strong emphasis on the autonomy of the individual and the primacy of the intellect sought to cure society's ills such as racism and poverty be education and technology.[11] Certain knowledge of ourselves and the world was possible, according to modernity, because nature was seen as a closed, static system of natural laws in wait of being discovered. The only differences among the majority of modernity's worldviews were what that truth was, and how it was known.[12] Unlike premodernism before it, modernism, by and large, rejected the supernatural. The rational man did not need to trust in anything beyond logic and normal sense experience. In Biblical criticism, for example, this was the presupposition for the higher critical

schools of interpretation.[13] Belief in miracles, the incarnation, and other supernatural doctrines were rejected out of hand. Modernist scholars sought to "demythologize" the Bible and free it from the superstitious shackles that had bound it for so long. As Diogenes Allen observes: In time some went so far as to claim that the Bible was not needed at all. It was useful to the human race in its infancy. But now that we have achieved enlightenment, we can read the book of nature and avoid all the blemishes, distortions, and absurdities that are found in the Bible.[14] Eventually this new "naturalistic" religion removed God from the picture altogether, and attempted to produce a just and egalitarian social order that would embody reason and social progress.[15] However, as it turned out, modernity didn't produce the harmony that its prophets predicted. After slavery, two world wars, communism, Nazism and nuclear bombs, people began to question the belief that the pursuits of reason, technology and science would make for a better world. Likewise the notion that nature is inherently orderly, governed by fixed, natural laws, had come under strong scrutiny. In recent years a number of scholars have began to question the idea of absolutes in science[16] and logic, [17] and have become more convinced that nature seems to be inherently disorderly and illusive.[18] In addition to this the idea that man is simply an unbiased observer of nature has been criticized. The mind is not the passive reflector of an external world and intrinsic order, but is active and creative in the process of perception and cognition. Reality is in some sense constructed by the mind, not simply perceived by it, and many such constructions are possible, none necessarily sovereign. (emphasis mine).[19] Modernity's idea that man is simply a uniform product of nature was dying fast. The presuppositions of modernity meant a reduction of the human condition to logic and scientific method. There was no human spirit; man was simply the result of a chancerandom assimilation of atoms, subject to the laws of nature in a closed universe. Freedom was an illusion; determinism was reality, and there was no way to account for the complexity of man's immaterial tendencies other than that it was, somehow, merely a biochemical response.[20[ Modernism had delivered just the opposite of what it promised. Its promises of liberation turned out to be masks for oppression and domination. This has been termed the "dialectic of the Enlightenment."[21] By removing God to the transcendent (and then doing away with Him altogether), and enthroning reason and science, man was now free to do all of the unrestrained evil he was capable of--all in the name of scientific progress. What was intended to liberate man had now become his prison. Modernity, like premodernity before it, was now vulnerable. Reason and technology were not messiahs, and the human spirit was still striving for its freedom and autonomy. The ground was very fertile for a new way of looking at things. Postmodernism was ready to arrive.

It should be said that while premodernity and modernity dominated the intellectual landscape of their time periods, it should not be assumed that there were no challenges to them. The roots of modernity can be easily seen in the Renaissance thinkers desire to return to the humanism of the Greeks, negate the vertical relationship to God, and emphasize the horizontal relationship with man.[22] Likewise, modernity's voices of dissent would become the seeds of the postmodern. Romanticism, with its roots in Kant, rejected the pure rationalism and empiricism of the Enlightenment, and emphasized the power of the imagination and the reality of the transcendental. But the strongest, and perhaps the most influential reaction to modernism came in existentialism. Here all absolute meaning was called into question. The quest for any ultimate meaning was seen as a fools errand. Nature was not ordered ultimately, and reason was not a guide. Both of these influential movements are at the roots of Postmodernism.[23]

Postmodernism
Stephen Connor says that the "concept of postmodernism cannot be said to have crystallized until about the mid-1970's . . ."[24] Modernity had received some strong criticism, and it was becoming more and more tenable to assert that the postmodern had come to stay, but it took some time before scholarship really jumped on the bandwagon. At this point it is important to distinguish between postmodern and postmodernism. Postmodern refers to a period of time, whereas postmodernism refers to a distinct ideology. As Veith points out, "If the modern era is over, we are all postmodern, even though we reject the tenets of postmodernism.[25] So exactly what is postmodernism? The situation is profoundly complex and ambiguous. But basically speaking, postmodernism is anti-foundationalism, or anti-worldview.[26] It denies the existence of any universal truth or standards. Jean-Francois Lyotard, perhaps the most influential writer in postmodern thought, defines postmodernism as "incredulity towards metannarratives."[27] For all intents and purposes, a metanarrative is a worldview: a network of elementary assumptions. . . in terms of which every aspect of our experience and knowledge is interrelated and interpreted.[28] Metanarratives are, according to postmodernist scholar Patricia Waugh, "Large-scale theoretical interpretations purportedly of universal application."[29] The postmodernist's, it would seem, would tolerate having a coherent worldview so long as it is kept from being asserted as universal in its application. This is not the case though. The goal, so to speak, of postmodernism is to not only reject metanarratives, but also the belief in coherence. Not only is any worldview which sees itself as foundational for all others oppressive,[30] belief that one may even have a coherent worldview is rejected as well. Nevertheless, there are many worldviews around today, and the postmodernist finds it to be his responsibility to critique, or "deconstruct" as they call it,[31] such worldviews and "flatten them out," so to speak, so that no one particular approach or belief is more "true" than any other. What constitutes truth, then, is relative to the individual or community holding the belief. As we have seen, for the postmodern thinker, there are no absolute truths or foundations to work from. Properly speaking, then, postmodernism is not a worldview per se; it does

not attempt to construct a model or paradigm that orders reality; reality alludes attempts at conformity for the postmodernist, and so he deconstructs all attempts at creating such absolute foundations. Modernity and Christianity debated as to which view was true; postmodernism attacks both Christianity and modernity because they claim to be "true." Christianity affirms certain necessary beliefs that must be assumed in order to make sense out of the world (e.g., that the triune God exists, that he is both transcendent and immanent, that the Bible is his Word). Postmodernism rejects the idea that reality makes sense in any absolute fashion, and reduces any construction to personal or cultural bias. Truth is a social construct, pragmatically justified, so as to make it one of many culturally conditioned approaches to the world. Postmodernism, then, is not so much an orthodoxy (a positive belief system or worldview), as it is an orthopraxy (a series of methods for analysis). In continuing to remove the possibility of any ultimate knowledge, postmodernism confuses the traditional distinction between the subject of knowledge (the knower) and the object of knowledge (the thing being known). Man does not sit back and passively receive knowledge about the world; rather, man's interpretation is, ultimately, the way the world actually is, as it is revealed to him, or to a culture. This confusion of subject and object has earned postmodernism the labels of nihilism and relativism.[32] Logic, science, history, and morality are not universal and absolute; they are the constructs of our own experience and interpretations of that experience. Why do the postmodernists draw these conclusions? As we saw above the idea that reality was orderly and that man was simply a passive observer was called into question. Kant's "Copernican Revolution" in philosophy argued that the mind "brings something to the objects it experiences . . . The mind imposes its way of knowing upon its objects."[33] It is the object that conforms to the mind, not the mind to the object. It would seem then that reality is what we perceive it to be. Charles Mackenzie observes: If in knowing an object the human mind virtually creates knowledge, the question has been raised then, What is the external world when it is not being perceived? Kant replied that we cannot know a thing-in-itself (ding an sich). The world, as it exists apart from our experience, is unknowable. (emphasis mine).[34] As such reality, as it really is, is unknowable. The "thing in itself," cannot be known. The only thing that can be known is our personal experience and our interpretation of that experience. Since each person's experience is all that can be known, it cannot be concluded that man can know anything in any absolute sense. All one has is his own finite, limited experience. Logic, science, history, and ethics are human disciplines that must, and do, reflect human insufficiency and subjectivity. Another reason the postmodernists draw these conclusions comes from the fact that the existentialists, with their rejection of rationalism and empiricism, focused philosophy on the human experience, especially as it is communicated through language. Language is the way man expresses these experiences of the world, therefore to understand the world, as best we can, we must look to what is said about reality.[35] But subjectivism is all we

can have since the best we can do is experience and interpret what others have experienced and interpreted reality to be, and so the spiral continues downward. Thus, for the postmodernists, any assertion of absolute knowledge is seriously questioned and ultimately rejected. Therefore history is seen as a series of metaphors rather than an account of events as they actually happened. After all, the one recording the events was writing and recording the events as he saw them. Someone else may have seen it differently had they been there. In issues of morality no one particular view is seen as foundational. Rather, each culture's, and ultimately each individual's, view on ethics is just as valid as the next. This view is the basis for the assumptions of "Multiculturalism," and the "Political Correctness" movement in today's society. Rather than affirming any one morality as absolute, every person's moral persuasion is to be respected no matter what it is, and language must be revised so as to not favor any one outlook and thus offend another.

Critique
To be sure modernity's assertion that logic and science alone are certain methods for acquiring truth, and that man is a passive "subject" of knowledge were wrong. One cannot, in light of the developments in the philosophy of logic, science, and ethics conclude any longer that, humanly speaking, these are unquestionable, uniform disciplines for man to simply fall in line with. This extreme ought to be rejected. But postmodernism, with its rejection of modernity's claims, pushes another extreme. With its quasi-nihilist rejection of any and all forms of "foundations," or "absolute truth," it is in itself a position not beyond question. On what basis ought the postmodernist view be taken as true? Is its affirmation that absolute truth is impossible itself absolute? In other words, how can the postmodernist claim that his way of looking at things is "true" without constructing some kind of metanarrative? As James Harris points out: What if some member of the heteromorphous group insists that Lyotard prove his claims to the satisfaction of the members of the group? And what if the members of that group refuse to admit the reasonableness of Lyotard's claim and treat it as a "paralogical" metanarrative?[36] All the while denying being a worldview, it is in effect a worldview. It is not just an orthopraxy, for even an orthopraxy must have a view on reality, knowledge, and morality in order to discern and justify its methods. It too is riddled with assumptions, all in need of as much scrutiny and evaluation as any other worldview. While attempting to do away with totalizing discourse and belief, the postmodernist must absolutize his claims to get his system going. This kind of extreme relativism is impossible; it affirms what it denies. Likewise, if language is all there is to reality, and all interpretation is subjective, then why do postmodernists write books? Why believe that there is any possible way to communicate the ideas of postmodernism? How do we in fact know that the reader's interpretation was the authors intent? This view of language, then, becomes the prison house of postmodern thought. In other words, how does the postmodernist get beyond deconstructing deconstructionism?

On a societal note, postmodernism, while it tries to enhance understanding of the diversity among people, actually creates a new tribalism. Multiculturalism says that the traditional idea of America, for example, being seen as an assimilation of cultures is false. America is not a "melting pot," it is more like a "salad bowl."[37] So education, morality, politics, etc. is defined by cultural interests. History, for example, no longer is an acquisition of knowledge of past events, rather it is revised so as to enhance the selfimage of a particular group that has been excluded or "oppressed." As Veith observes: Contemporary scholars seek to dismantle the paradigms of the past and "to bring the marginal into the center" (rewriting history in favor of those who have been excluded from power-women, homosexuals, blacks, Native Americans, and other victims of oppression. Scholars attack received ideas with withering skepticism, while constructing new models as alternatives. Those who celebrate the achievements of Western civilization are accused of narrow-minded "Euro-centrism;" this view is challenged by "Afro-centrism," which exalts Africa as the pinnacle of civilization. Male-dominant thought is replaced by feminist models. "Patriarchal religions" such as Judaism and Christianity are challenged and replaced with matriarchal religions; the influence of the Bible is countered by the influence of "goddess-worship." Homosexuality is no longer considered a psychological problem; rather, homophobia is.[38] It does not matter what actually happened; that is impossible to know and as such is irrelevant. Accuracy is not the desire of postmodernism; power is. Remember postmodernism rejects the idea of any universal truth, whether it is history or logic. Perhaps the central prophet of the postmodern condition was Freidrich Nietzsche (18441900). Nietzsche anticipated the emerging nihilism in Western Culture. Life is absurd, according to Nietzsche. There is no truth, no value, no concern. All that is left is "The Will to Power," which is more than just a will to survive; it is an inner drive to express a vigorous affirmation of all of man's powers.[39] For postmodernism, like Nietzsche, there is no ultimate meaning, and each individual or group of individuals must exercise their will to overcome the oppression of others. The irony here is that while postmodernism rejects any ultimate morality, and affirms the primacy of power, it sees oppression as a "bad" thing.[40] The belief that "One ought not oppress others" is itself an ethical judgement. Again, on the postmodernist view of things, why not? Why should there be any attempt to correct the injustices of the past, so to speak? And if morality is ultimately to be relative to cultures, then what is the basis for the postmodern multiculturalist abhorring all of the things it tries to correct (e.g., oppression, injustice, Patriarchal practices, totalizing, absoluteness, etc.)? After all, there are no proscriptions, or metanarratives, to prohibit anyone from doing anything. It is dialectical tension between reactification and anything goes.[41] As Gertrude Himmelfarb points out, postmodern multiculturalism has the pernicious effect to demean and dehumanize the people who are the subjects of history. To pluralize and particularize history to the point where people have no history in common is to deny the common humanity of all people, whatever their sex, race, class, religion.[42]

In effect, everyone is so disconnected, convinced that their way of looking at things is true for them, that there is no room for common discourse or reason for understanding differences. In short, postmodern multiculturalism, while trying to raise awareness of diversity, exalts it so to the point that, ultimately, it destroys what it sets out to do.

Conclusion: Is Postmodernism all bad?


Irving Kristol, a fellow at the American Enterprize Institute, describes the current time as "a shaking of the foundations of the modern world."[43] Allen says: A massive intellectual revolution is taking place that is perhaps as great as that which marked off the modern world from the Middle Ages . . . The principles forged during the Enlightenment . . . which formed the foundations of the modernmentality, are crumbling. [44] The collapse of Enlightenment Humanism is imminent, and the attacks on it are from all angles. From religious conservatives to scientific liberals, the desire to overhaul the presuppositions of modernity is a shared goal, although the motives differ greatly. Christians welcome the opportunity for credible public discourse concerning their faith, and many scientists are eager to see a shift in scientific outlook that will account for the anomalies that modern science has avoided. These are exciting times, times when the church should be alert. In a postm odern world Christianity is intellectually relevant.[45] With the demise of the absoluteness of human reason and science, the super-natural, that which is not empirical, is once again open to consideration. The marketplace of ideas is wide open, and opportunities abound. It is important that the church understand these important times in which it finds itself. But in addition to opening the door once again to the Christian faith, postmodernism, with its critical apparatus, has a few lessons for the church to learn. Veith likens the current situation to that of the pagans at the Tower of Babel.[46] Genesis 11:1-9 tells us that at that time everyone, the whole earth, spoke the same language. As some were traveling east, they stopped in the valley of Shinar and decided to make a name for themselves by building a tower that would reach into heaven. When the Lord saw what they were doing he came down, and destroyed the heart of their unity: language. As a result they were scattered over the earth, and had no way to communicate. Modern, and to some extent, premodern man built his tower. Dependence on God was superseded by autonomous man's faith in reason and science. These elements bound man together and he built his tower, removed God (so he thought), and placed himself on the throne to be worshipped. What is interesting is that postmodernism strikes at the very same thing God did: language. Without language, logic and science are meaningless; they have no application. As we have seen, its each man for himself in his own private world. The arrogant, pseudo-unity that man had claimed to find was now just one of the many ways of looking at things. Logic and science were now relative to cultural interpretation. [47] Like the people at the Tower of Babel, modern man has been fragmented and scattered. There is no center of discourse any longer.

In this light perhaps the most significant contribution of postmodernism is that it reminds us of our finitude. It reminds us that God is creator and we are his creation. It tell us that he must be the beginning of all of our thinking, that apart from him we could know nothing. Veith observes: Without a belief in God . . . it would be difficult to avoid postmodernist conclusions . . . If there is no transcendent logos, then there can be no absolutes, no meaning apart from human culture, no say out of the prison house of language . . . Postmodernism may represent the dead-end--the implosion, the deconstruction--of attempts to do without God.[48] It wouldn't simply be difficult, it would be impossible. It is the fear of the Lord that is the beginning of knowledge (Proverbs 1:7), not the conclusion of our investigation. In Christ "are hidden all of the treasures of wisdom and knowledge" (Col. 2:3). This does not mean that we reject disciplines such as logic and science. Rather we see them as tools for us to use to better understand God's amazing creation, not ultimate standards to legislate what is possible, and take the place of God's Revelation. After all, the "gift of logical reason was given by God to man in order that he might order the revelation of God for himself."[49] Science is simply the study of God's creation, so that we might better understand how to care for it, advance in knowledge, and fulfill the Cultural Mandate. [50] In the same way, postmodernism reminds us that theology is, like logic, not exhaustive, but a developing science. That there are many approaches to theology, none of which are exhaustive; that it is the theologian's responsibility to examine carefully all propositions in accordance with God's Word, and press forward to better understand the revelation that God has given. In short, we are to think God's thoughts after him. For our personal life, postmodernism shows us the futility of autonomy. It forces those of us who know Christ back to the basics of depending on Christ for everything, whether it is salvation or standards. That in him we have meaning and purpose for our lives; he is the vine, we are the branches, and apart from him we can do nothing (John 15:15). To sum it up, postmodernism need not be seen as a mortal enemy. In many ways it drives us back to complete and total dependence on God. It reminds us that he is the foundation for every area of life, whether it be logic or law. It shows us that there exists no neutral, impartial domains that we can lean on in addition to him. Postmodernism points out that we all have presuppositions, and that no one is unbiased. We all bring our assumptions to our experience; each fact about the world is theory-laden. The question then becomes, "Which presuppositions are true?" The answer is clear: the Christian worldview is true. It alone is the only escape from subjective nihilism, for it alone provides the necessary foundations to make the facts intelligible. This being the case, the Christian is able to glean what is good from postmodernism, and reject the extremes. Diogenes Allen argues that "Christian theology has yet to become postmodern."[51] It is still being plagued by modern rationalists, on one end, and premodern fundamentalism on the other. The modernist theologian continues to jettison doctrines which he deems unscientific or irrational, and the premodern theologian refuses to allow his doctrine to

engage the world. A Postmodern theology is a worldview theology. It critiques rationalism by demonstrating the impossibility of reason apart from presupposing God; it rejects fideism and seclusion because it demonstrates that God is the basis for rationality, and that it speaks to all of life. Are the Christian churches ready to meet the challenge? As Allen says, "They have within their heritage immensely powerful ideas, not to mention a living Lord."[52] Will the church embrace this and engage the world? It must, sooner or later.

Notes
1. This essay was written by Marty Fields as a contribution to War of the Worldviews. This is the original essay; the forthcoming chapter has been edited for a more popular audience. 2. David F. Wells, [God in the Wasteland: The Reality of Truth in a World of Fading Dreams (Grand Rapids: Eerdmans, 1994), p. 48. 3. George Barna, Absolute Confusion: How our moral and Spiritual Foundations are eroding in this age of change (Ventura, CA: Regal Books, 1993). 4. A detailed examination of all of the elements of postmodernism is beyond the scope of this essay. For a more detailed examination of the many issues raised by postmodern thought, see Stephen Best and Douglas Kellner, Postmodern Theory: Critical Interrogations (New York: The Guilford Press, 1991); Stephen Connor, Postmodernist Culture: An Introduction to Theories of the Contemporary (Cambridge, MA: Basil Blackwell, 1989); Jean-Francois Lyotard, The Postmodern Condition: A Report on Knowledge Trans. Geoff Bennington and Brian Massumi (Manchester: Manchester University Press, 1984). From a Christian point of view see Gene Edward Veith, Postmodern Times: A Christian Guide to Contemporary Thought and Culture (Wheaton, IL: Crossway Books, 1994). 5. Gene Edward Veith, Postmodern Times: A Christian Guide to Contemporary Thought and Culture (Wheaton, IL: Crossway Books, 1994), p. 19. This is more fully explained and documented in Richard Tarnas' excellent book The Passion of the Western Mind: understanding the ideas that have shaped our world view (New York: Harmony Books, 1991). 6. Diogenes Allen sees theology prior to Hume and Kant as premodern, and the nineteenth century attempts to deal with the works of Hume and Kant as modern. Interestingly, he sees Postmodern theology as being classified into four areas: confessional theology, which was the immediate reaction to nineteenth-century liberalism. He sees Karl Barth as the principle proponent of this view; existentialisthermeneutical theology, which is seen in the works of Heidegger; deconstructionist theology, which is indebted to the works of Jacques Derrida; and process theology, which is reflected in the works of A.N. Whitehead and Charles Hartshorne. The main body of postmodern thought, however, would be more localized in the traditions of existentialism

and deconstructionism. (See Diogenes Allen, Christian Faith in a Postmodern World: The Full Wealth of Conviction (Louisville, KY: Westminster/John Knox, 1989, p. 6). 7. Veith, Postmodern Times, p. 29. 8. See William H. Halverson, A Concise Introduction to Philosophy, 4th ed. (New York: McGraw-Hill Publishers, 1967 (1981), p. 413ff. Halverson points out that the study of Philosophy encompasses all other disciplines. It seeks to create the concepts which will unify knowledge, and provide a foundation for coherence. The history of philosophy, then, is the search for an adequate worldview which will make sense of our experience of the world. As we will see, this search for the unity of knowledge is abandoned in postmodernism. 9. Thomas C. Oden, Two Worlds: Notes on the Death of Modernity in America and Russia (Downers Grove, IL: InterVarsity Press, 1992), p. 32. 10. For a detailed account of positivism see Michael Corrado, The Analytic Tradition in Philosophy: background and issues (Chicago: American Library Association, 1975). 11. See Gary Scott Smith, "Naturalistic Humanism," in Building a Christian Worldview, vol. 1: God, Man, and Knowledge, W. Andrew Hoffecker and Gary Scott Smith, eds. (Phillipsburg, PA: Presbyterian and Reformed, 1986), pp. 161-181. 12. Some theories of knowledge stressed the importance of coherence when it comes to matters of truth. Truth is what conforms to the laws of logic. This worldview belief is known as Rationalism. For others emphasis was not placed so much on coherence as it was correspondence. What is true is what corresponds to the world we experience. This is known as Empiricism. For a thorough study of these issues see V. James Mannoia, "Rationalism and Empiricism," in Building a Christian Worldview, vol. 1: God, Man, and Knowledge, W. Andrew Hoffecker and Gary Scott Smith, eds. (Phillipsburg, NJ: Presbyterian and Reformed, 1986), pp. 261-277. 13. See E. J. Young, An Introduction to the Old Testament (Grand Rapids: Eerdmans Publishing Co., 1949 (1977), pp. 123ff. In responding to the charge that Christianity and reason are at odds with each other, Young says, "Christianity and reason, of course, are not enemies, for Christianity is the only reasonable explanation of life, and true reason, which is derived from God, is both humble and receptive." This view of reason will be discussed later. 14. Diogenes Allen, Christian Faith in a Postmodern World, p. 36. Allen cites Kant as one of those who eventually lost faith in the so-called "Book of Nature." This moved Kant to replace it with the moral argument for God's existence in his Critique of Practical Reason. 15. Stephen Best and Douglas Kellner, Postmodern Theory: Critical Interrogations (New York: The Guilford Press, 1991), p. 2.

16. See Thomas Kuhn, The Structure of Scientific Revolutions (Chicago: The University of Chicago Press, 1962 (1970). Kuhn reviews the history of science and argues that what is called "science" is not an invariant approach to the world. Scientists work in terms of paradigms, or models, as they approach the data. These paradigms, or worldviews, are comprehensive. They are not derived by simply looking at the "facts." Rather, they are networks of presuppositions in terms of which the facts of experience are interpreted. They reflect science's biases concerning the nature of reality and knowledge. As long as a paradigm "works," it is followed until the anomalies which seem to contradict the existing paradigm accumulate so the point of causing a shift, or as Kuhn called it, a "revolution" in which the old paradigm is abandoned for one which comports with the data more effectively. 17. The denial of any objective and universal logic is seen in Willard Van Orman Quine's influential essay, "Two Dogmas of Empiricism," in From a Logical Point of View (Cambridge, MA: Harvard University Press, 1953 (1980), pp. 20-46. Quine argues that there are no laws of logic absolutely; there are only laws of logic for a belief system or a language. 18. With the developments in Einstein's relativity, Bohr's quantum mechanics, and Heisenburg's "Uncertainty Principle," the strict Newtonian Determinism in physics was called into question. Subatomic particles didn't seem to follow the physical patterns of their constructs. At times they behaved like particles, and at other times they behaved like waves. In Kuhnian terms, "Normal science" was not able to account for these anomalies. In the words of Sir James Jeans, the "physical world of twentieth-century physics did not look like a great machine as it did a great thought." (See Richard Tarnas, The Passion of the Western Mind, p. 356). 19. Tarnas, [The Passion of the Western Mind, p. 396. See also Sir James Jeans, [Physics and Philosophy (New York: Dover Publications, 1942 (1981), p. 143. Jeans points out that the theory of relativity makes pure objectivity in science impossible. Since each subject and object are moving, it is impossible to extrapolate any impartial knowledge. The notion that the mind constructs reality is a central tenet of postmodernism, and will be discussed later. 20. This process of the reduction of man is described in excellent detail in William Barrett's, Death of the Soul: From Descartes to the Computer (New York: Anchor/Doubleday, 1986). 21. Best and Kellner, Postmodern Theory, p. 3. 22. See W. T. Jones, A History of Western Philosophy, vol. III: Hobbes to Hume (New York: Harcourt Brace Jovanovich, 1952 (1969), pp. 103. 23. It should also be pointed out that, ironically, some of the strongest reinforcement for the postmodern worldview came from Analytic philosophy, the movement which developed positivism. Analytic philosophy stresses the importance of logic and the

analysis of language in dealing with philosophical issues. The afore mentioned logician Willare Van Orman Quine comes from this tradition. Likewise, perhaps the greatest Analytic philosopher, Ludwig Wittgenstein, in his later works concluded that there exists no objective, "ideal" language by which all men agree. Rather, "meaning," is relative to a particular community's "language games" (as he called it). There exists no objective basis for why we interpret language the way we do. As he said, "This is simply what I do." (See W. T. Jones, A History of Western Philosophy, Vol. V: The Twentieth Century to Wittgenstein and Sarte (New York: Harcourt Brace Jovanovich College Publishers, 1952 (1975), pp. 367ff). Lyotard describes the late work of Wittgenstein as an "epilogue to modernity and a prologue to an honorable postmodernity." (See Best and Kellner, Postmodern Theory, p. 168). 24. Stephen Connor, Postmodernist Culture: An Introduction to Theories of the Contemporary (Cambridge, MA: Basil Blackwell, 1989), p. 6. This is not to be seen as in conflict with Thomas Oden's tenure of the modern period. Postmodernism was becoming more and more concrete, but modernism was still flourishing. It was the fall of communism in 1989 that drove the death-nail into the philosophy of modernity. 25. Veith, Postmodern Times, p. 42. 26. Idem, p. 49. 27. Jean-Francois Lyotard, The Postmodern Condition: A Report on Knowledge. Translated by G. Bennington and B. Massumi (Minneapolis, MN: University of Minnesota Press, 1984), pp. xxiv, 34. 28. Gary Demar, Surviving College Successfully (Brentwood, TN: Wolgemuth and Hyatt, 1988), p. 8. 29. Patricia Waugh, Postmodernism: A Reader (London: Edward Arnold Publishers, 1992), p. 1. 30. The idea of oppression drives much of postmodernist discourse. The primacy of autonomy is central. Worldviews are absolute foundations, and are, inevitably, exclusive. 31. Deconstructionism is the preferred method of postmodernism in dealing with ideas and constructs. For the postmodernist, language is the encapsulator of reality. All meaning is seen as socially constructed in language. These assumptions make "meaning" relative to the author and the reader. Any narrative which is viewed as ultimate is deconstructed so as to show the subjectivity of the claim. Veith says this is why postmodernism developed out of literary criticism, and not traditional philosophy. (See Veith, Postmodern Times, p. 51). 32. Nihilism is the view that human existence is totally and irremediably meaningless, that nothing in the world has any value. This is perhaps seen the clearest in the works of

Freidrich Nietzsche (See Halverson, A Concise Introduction to Philosophy, pp. 448; 457462). 33. See Samuel Enoch Stumpf, Socrates to Sarte: A History of Philosophy, 3rd ed. (New York: Mcgraw-Hill Book Company, 1982), pp. 296-299. This particular position is a type of phenomenalism. All that we can know is the experiences of the objects. Our preinterpretation of these objects prohibits us from having any pure and unbiased knowledge of the objects themselves. As such, reality can be concluded, as it is by postmodernists, as a social construct. [34. Charles Mackenzie, "Kant's Copernican Revolution," in Building a Christian Worldview, vol. 1: God, Man, and Knowledge[. W. Andrew Hoffecker and Gary Scott Smith, eds. (Phillipsburg, NJ: Presbyterian and Reformed, 1986), p. 284. 35. Martin Heidegger, perhaps the most influential existentialist for postmodern thought, concluded that "reality," or "existence," is so illusive that not even language can name an object. In his essay, "A Dialogue on Language," Heidegger argued that there is no way to formulate an adequate language to transcend human subjectivity. All that is left is silence, for no two people can be sure that they have the same thing in mind. Heidegger concluded that "Language is the house of being." Language cannot be separated from reality. As W. T. Jones puts it, "Unless one is content to achieve a mystical contact with reality, one must conclude that the phenomenological route out of the Kantian paradigm has reached a dead end in Heidegger." (A History of Western Philosophy, vol. V: The Twentieth Century to Wittgenstein and Sarte, p. 331). 36. James F. Harris, [Against Relativism: A Philosophical Defense of Method (La Salle, IL: Open Court, 1992), p. 118. 37. I have heard some respond to this analogy by saying that even a salad must have dressing over it! 38. Veith, [Postmodern Times, p. 57. 39. Stumpf, Socrates to Sarte, p. 360 Colin Brown states that for Nietzsche, the starting point is the non-existence of God. While he liked the Christian morality, he detested the Christian idea of God. If there is a God, then man cannot be free. The Will to Power demands that man's freedom be ultimate. (See Colin Brown, Philosophy and the Christian Faith (Downers Grove, IL: InterVarsity Press, 1968), pp. 137-141. 40. See Best and Kellner, Postmodern Theory, p. 292. The authors state that the postmodern critique of "macrotheory" and "traditional politics" turns out to be "just as one-sided and dogmatic the modern theories they oppose." 41. Connor, Postmodernist Culture, p. 227. As he puts it, "The problem for a postmodern politics, then, is this dual prospect, on the one hand of a transformation of history by a

sheer act of imaginative will, and on the other, of an absolute weightlessness, in which anything is imaginatively possible," because nothing really matters. (emphasis mine). 42. Gertrude Himmelfarb, [On Looking into the Abyss: Untimely Thoughts on Culture and Society (New York: Alfred A. Knopf, 1994), p 154. 43. Dennis Farney, "Natural Questions," in The Wall Street Journal, Monday, July 11, 1994, p. A4. 44. Diogenes Allen, [Christian Belief in a Postmodern World, p. 2. 45. Idem, p. 5. 46. Veith, Postmodern Times, pp. 20-23. 47. For an extensive account of the insufficiency of autonomous science and logic, and their diversity, see Greg L. Bahnsen, "Science, Subjectivity, and Scripture" (available from Covenant Tape Ministry, 22005 N. Venado Dr., Sun City West, AZ 85375). 48. Veith, [Postmodern Times, p. 68. 49. Cornelius Van Til, An Introduction to Systematic Theology (Philadelphia: den Dulk Foundation, 1974), p. 256. 50. Genesis 1:28. 51. Allen, Christian Belief in a Postmodern World, p. 6. 52. Idem, p. 8. D. Martin Fields is a Church Planter in Wilmington Island, GA. This essay is used with permission. Copyright 1995 byPREMISE. All Rights Reserved.

This Issue / Index / CAPO

THE POSTMODERN PARADIGM


Brent G. Wilson University of Colorado at Denver To appear in C. R. Dills and A. A. Romiszowski (Eds.), Instructional development paradigms. Englewood Cliffs NJ: Educational Technology Publications, in press (to be published in March 1997). Also available at: http://www.cudenver.edu/~bwilson To order a copy of the forthcoming book, call 1-800-952-BOOK. Abstract The constructivist movement is changing the way many of us think about instructional design (ID), but still, postmodern critics of educational technology are often seen as too radical, too iconoclastic. Streibel (1986), for example, offers a devastating critique of computers in education that makes many educational technologists feel uncomfortable. Computers are our stock in trade, after all. Other postmodern writers offer critiques of practice, but relatively few directly address the interests of instructional designers. This paper suggests that (1) postmodern perspectives about the world underlie much constructivist writing, and (2) a postmodern stance can offer positive, constructive critiques of ID practice. After a brief introduction to postmodern ideas, a set of recommendations are offered for changing ID practice.

For more than ten years, a small clique of postmodern researchers and theorists has existed within the Association for Educational Communications and Technology (AECT). For years, they behaved like a small, persecuted minority-a "cult" of sorts. They complained that journal editors were biased, ignorant, and unwilling to publish their radical writings. They struggled to have AECT papers and symposia accepted on the program. The main forum for the postmodern clique was an annual "foundations symposium," which year by year found its way onto AECT's program. I have attended these symposia for the last several years, and have noticed two things. First, the crowds are getting bigger and seemingly better informed. Second, I have noticed a change in the presenters. I see less defensiveness and fewer signs of being persecuted. Instead, I see a growing maturity of perspective and a growing confidence that a postmodern perspective has something hopeful and positive to say to our field. It is in that same spirit of hopefulness and honesty that I approach this chapter. I am not a member of the postmodern clique. I am an instructional designer-a moniker unpopular in many postmodern circles. But I approach the task of articulating postmodernism with a belief that there are some worthwhile ideas here, and that the field of ID can be improved by listening closely to "alternate voices" currently abounding in our field. Three recent publications symbolize the growing acceptance of postmodern thinking within educational technology: --Dennis Hlynka and Andrew Yeaman prepared a carefully written two-page digest of postmodern thinking for publication as an ERIC Digest (Hlynka and Yeaman, 1992). This is the first source I would recommend for instructional designers interested in a brief and clear introduction to postmodern thinking. --In 1992, Educational Technology Publications published a collection of postmodern writing edited by Dennis Hlynka and John Belland, titled Paradigms regained: The uses of illuminative, semiotic, and post-modern criticism as modes of inquiry in educational technology: A book of readings This book serves as a valuable resource for educational technologists in search of alternative perspectives for interpreting their field. --The March 1994 issue of Educational Technology was devoted to postmodern topics. The issue again made postmodernism more visible within the educational technology community, but also included some real dialogue, spurred by Barbara Martin's (1994) call for better communications between postmodern critics and the educational technology community. The purpose of this chapter is to provide a short guided tour of postmodern thinking for practicing instructional designers raised in the "old school" of Gagn, Briggs, and Merrill. I will assume that you have been exposed to some measure of constructivist thinking, yet postmodern philosophy remains a mystery. To help make a transition to postmodern ways of thinking, the second half of the chapter offers a set of

recommendations for doing traditional ID steps in ways more sensitive to postmodern perspectives. Also at the outset, please remember that labels such as "constructivist" or "postmodern" embrace a whole range of ideas and methods. This chapter is my best shot at elucidating postmodern philosophy for an ID audience, yet I approach the task as an admiring outsider, not really an expert. What I can bring to the discussion is my understanding of instructional designers and their preconceptions. The next step for any reader would be to consult original sources-either the educational technologists referred to above, or the postmodern philosophers and critics they rely upon in their writing. An Introduction to Postmodern Thinking I have decided that the best way to provide a conceptual overview is to tell a simple story. This story is not true, but it has some truth in it. It is meant to serve as a scaffold for making sense out of the word 'postmodern.' A Story about Worldviews The ancient worldview. In many ways, the ancients of Greece and Rome were a lot like us. They faced some of the same questions we face now-namely-How is it that we know things? How can we get at the truth? How is the world made up? The ancients recognized that appearances can be deceiving-that what looks reliable and stable on the surface may actually be in flux and changing. How can we get at the way things really are? To address this problem, the ancients differentiated between the world that we see with our eyes and the "real" world, which was perfect, whole, and divine. The divine, in fact, was what made it possible for us to catch glimpses of the "real," idealized world. Left to our own inclinations, we see imperfection, weakness, and lots of jagged edges. With the help of divine logic and mathematics, the jagged edges become smooth, and the perfect thingbehind-the-thing is made manifest to us. Concepts are divine revelations of the way the world really is-our everyday usage of "ideas" stems from the ideal forms sought by the ancients. The modern worldview. The ancient view of things dominated our thinking for many years, in fact through the Medieval Era. Beginning with the Renaissance, however, we gradually shifted our focus. Taught to look to God for truth-and for God in the Church and in received texts-many bright thinkers instead started to believe their own eyes and faculties. Rather than God assuming the central role in the universe, man himself became the standard for judging the truth of things. Man's intellect was capable of discerning truth from error. Certain defined methods for discovering truth and evaluating evidence came to be considered reliable and sufficient for gaining access to the "truth." Superstition and tradition were replaced by rationality and the scientific method. Technology and the progress of science would signal a corresponding progress in society, until man perfected himself and controlled nature through his knowledge and tools. Still, philosophers troubled themselves over the same questions of how do we know the truth? Kant realized that we will never really get at the way things really are, but that we

can get pretty close-we create schemas in our mind that roughly match up with how things are. The word 'phenomenon' comes from Kant, and means essentially "close to the real thing." Over the years, however, it became clear to philosophers that there remained an insurmountable gulf between ourselves and the truth. We live in a specific time and place, conditioned by a particular culture and set of experiences. Without God to connect us to the truth, how can we get there? How can we transcend our limitations and reach beyond ourselves to the way things really are? These are tough questions that have not gone away through the ages. The postmodern worldview. 'Postmodernism', as the term implies, is largely a response to modernity. Whereas modernity trusted science to lead us down the road of progress, postmodernism questioned whether science alone could really get us there. Whereas modernity happily created inventions and technologies to improve our lives, postmodernism took a second look and wondered whether our lives were really better for all the gadgets and toys. Postmodernism looked at the culmination of modernity in the 20th century-the results of forces such as nationalism, totalitarianism, technocracy, consumerism, and modern warfare-and said, we can see the efficiency and the improvements, but we can also see the dehumanizing, mechanizing effects in our lives. The Holocaust was efficient, technical, coldly rational. There must be a better way to think about things. So what about the age-old questions about truth and knowledge? A postmodernist might say, "Truth is what people agree on," or "Truth is what works," or "Hey, there is no Truth, only lots of little 'truths' running around out there!" Postmodernists tend to reject the idealized view of Truth inherited from the ancients and replace it with a dynamic, changing truth bounded by time, space, and perspective. Rather than seeking for the unchanging ideal, postmodernists tend to celebrate the dynamic diversity of life. In their ERIC Digest, Hlynka and Yeaman (1992) outline some key features of postmodern thinking (liberally paraphrased for simplicity): 1. A commitment to plurality of perspectives, meanings, methods, values-everything! 2. A search for and appreciation of double meanings and alternative interpretations, many of them ironic and unintended. 3. A critique or distrust of Big Stories meant to explain everything. This includes grand theories of science, and myths in our religions, nations, cultures, and professions that serve to explain why things are the way they are. 4. An acknowledgment that-because there is a plurality of perspectives and ways of knowing-there are also multiple truths.

In a lovely section, Hlynka and Yeaman (1992) suggest (ironically!) four easy steps to becoming a postmodernist: 1. Consider concepts, ideas and objects as texts. Textual meanings are open to interpretation. 2. Look for binary oppositions in those texts. Some usual oppositions are good/bad, progress/tradition, science/myth, love/hate, man/woman, and truth/fiction. 3. "Deconstruct" the text by showing how the oppositions are not necessarily true. 4. Identify texts which are absent, groups who are not represented and omissions, which may or may not be deliberate, but are important. pp. 1-2. Postmodern thinking grew out of the humanities tradition-philosophy, literary criticism, the arts. This helps to account for some of the misunderstandings that can occur between instructional designers and postmodern critics. As C. P. Snow argued in The Two Cultures (1969), people in science see things very differently than people in the humanities. The field of instructional design, evolving from behavioral psychology, systems technology, and management theory, sees the world through the "scientific" lens, whereas postmodernists tend to see things through a critical, humanities type of lens. The goal of an artist or critic is not so much to explain, predict, and control, but to create, appreciate and interpret meanings. Over the years, postmodern approaches have expanded to encompass science, feminism, education, and the social sciences, but the orientation remains that of interpretation rather than prediction and control. An Example of "Deconstruction": Conditions-of-Learning Models As an illustrative exercise, I have attempted a postmodern deconstruction of traditional ID models. Conditions-of-learning or "CoL" models are the type of models we find in Reigeluth (1983b). Gagn, Briggs, Merrill, and Reigeluth are the classic "CoL" theorists. Wilson and Cole (1991) described the basic conditions-of-learning paradigm: [CoL] models are based on Robert Gagn's conditions-of-learning paradigm (Gagn, 1966), which in its time was a significant departure from the Skinnerian operant conditioning paradigm dominant among American psychologists. The conditions-oflearning paradigm posits that a graded hierarchy of learning outcomes exists, and for each desired outcome, a set of conditions exists that leads to learning. Instructional design is a matter of clarifying intended learning outcomes, then matching up appropriate instructional strategies. The designer writes behaviorally specific learning objectives, classifies those objectives according to a taxonomy of learning types, then arranges the instructional conditions to fit the current instructional prescriptions. In this way, designers can design instruction to successfully teach a rule, a psychomotor skill, an attitude, or piece of verbal information.

A related idea within the conditions-of-learning paradigm claims that sequencing of instruction should be based on a hierarchical progression from simple to complex learning outcomes. Gagn developed a technique of constructing learning hierarchies for analyzing skills: A skill is rationally decomposed into parts and sub-parts; then instruction is ordered from simple subskills to the complete skill. Elaboration theory uses content structure (concept, procedure, or principle) as the basis for organizing and sequencing instruction (Reigeluth, Merrill, Wilson, & Spiller, 1980). Both methods depend on task analysis to break down the goals of instruction, then on a method of sequencing proceeding from simple to gradually more complex and complete tasks. p. 49. The critique below is an edited revision of an e-mail post I sent to some author-friends who are writing a chapter on "conditions-of-learning" models; hence the especially informal tone. In spite of the informality, however, the concepts are rather abstract and difficult. If this section proves too confusing, please skip to the next section! Conditions-of-learning (CoL) models rely on a number of assumptions and distinctions, including: Description versus prescription. The precise stance of CoL models is somewhat ambiguous-are they "scientific" models or are they "engineering" procedures? In some ways CoL models are descriptive-"There are these kinds of learning outcomes, these kinds of strategies"-but descriptive only of highly artificial activities and structures (cf. Simon, 1983). CoL models rest on a loosely defined knowledge base-a little psychology, a little instructional research, a little systems theory, a little information theory. CoL models also serve a prescriptive function for ID, but in a strange sense. Because of their difficulty, they are more than simple "recipes" or "hooks" for the novice to use and then grow out of. They are kind of saying: "Instruction should be like this, so do it this way." To complement instructional systems development (ISD) models-which focus more on procedures and processes-CoL models focus more on the product, saying "Good instruction should look this way; go and do likewise." Another way of looking at this question is to consider what defines good instruction: 1. Craft/process definition. Instruction made by jumping certain hoops. Instruction made in a certain way-following ISD steps-is good. 2. Empirical definition. Instruction that demonstrably results in targeted learning. This is an assessment-based definition. This is a pragmatic, commonsense approach to it-if it works, it's good. 3. Analytic/scientific definition. Instruction that has all the desired attributes. This is the product definition of goodness. The product incorporates effective principles, contains certain features, looks a certain way. You can tell by examining the product, rather than the process used to create it or its effect on learners. This approach is most characteristic of CoL models, defining good instruction in terms of its use of certain instructional

strategies and components. If the lesson has an advance organizer, clear writing, lots of examples, lots of practice, etc.-then it is good instruction. Orthogonal independence of content and method. This is analogous to Richard Clark's claims about media and strategy-that they're independent and crossable. I may learn a concept via examples or via a definition or via a bunch of practice. In each case, however, I've learned the same thing-the target concept. An alternative view (that would need some defense) would be that different strategies necessarily lead to qualitatively different outcomes, even if some of the behaviors exhibitable by the person may be the same. CoL models assume that there is a class of methods that fits a class of learning goals, and that I can reliably draw upon one in service of the other. But let's say your goal is to make a "Yale man" out of me. Can I accomplish those learning goals by attending Front Range Community College? Can I generalize the strategy used in one setting and replicate it in another setting? How transferable/generalizable are different "contents" and "methods"? The "real" status of content and method. Trying to find "content" in the experiences of experts can be as hard as finding "method" in the experiences of teachers and designers. Where precisely is the content? Does it "exist" in the objectives list? In people's heads? Where is the "method"? Do I look it up in Charlie's books (Reigeluth, 1983, 1987)? Both content and method are rooted in the actual experience and practice of people engaged in instructional activities. Yet CoL models tend to treat textbook objectives and strategies as if they had a clear, unproblematic, unambiguous ontological status. I think that the challenge for designers is not so much in following the models properly, but in determining how a model relates to a practical situation. How can you make sense out of a CoL model when you encounter a messy real learning situation? Instructional theory versus the practice of design. CoL approaches are all built on the conditions-goals-method framework that Charlie Reigeluth articulates in the "Green Book" (Reigeluth, 1983b): Depending on the conditions and your instructional goals, you "select" the appropriate instructional strategy to accomplish those goals. Such a view defines ID as adherence to a set of rules, and places the expertise or knowledge into the textbook-or the rule-based expert system. The advantages of this approach are that the knowledge can be codified, owned, controlled, and communicated unambiguously to others. Technician-level people can even do it, even if they don't really understand what they're doing, just following numbers. What an advance! The down side is that it doesn't work beyond a poor level of "output." Schn (1987) calls this aspect of practice "technical rationality." He doesn't deny its place; all disciplines have a technical component. But he says that's only a starting point for design or for professional practice. Technical rationality is the formal, abstract statement of theory that gets all the attention of the researcher but which utterly fails to "capture" the real expertise of the practitioners' culture. When David Merrill first attempted to convert his theories into expert systems, he found a whole new layer of problems and decisions he had previously ignored. I am saying that between expert

systems and real life, there is yet again a whole huge layer of expertise, and that expert systems are inherently incapable of capturing it. Hence the chasm between theory and practice, between researcher and practitioner. The theorist takes seriously this formalism, this set of algorithmic rules for practice; the practitioner depends on a huge "bank" of additional knowledge and values-including how to use the technical rules-that accounts for successful practice. The situation is similar to research on cognitive strategies. Researchers (Butterfield & Belmont, 1975) found that retarded learners were perfectly capable of mastering the specific strategies-it was in knowing when and where to use those strategies, and how to adapt them to situations, that they failed. Our theories are like the strategy repertoires of retarded learners-of themselves they do not add up to true expertise because they are missing the intangible, unanalyzable ingredients that go into everyday cognition and decision-making. Of course, the same criticism can be leveled at attempts to define content via standard objectives and task analyses. It can't be done. Over-reliance on objectives and analyses can easily lead to failed instruction for the same reason that dogmatic adherence to CoL models will lead to failed instruction: There's more to it than what's written down in the books. People need to have experiences that place them in positions where they'll learn important things. Who knows exactly what they'll learn, but one thing for certain: If you sterilize and control the learning environment and teach only your targeted objectives, learners will fail to learn how to be the thing you want them to be. They may learn some things you want them to learn, but they will fail at the role you're asking them to play in a real world of practice. Design versus implementation. CoL models assume that intended learning outcomes and instructional strategies can be made in a context removed in time and setting from practice. Winn (1990) developed this argument fairly well. Following traditional ID procedures, designers and subject experts sit together in a room over a table and make decisions about how teachers and students are going to spend their lives. We can make these decisions out of context. Sometimes we may not know that much about the context of use. Marty Tessmer's (Tessmer & Harris, 1990) work on environmental analysis is an attempt to re-introduce some "systems" thinking back into instructional design, realizing that contexts of use are inexorably related to the design. In an interesing self-analysis, Clancey (1993) noted that after years of work developing GUIDON and other expert systems for medical problem-solving, virtually no product ever achieved day-to-day use by medical practitioners. He faulted the design team's removal from the context of practice. The design team assumed that practitioners would welcome an expert system into their work; they thought the transition to the field would be relatively unproblematic. They failed to include implementation factors in their design, failed to achieve praxis-the interaction between theory and practice that keeps both fresh. There is a danger that when ID decisions are removed from the context of real instruction, similar problems will occur.

The role of the instructional designer. According to typical ID models, the instructional designer comes onto a new subject, gets fed the content by the subject-matter expert (SME), and spits it back out in the form of quality instruction. By contrast, Shulman (1987) found a whole array of different kinds of knowledge that an effective teacher must have in order to teach effectively. There is accumulating research to suggest that teachers who don't know the content inside out don't teach it as well. That's the problem with elementary math-too many elementary teachers are math phobic, don't really understand the concepts and underlying structure, and hence don't teach it well. It is amazing to me how we can expect designers who are neophytes to a subject to somehow design good instruction for it. Instructional strategies (and types of learning outcomes) "selected" from a pool. Beside the problems of technical rationality stated above, having a finite set of strategies (or objectives types) carries a unique danger-that of locking ourselves into set ways of thinking and not being open to innovations or new solutions. Following a CoL model will likely "bias" me toward a certain defined class of strategies or learning outcomes and "blind" me to other possible ways of viewing learning outcomes or strategies. The examples are obvious-CoL models tend to view motivational variables as "add-on"; they tend to neglect social cognition and cultural variables; they still don't have a good language for metacognitive and problem-solving outcomes. On the strategy side, a variety of constructivist strategies-simulations, games, cognitive tools-were neglected in "classic" CoL models, with updating and revisions currently going on. The point is that traditional CoL models grew out of a particular time and place and its attending ways of seeing the world. The two Reigeluth books reflect pretty much a 1970s psychology, translated into 1980s instructional theory. Any model or theory reflects a perspective of a defined time and place. In contrast, professional practice is never ending, always changing, just as our views are always changing. In the real world, change is the norm; unfortunately, we don't yet have a mechanism for continually updating our formal theoretical models in the same continuous way. Of course, none of the assumptions above need be devastating to the use of CoL models. Each carries a set of risks (which I have emphasized above) but also yields a certain economy or efficiency in practice. The cumulative danger, though, is that use of CoL models will result in lowest-common-denominator, mediocre-at-best instruction rather than creative or genuinely good instruction. Certainly failure to even think about assumptions like these increases the probability that CoL models will be uncritically and inappropriately used. Postmodern Roots of Constructivism There may be some confusion as to how postmodernism is different from constructivismcertainly the more common term found in the ID literature. I confess to some confusion myself, and to occasionally mixing up the two terms (see Wilson, Osmon-Jouchoux, & Teslow, 1995). I think it helps to clarify the issue to think of postmodernism as an underlying philosophy about the world, and constructivism as a very general theory of

cognition, suggesting how the mind works and how we know things. The roots of many constructivist beliefs about cognition are traceable to postmodern philosophies which depart from the rationalist, objectivist, and technocratic tendencies of "modern" society. Table 1 illustrates this relationship between constructivism and an underlying postmodern philosophy.

Underlying Philosophy

Theory about Cognition Constructivism (Situated-cognition Flavor)

Postmodernism

-Mind is real. Mental events are Postmodern philosophy worthy of study. emphasizes contextual construction of meaning and the -Knowledge is dynamic. validity of multiple perspectives. Key ideas include: -Meaning is constructed. --Knowledge is constructed by people and groups of people; --Reality is multiperspectival; -Learning is a natural consequence of performance.

-Reflection/abstraction is critical to expert performance and to becoming --Truth is grounded in everyday an expert. life and social relations. -Teaching is negotiating construction --Life is a text; thinking is an of meaning. interpretive act. -Thinking and perception are --Facts and values are inseparable. inseparable; -Problem solving is central to --Science and all other human cognition. activities are value-laden. -Perception and understanding are also central to cognition. Table 1. A situated-cognition flavor of constructivism and its underlying postmodern philosophy. In truth, not all constructivists are postmodern in their orientation. In psychology, constructivism originally reflected the thinking of people like Piaget and Vygotsky, who were basically modern in orientation. The current instructional models of Spiro, Jonassen, Bereiter, Resnick, Lesgold, etc.-while definitely constructivist-show varying degrees of postmodern influence (although some may be postmodern without realizing it!). It is

possible to have a constructivist view of cognition while still retaining a fairly traditional, modern view of science, method, and technology. It should also be noted that postmodern thinking can lead to what I consider positive or negative outlooks on life. On the down side, some postmodernist theories can lead to despair, cynicism, moral indifference, wimpishness, and a kind of myopic selfcenteredness. At the same time, other theorists are using postmodern ideas to fashion very positive, hopeful-even spiritual-approaches to life (Spretnak, 1991; Tarnas, 1991). My slant on postmodernism in this paper has been positive, as I believe it must be to have an impact on instructional design. Guidelines for Doing Postmodern ID In the spirit of subtly changing the meaning of traditional terms, I offer the following laundry list of tips for doing ID with a postmodern twist. The list should provide a clearer idea of how postmodern concepts can infiltrate and change designers' conceptions of their work. General Methodology Be willing to break the rules. Theories and models are meant to serve human needs. Wise use of these models implies when and where to use them, and where to change the rules or forget about them altogether. Place principles above procedures, and people above principles. The skilled designer will find ways to follow the principles underlying the procedures. Procedural models of ID are seen as flexible and changeable. Even key principles should be continually tested against the real needs of the people involved in the project. Include all interested parties in the design and development process. Incorporate participatory design techniques, with design activity moving out of the "lab" and into the field. Include end users (both teachers and students) as part of the design team. Make sure all interested parties-the "constituencies"-have some kind of voice contributing to the outcome of the project. Don't believe your own metaphors. Be aware of the pervasive influence that labels and metaphors have on our thinking-e.g., "delivery" of instruction, memory "storage," learning "prerequisites," "systems" design, strategy "selection," instructional "feedback," and learning "environments." While such metaphors are necessary for our thinking, they each carry a certain connotative baggage that may blind us to alternative ways of seeing. Needs Assessment Make use of consensus needs assessment strategies, in addition to gap-oriented strategies. Gap models of needs assessment attempt to portray the "ideal" situation, compare it against the present state, leaving a need in the gap. The technical fix suggested

by gap models of needs assessment may be appropriate for certain work settings. However, not all instruction is designed to improve performance in a specific work setting. Schools may develop curriculum based on a consensus among very different constituencies; the "ideal" situation may be a political compromise. Do an "environmental impact" analysis. Gap analyses always need to be supplemented with consideration of the "environmental impact" of proposed fixes. After addressing the targeted needs, what kinds of unintended outcomes may be anticipated? Resist the temptation to be driven by easily measured and manipulated content. Many important learning outcomes cannot be easily measured. It may or may not be possible to reduce value down to a number. The postmodern designer will be sensitive to subtle yet highly valued outcomes and effects. Ask: Who makes the rules about what constitutes a need? Are there other perspectives to consider? What (and whose) needs are being neglected? These questions arise out of the postmodern notion that all human activity is ideologically based. The possible political and social consequences of our actions need to inform our decisions. Goal/Task Analyses Allow for instruction and learning goals to emerge during instruction. Just as content cannot be fully captured, learning goals cannot be fully pre-specified apart from the actual learning context. See Winn (1990) for a thorough discussion of this issue. Don't sacrifice educational goals for technical training. Acknowledge that education and training goals arise in every setting. Schools train as well as educate, and workers must be educated-not just trained in skills-to work effectively on the factory floor. The postmodern designer will be especially tuned to the need for educational goals that strengthen conceptual understanding and problem-solving skills in a domain. Use objectives as heuristics to guide design. There is no special value in operational descriptions of intended learning outcomes; in fact, these may constrain the learners' goals and achievement. Pushing goal statements to behavioral specifications can often be wasted work-or worse, lead to misguided efforts. The "intent" of instruction can be inferred by examining goal statements, learning activities, and assessment methods. Goals and objectives should be specific enough to serve as inputs to the design of assessments and instructional strategies. Don't expect to "capture" the content in your goal- or task analysis . Content on paper is not the expertise in a practitioner's head (even if you believed expertise resided in someone's head!). The best analysis always falls short of the mark. The only remedy is to design rich learning experiences and interactions where learners can pick up on their own the content missing between the gaps of analysis.

Consider multiple models of expertise. Expertise is usually thought of as having two levels: Expert or proficient performance and novice or initial performance. Of course, a two-level model is insufficient for accurate modeling of student growth over time. A series of qualitative models of expertise may be needed for modeling students' progression in learning critical tasks (Dreyfus & Dreyfus, this volume; White & Frederiksen, 1986). Postmodern theorists would pose an even more radical thought: that expertise does not follow a linear progression of stages, but takes on different forms in different people. Instruction needs to respond to where a learner "is," and support their growth, regardless of their positioning in the expertise "universe." Give priority to contextualized problem-solving and meaning-constructing learning goals. Instead of rule-following, emphasize problem solving (which incorporates rulefollowing but is not limited to it). Rules change according to context. But even problem solving is not all there is to cognition-perception is also central. Instead of simple recall and memory tasks, ask learners to practice seeing-making sense out of material and demonstrating their understanding of it (Prawat, 1993). Define content in multiple ways. Use cases, stories, and patterns in addition to rules, principles, and procedures. Human memory, according to some theorists, is largely story- or narrative-based (Schank, 1991). Other theories, such as situated cognition (Brown, Collins, and Duguid, 1989; Clancey, 1992, 1993) and connectionism (Marshall, 1991), emphasize pattern development and learning from authentic cases. Rich cases, stories, and patterns of performance can be alternative metaphors for finding and representing content. These multiple modes of representation can then find their way into instruction, providing richer, more meaningful experiences for students. Appreciate the value-ladenness of all analysis. Defining content and goals for learning is a political, ideological enterprise. Valuing one perspective means that other perspectives will be given less value. One approach is given prominence; another is neglected. Somebody wins, and somebody loses. Be sensitive to the value implications of your decisions. Ask: Who makes the rules about what constitutes a legitimate learning goal? What learning goals are not being analyzed? Whose interests does the project serve? What is the hidden agenda (Noble, 1989)? Twenty-five years ago, a designer using 'understand' as a verb in a learning objective would have been laughed out of the office. 'Understanding' was fuzzy; it was forbidden. Are there other expressions of learning outcomes that remain taboo? Are there other dimensions of human performance that remain undervalued within ID discourse? The cultural? The spiritual? Good postmodern ID would pursue answers to these questions and be unafraid of reexamining current practice. Instructional Strategy Development Distinguish between instructional goals and learners' goals; support learners in pursuing their own goals. Ng and Bereiter (1991) found that students showed signs of

having three kinds of goals: (1) student task-completion goals or "hoop jumping," (2) instructional goals set by the system, and (3) personal knowledge-building goals set by the student. The three do not always converge. A student motivated by task-completion goals doesn't even consider learning, yet many students' behavior in schools is driven by just such performance requirements. Postmodern instruction would nourish and encourage pursuit of personal knowledge-building goals, while still supporting instructional goals. As Mark Twain put it: "I have never let my schooling interfere with my education." Appreciate the interdependency of content and method. Traditional design theory treats content and the method for teaching that content as orthogonally independent factors. Postmodern ID says you can't entirely separate the two. When you use a Socratic method, you are teaching something quite different than when you use worksheets and a posttest. Teaching concepts via a rule definition results in something different than teaching the same concepts via rich cases and class discussion. Just as McLuhan discerned the confounding of "media" and "message," so designers must see how learning goals are not uniformly met by interchangeable instructional strategies. Allow for the "teaching moment." Situations occur within instruction at which the student is primed and ready to learn a significant new insight. Good teachers create conditions under which such moments occur regularly, then they seize the moment and teach the lesson. This kind of flexibility requires a level of spontaneity and responsiveness not usually talked about in ID circles. Be open to new ways of thinking about education and instruction. The postmodern designer will always feel somewhat ill at ease when "applying" a particular model, even the more progressive models such as cognitive apprenticeship, minimalist training, intentional learning environments, or case- or story-based instruction. Designers should always be playing with models, trying new things out, modifying or adapting methods to suit new circumstances. Think in terms of designing learning environments and experiences rather than "selecting" instructional strategies. Metaphors are important. Does the designer "select" a strategy or "arrange" a learning experience? Postmodern designers would usually think of instruction in interactive, experiential terms rather than as a simple product or presentation. Think of instruction as providing tools that teachers and students can use for learning; make these tools user-friendly. This frame of mind is virtually the opposite of "teacherproofing" instructional materials to assure uniform adherence to designers' use expectations. Instead, teachers and students are encouraged to make creative and intelligent use of instructional tools and resources. In some respects the designer is surrendering control over the use of the product, but in so doing participates more meaningfully in the total design of the experience.

Consider strategies that provide multiple perspectives that encourage the learner to exercise responsibility. Resist the temptation to "pre-package" everything. Let learners generate their own questions and goals, then seek out information and experiences to address those questions. Of course, this runs the risk of not giving the learner enough guidance, or of exposing too much confusion and complexity. Certainly there are times to simplify and reduce complexity; the designer needs to exercise best judgment and find methods for support in the midst of complexity. Appreciate the value-ladenness of instructional strategies. Sitting through a school board meeting is enough to convince anyone of this. Instructional strategies grow out of our philosophies of the world and our value systems. Not only the content, but the strategy can be a threat to particular ideological positions or to learner motivation. Good designers will be sensitive to the "fit" between their designs and the situation. Media Selection Include media biases as a consideration in media decisions. Different media send different "messages" to an audience, independently of the instructional content. A TV show means something different to a child than another worksheet. Look for any "hidden curriculum" elements in different media choices. Avoid negative stereotypes and cultural biases. Consider the rhetorical goodness of fit between media choice and overall instructional purposes. Include media literacy as a planning consideration. Designers should be sensitive to an audience's media sophistication and literacy, paying particular attention to humor, media conventions, and production values. Student Assessment Incorporate assessment into the learning experience where possible. Skilled teachers will be assessing students informally all the time. Also, technologies are available for incorporating continuous, "dynamic assessment" into learning materials (Lajoie & Lesgold, 1992). Assessment should be seamlessly integrated into meaningful learning experiences and not tacked on at the end. Critique and discuss products and performances grounded in authentic contexts, including portfolios, projects, compositions, and performances. Product and performance reviews can complement more traditional measures of knowledge acquisition and understanding (Cates, 1992). Include different perspectives in the critiquing process. Use informal assessments within classrooms and learning environments. Informal assessments refer primarily to teacher observations of eye contact, body language, facial expressions, and work performance. These observations can complement formal assessments as a basis for instructional adjustments. Conclusion

If my purpose has been accomplished, you will have gained an appreciation of postmodern ideas and how they can relate to ID practice. As we continue to grow professionally, the same old terms begin to take on different meanings. At the same time, I hope you are cautious and critical in evaluating these ideas. Avoid any bandwagon phenomenon. Test any of the ideas in this chapter or book against the reality of your practice. All theories and ideas need to be put into the service of real-world practice and usability. Remember the postmodern slogan: "Question authority (before they question you!)" Author Notes Brent Wilson is an associate professor of instructional technology at the University of Colorado at Denver and may be reached at bwilson@carbon.denver.colorado.edu. Brent's research interests include instructional-design theory, constructivist learning environments, and technology in the classroom. Parts of this chapter are adapted from one published in B. Seels (Ed.), Instructional design fundamentals. Englewood Cliffs NJ: Educational Technology Publications, 1995. That chapter, "The Impact of Constructivism (and Postmodernism) on ID Fundamentals", was co-authored by Jim Teslow and Rionda Osman-Jouchoux. The former chapter focused on constructivism, whereas here the focus is on postmodern thinking that underlies much of the constructivist discussion. A special thanks goes to Charles Dills for his encouragement and patience in the production of the manuscript. References Brown, J. S., Collins, A., & Duguid, P. (1989, January-February). Situated cognition and the culture of learning. Educational Researcher, 32-42. Butterfield, E. C., & Belmont, J. M. (1975). Assessing and improving the executive cognitive functions of mentally retarded people. In I. Bialer & M. Sternlicht (Eds.), Psychological issues in mental retardation. Chicago: Aldine-Atherton. Cates, W. M. (1992, April). Considerations in evaluating metacognition in interactive hypermedia/multimedia instruction. Paper presented at the meeting of the American Educational Research Association, San Francisco. Clancey, W. J. (1992). Representations of knowing: In defense of cognitive apprenticeship. Journal of Artificial Intelligence in Education, 3 (2), 139-168. Clancey, W. J. (1993). Guidon-Manage revisited: A socio-technical systems approach. Journal of Artificial Intelligence in Education, 4 (1), 5-34. Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53, 445-460.

Dreyfus, H., & Dreyfus, S. Chapter, this volume. In C. Dills & A. Romoszowski (Eds.), Instructional development: The state of the art. Englewood Cliffs NJ: Educational Technology Publications. Gagn, R. M. (1966). The conditions of learning (1st ed.). New York: Holt, Rinehart, & Winston. Hlynka, D., & Belland, J. C. (Eds.). (1991). Paradigms regained: The uses of illuminative, semiotic, and post-modern criticism as modes of inquiry in educational technology: A book of readings. Englewood Cliffs NJ: Educational Technology Publications. Hlynka, D., & Yeaman, R. J. (1992, September). Postmodern educational technology. ERIC Digest No. EDO-IR-92-5. Syracuse NY: ERIC Clearinghouse on Information Resources. Lajoie, S. P., & Lesgold, A. M. (1992). Dynamic assessment of proficiency for solving procedural knowledge tasks. Educational Technology, 27(3), 365-384. Marshall, S. P. (1991, December). Schemas in problems solving: An integrated model of learning, memory, and instruction. San Diego: Center for Research in Mathematics and Science Education. Final Report for Office of Naval Research Grant No. ONR N0001489-J-1143. Martin, B. L. (1994, March). Commentary on deconstructing modern educational technology. Educational Technology, 64-65. Ng, E., & Bereiter, C. (1991). Three levels of goal orientation in learning. The Journal of the Learning Sciences, 1(3 & 4), 243-271. Noble, D. D. (1989). Cockpit cognition: Education, the Military and cognitive engineering. AI & Society, 3, 271-297. Prawat, R. S. (1993). The value of ideas: Problems versus possibilities in learning. Educational Researcher, 22 (6), 5-16. Reigeluth, C. M. (Ed.). (1983). Instructional-design theories and models: An overview of their current status. Hillsdale, NJ: Erlbaum. Reigeluth, C. M. (1983). Instructional design: What is it and why is it? In C. M. Reigeluth (Ed.). (1987). Instructional-design theories and models. Hillsdale, NJ: Lawrence Erlbaum Associates. Reigeluth, C. M. (Ed.), Instructional theories in action: Lessons illustrating selected theories and models . Hillsdale, NJ: Erlbaum.

Reigeluth, C. M., Merrill, M. D., Wilson, B. G., & Spiller, R. T. (1980). The elaboration theory of instruction: A model for structuring instruction. Instructional Science, 9, 195219. Schn, D. (1987). Educating the reflective practitioner. New York: Basic Books. Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57 (1), 1-22. Schank, R. (1991). Tell me a story: A new look at real and artificial memory. New York: Simon and Schuster. Simon, H. (1983). The sciences of the artificial (2nd ed.). Cambridge MA: MIT Press. Snow, C. P. (1969). The two cultures and the scientific revolution. London: Cambridge University Press. Spretnak, C. (1991). States of grace: The recovery of meaning in the postmodern age. San Francisco: HarperSanFrancisco. See especially Appendix A, "The merely relative: A brief survey of deconstructive postmodernism." Streibel, M. J. (1986). A critical analysis of the use of computers in education. Educational Communications and Technology Journal, 34 (3). Tarnas, R. (1991). The passion of the western mind. New York: Harmony Books. See especially Part VI: "The transformation of the Modern Era." Tessmer, M., & Harris, D. (1992). Analysing the instructional setting: Environmental analysis. London: Kogan Page. White, B. Y., & Frederiksen, J. R. (1986). Progressions of quantitative models as a foundation for intelligent learning environments. Technical Report # 6277, BBN. Wilson, B. G. (this volume). Constructivism. In C. Dills & A. Romoszowski (Eds.), Instructional development: The state of the art. Englewood Cliffs NJ: Educational Technology Publications. Wilson, B., & Cole, P. (1991). A review of cognitive teaching models. Educational Technology Research and Development, 39 (4), 47-64. Wilson, B. G., Osman-Jouchoux, R., & Teslow, J. (1995). The impact of constructivism (and postmodernism) on ID fundamentals. In B. Seels (Ed.). Instructional design fundamentals. Englewood Cliffs NJ: Educational Technology Publications. Winn, W. D. (1990). Some implications of cognitive theory for instructional design. Instructional Science, 19, 53-69.

You might also like