Given the so-called “crisis” of higher education today, it is hardly a surprise that academic writing remains the preferred target of mainstream and alternative media. The shockwaves that Nicholas Kristof’s recent editorial1 in the New York Times set off among academics2 and their critics3 in February rehearse the usual complaints: scholars spend their days navel-gazing amidst their tomes; humanities disciplines are appendix-like vestiges of a bygone era; academic publishers reward ever-narrower research agendas, as do review committees when the tenure clock strikes midnight. A point of controversy in virtually every critique in this vein is academic language, what Kristof calls “turgid prose” mired in its own obscurity. Lamenting academic prose is now commonplace in media takedowns of academia, thought to betoken a broader crisis looming over North American colleges and universities. Of course, even a cursory look at media discourse in the last decade shows “crisis” to be the preferred term for situations when something goes awry, when our control over nature is exposed to be an elaborate farce, or when we simply have no idea what is going on. What I would like to suggest is that “academese”—the occasionally dense and even obscurantist discourse in the humanities and social sciences—taps into anxieties about language’s ability to signify in a world beset by self-generated “crises.” Many beautiful works of non-fiction have come off university presses in recent years; so have many of the most grating.
This is a wildly speculative idea which I arrived at after years of reading cultural criticism in periodicals across the ideological spectrum and then set atop my research on apocalyptic prophecy in early modern Iberia. Whether or not academese is a symptom of higher education’s systemic woes is not my judgment to make. For me, what is more interesting than the specifics of arguments like these is why op-ed writers continue to train academia in their rhetorical crosshairs, impugning the value of a liberal education while student debt soars and budget axes dangle precariously over whole departments. Academese fits neatly into such snark, painted as one of many symptoms of an institution increasingly estranged from reality. Academia, of course, is just one of many professions that speaks a particular idiom. Law students plowing through briefs and budding physicians leafing through medical journals encounter a peculiar variation on what they thought was their native tongue, yet often strikes them as vacuous gobbledygook. Even the business world boasts an idiom rife with acronyms and buzzwords that only the initiated can comprehend.
Legalese and its professional cousins notwithstanding, academese and, by extension, higher education, have become favorite whipping boys for today’s social maladies. A college degree is held up as a ticket to a middle-class lifestyle, a reflection of higher education’s drift away from intellectual formation towards vocational preparation. Job-readiness and “ROI” are now the go-to metrics for evaluating degree programs, which are entirely reasonable for anyone deciding whether to shoulder colossal debts and enter the worst labor market in more than a generation. Since private funding now comprises the bulk of public and private university budgets, it certainly makes sense to scrutinize the research that those universities produce. This is where most editorials launch into diatribes against academese, but in so doing they imply that the onus of explaining the world still falls on academic researchers. At least in theory, research is supposed to tell us something about the world with an eye towards a greater truth. Even as the media (and some segments of the public) criticize universities for overcharging and underdelivering, the expectation for academics to conduct actionable research survives.
Practically the only reason offered for why academics continue to dash this expectation has do to with the rigidity of tenure decisions, themselves hemmed in by the ever-narrower specializations needed to publish. While this may be one factor, I have often asked myself if it is the only one or the even most important if there are more than one. If the economic crisis of 2008 laid bare the flaws of neoliberalism, it also underscored the value of higher education in order to analyze and prevent such crises in the future. Yet crisis—broadly conceived as an aberration in a normal state of affairs—has become the ethos of our time rather than an exception to it. In a recent essay on the Great Recession’s cultural fallout, Rosalind Williams proposes that crisis “is no longer a turning point in history but rather than an immanent condition of history, part of its ‘normal’ working, indistinguishable from its own aftermath” (30).4 Crisis, in other words, has become history’s default setting rather than a sign of its breakdown—needling evidence that our collective belief in progress continues to recede, despite promises that enough apps and lean startups can save humanity from itself. By this account, academics and anyone else trying to make sense of things face the chore of discerning patterns in today’s maelstrom of data, events, and disasters.
Naturally, many of the worst crises humanity has witnessed long predate our age. Our toxic political landscape pales in comparison to that of the Roman Empire at its zenith; the Nika Riots in Justinian’s Constantinople led to the slaughter of some 30,000 citizens enraged about the outcome of a chariot race, dwarfing today’s post-game fan hijinks; it is hard to imagine how today’s media would cover the outbreak of the plague in fourteenth-century Europe. What distinguishes our time from these earlier periods is not the presence of crisis, but the absence of an explanatory framework to overlay it. When Herodotus first conceived history as an inquiry that rescues human deeds from oblivion and Aristotle parsed apart history and poetry a hundred years later, both assumed the world to be stable and immutable. History immortalized the virtuous and the daring through enduring memory, even though, as Hannah Arendt observes, the ancient Greeks held that “man, insofar as he is a natural being and belongs to the species of mankind, possess immortality; through the recurrent cycle of life, nature assures the same kind of being-forever to things that are born and die as dot things that are and do not change” (571).5 Any crisis—assuming that one was thinkable in the first place—was absorbed into this repetitive cycle.
Though divergent from eternal cycling, the Judeo-Christian view of history understood crisis as a temporary setback in an otherwise rectilinear path towards the end of days. Eschatological prophecies, in spite of the horrific images they conjured, also invested history with a purpose that co-opted crises, portraying them as sanctifying punishments from God rather than ruptures in the flow of time. First seeking compatibility with religion in the eighteenth century, then jettisoning it altogether in the nineteenth, those whose saw history as progress framed crises as swerves on the road to humanity’s complete domination of nature, the utopian stage managing of reality. Progress has since lost its status as the singular guiding principle of history, now largely confined to the tech industry, whereas “larger systems are involved—especially environmental, military, and economic ones—the pattern of contemporary history associated with them is visualized not as a line but as a pattern of crisis centers spreading with no end in sight” (Williams 31). It would seem that progress in this narrower sense empowers its chaotic opposite: the dizzying speed of technological innovation does not simply outpace our ability to comprehend its impact, but belies our claim to mastery over nature when it fails. Crisis, then, is the absence of any explanatory theory of history. It is an abyss that theorists took to calling the “postmodern condition” decades ago, one that defies conventional descriptive codes and, as with all new things, forces a retreat to the literary devices and neologisms upon which poets, novelists, and yes, academic researchers, are among the first to seize.
If crisis is history devoid of an identifiable pattern, forging a vocabulary to talk about it is daunting indeed. In developing her thesis on history as a perpetual apocalypse, Williams alludes to the thought of two writers—Haruki Murakami and Leo Marx—whose ideas address the question of language explicitly. Murakami addresses this dilemma on behalf of fiction writers when speaking of a recent “realignment” in global culture. While previous decades North American and European readers sought to domesticate his writing with one or another “-ism” and plenty of “logical parsing,” he marvels that in recent years, “people were accepting my stories in toto—stories that are chaotic in many cases, missing logicality at times, and in which the composition of reality has been rearranged.”6 For Murakami, this marks a shift in mentality that he attributes to the aftermath of 9/11, which, darkly echoing the collapse of the Berlin Wall, partially expunged progress from the Western cultural imaginary. The lack of any historical pattern presupposes a paucity of descriptive language with which to identify one. Leo Marx refers to this paucity as a “semantic void,” that is, “an awareness of certain novel developments in society and culture for which no adequate name had yet become available” (563).7 In the late nineteenth century, for instance, “technology” emerged as a filler for the semantic void the Industrial Revolution had pried open, standing in for terms like “mechanic arts," “invention,” or “machinery” that proved unsuitable. This void emerged within a recognizable historical pattern—progress, in this case—that was able to contain it.
Successfully filling a semantic void assumes that circumstances will remain stable enough to allow new descriptive keywords to emerge. What I am wondering is whether the perceived acceleration of history’s tempo frustrates efforts to fill such voids and whether professional “babel” is a linguistic symptom of this frustration. Scrambling to explain ever-changing circumstances is a bit like finding clarity in a blur or perceiving depth in the darkness. If our language obsolesces at the same rate as last year’s favorite gadget, then it is tempting to resign our commission and devote ourselves to learning how to take in each crisis tout court. Humans, however, are storytelling creatures. Our thoughts are structured as elaborate narratives, our most banal language flooded with reified metaphors. Despite postmodernist celebrations of dead meta-narratives, we need frameworks to explain the past, to make sense of the present, and to predict the future in order to plan for it. An identifiable pattern of history is therefore much more than a cerebral luxury. It is, in William’s words, “the basis for a sense of predictability in human life” (36). However awkward or slapdash our language is, the world’s demand for it undergirds the expectation that academics continue to conduct research. It may be that crisis prevents us from analyzing matters clearly, especially if the language of analysis has to be invented on the spot with full awareness that its shelf life is limited. Both in its beautiful and less savory forms, academese may be the ideal response after all.
Kristof’s op-ed and others like it gesture towards something far deeper than bulldozing through the ivory tower. Higher education is as vulnerable to economic forces as other sectors, and many of those sectors rely on their own jargon to speak among themselves in the proverbial “real world.” Instead, the cry for America’s “sharpest minds” to participate in the public sphere unearths a yearning for good storytelling, that most human of qualities that brings order to chaos, patterns to randomness, hope to anomie. It does not take a stretch of the imagination for literary critics and historians to see themselves as narrators, spinning stories or the raw materials that go into them. At the same time, fiction writers, social scientists, and policymakers envision possible worlds through narrative based on the data they collect and the solutions they formulate. Though highly specialized, research at its best equips us to craft stories, and with it, to descry historical patterns. Research is repurposed storytelling, its mission to make sense of our increasingly complex world. In this regard, it draws near to how Murakami views literature on our crisis-riddled planet:
[T]he role of a story is to maintain the soundness of the spiritual bridge that has been constructed between the past and the future. New guidelines and morals emerge quite naturally from such an undertaking. For that to happen, we must first breathe deeply of the air of reality, the air of things-as-they-are, and we must stare unsparingly and without prejudice at the way stories are changing inside of us. We must coin new words in tune with the breath of that change.
Murakami’s clarion call to absorb reality before developing stories about it is instructive. It exhorts us into an uphill battle against conditions that change far too quickly for our minds to catch up. Murakami himself remarks that while his writing in recent years has been rocky, it has begun to thrive amidst chaos rather than in spite of it. Academese is the scholarly correlate to the challenge that writers of all stripes now encounter—a gawky yet noble attempt to “coin new words in tune with the breath of that change.” Whether we ever manage to portray this reality is in many ways secondary. Instead, what matters is that after enough fits and starts, we fashion stories that ultimately enable us to embrace our limited knowledge. We may never again decide on another all-encompassing pattern of history, but this may be as liberating as it is terrifying. Where we once stood confident that history was going our way, we can now humble ourselves before the unknown. And where we once clung zealously to our theories and our models, we can now take pride in our vulnerability as storytellers laboring to narrate a perpetual crisis.
- 1. Nicholas Kristof, “Professors, We Need You!,” New York Times, 15 February 2014, http://www.nytimes.com/2014/02/16/opinion/sunday/kristof-professors-we-need-you.html.
- 2. Paul Stoller, “The Scholar’s Obligations,” Huffington Post, 25 February 2014, http://www.huffingtonpost.com/paul-stoller/the-scholars-obligations_b_4853777.html.
- 3. Joshua Marshall, “Goodbye to All That—Why I Left the Academic Life,” Talking Points Memo, 24 February 2013, http://talkingpointsmemo.com/edblog/goodbye-to-all-that--2.
- 4. Rosalind Williams, “The Rolling Apocalypse of Contemporary History,” in Aftermath: The Cultures of the Economic Crisis., ed. Manuel Castells, João Caraça, and Gustavo Cardoso (New York: Oxford University Press, 2012), 17-43.
- 5. Hannah Arendt, “The Modern Concept of History,” The Review of Politics 20.4 (1958): 570-90.
- 6. Haruki Murakami, “Reality A and Reality B,” New York Times, 29 November 2010, http://www.nytimes.com/2010/12/02/opinion/global/02iht-GA06-Murakami.html?pagewanted=all.
- 7. Leo Marx, “Technology: The Emergence of a Hazardous Concept,” Technology and Culture 51.3 (2010): 561-77.