Dosier: Posmodernismo y las humanidades
Julio 1, 2019

captura-de-pantalla-2019-03-12-a-las-13-39-50The Birth, Death, and Rebirth of Postmodernism

No other idea from the humanities had so massive, if murky, an influence.

What was Postmodernism? In the 35 years since Fredric Jameson’s New Left Review essay “Postmodernism, or the Cultural Logic of Late Capitalism” — and the 40 years since the publication of Jean-François Lyotard’s The Postmodern Condition — it’s fair to say that no other idea from the academic humanities has had so vast, if murky, an influence on the broader culture. (Often assumed to be a proponent of postmodernism, Jameson is rather its diagnostician.) Seinfeld was said to be “postmodern,” and so was the architecture of Frank Gehry. So, too, was the “fashionable nonsense” targeted by Alan Sokal in his infamous 1996 Social Text hoax. As the recent “Sokal Squared” hoaxes showed, the specter of postmodernism continues to be a useful cudgel wielded against the university.

Debates about postmodernism have returned, in both vulgar and sophisticated forms — from Jordan Peterson’s crusade against “postmodern neo-Marxism” to Bruno Latour’s defense of climate science to Rita Felski’s probing of “the limits of critique.” And the accelerating media bombardment enabled by our proliferating devices has only intensified what Jameson, all those years ago, called the “transformation of the ‘real’ into so many pseudo-events.”

In that spirit, we asked 10 contributors to reflect on the continuing relevance — or irrelevance — of postmodernism to the academy and the larger culture.

You’re So Paranoid, You Probably Think This Conspiracy Is About You

Moira Weigel

Postmodernism has long been an object of conspiracies. But I had forgotten that, in the middle of his canonical essay “Postmodernism,” Fredric Jameson himself turns to the subject of conspiracy theorists. What’s more, he acknowledges a kinship between their methods and his own.

Such an acknowledgment might sound like precisely the confession that the self-described opponents of postmodernism have been waiting for. Since the Culture Wars of the late 1980s, these opponents have come from many and politically varied quarters, from bow-tied defenders of Great Books courses to class-first Marxists who argue that emphasis on discourse and representation has been distracting and destructive for left politics.

In the era of Donald Trump — and YouTube — the most fevered version of the case against postmodernism has become increasingly visible. That is, the claim that a coalition of critical theorists, poststructuralists, multiculturalists, feminists, queer theorists, and African-­American and other “studies” professors have successfully conspired to take over educational institutions, the media, and the U.S. government, and even to establish a new International World Order. (Why those same masterminds so often prove unable to secure the minimal funding needed to keep their departments open, or university presses running, does not come up. Nor do the many intellectual and political differences that we have among ourselves.)

As the “cultural Marxism” conspiracy has migrated from Holocaust-­denier conferences and terrorist manifestos to White House memos and the opinion pages of The New York Times, its exponents have continued to deploy a rhetoric of reaction that at once abjures and identifies with the targets of its criticism. The “studies” department politicized culture, so “we” must start a culture war. The Social Justice Warriors have made the transgression of gender norms so normal that “we” must transgress them. Joe Biden doesn’t like you either. Look what you made him do.

Now: The writers who expound the Cultural Marxism story have not necessarily read Jameson. The most popular accounts — books like Pat Buchanan’s Death of the West (2001) or Andrew Breitbart’s Righteous Indignation (2011) — end in the 1960s, with Herbert Marcuse. In 2017, when Jordan Peterson told an audience at the Manning Centre that “you need to understand postmodernism, because that’s what you’re up against,” he recommended that people buy Understanding Post­modernism, by Stephen Hicks.

The refusal to engage with power makes mainstream critics of postmodernism often sound conspiratorial.

But Jameson is one of relatively few thinkers who has ever actually described himself as a “cultural marxist.” And both the style of thinking that he practices and the “cultural logic” that he describes — his insistence that everything is historical, and therefore political — is what stands accused.

And not just by self-described white nationalists. Since the election of Donald Trump, even reputable cultural critics who identify as liberals and leftists have begun to say that they, too, blame “postmodernism.” The MLA has sponsored panels of earnest self-criticism on the responsibility that literary scholars bear for creating the conditions of “post-truth.”

That is, there has been a growing consensus that postmodernism destroyed a necessary capacity to draw distinctions, created epistemic conditions in which left and right, it is all the same. Rereading Jameson, I found conceptual resources for countering such claims. The sense that there is no outside, that everything connects to everything, is one that Jameson really does share with Glenn Beck. But at the heart of his essay on postmodernism, anticipating the charge that he is similar to conspiracists, Jameson articulates the differences between them.

The passage I mean comes at the end of the section on the “technological sublime.” Jameson has made his argument that, whereas Edmund Burke and Immanuel Kant conceptualized the sublime as an encounter with divinity or nature, by the 1980s those forces had been replaced by technology. It’s a concept prescient enough to seem banal, now. If the sublime is that which overwhelms the subject with the finitude of his own reasoning capacities, anyone who has ever attended a Silicon Valley Demo Day or called their health-insurance company to dispute the denial of a claim by “the computer” knows that Big Data does exactly this.

According to Jameson, a new genre of writing and filmmaking he calls “high-tech paranoia” captured this situation best. The narratives he had in mind were stories of conspiracy: “Yet conspiracy theory (and its garish narrative manifestations) must be seen as a degraded attempt — through the figuration of advanced technology — to think the impossible totality of the contemporary world system.” Degraded why? Because while it went through the motions of revealing truth — the shadowy agency behind the global plot, the men in smoky rooms who had made everything happen — ultimately conspiracy theory preserved the invisibility that it thematizes.

At the end of “Postmodernism,” Jameson proposes that the task of art and of criticism in his time is to do something that he calls global cognitive mapping: to create “a pedagogical political culture which seeks to endow the individual subject with some new heightened sense of its place in the global system.” Superficially, the work of cognitive mapping resembles conspiracy theorizing. Both abandon “depth reading” in favor of tracing connections. Both aver that there is no “outside.”

The key difference is that the first activity aims to demystify. The frantic activity of Glenn Beck at his chalkboard or Jordan Peterson tweeting about “cultural Marxists” does not ultimately enable the reader or viewer to recognize the forces that keep her in her place. Rather it exaggerates their incomprehensible sublimity. The Big Boss’s Boss stays offscreen — for the sequel.

This refusal to really engage with power is what makes the mainstream critics of postmodernism, cultural Marxism, and allegedly related movements — intersectionality, “safetyism,” etc. — so often sound conspiratorial themselves. While they point to many interrelated cultural and political changes, they provide no concrete account of how these changes happen. They have no theory of power. They like it that way.

It is common, for instance, for contemporary anti-postmodernists to claim that students with “illiberal” ideas have gotten their professors fired, or that a Twitter mob has “censored” a major magazine or book. Sure, student unrest can lead to the dismissal of a teacher, and social-­media outrage can be a reason that a publishing conglomerate decides to halt the presses. But to say that the overheated feelings of a few teenagers can make professors destitute — or that snark broadcast to a few dozen followers can oppress a presidential candidate — is to skip over a few key steps.

We cannot actually understand the relationship between students’ embrace of certain ideas and the precarity of their teachers — just for instance — without mapping the academic and corporate bureaucracies, the government-backed lending institutions, that shape their lives. The widespread accusation that “theory” has produced our situation may be so popular precisely because it allows existing power structures to remain as they are. Pay no attention to that debt bureaucracy behind the curtain!

Whether you think exposing power this way is the only legitimate task of art or literature or criticism is another question. But to account for culture at the present will require descending from the online marketplace of ideas into the workshop of their production.

Otherwise, at the end of this thriller, the anti-theorist valued for his “viewpoint diversity” will fly to the next campus, give his next talk on how he is being silenced by teenagers, and collect his $30,000 — and the next installment of the Culture War franchise will remain just as predictable.

Moira Weigel is a postdoctoral fellow at Harvard University.


Tracing Fossils of the Old Beast

Justin E.H. Smith

All things come to an end, not least the coming-to-an-end of things. And so it had to be with the end of modernism, and the couple of decades of reflection and debate on what was to come next. For me, postmodernism is the copy of Jean-François Lyotard’s The Postmodern Condition, which I bought in English translation in 1993. It’s sitting in a cardboard box, its pages slowly yellowing and its cover design receding into something recognizably vintage, in my old mother’s suburban California garage. I stowed it there when I moved to Paris, in 2013. And in the past six years I have seen only fossil traces of the old beast said to have roamed here in earlier times, eating up grand narratives and truth claims like they were nests full of unprotected eggs.

A few living fossils, coelacanth-like, survived from French philosophy’s âge d’or and could still fill lecture halls. But the survivors were mostly known for their non-representativity, in part because they loudly proclaimed it. Alain Badiou, for example, talked about the transcendental forms of love and beauty. Bruno Latour, not long after 2001, began regretting what his own brand of truth-wariness had done to stoke the “truther” conspiracy theories that had quickly spread to the villagers who worked his family’s vineyards, in Bourgogne. And for the most part, as Perry Anderson observed, by the early 21st century French philosophy had gone the way of French cinema: the way of nostalgia, safe formulas, and whatever special effects could be pulled off in the absence of adequate funding.

So much for France, then. In the Anglosphere, new technology was meanwhile ensuring that whatever had been said a few decades prior about simulacra and spectacle may as well have been said not about television, or even the early internet, but about telegraphs and semaphore. I, a clueless normie, first noticed the full extent of the epochal shift in 2015. Old institutions were crumbling — media, entertainment, education, democracy. Along with them, old ideas about what constitutes authority came to seem laughable. An American president was propelled into office through trolling. Mobs emerged to shame and ostracize anyone, no matter how distinguished or eminent, who did not agree with them. The internet, which should long ago have been transformed into a public utility, had instead been transformed into a weapon of war: not just a metaphorical “war of words,” but actual war, an asymmetrical vigilante war (with covert support from a recrudescing superpower) bent upon bringing old institutions to the ground. We are in the midst of this war as I write.

Old institutions were crumbling — media, entertainment, education, democracy.

By the time of this epochal shift, academic postmodernists had become fully identified with their institutions. They might have entered into them in a spirit of play and subversion, but try telling the precariously employed and prospectless grad students that their professors’ health insurance and retirement plans are just so much play.

This shift has also hailed a stark return to old ideas about truth and falsehood, ideas that the postmodernists had been dismissing for my whole conscious life as unsophisticated and passé. No moment for me is more emblematic of this than Avital Ronell’s implosion in the wake of her harassment scandal last year. She thought she could pass it all off as “camp,” that is, an expression of free play unbound by simple rules of right and wrong. Slavoj Žižek and a few other coevals offered up variations on the importance of not rushing to judgment. After all, the heart is a dark forest and those of us on the outside of an emotionally charged affair are in a poor position to judge of its true nature, so be quiet and let my friend Avital keep using NYU office space as her deconstructionist romper room. The young people were not having it.

One of the preferred phrases in internet-based debate is “full stop.” For instance, “Trans women are women, full stop.” Conservatives such as Jordan Peterson, who continue in the habit of believing that the problem with the left is its “postmodernism,” have been so floored by what they take to be the disregard for scientific or transcendental truth in the first part of this sentence that they have completely failed to notice the utterly un-postmodernist spirit of the second part — the “full stop.” But the kids mean what they say, and in this they are a world away from the postmodernists, to whose spirit nothing could have been more contrary than the suggestion that there is a final word, a full stop, on anything. The future, at least the near future, belongs to them. Playtime is over.

Justin E.H. Smith is a professor of history and philosophy of science at the University of Paris Diderot and the author of Irrationality: A History of the Dark Side of Reason (Princeton University Press, 2019).

The End of the World

Mark Greif

“Postmodernism” referred to three different things.

Postmodernism1 asked a question about art. The question was: Had the story of modern art ended? The answer seemed to be yes. “Modern art,” in this sense, meant a sequence of stepwise advances in style, begun in the 1860s, each novelty making its predecessors obsolete. The story of modernism in art had been told such that it could have an ending, once the possible new steps were reduced to minimal, repetitive, or chance gestures (in the fine arts), and no improvements in function could be conceived (as in architecture). Now, in the 1970s and 1980s, art ended before critics’ eyes. Minimal music. Rephotographed Marlboro Men. Decorative party hats on skyscrapers. Fredric Jameson recorded some of the changes in his great essay “Postmodernism” (1984), as Arthur Danto celebrated others in “The End of Art” (1984) and elsewhere. Artists kept making works, but knew they all came “after.”

Postmodernism2 asked questions about society. Considering economics, politics, and technology, it asked: Had a “modern” order, begun in the 1500s or the 1700s, ended in the rich nations? Industrial capitalism underwent deindustrialization. The democratic utopias — of bourgeois revolutions for rights and freedoms, and socialist revolutions for equality and plenty — seemed to be running out of steam. Information and computation might liquidate and concentrate all spheres of knowledge and value — or just multiply and clutter them. In different ways, these were the questions asked by Jean-François Lyotard, David Harvey, and Francis Fukuyama in the 1980s and 1990s. But the answer here to “postmodernity” seemed to be: No, modernity hadn’t ended after all; it had accelerated, metastasized, reorganized. On this view, we still dwell in the familiar “modernity” that launched with the Renaissance, scientific revolution, or industrialization, just more pessimistically.

Postmodernism3 was less coherent. It helped to name a conflict in philosophy. A strange struggle had occurred in English-language university departments after 1945, when philosophy and some social sciences lost touch with Europe and refashioned themselves as progressive and anti-historical imitators of the “hard” sciences. Postwar European social science and philosophy found alternate homes in literature and anthropology departments, as “theory.” Sometimes it was called, by its opponents, “postmodernism.” (Perhaps because “modernism” in thought could mean rigorism, scientific unification and purification. But philosophical “modernism” had also meant relativism, pragmatism, and phenomenology, and the objected-to portions of Foucault, Derrida, Lacan, or Bourdieu preceded them in modernists like Dewey, Boas, Wittgenstein, and Heidegger, not to mention Weber, Freud, and Marx.) During the period from 1970 to 2000, the disciplinary musical chairs upset, or inspired, all sorts of people, quite needlessly. The continuing use of “postmodernism” as a pejorative, today, is a know-­nothing usage, and should decline in importance.

“Postmodernism” has become a historical term, naming forms of debate from the last fin-de-siècle. 

Thirty-five years after its best formulations, “postmodernism” has become an essentially historical term. It names forms of debate from the last fin de siècle. All the phenomena underlying the debates still live, however. The major critical statements are each in their own way quite great, and worth reading. Jameson’s contributions, especially, have held up best because of his indestructible Marxist commitment to overlaying the artistic and cultural (his métier) upon the social and economic, even where they mismatch.

My own position is that the explanation for the “postmodern,” and its role as a mysterious vortex for intense energies of the late 20th century, can only be understood within the larger proliferation of “posts” in the period, including also “posthistory” and the “posthuman.” The energizing feature of all these debate-containers, which also named lively fantasies (of intellectual messianism, of world destruction), belongs to that gap you can see between Postmodernism1 and Postmodernism2, a magnetic dynamo (for those who flopped between culture and political economy) repeated in many other intellectual locations.

Some forms of structured progress really did end, like modern art. Other forms, equally intellectual yet real, really didn’t end — like capitalism and socialism, even though the U.S.S.R. went kaput. Still others more unpredictably didn’t end — like the world itself, though it sometimes seemed it should have. The expected U.S.-U.S.S.R. nuclear war never happened, despite ample opportunities. Yet the world itself secretly embarked on another possible ending in unforeseen guise, and nuclear terror has been adapted for a slow climate catastrophe already underway. This persistence fits within the modern habit of thought that overvalues the new, feels the accelerated tempo of eventfulness, and can’t quite decide whether it should prefer to be in the last generation, or to abide. Whether, in other words, the pleasure of being the first to name, describe, or even help cause the end of the world and time would be worth the price of undergoing it.

Mark Greif is an associate professor of English at Stanford University.


Postmodernists Didn’t Go Far Enough

Ethan Kleinberg

The Bonaventure Hotel still stands in downtown Los Angeles, 35 years after Fredric Jameson presented it as an exemplar of postmodernism. But walking through the Bonaventure today, one has the inescapable feeling of inhabiting a past promise of what the future was to be. One has a similar feeling re-reading Jameson’s essay “Postmodernism, or the Cultural Logic of Late Capitalism.” The building and the text each gestures to the hopes or fears of an inevitable future that never came to pass — though it is principally fear that has driven attacks on postmodernism, from the time of Jameson’s essay to now.

Jameson’s fear was that postmodern theory disabled Marxist critique and political action. More recent attacks from public intellectuals on the Marxist left conserve Jameson’s claim that postmodernism abandons the concept of “truth,” paralyzing the critic in negation and revolt. These are claims that I reject.

It is wrongheaded to assert that postmodernism abandons “truth.” While the postmodern position does hold truth to be socially constructed, it does not follow that truth does not exist. Instead, postmodernism seeks to understand why and how some truths are accepted while others are not. This is a deeply historical project which requires scholars to engage with the ways that epistemological commitments change at different times and in different places. Far from inciting paralysis, an understanding of historical contingency is liberating because it denaturalizes suppositions previously taken to be foundational or immutable. It opens up the possibility of change.

Surprisingly, aspects of Jameson’s critique have provided a template for critics of postmodernism from the political center and right. Jordan Peterson’s cartoonish characterization of postmodernism as “an infinite number of ways to interpret a finite set of phenomena” and his conclusion that under such a world view “no interpretation can be privileged above another” merely serve as a means of misdirection allowing him to privilege his own odious interpretations. Steven Pinker’s work has more academic credibility, but the family resemblance is visible, as when he asks an imagined postmodernist this rhetorical question: “If truth is just socially constructed, would you say that climate change is a myth?” But to assert that the truth is socially constructed is not tantamount to saying it is impossible to privilege any one truth above others. We can and we do. The potency of the postmodern approach lies in its ability to question why and how such privileging occurs.

The issue has never been that postmodernism has gone too far but that academics have yet to go far enough. Given the precarity of our current political, ecological, and epistemological climate, the time has come for academics to embrace postmodernism — to activate not only its potential for critique but also its power to create space for change.

Ethan Kleinberg is a professor of history and letters at Wesleyan University and editor-in-chief of History and Theory.

Still Modernist, After All These Years

Marjorie Perloff

Contradiction, disruption, dislocation, decentering: In the 1970s and ’80s, I was a confirmed believer in those defining attributes of what was known as postmodernism. From David Antin’s groundbreaking essay “Modernism and Postmodernism: Approaching the Present in American Poetry,” published in the first issue (1972) of boundary 2, to Jean-François Lyotard’s The Postmodern Condition (1979), to Fredric Jameson’s definitive “Postmodernism, or the Cultural Logic of Late Capitalism” (1984), with its elaboration of “the death of the author,” “the waning of affect,” the “new depthlessness,” the blank parody we call “pastiche,” and “the new spatial logic of the simulacrum” (the term simulacrummade famous by Jean Baudrillard), postmodernism was the order of the day.

Ihab Hassan’s elaborate charts, which pitted modernism against postmodernism, using such categories as urbanism versus the global village, elitism versus community and anarchy, and irony versus camp and the absurd, were widely cited as if their neat bifurcation was a fact of life in the second half of the 20th century. Derridean anti-­essentialism demanded that we all scoff at the very possibility of transcendental value or external “truth.” The only difference was between those who thought the coming of postmodernism was a good thing — a liberating notion ushering in a new avant-garde — and those who saw it as the darker and inevitable logic of a late and brutal capitalism.

Jameson was of the second camp, I myself of the first. The artists I loved and championed — John Cage, Jasper Johns, Robert Smithson, Laurie Anderson, the Language poets — were doing things that seemed new and exciting; theirs were “breakthrough” performances (and indeed performance art was deemed to be a leading new practice), challenging the “old” modernist ethos, with its “elitist” canon of predominantly white European males. But whether one approved or disapproved of postmodernism, it was, well into the 1990s, considered a valuable period concept.

Then suddenly — or at least it seemed sudden to many of us — postmodernism was finished. The turning point came, I believe, with 9/11, although no one realized it at the time. The attack on the World Trade Center at the threshold of the new century, with its terrible death count, followed by the coming of ISIS and other political upheavals and culminating in the election of Donald Trump, in 2016, made it increasingly difficult to talk of simulacra and multiple truths, of the absurdity of master narratives and the refusal of all categorical imperatives. Derridean différance increasingly gave way to the concept of diversity, which, in practice, means the necessity for previously underrepresented communities to gain recognition. The notion of decentering now came to refer not to ambiguity or undecidability, but to a statistically based inclusiveness.

In the strangest way, Proust and Kafka, Gertrude Stein and Marcel Duchamp, remain unsurpassable.

Indeed, in 2019 the pendulum seems to have swung as far away from “postmodernism” as possible. The sentence “Trump is a racist” is now regularly pronounced on CNN as if it were a simple fact, equivalent to “Trump is 6’3″.” The assumption is that racist means a specific thing and Trump is definitely that thing. Or again, when people today refer to “social justice,” a term postmodernism would have been reluctant to use, they see no need to define the term. No simulacrum here: We all know what social justice would and should look like.

Accordingly, if postmodernism can now refer to anything, it can only be a historical period, from around 1960 to 2000. What, then, about our own moment? Surely there is no use calling ours the post-­postmodern, especially since — and this is the real complication — even as postmodernism has gone away, modernism is still very much with us.

Indeed, the art and literature once thought to be “postmodern” (say, the theater of Beckett) now seems more properly “late modern.” And the work of the great modernists — Joyce and Pound, Mahler and Malevich — continues to be hotly debated and discussed, even as their postmodern successors, like William Burroughs and Günter Grass, have largely been eclipsed. It seems we are somehow still in the orbit of that great revolution known as modernism. Indeed, one of the seminal books Fredric Jameson has published in this century is a fat volume called The Modernist Papers (Verso, 2007). In the strangest way, Proust and Kafka, Gertrude Stein and Marcel Duchamp, remain unsurpassable.

Given the digital revolution, given the displacement and migration of people around the globe, given the increasingly urgent call for political transformation and for the end of racial and gender inequality, why is it that, at least in the arts, modernist norms have remained so hard to shake off? I can’t answer that question. But time will surely provide an epithet or two for our own epoch. And it won’t be postmodernism.

Marjorie Perloff is an emerita professor of English at Stanford University.


It’s Back, Baby

Jessica Burstein

Fashion explains everything. Hemlines, hummingbirds, architecture, the structure of scientific revolutions, universities. And I’ve got a party to get to, so I’m going to make this fast.

First, you have to understand that logic is a mug’s game. It works and all that, and I like it that bridges don’t fall down, but the Enlightenment was one among a number of trends. Also, by the way, you are an irrational being — Michael Lewis tells us that you just need a narrative to swallow and you’ll think anything is reasonable. Freud said that too, but that’s another story.

Second point: Academia is not only a very good idea; it is populated by human beings. Most of us were not invited to the prom, so we’re especially pleased to be included in “critical waves.” A critical wave is something that two people said and then there was a special session at the MLA (“Whither Postmodernism?”) where a professor started crying and The Chronicle did a story on it.

Third: Academic trends are just that.

Fourth: The only reason you’re wearing denim this year (you idiot) is because you weren’t last year. Ditto hemlines: Up then means down now. Get it?

Fifth — and here I’m getting indigenous, but my Uber is still three minutes away — the 1990s were investment bankers telling you at cocktail parties it would be great to be a professor since they have summers off, and I was concentrating on not throwing up at my qualifying exams. Modernism was fascism, and then there was the time I cajoled Derrida into telling me what he thought of the new book on Paul de Man. I can’t write what he said, and not just because my French lexicon of obscenities is underpopulated. You (don’t) remember de Man — deconstruction, Yale, and the grad student who found de Man’s newspaper columns in the 1940s collaborationist paper Le Soir? The handsomest man in academia had co-edited a book called Responses — go look it/him up. Belgium was actually in the news. There was, if not a feeding frenzy, at least an animated chow-down in which lots of folks aligned deconstruction with the idea that words meant nothing, that these guys (not sic) said there was no meaning, and so that led to the breakdown of a moral center, and next stop gas chambers.

Academia is not only a very good idea; it is populated by human beings.

Sound familiar? It’s back, baby. And you look fab-u-lous dragged out in that vegan kidskin moral recrimination Mary Quant miniskirt, throwback Armani #MelanieGriffithheaddesk power-suit jacket, and neoliberal kidskin gloves. (By the way, are there any ladies aboard this little bus?) Anyhoo, the hair shirt comes in Gray Beard and Ivy-Hall, but either way it ain’t optional.

Jessica Burstein is an associate professor of English and gender, women, and sexuality at the University of Washington.

Are Postmodernism and #MeToo Incompatible?

Seo-Young Chu

There’s a so-called love scene in the 1982 film Blade Runner — usually considered a classic of postmodern art and a staple of the postmodern curriculum — through which I have often fast-forwarded or dissociated.

In this “love scene,” Rick Deckard (the film’s protagonist, a bounty hunter played by Harrison Ford) attempts to kiss Rachael (a “replicant,” or humanoid artifact played by Sean Young). At first Rachael rejects Deckard’s advances. She struggles against him physically. She tries frantically to leave his apartment. But when he commands, “Say, ‘Kiss me,’” she begins to relent: “Kiss me,” she echoes with reluctance. When Deckard goes on to instruct her, repeatedly, to say, “I want you,” she complies in a barely audible voice. Throughout this exchange, Rachael’s face is alive with fear while Deckard’s face is menacing. Meanwhile, the music — a slow yet vigorous and grotesquely tender diatonic saxophone melody that stands out in what is otherwise an explicitly eerie cyberpunk soundscape — announces that what we are witnessing is a triumphant romantic conquest.

I first saw Blade Runner as a naïve teenager in the 1990s. Even back then, I found the “love scene” disturbing, though I could not articulate exactly why. After my own encounter with sexual violence (I was raped and sexually harassed in 2000 by a man in a position of power), I continued to have trouble articulating why I would fast-forward or dissociate during the “love scene” while rewatching the movie.

The film’s spectacular cityscape still entranced me — despite or perhaps because of its Techno-Oriental aura. Thus I was able to teach and to some extent enjoy Blade Runner, telling students that it was one of my favorite films. It was, I thought, a postmodern cinematic masterpiece in which robot rights become trenchantly available for representation when the replicant Roy (played by Rutger Hauer) utters his dying soliloquy.

Recently, after #MeToo, I taught Blade Runner again. This time one of my students emailed me about the “love scene.” She was upset by it. Upon reading her email, I forced myself to watch the scene in its entirety while struggling not to dissociate. A word like “postmodern” would be obscenely irrelevant to a discussion of my reaction, which was visceral, raw, too real, too authentic, too present, the opposite of absent, the opposite of depthless. It was horrific. The coercion, the obvious unwillingness, the trauma — I don’t think I can teach Blade Runner ever again.

If postmodernism renders the replicant Rachael legible as a glossy simulacrum, then #MeToo renders her brutally legible as a victim of sexual violence.

I will conclude this piece the way I sometimes conclude my classroom lectures: with questions for discussion. If you disagree with my reaction to Blade Runner, how would you support your argument? In what ways, if any, is #MeToo postmodern?

Seo-Young Chu is an associate professor of English at the City University of New York’s Queens College.


The Best Lack All Conviction

David Bromwich

Was there ever a more cheerless name for an art-­historical movement? Yet the name fit the mood: Postmodernism wanted to be history before it wagered anything as art. “Neoclassicism” and “the Gothic Revival” went halfway to self-description by declaring that they aimed at a return. Whereas the very word postmodern spoke of exhaustion. It came after a something whose own name meant “of the mode; of the moment.” What could it mean to come after the mode, after the moment?

The idea that the postmodern bears a special relationship to high capitalism owed much to Robert Venturi, Denise Scott Brown, and Steven Izenour’s architectural primer Learning From Las Vegas (MIT Press, 1972). Their celebratory approach nonetheless signaled a radical intention, if you knew how to read the surfaces: “Learning from the existing landscape is a way of being revolutionary for an architect.” Of course it is also a way of being conventional. This bland evasiveness would become a characteristic manner for Pomo theorists.

The very word postmodern spoke of exhaustion. 

The postmodern meant coming after and pointing to your location. Its key term of art was the word “reference,” transformed from a noun to a verb in sentences that would once have used “mention” or “allude to.” See how this design by Venturi references Gothic ornamentality. Or how Scorsese references the iconic taxi scene in On the Waterfront.For that matter, the Pomo word “iconic,” meaning generally knownand known to be known, began its climb to pop currency as a synonym for “famous.” It imparted to fame itself a sacral overtone.

In universities, the new language carried its widest appeal in the visual arts, with some spillover in literary theory and a small allowed ration in political theory and philosophy. Another sample from Venturi, Scott Brown, and Izenour:

A choice between Perpendicular and Decorated for [19th-century] English churches reflected theological differences between the Oxford and Cambridge Movements. The hamburger-shaped hamburger stand is a current, more literal, attempt to express function via association but for commercial persuasion rather than theological refinement.

This blurring of lines between theology and commerce, art and commerce, high art and the existing landscape of motels and shopping malls, was a Pomo gesture; the authors preferred “roadside eclecticism” and “representational architecture along highways” to the hostile environment built up by modernists like Le Corbusier and Gropius. What was true in architecture could be extended as a democratizing move to literature, painting, music, film.

Ideas of individual talent or personal style were accordingly called into question or “decentered.” But the retrospective judgment of Jean-François Lyotard on his influential book The Postmodern Conditionmight be taken to qualify the importance of the result: “I made up stories, I referred to a quantity of books I’d never read, apparently it impressed people.”

Postmodernism is sometimes linked with relativism, a speculative theory no one has ever lived. You can’t believe you are right about the worth of an activity or the truth of a proposition and simultaneously believe a person who thinks the opposite is also right. Still, there may be an affinity here: The genuine disagrees with the fake, but Pomo theory called that binary (among others) into question. Postmodernism was a sophisticated stand-in for a lack of conviction, and it prospered most of all on the nonpolitical left of the 1980s and 1990s.

Fredric Jameson’s New Left Review essay “Postmodernism, or the Cultural Logic of Late Capitalism” was published five years before Francis Fukuyama’s National Interest essay “The End of History?” The politics of the two authors differed markedly, but the spectacle of hypnotic distraction, which Jameson described in ambiguous tones, was the cultural facade of the same weak-spirited luxury whose triumph Fukuyama celebrated after the fall of communism. In the academic mood of the arts today, it would be hard to find anyone with a good word for postmodernism. The pendulum has swung the other way. We are on the brink of a return of the social-­realist doctrine of the 1930s. The arts are said to matter chiefly for their service to culture, and culture is understood to be a province of politics.

David Bromwich is a professor of English at Yale University.

0 Comments

Submit a Comment

Su dirección de correo no se hará público. Los campos requeridos están marcados *

PUBLICACIONES

Libros

Capítulos de libros

Artículos académicos

Columnas de opinión

Comentarios críticos

Entrevistas

Presentaciones y cursos

Actividades

Documentos de interés

Google académico

DESTACADOS DE PORTADA

Artículos relacionados

Mediación para el acoso escolar

La mediación como arma contra el acoso escolar Los alumnos aprenden habilidades para la resolución de conflictos internos, fomentando así un clima de convivencia positivo en el centro DIANA OLIVER 10 MAR 2024 - 05:30CET El colegio público Gustavo Adolfo Bécquer, en el...

Qué hace a una universidad

Don’t ditch research quality benchmarks, universities say Australian institutions say new research rules are not to blame as registration delays stretch on March 11, 2024 John Ross Universities have defended Australia’s new research quality requirements after the...

Share This