Originally published in The Review of Contemporary Fiction, this essay sketches a loose history of the American television industry advertising, U.S. fiction and advertisement. The title “E Unibus Pluram” means “from one, many” and is a solipsistic play on the American motto, “E Pluribus Unum,” which means “out of many, one.”
While Wallace acknowledges that TV is fun, most evident from the fact that the average American watches up to six hours of television a day, he also claims that fiction writers do not take TV “seriously enough as both a disseminator and a definer of the cultural atmosphere we breathe and process.” He laments tv critics for their uncaring attention to the effects of long-term television viewing and how it will begin to affect the average American in ways not seen before. He argues that critics disdain TV’s “vapidity”, but they also watch with “beady-eyed fascination”; “they simultaneously hate, fear, and need television”, which seems to be quite powerful indeed. Wallace goes on to argue that television has co-opted irony. Ironic ads offer us a way to feel like we’re in on the joke. He writes about a Pepsi commercial that invites “Joe Briefcase,” Wallace’s personification of the individuals sitting alone at home, all together watching TV (E Unibus Pluram), to feel like he has transcended the masses that Pepsi is advertising too. The ironic part is that Joe Briefcase then goes out and buys more Pepsi; he has transcended nothing, and is certainly not above mass-consumption.
Wallace notes that “American fiction remains deeply informed by television” and that they interact in a realm of self-conscious irony. He writes about a Fiction of Image as a response to television culture. He describes Mark Leyner’s My Cousin, My Gastroenterologist as a “dazzling televisual parody” and argues that its “sole aim is, finally, to wow, to ensure that the reader is pleased and continues to read” —in short, that the novel’s objective is exactly the same as television’s objective, and it does so by flattering the reader for getting the joke, just as ironic TV ads do.
Wallace ends “E Unibus Pluram” by stating (not asking): “Are you immensely pleased.” This could just be a direct echo of an ironic statement from DeLillo’s White Noise that Wallace quotes early on in the essay: “he [Gladney? I don’t have White Noise on hand…] seemed immensely pleased by this”, with “this” referring to the impossibility of getting outside the aura. However, Wallace could also be poking fun at his own attempt to get outside the aura and reveal his self-conscious anxiety with regard to his own artistic aim, which he fears may only be to get the reader to like him. Furthermore, that desire for the reader’s approval can easily be read as Wallace’s desire for his grammarian mother’s approval—hence his constant grandiloquence.
Recent technological advances stretch some of Wallace’s key arguments. For example, while Wallace notes that the VCR “threatens the very viability of commercials”, today we can view nearly all television completely commerical free by using TIVO or the streaming sites. Because of this, the television industry focuses more on product placement, which we can never really escape. It’s not merely that “programs start to resemble commercial”, but that commercials are fundamentally embedded in, and sometime vital to the plot-lines of, programs. With On Demand, iTunes, torrents and YouTube, we can now truly, as Wallace predicted, “engineer our own dreams.” This can be similarly seen with the prevalance of gadgets and smart phones integral to the narrative of a film or show, the omnipresence of the iPhone.
The key question that this essay should bring out is do the shows we watch imitate life or does life imitate the shows we watch?
Fiction writers as a species tend to be oglers. They tend to lurk and to stare. The minute fiction writers stop moving, they start lurking, and stare. They are born watchers. They are viewers. They are the ones on the subway about whose nonchalant stare there is something creepy, somehow. Almost predatory. This is because human situations are writers’ food. Fiction writers watch other humans sort of the way gapers slow down for car wrecks: they covet a vision of themselves as witnesses.
But fiction writers as a species also tend to be terribly self-conscious. Even by U.S. standards. Devoting lots of productive time to studying closely how people come across to them, fiction writers also spend lots of less productive time wondering nervously how they come across to other people. How they appear, how they seem, whether their shirttail might be hanging out their fly, whether there’s maybe lipstick on their teeth, whether the people they’re ogling can maybe size them up as somehow creepy, lurkers and starers.
The result is that a surprising majority of fiction writers, born watchers, tend to dislike being objects of people’s attention. Being watched. The exceptions to this rule – Mailer, McInerney, Janowitz – create the misleading impression that lots of belles-lettres types like people’s attention. Most don’t. The few who like attention just naturally get more attention. The rest of us get less, and ogle.
Most of the fiction writers I know are Americans under forty. I don’t know whether fiction writers under forty watch more television than other American species. Statisticians report that television is watched over six hours a day in the average American household. I don’t know any fiction writers who live in average American households. I suspect Louise Erdrich might. Actually I have never seen an average American household. Except on TV.
So right away you can see a couple of things that look potentially great, for U.S. fiction writers, about U.S. television. First, television does a lot of our predatory human research for us. American human beings are a slippery and protean bunch, in real life, as hard to get any kind of univocal handle on as a literary territory that’s gone from Darwinianly naturalistic to cybernetically post-postmodern in eighty years. But television comes equipped with just such a syncretic handle. If we want to know what American normality is – what Americans want to regard as normal – we can trust television. For television’s whole raison is reflecting what people want to see. It’s a mirror. Not the Stendhalian mirror reflecting the blue sky and mud puddle. More like the overlit bathroom mirror before which the teenager monitors his biceps and determines his better profile. This kind of window on nervous American self-perception is just invaluable, fictionwise. And writers can have faith in television. There is a lot of money at stake, after all; and television retains the best demographers applied social science has to offer, and these researchers can determine precisely what Americans in 1990 are, want, see: what we as Audience want to see ourselves as. Television, from the surface on down, is about desire. Fictionally speaking, desire is the sugar in human food.
The second great thing is that television looks to be an absolute godsend for a human subspecies that loves to watch people but hates to be watched itself. For the television screen affords access only one way. A psychic ball-check valve. We can see Them; They can’t see Us. We can relax, unobserved, as we ogle. I happen to believe this is why television also appeals so much to lonely people. To voluntary shut-ins. Every lonely human I know watches way more than the average U.S. six hours a day. The lonely, like the fictional, love one-way watching. For lonely people are usually lonely not because of hideous deformity or odor or obnoxiousness – in fact there exist today social and support groups for persons with precisely these features. Lonely people tend rather to be lonely because they decline to bear the emotional costs associated with being around other humans. They are allergic to people. People affect them too strongly. Let’s call the average U.S. lonely person Joe Briefcase. Joe Briefcase just loathes the strain of the self-consciousness which so oddly seems to appear only when other real human beings are around, staring, their human sense-antennae abristle. Joe B. fears how he might appear to watchers. He sits out the stressful U.S. game of appearance poker.
But lonely people, home, alone, still crave sights and scenes. Hence television. Joe can stare at Them, on the screen; They remain blind to Joe. It’s almost like voyeurism. I happen to know lonely people who regard television as a veritable deus ex machina for voyeurs. And a lot of the criticism, the really rabid criticism less leveled than sprayed at networks, advertisers, and audiences alike, has to do with the charge that television has turned us into a nation of sweaty, slack-jawed voyeurs. This charge turns out to be untrue, but for weird reasons.
What classic voyeurism is is espial: watching people who don’t know you’re there as they go about the mundane but erotically charged little businesses of private life. It’s interesting that so much classic voyeurism involves media of framed glass-windows, telescopes, etc. Maybe the framed glass is why the analogy to television is so tempting. But TV-watching is a different animal from Peeping Tourism. Because the people we’re watching through TV’s framed-glass screen are not really ignorant of the fact that somebody is watching them. In fact a whole lot of somebodies. In fact the people on television know that it is in virtue of this truly huge crowd of ogling somebodies that they are on the screen, engaging in broad non-mundane gestures, at all. Television does not afford true espial because television is performance, spectacle, which by definition requires watchers. We’re not voyeurs here at all. We’re just viewers. We are the Audience, megametrically many, though most often we watch alone. E unibus pluram.(1)
One reason fiction writers seem creepy in person is that by vocation they really are voyeurs. They need that straightforward visual theft of watching somebody without his getting to prepare a speciable watchable self. The only real illusion in espial is suffered by the voyee, who doesn’t know he’s giving off images and impressions. A problem with so many of us fiction writers under forty using television as a substitute for true espial, however, is that TV “voyeurism” involves a whole gorgeous orgy of illusions for the pseudo-spy, when we watch. Illusion (1) is that we’re voyeurs here at all: the voyees behind the screen’s glass are only pretending ignorance. They know perfectly well we’re out there. And that we’re there is also very much on the minds of those behind the second layer of glass, the lenses and monitors via which technicians and arrangers apply no small ingenuity to hurl the visible images at us. What we see is far from stolen; it’s proffered – illusion (2). And, illusion (3), what we’re seeing through the framed pane isn’t people in real situations that do or even could go on without consciousness of Audience. What young writers are scanning for data on some reality to fictionalize is already composed of fictional characters in highly ritualized narratives. Plus, (4), we’re not really even seeing “characters” at all: it’s not Major Frank Burns, pathetic self-important putz from Fort Wayne, Indiana; it’s Larry Linville of Ojai, California, actor stoic enough to endure thousands of letters (still coming in, even in syndication) from pseudo-voyeurs mistakenly berating him for being a putz. And, if (5) isn’t too out-there for you, it’s ultimately of course not even actors we’re espying, not even people: it’s EM-propelled analog waves and ionized streams and rear-screen chemical reactions throwing off phosphenes in grids of dots not much more lifelike than Seurat’s own impressionistic “statements” on perceptual illusion. Good lord and (6) the dots are coming out of our furniture, all we’re spying on is our own furniture; and our very own chairs and lamps and bookspines sit visible but unseen at our gaze’s frame as we contemplate “Korea” or are “taken live to Amman, Jordan,” or regard the plusher chairs and classier spines of the Huxtable “home” as illusory cues that this is some domestic interior whose membrane we have, slyly, unnoticed, violated. (7) and (8) and illusions ad inf.
Not that realities about actors and phosphenes and furniture are unknown to us. We simply choose to ignore them. For six hours a day. They are part of the
belief we suspend. But we’re asked to hoist such a heavy load aloft. Illusions of voyeurism and privileged access require real complicity from viewers. How can we be made so willingly to acquiesce for hours daily to the illusion that the people on the TV don’t know they’re being looked at, to the fantasy that we’re transcending privacy and feeding on unself-conscious human activity? There might be lots of reasons why these unrealities are so swallowable, but a big one is that the performers behind the two layers of glass are – varying degrees of Thespian talent aside – absolute geniuses at seeming unwatched. Now, seeming unwatched in front of a TV camera is a genuine art. Take a look at how civilians act when a TV camera is pointed at them: they simply spaz out, or else go all rigor mortis. Even PR people and politicians are, camera-wise, civilians. And we love to laugh at how stiff and false non-professionals appear, on television. How unnatural But if you’ve ever once been the object of that terrible blank round glass stare, you know all too well how self-conscious it makes you. A harried guy with earphones and a clipboard tells you to “act natural” as your face begins to leap around on your skull, struggling for a seemingly unwatched expression that feels impossible because “seeming unwatched” is, like the “act natural” which fathered it, oxymoronic. Try driving a golf ball as someone asks you whether you in- or exhale on your backswing, or getting promised lavish rewards if you can avoid thinking of a rhinoceros for ten seconds, and you’ll get some idea of the truly heroic contortions of body and mind that must be required for Don Johnson to act unwatched as he’s watched by a lens that’s an overwhelming emblem of what Emerson, years before TV, called “the gaze of millions.”
Only a certain very rare species of person, for Emerson, is “fit to stand the gaze of millions.” It is not your normal, hard-working, quietly desperate species of American. The man who can stand the megagaze is a walking imago, a certain type of transcendent freak who, for Emerson, “carries the holiday in his eye.”(2) The Emersonian holiday television actors’ eyes carry is the potent illusion of a vacation from self-consciousness. Not worrying about how you come across. A total unallergy to gazes. It is contemporarily heroic. It is frightening and strong. It is also, of course, an act, a counterfeit impression – for you have to be just abnormally self-conscious and self-controlling to appear unwatched before lenses. The self-conscious appearance of unself-consciousness is the grand illusion behind TV’s mirror-hall of illusions; and for us, the Audience, it is both medicine and poison.
For we gaze at these rare, highly trained, seemingly unwatched people for six hours daily. And we love these people. In terms of attributing to them true supernatural assets and desiring to emulate them, we sort of worship them. In a real Joe Briefcase-type world that shifts ever more starkly from some community of relationships to networks of strangers connected by self-interest and contest and image, the people we espy on TV offer us familiarity, community. Intimate friendship. But we split what we see. The characters are our “close friends”; but the performers are beyond strangers, they’re images, demigods, and they move in a different sphere, hang out with and marry only each other, seem even as actors accessible to Audience only via the mediation of tabloids, talk show, EM signal. And yet both actors and characters, so terribly removed and filtered, seem so natural, when we watch.
Given how much we watch and what watching means, it’s inevitable – but toxic – for those of us fictionists or Joe Briefcases who wish to be voyeurs to get the idea that these persons behind the glass, persons who are often the most colorful, attractive, animated, alive people in our daily experience, are also people who are oblivious to the fact that they are watched. It’s toxic for allergic people because it sets up an alienating cycle, and also for writers because it replaces fiction research with a weird kind of fiction consumption. We self-conscious Americans’ oversensitivity to real humans fixes us before the television and its ball-check valve in an attitude of rapt, relaxed reception. We watch various actors play various characters, etc. For 360 minutes per diem, we receive unconscious reinforcement of the deep thesis that the most significant feature of truly alive persons is watchableness, and that genuine human worth is not just identical with but rooted in the phenomenon of watching. And that the single biggest part of real watchableness is seeming to be unaware that there’s any watching going on. Acting natural. The persons we young fiction writers and assorted shut-ins most study, feel for, feel through are, by virtue of a genius for feigned unself-consciousness, fit to stand gazes. And we, trying desperately to be nonchalant, perspire creepily, on the subway.
Weighty existential predicaments aside, there’s no denying that people in the U.S.A. watch so much television because it’s fun. I know I watch for fun, most of the time, and that at least 51 percent of the time I do have fun when I watch. This doesn’t mean I do not take television seriously. One claim of this essay is that the most dangerous thing about television for U.S. fiction writers is that we yield to the temptation not to take television seriously as both a disseminator and a definer of the cultural atmosphere we breathe and process, that many of us are so blinded by constant exposure that we regard TV the way Reagan’s lame FCC chairman Mark Fowler professed to in 1981, as “just another appliance, a toaster with pictures.”(3)
Television nevertheless is just plain pleasurable, though it may seem odd that so much of the pleasure my generation gets from television lies in making fun of it. But you have to remember that younger Americans grew up as much with people’s disdain for TV as we did with TV itself I knew it was a “vast wasteland” way before I knew who Newton Minow or Mark Fowler were. And it’s just fun to laugh cynically at television – at the way the laughter from sitcoms’ “live studio audience” is always suspiciously constant in pitch and duration, or at the way travel is depicted on The Flintstones by having the exact same cut-rate cartoon tree, rock, and house go by four times. It’s fun, when a withered June Allyson comes on-screen for Depend Adult Undergarments and says “If you have a bladder-control problem, you’re not alone,” to hoot and shout back “Well, chances are you’re alone quite a bit, June!”
Most scholars and critics who write about U.S. popular culture, though, seem both to take TV seriously and to suffer real pain over what they see. There’s this well-known critical litany about television’s vapidity, shallowness, and irrealism. The litany is often far cruder and triter than what the critics complain about, which I think is why most younger viewers find pro criticism of television far less interesting than pro television itself. I found solid examples of what I’m talking about on the first day I even looked. The New York Times Arts & Leisure section for Sunday, 8/05/90, simply bulged with bitter critical derision for TV, and some of the most unhappy articles weren’t about just low-quality programming so much as about how TV’s become this despicable instrument of cultural decay. In a summary review of all 1990’s “crash and burn” summer box-office hits in which “realism … seems to have gone almost entirely out of fashion,” Janet Maslin locates her true anti-reality culprit: “We may be hearing about |real life’ on television shows made up of 15-second sound bites (in which |real people’ not only speak in brief, neat truisms but actually seem to think that way, perhaps as a result of having watched too much reality-molding television themselves).”(4) And one Stephen Holden, in what starts out as a mean pop music article, knows perfectly well what’s behind what he hates: “Pop music is no longer a world unto itself but an adjunct of television, whose stream of commercial images projects a culture in which everything is for sale and the only things that count are fame, power, and the body beautiful.”(5) This stuff just goes on and on, in the Times. The only Arts & Leisure piece I could find with anything upbeat to say about TV that morning was a breathless article on how lots of Ivy League graduates are now flying straight from school to New York and Los Angeles to become television writers and are clearing well over $200,000 to start and enjoying rapid advancement to harried clip-boarded production status. In this regard, 8/05’s Times is a good example of a strange mix that’s been around for a few years now: weary contempt for television as a creative product and cultural force, combined with beady-eyed fascination about the actual behind-the-glass mechanics of making that product and projecting that force.
Surely we all have friends we just hate to hear talk about TV because they so clearly loathe it – they sneer relentlessly at the hackneyed plots, the unlikely dialogue, the Cheez-Whiz resolutions, the bland condescension of the news anchors, the shrill wheedling of commercials – and yet are just as clearly obsessed with it, somehow need to hate their six hours a day, day in and out. Junior advertising executives, aspiring filmakers, and graduate- school poets are in my experience especially prone to this condition where they simultaneously hate, fear, and need television, and try to disinfect themselves of whatever so much viewing might do to them by watching TV with weary irony instead of the rapt credulity most of us grew up with. (Note that most fiction writers still tend to go for the rapt credulity.)
But, since the wearily disgusted Times has its own demographic thumb on the pulse of news-readerly taste, it’s safe to conclude that most educated, Times-buying Americans are wearily disgusted by television, have this weird hate-need-fear-6-hrs.-daily gestalt about it. Published TV scholarship sure reflects this mood. And the numbingly dull quality to most “literary” television analyses is due less to the turgid abstraction scholars employ to make television seem an OK object of “aesthetic” inquiry – cf. an ’86 treatise: “The form of my Tuesday evening’s prime-time pleasure is structured by a dialectic of elision and rift among various windows through which … |flow’ is more of a circumstance than a product. The real output is the quantum, the smallest maneuverable broadcast bit”(6) – than to the tired, jaded cynicism of television experts who mock and revile the very phenomenon they’ve chosen as scholarly vocation. It’s like people who despise – I mean big-time, long-term despise – their spouses or jobs, but won’t split up or quit. Critical complaint degenerates quickly into plain whining. The fecund question about U.S. television is no longer whether there are some truly nasty problems here but rather what on earth’s to be done about them. On this question pop critics are mute.
In fact it’s in the U.S. arts, particularly in certain strands of contemporary American fiction, that the really interesting questions about end-of-the-century TV – What is it about televisual culture that we so hate? Why are we so immersed in it if we hate it so? What implications are there in our sustained voluntary immersion in stuff we hate? – are being addressed. But they are also, weirdly, being asked and answered by television itself. This is another reason why most TV criticism seems so empty. Television’s managed to become its own most profitable critic.
A.M., 8/05/90, as I was scanning and sneering at the sneering tone of the prenominate Times articles, a syndicated episode of St. Elsewhere was on the TV, cleaning up in a Sunday-morning Boston market otherwise occupied by televangelists, infomercials, and the steroid- and polyurethane-ridden American Gladiators, itself not charmless but definitely a low-dose show. Syndication is another new area of public fascination, not only because huge cable stations like Chicago’s WGN and Atlanta’s WTBS have upped the stakes from local to national, but because syndication is changing the whole creative philosophy of network television. Since it is in syndication deals (where the disuibutor gets both an up-front fee for a program and a percentage of the ad-slots for his own commercials) that the creators of successful television series realize truly gross profits, many new programs are designed and pitched with both immediate prime-time and down-the-road syndication audiences in mind, and are now informed less by dreams of the ten-year-beloved-TV-institution-type run – Gunsmoke, M*A*S*H – than of a modest three-year run that yields the seventy-eight in-can episodes required for an attractive-syndication package. I, like millions of other Americans, know this stuff only because I saw a special three-part report about syndication on Entertainment Tonight, itself the first nationally syndicated “news” program and the first infomercial so popular that TV stations were willing to pay for it.
Sunday syndication is also intriguing because it makes for juxtapositions as eerily apposite as anything French surrealists could contrive. Lovable warlocks on Bewitched and commercially Satanic heavy-metal videos on America’s Top 40 run opposite airbrushed preachers decrying demonism in U.S. culture. Or, better, 8/05’s St. Elsewhere episode 94, originally broadcast in 1988, aired on Boston’s Channel 38 immediately following two back-to-back episodes of The Mary Tyler Moore Show, that icon of seventies pathos. The plots of the two Mary Tyler Moore Shows are unimportant here. But the St. Elsewhere episode that followed them partly concerned a cameo-role mental patient afflicted with the delusional belief that he was Mary Richards from The Mary Tyler Moore Show. He further believed that a fellow cameo-role mental patient was Rhoda, that Dr. Westphal was Mr. Grant, and that Dr. Auschlander was Murray. This psychiatric subplot was a one-shot; it was resolved by episode’s end. The pseudo-Mary (a sad lumpy-looking guy who used to play one of Dr. Hartley’s neurotic clients on the old Bob Newhart Show) rescues the other cameo-role mental patient, whom he believes to be Rhoda and who has been furious in his denials that he is female, much less fictional (and who is himself played by the guy who used to play Mr. Carlin, Dr. Hartley’s most intractable client) from assault by a bit-part hebephrene. In gratitude, Rhoda/Mr. Carlin/mental patient declares that he’ll consent to be Rhoda if that’s what Mary/neurotic client/mental patient wants. At this too-real generosity, the pseudo-Mary’s psychotic break breaks. The sad guy admits to Dr. Auschlander that he’s not Mary Richards. He’s actually just a plain old amnesiac, minus a self, existentially adrift. He has no idea who he is. He’s lonely. He watches a lot of television. He figured it was “better to believe I was a TV character than not to believe I was anybody.” Dr. Auschlander takes the penitent patient for a walk in the wintery Boston air and promises that he, the identityless guy, can someday find out who he really is, provided he can dispense with “the distraction of television.” At this cheery prognosis, the patient removes his own fuzzy winter beret and throws it into the air. The episode ends with a freeze of the aloft hat, leaving at least one viewer credulously rapt.
This would have been just another clever low-concept eighties TV story, where the final cap-tossing and closing credits coyly undercut Dr. Auschlander’s put-down of television, were it not for the countless layers of ironic, involuted TV imagery and data that whirl around this high-concept installment. Because another of this episode’s cameo stars, drifting through a different subplot, is one Betty White, Sue Ann Nivens of the old Mary Tyler Moore Show, here playing a tortured NASA surgeon (don’t ask). It is with almost tragic inevitability, then”, that Ms. White, at thirty-two minutes into the episode, meets up with the TV-deluded pseudo-Mary in their respective tortured wanderings through the hospital’s corridors, and that she considers the mental patient’s inevitable joyful cries of “Sue Ann!” with a too-straight face and says he must have her confused with someone else. Of the convolved levels of fantasy and reality and identity here – e.g., patient simultaneously does, does not, and does have Betty White “confused” with Sue Ann Nivens – we needn’t speak in detail: doubtless a Yale Contemporary Culture dissertation is underway on R. D. Laing and just this episode. But the most interesting levels of meaning here lie, and point, behind the lens. For NBC’s St. Elsewhere, like The Mary Tyler Moore Show and The Bob Newhart Show before it, was created, produced, and guided into syndication by MTM Studios, owned by Mary Tyler Moore and overseen by her husband, later NBC Chair Grant Tinker; and St. Elsewhere’s scripts and subplots are story-edited by Mark Tinker, Mary’s step-, Grant’s heir. The deluded mental patient, an exiled, drifting veteran of one MTM program, reaches piteously out to the exiled, drifting (literally – NASA, for God’s sake) veteran of another MTM production, and her ironic rebuff is scripted by KM personnel, who accomplish the parodic undercut of MTM’s Dr. Auschlander with the copyrighted MTM hat-gesture of one MTM veteran who’s “deluded” he’s another. Dr. A.’s Fowleresque dismissal of TV as just a “distraction” is less absurd than incoherent. Therd is nothing but television on this episode; every joke and dramatic surge depends on involution, metatelevision. It is in joke within in-joke.
So then why do I get it? Because I, the viewer, outside the glass with the rest of the Audience, am nevertheless in on the in-joke. I’ve seen Mary Tyler Moore’s “real” toss of that fuzzy beret so often it’s moved past cliche into nostalgia. I know the mental patient from Bob Newhart, Betty White from everywhere, and I know all sorts of intriguing irrelevant stuff about MTM Studios and syndication from Entertainment Tonight. I, the pseudovoyeur, am indeed “behind the scenes,” for in-joke purposes. But it is not I the spy who have crept inside television’s boundaries. It is vice versa. Television, even the mundane little businesses of its production, have become Our interior. And we seem a jaded, jeering, but willing and knowledgeable Audience. This St. Elsewhere episode was nominated for an Emmy. For best original teleplay.
The best TV of the last five years has been about ironic self-reference like no previous species of postmodern art could have dreamed of. The colors of MTV videos, blue-black and lambently flickered, are the colors of television. Moonlighting’s Bruce and Bueller’s Ferris throw asides to the viewer every bit as bald as the old melodrama villain’s monologued gloat. Segments of the new late-night glitz-news After Hours end with a tease that features harried headphoned guys in the production booth ordering the tease. MTV’s television-trivia game show, the dry-titled Remote Control, got so popular it busted its own MTV-membrane and is in 1990 now syndicated band-wide. The hippest commercials, with stark computerized settings and blank beauties in mirrored shades and plastic slacks genuflecting before various forms of velocity, force, and adrenaline, seem like little more than TV’s vision of how TV offers rescue to those lonely Joe Briefcases passively trapped into watching too much TV.
What explains the pointlessness of most published TV criticism is that television has become immune to charges that it lacks any meaningful connection to the world outside it. It’s not that charges of nonconnection have become untrue. It’s that any such connection has become otiose. Television used to point beyond itself. Those of us born in like the sixties were trained to look where it pointed, usually at versions of “real life” made prettier, sweeter, better by succumbing to a product or temptation. Today’s Audience is way better trained, and TV has discarded what’s not needed. A dog, if you point at something, will look only at your finger.
It’s not like self-reference is new to mass entertainment. How many old radio shows – Jack Benny, Martin and Lewis, Abbott and Costello – were mostly about themselves as shows? “So, Jerry, and you said I couldn’t get a big star like Miss Lucille Ball to be a guest on our show, you little twerp.” Etc. But once television introduces the element of watching, and once it informs an economy and culture like radio never did, the referential stakes go way up. Six hours a day is more time than most people (consciously) do any one thing. How people who absorb such doses understand themselves changes, becomes spectatorial, self-conscious. Because the practice of watching is expansive. Exponential. We spend enough time watching, pretty soon we start watching ourselves watching. We start to “feel” ourselves feeling, yearn to experience “experiences.” And that American subspecies into writing starts writing more and more about….
The emergence of something called metafiction in the American sixties was and is hailed by academic critics as a radical aesthetic, a whole new literary form unshackled from the canonical cinctures of narrative and mimesis and free to plunge into reflexivity and self-conscious meditations on aboutness. Radical it may have been, but thinking that postmodern metafiction evolved unconscious of prior changes in readerly taste is about as innocent as thinking that all those students we saw on television protesting the war in southeast Asia were protesting only because they hated the war. They may have hated the war, but they also wanted to be seen protesting on television. TV was where they’d seen this war, after all. Why wouldn’t they go about hating it on the very medium that made their hate possible? Metafictionists may have had aesthetic theories out the bazoo, but they were also sentient citizens of a community that was exchanging an old idea of itself as a nation of do-ers and be-ers for a new vision of the U.S.A. as an atomized mass of self-conscious watchers and appearers. Metafiction, for its time, was nothing more than a poignant hybrid of its theoretical foe, realism: if realism called it like it saw it, metafiction simply called it as it saw itself seeing itself see it. This high-cultural postmodern genre, in other words, was deeply informed by the emergence of television. And American fiction remains informed by TV … especially those strains of fiction with roots in postmodernism, which even at its rebellious zenith was less a “response to” televisual culture than a kind of abiding-in-TV. Even back then, the borders were starting to come down.
It’s strange that it took television itself so long to wake up to watching’s potent reflexivity. Television shows about television shows were rare for a long time. The Dick Van Dyke Show was prescient, and Mary Moore carried its insight into her own decade-long study in local-market angst. Now, of course, there’s been everything from Murphy Brown to Max Headroom to Entertainment Tonight. And with Letterman, Arsenio, and Leno’s battery of hip, sardonic, this-is-just-TV shticks, the circle back to the days of “So glad to get Miss Ball on our show” has closed and come spiral, television’s power to jettison connection and castrate protest fueled by the same ironic postmodern self-consciousness it first helped fashion.
It’s going to take a while, but I’m going to prove to you that the nexus where television and fiction converse and consort is self-conscious irony. Irony is, of course, a turf fictionists have long worked with zeal. And irony is important for understanding TV because “T.V.,” now that it’s gotten powerful enough to move from acronym to way of life, revolves off just the sorts of absurd contradictions irony’s all about exposing. It is ironic that television is a syncresis that celebrates diversity. That an extremely unattractive self-consciousness is necessary to create TV performers’ illusion of unconscious appeal. That products presented as helping you express individuality can afford to be advertised on television only because they sell to huge hordes. And so on.
Television regards irony the way the educated lonely regard television. Television both fears irony’s capacity to expose, and needs it. It needs irony because television was practically made for irony. For TV is a bisensuous medium. Its displacement of radio wasn’t picture displacing sound; it was picture added. Since the tension between what’s said and what’s seen is irony’s whole sales territory, classic televisual irony works not via the juxtaposition of conflicting pictures or conflicting sounds, but with sights that undercut what’s said. A scholarly article on network news describes a famous interview with a corporate guy from United Fruit on a CBS special about Guatemala: “I sure don’t know of anybody being so-called |oppressed,'” the guy in a seventies leisure suit with a tie that looks like an omelette tells Ed Rabel. “I think this is just something that some reporters have thought up.”(7) The whole interview is intercut with commentless pictures of big-bellied kids in Guatemalan slums and union organizers lying there with cut throats.
Television’s classic irony-function came into its own in the summer of 1974, as remorseless lenses opened to view the fertile “credibility gap” between the image of official disclaimer and the reality of high-level shenanigans. A nation was changed, as Audience. If even the president lies to you, whom are you supposed to trust to deliver the real? Television, that summer, presented itself as the earnest, worried eye on the reality behind all images. The irony that television is itself a river of image, however, was apparent even to a twelve-year-old, sitting there, rapt. There seemed to be no way out. Images and ironies all over the place. It’s not a coincidence that Saturday Night Live, that Athens of irreverent cynicism, specializing in parodies of (1) politics and (2) television, premiered the next fall. On television.
I’m worried when I say things like “television fears” and “television presents itself” because, even though it’s an abstraction necessary to discourse, talking about television as if it were an entity can easily slip into the worst sort of anti-TV paranoia, treating of TV as some autonomous diabolical corrupter of personal agency and community gumption. I am anxious to avoid anti-TV paranoia here. Though I’m convinced that television lies, with a potency somewhere between symptom and synecdoche, behind a genuine crisis for U.S. culture and lit today, I don’t share reactionary adults’ vision of TV as some malignancy visited on an innocent populace, sapping IQs and compromising SAT scores while we all sit there on ever fatter bottoms with little mesmerized spirals revolving in our eyes. Because conservative critics like Samuel Huntington and Barbara Tuchman who try to claim that TV’s lowering of our aesthetic standards is responsible for a “contemporary culture taken over by commercialism directed to the mass market and necessarily to mass taste”(8) can be refuted by observing that their propter hoc isn’t even post hoc: by 1830 de Tocqueville had already diagnosed American culture as peculiarly devoted to easy sensation and mass-marketed entertainment, “spectacles vehement and untutored and rude” that aimed “to stir the passions more than to gratify the taste.”(9)
It’s undeniable that television is an example of “low” art, the sort of art that tries too hard to please. Because of the economics of nationally broadcast, advertiser-subsidized entertainment, television’s one goal – never denied by anybody in or around TV since RCA first authorized field tests in 1936 – is to ensure as much watching as possible. TV is the epitome of low art in its desire to appeal to and enjoy the attention of unprecedented numbers of people. But TV is not low because it is vulgar or prurient or stupid. It is often all these things, but this is a logical function of its need to please Audience. And I’m not saying that television is vulgar and dumb because the people who compose Audience are vulgar and dumb. Television is the way it is simply because people tend to be really similar in their vulgar and prurient and stupid interests and wildly different in their refined and moral and intelligent interests. It’s all about syncretic diversity: neither medium nor viewers are responsible for quality.
Still, for the fact that American humans consume vulgar, prurient, stupid stuff at the sobering clip of six hours a day, for this both TV and we need to answer. We are responsible basically because nobody is holding any weapons on us forcing us to spend amounts of time second only to sleep doing something that is, when you come right down to it, not good for us. Sorry to sound judgmental, but there it is: six hours a day is not good.
Television’s biggest minute-by-minute appeal is that it engages without demanding. One can rest while undergoing stimulation. Receive without giving. In this respect, television resembles other things mothers call “special treats” – e.g., candy, or liquor – treats that are basically fine and fun in small amounts but bad for us in large amounts and really bad for us if consumed as any kind of nutritive staple. One can only guess what volume of gin or poundage of Toblerone six hours of special treat a day would convert to.
On the surface of the problem, television is responsible for our rate of its consumption only in that it’s become so terribly successful at its acknowledged job of ensuring prodigious amounts of watching. Its social accountability seems sort of like that of designers of military weapons: unculpable right up until they get a little too good at their job.
But the analogy between television and liquor is best, I think. Because I’m afraid Joe Briefcase is a teleholic. Watching TV can become malignantly addictive. TV may become malignantly addictive only once a certain threshold of quantity is habitually passed, but then the same is true of whiskey. And by “malignant” and “addictive” I again do not mean evil or coercive. An activity is addictive if one’s relationship to it lies on that downward-sloping continuum between liking it a little too much and downright needing it. Many addictions, from exercise to letter-writing, are pretty benign. But something is malignantly addictive if (1) it causes real problems for the addict, and (2) it offers itself as relief from the very problems it causes. A malignant addiction is also distinguished for spreading the problems of the addiction out and in in interference patterns, creating difficulties for relationships, communities, and the addict’s very sense of self and soul. The hyperbole might strain the analogy for you, but concrete illustrations of malignant TV-watching cycles aren’t hard to come by. If it’s true that many Americans are lonely, and if it’s true that many lonely people are prodigious TV-watchers, and if it’s true that lonely people find in television’s 2D images relief from the pain of their reluctance to be around real humans, then it’s also obvious that the more time spent watching TV, the less time spent in the real human world, and the less time spent in the real human world, the harder it becomes not to feel alienated from real humans, solipsistic, lonely. It’s also true that to the extent one begins to view pseudo-relationships with Bud Bundy or Jane Pauley as acceptable alternatives to relationships with real humans, one has commensurately less conscious incentive even to try to connect with real 3D persons, connections that are pretty important to mental health. For Joe Briefcase, as for many addicts, the “special treat” of TV, begins to substitute for something nourishing and needed, and the original hunger subsides to a strange objectless unease.
TV-watching as a malignant cycle doesn’t even require special preconditions like writerly self-consciousness or loneliness. Let’s for a second imagine Joe Briefcase as now just average, relatively unlonely, adjusted, married, blessed with 2.5 apple-cheeked issue, normal, home from hard work at 5:30, starting his average six-hour stint. Since Joe B. is average, he’ll shrug at pollsters’ questions and say he most often watches television to “unwind” from those elements of his day and life he finds stressful. It’s tempting to suppose that TV enables this “unwinding” simply because it offers an Auschlanderian distraction, something to divert the mind from quotidian troubles. But would mere distraction ensure continual massive watching? Television offers more than distraction. In lots of ways, television purveys and enables dreams, and most of these dreams involve some sort of transcendence of average daily life. The modes of presentation that work best for TV – stuff like “action,” with shoot-outs and car wrecks, or the rapid-fire “collage” of commercials, news, and music videos, or the “hysteria” of prime-time soap and sitcom with broad gestures, high voices, too much laughter – are unsubtle in their whispers that, somewhere, life is quicker, denser, more interesting, more … well, lively than contemporary life as Joe Briefcase knows and moves through it. This might seem benign until we consider that what average Joe Briefcase does more than almost anything else in contemporary life is watch television, an activity which anyone with an average brain can see does not make for a very dense and lively life. Since television must seek to compel attention by offering a dreamy promise of escape from daily life, and since stats confirm that so grossly much of ordinary U.S. life is watching TV, TV’s whispered promises must somehow undercut television-watching in theory (“Joe, Joe, there’s a world where life is lively, where nobody spends six hours a day unwinding before a piece of furniture”) while reinforcing television-watching in practice (“Joe, Joe, your best and only access to this world is TV”).
Well, Joe Briefcase has an average, workable brain, and deep inside he knows, as we do, that there’s some kind of psychic three-card monte going on in this system of conflicting whispers. But if it’s so bald a delusion, why do we keep watching such doses? Part of the answer – a part which requires discretion lest it slip into anti-TV paranoia – is that the phenomenon of television somehow trains or conditions our viewership. Television has become able not only to ensure that we watch, but to inform our deepest responses to what’s watched. Take jaded TV critics, or our acquaintances who sneer at the numbing sameness of all the television they sit still for. I always want to grab these unhappy guys by the lapels and shake them until their teeth rattle and point to the absence of guns to their heads and ask why the heck they keep watching, then. But the truth is that there’s some complex high-dose psychic transaction between TV and Audience whereby Audience gets trained to respond to and then like and then expect trite, hackneyed, numbing television shows, and to expect them to such an extent that when networks do occasionally abandon time-tested formulas we usually punish them for it by not watching novel forms in sufficient numbers to let them get off the ground. Hence the networks’ bland response to its critics that in the majority of cases – and until the rise of hip metatelevision you could count the exceptions on one hand – “different” or “high-concept” programming simply didn’t get ratings. Quality television cannot stand the gaze of millions, somehow.
Now, it is true that certain PR techniques – e.g., shock, grotesquerie, or irreverence – can ease novel sorts of shows’ rise to demographic viability. Examples here might be the shocking A Current Affair, the grotesque Real People, the irreverent Married, with Children. But these programs, like most of those touted by the industry as “fresh” or “outrageous,” turn out to be just tiny transparent variations on old formulas.
But it’s still not fair to blame television’s shortage of originality on any lack of creativity among network talent. The truth is that we seldom get a chance to know whether anybody behind any TV show is creative, or more accurately that they seldom get a chance to show us. Despite the unquestioned assumption on the part of pop-culture critics that television’s poor Audience, deep down, craves novelty, all available evidence suggests rather that the Audience really craves sameness but thinks, deep down, that it ought to crave novelty. Hence the mixture of devotion and sneer on viewerly faces. Hence also the weird viewer-complicity behind TV’s sham “breakthrough programs”: Joe Briefcase needs that PR-patina of “freshness” and “outrageousness” to quiet his conscience while he goes about getting from television what we’ve all been trained to want from it: some strangely American, profoundly shallow reassurance.
Particularly in the last decade, this tension in the Audience between what we do want and what we think we ought to want has been television’s breath and bread. TV’s self-mocking invitation to itself as indulgence, transgression, a glorious “giving in” (again not foreign to addictive cycles) is one of two ingenious ways it’s consolidated its six-hour hold on my generation’s cajones. The other is postmodern irony. The commercials for Alf’s Boston debut in syndicated package feature the fat, cynical, gloriously decadent puppet (so much like Snoopy, like Garfield, like Bart) advising me to “Eat a whole lot of food and stare at the TV!” His pitch is an ironic permission slip to do what I do best whenever I feel confused and guilty: assume, inside, a sort of fetal position; a pose of passive reception to escape, comfort, reassurance. The cycle is self-nourishing.
Not, again, that this cycle’s root conflict is new. You can, trace the opposition between what persons do and ought to desire at least as far back as Plato’s chariot or the Prodigal’s return. But the way entertainments appeal to and work within this conflict has been transformed in a televisual culture. This culture-of-watching’s relation to the cycle of indulgence, guilt, and reassurance has important consequences for U.S. art, and though the parallels are easiest to see w/r/t Warhol’s pop or Elvis’s rock, the most interesting intercourse is between television and American lit.
One of the most recognizable things about this century’s postmodern fiction was the movement’s strategic deployment of pop-cultural references – brand names, celebrities, television programs – in even its loftiest high-art projects. Think of just about any example of avant-garde U.S. fiction in the last twenty-five years, from Slothrop’s passion for Slippery Elm throat lozenges and his weird encounter with Mickey Rooney in Gravity’s Rainbow to “You”‘s fetish for the New York Post’s COMA BABY feature in Bright Lights, to Don Delillo’s pop-hip characters saying stuff to each other like “Elvis fulfilled the terms of the contract. Excess, deterioration, self-destructiveness, grotesque behavior, a physical bloating and a series of insults to the brain, self-delivered.”
The apotheosis of the pop in postwar art marked a whole new marriage between high and low culture. For the artistic viability of postmodernism is a direct consequence, again, not of any new facts about art, but of facts about the new importance of mass commercial culture. Americans seemed no longer united so much by common feelings as by common images: what binds us became what we stood witness to. No one did or does see this as a good change. In fact, pop-cultural references have become such potent metaphors in U.S. fiction not only because of how united Americans are in our exposure to mass images but also because of our guilty indulgent psychology with respect to that exposure. Put simply, the pop reference works so well in contemporary fiction because (1) we all recognize such a reference, and (2) we’re all a little uneasy about how we all recognize such a reference.
The status of low-cultural images in postmodern and contemporary fiction is very different from their place in postmodernism’s artistic ancestors, the “dirty realism” of a Joyce or the Ur-Dadaism of a Duchamp toilet sculpture. Duchamp’s display of that vulgarest of appliances served an exclusively theoretical end: it was making statements like “The Museum is the Mausoleum is the Men’s Room,” etc. It was an example of what Octavio Paz calls “meta-irony,” an attempt to reveal that categories we divide into superior/arty and inferior/vulgar are in fact so interdependent as to be coextensive. The use of “low” references in today’s literary fiction, on the other hand, serves a less abstract agenda. It is meant (1) to help create a mood of irony and irreverence, (2) to make us uneasy and so “comment” on the vapidity of U.S. culture, and (3) most important, these days, to be just plain realistic.
Pynchon and DeLillo were ahead of their time. Today, the belief that pop images are basically just mimetic devices is one of the attitudes that separates most U.S. fiction writers under forty from the writerly generation that precedes us, reviews us, and designs our grad-school curricula. This generation-gap in conceptions of realism is, again, TV-dependent. The U.S. generation born after 1950 is the first for whom television was something to be lived with instead of just looked at. Our elders regard the set rather as the Flapper did the automobile: a curiosity turned treat turned seduction. For younger writers, TV’s as much a part of reality as Toyotas and gridlock. We literally cannot imagine life without it. We’re not different from our fathers insofar as television presents and defines the contemporary world. But we are different in that we have no memory of a world without such electric definition. This is why the derision so many older fictionists heap on a “Brat Pack” generation they see as insufficiently critical of mass culture is simultaneously apt and misguided. It’s true that there’s something sad about the fact that young lion David Leavitt’s sole descriptions of certain story characters is that their T-shirts have certain brand names on them. But the fact is that, for most of the educated young readership for whom Leavitt writes, members of a generation raised and nourished on messages equating what one consumes with who one is, Leavitt’s descriptions do the job. In our post-’50, inseparable-from-TV association pool, brand loyalty is synecdochic of identity, character.
For those U.S. writers whose ganglia were formed pre-TV, who are big on neither Duchamp nor Paz and lack the oracular foresight of a Pynchon, the mimetic deployment of pop-culture icons seems at best an annoying tic and at worst a dangerous vapidity that compromises fiction’s seriousness by dating it out of the Platonic Always where it ought to reside. In one of the graduate workshops I suffered through, an earnest gray eminence kept trying to convince our class that a literary story or novel always eschews “any feature which serves to date it,” because “serious fiction must be timeless.” When we finally protested that, in his own well-known work, characters moved about in electrically lit rooms, drove cars, spoke not Anglo-Saxon but postwar English, inhabited a North America already separated from Africa by continental drift, he impatiently amended his proscription to those explicit references that would date a story in the frivolous “Now.” When pressed for just what stuff evoked this f.N., he said of course he meant the “trendy mass-popular-media” reference. And here, at just this point, transgenerational discourse broke down. We looked at him blankly. We scratched our little heads. We didn’t get it. This guy and his students just didn’t imagine the “serious” world the same way. His automobiled timeless and our FCC’d own were different.
If you read the big literary supplements, you’ve doubtless seen the intergenerational squabble the prenominate scene explains. The plain fact is that certain key things having to do with fiction production are different for young U.S. writers now. And television is at the vortex of much of the flux. Because younger writers are not only Artists probing for the nobler interstices in what Stanley Cavell calls the reader’s “willingness to be pleased”; we are also, now, self-defined parts of the great U.S. Audience, and have our own aesthetic pleasure-centers; and television has formed and trained us. It won’t do, then, for the literary establishment simply to complain that, for instance, young-written characters don’t have very interesting dialogues with each other, that young writers’ ears seem tinny. Tinny they may be, but the truth is that in younger Americans’ experience, people in the same room don’t do all that much direct conversing with each other. What most of the people I know do is they all sit and face the same direction and stare at the same thing and then structure commercial-length conversations around the sorts of questions myopic car-crash witnesses might ask each other – “Did you just see what I just saw?” And, realism-wise, the paucity of profound conversation in Brat-esque fiction seems to be mimetic of more than just our own generation. Six hours a day, in average households young and old, just how much interfacing can really be going on? So now whose literary aesthetic seems “dated”?
In terms of lit history, it’s important to recognize the distinction between pop and televisual references, on the one hand, and the mere use of TV-like techniques, on the other. The latter have been around in fiction forever. The Voltaire of Candide, for instance, uses a bisensuous irony that would do Ed Rabel proud, having Candide and Pangloss run around smiling and saying “All for the best, the best of all worlds” amid war-dead, pogroms, rampant nastiness. Even the stream-of-consciousness guys who fathered modernism were, on a very high level, constructing the same sorts of illusions about privacy-puncturing and espial on the forbidden that television has found so fecund. And let’s not even talk about Balzac.
It was in post-atomic America that pop influences on lit became-something more than technical. About the time television first gasped and sucked air, mass popular U.S. culture became high-art viable as a collection of symbols and myth. The episcopate of this pop-reference movement were the post-Nabokovian black humorists, the metafictionists and assorted franc- and latinophiles only later comprised by “postmodern.” The erudite, sardonic fictions of the black humorists introduced a generation of new fiction writers who saw themselves as avant-avant-garde, not only cosmopolitan and polyglot but also technologically literate, products of more than just one region, heritage, and theory, and citizens of a culture that said its most important stuff about itself via mass media. In this regard I think particularly of the Barth of The End of the Road and The Sot-Weed Factor, the Gaddis of The Recognitions, and the Pynchon of The Crying of Lot 49; but the movement toward treating of the pop as its own reservoir of mythopeia fast metastasized and has transcended both school and genre. Plucking from my bookshelves almost at random, I find poet James Cummin’s 1986 The Whole Truth, a cycle of sestinas deconstructing Perry Mason. Here’s Robert Coover’s 1977 A Public Burning, in which Eisenhower buggers Nixon on-air, and his 1980 A Political Fable, in which the Cat in the Hat runs for president. I find Max Apple’s 1986 The Propheteers, a novel-length imagining of Walt Disney’s travails. Or part of poet Bill Knott’s 1974 “And Other Travels”:
. . . in my hand a cat o’nine tails on every tip of which was Clearasil I was worried because Dick Clark had told the cameraman not to put the camera on me during the dance parts of the show because my skirts were too tight
which serves as a lovely example because, even though this stanza appears in the poem without anything we’d normally call context or support, it is in fact self-supported by a reference we all, each of us, immediately get, conjuring as it does with Bandstand ritualized vanity, teenage insecurity, the management of spontaneous moments. It is the perfect pop image: at once slight and universal, soothing and discomfiting.
Recall that the phenomena of watching and consciousness of watching are by nature expansive. What distinguishes another, later wave of postmodern lit is a further shift, from television images as valid objects of literary allusion, to TV and metawatching as themselves valid subjects. By this I mean certain lit beginning to locate its raison in its commentary on, response to, a U.S. culture more and more of and for watching, illusion, and the video image. This involution of attention was first observable in academic poetry. See for instance Stephen Dobyns’s 1980 “Arrested Saturday Night”:
This is how it happened: Peg and Bob had invited Jack and Roxanne over to their house to watch the TV, and on the big screen they saw Peg and Bob, Jack and Roxanne watching themselves watch themselves on progressively smaller TVs….
or Knott’s 1983 “Crash Course”:
I strap a TV monitor on my chest so that all who approach can see themselves and respond appropriately.”
The true prophet of this shift in U.S. fiction, though, was the prenominate Don DeLillo, a long-neglected conceptual novelist who has made signal and image his unifying topoi the way Barth and Pynchon had sculpted in paralysis and paranoia a decade earlier. DeLillo’s 1985 White Noise sounded to fledgling fictionists a kind of televisual clarion-call. Scenelets like the following seemed especially important:
Several days later Murray asked me about a tourist attraction known as the most photographed barn in America. We drove twenty-two miles into the country around Farmington. There were meadows and apple orchards. White fences trailed through the rolling fields. Soon the signs started appearing. THE MOST PHOTOGRAPHED BARN IN AMERICA. We counted five signs before we reached the site. . . . We walked along a cow-path to the slightly elevated spot set aside for viewing and photographing. All the people had cameras; some had tripods, telephoto lenses, filter kits. A man in a booth sold postcards and slides – pictures of the barn taken from the elevated spot. We stood near a grove of trees and watched the photographers. Murray maintained a prolonged silence, occasionally scrawling some notes in a little book.
“No one sees the barn,” he said finally.
A long silence followed.
“Once you’ve seen the signs about the barn, it becomes impossible to see the barn.”
He fell silent once more. People with cameras left the elevated site, replaced at once by others.
“We’re not here to capture an image. We’re here to maintain one. Can you feel it, Jack? An accumulation of nameless energies.”
There was an extended silence. The man in the booth sold postcards and slides.
“Being here is a kind of spiritual surrender. We see only what the others see. The thousands who were here in the past, those who will come in the future. We’ve agreed to be part of a collective perception. This literally colors our vision. A religious experience in a way, like all tourism.”
Another silence ensued.
“They are taking pictures of taking pictures,” he said. (12-13)
I quote this at such length not only because it’s too darn good to ablate, but to draw your attention to two relevant features. The less interesting is the Dobyns-esque message here about the metastasis of watching. For not only are people watching a barn whose only claim to fame is as an object of watching, but the pop-culture scholar Murray is watching people watch a barn, and his friend Jack is watching Murray watch the watching, and we readers are pretty obviously watching Jack the narrator watch Murray watching, etc. If you leave out the reader, there’s a similar regress of recordings of barn and barn-watching.
But more important are the complicated ironies at work in the scene. The scene itself is obviously absurd and absurdist. But most of the writing’s parodic force is directed at Murray, the would-be transcender of spectation. Murray, by watching and analyzing, would try to figure out the how and whys of giving in to collective visions of mass images that have themselves become mass images only because they’ve been made the objects of collective vision. The narrator’s “extended silence” in response to Murray’s blather speaks volumes. But it’s not to be mistaken for a silence of sympathy with the sheeplike photograph-hungry crowd. These poor Joe Briefcases are no less objects of ridicule for their “scientific” critic himself being ridiculed. The authorial tone throughout is a kind of deadpan sneer. Jack himself is utterly mute – since to speak out loud in the scene would render the narrator part of the farce (instead of a detached, transcendent “observer and recorder”) and so vulnerable to ridicule himself. With his silence, DeLillo’s alter ego Jack eloquently diagnoses the very disease from which he, Murray, barn-watchers, and readers all suffer.
I Do Have a Thesis
I want to convince you that irony, poker-faced silence, and fear of ridicule are distinctive of those features of contemporary U.S. culture (of which cutting-edge fiction is a part) that enjoy any significant relation to the television whose weird pretty hand has my generation by the throat. I’m going to argue that irony and ridicule are entertaining and effective, and that at the same time they are agents of a great despair and stasis in U.S. culture, and that for aspiring fictionists they pose terrifically vexing problems.
My two big premises are that, on the one hand, a certain subgenre of pop-conscious postmodern fiction, written mostly by young Americans, has lately arisen and made a real attempt to transfigure a world of and for appearance, mass appeal, and television; and that, on the other hand, televisual culture has somehow evolved to a point where it seems invulnerable to any such transfiguring assault. TV, in other words, has become able to capture and neutralize any attempt to change or even protest the attitudes of passive unease and cynicism TV requires of Audience in order to be commercially and psychologically viable at doses of several hours per day.
The particular fictional subgenre I have in mind has been called by some editors “post-postmodernism” and by some critics “hyperrealism.” Most of the younger readers and writers I know call it the “fiction of image.” Image-fiction is basically a further involution of the relations between lit and pop that blossomed with the sixties postmodernists. If the postmodern church fathers found pop images valid referents and symbols in fiction, and if in the seventies and early eighties this appeal to the features of mass culture shifted from use to mention, certain avant-gardists starting to treat of pop and TV and watching as themselves fertile subjects, the new fiction of image uses the transient received myths of popular culture as a world in which to imagine fictions about “real,” albeit pop-mediated, public characters. Early uses of imagist tactics can be seen in the DeLillo of Great Jones Street, the Coover of Burning, and in Max Apple, whose seventies short story “The Oranging of America” projected an interior life onto the figure of Howard Johnson.
But in the late eighties, despite publisher unease over the legalities of imagining private lives for public figures, a real bumper crop of this behind-the-glass stuff started appearing, authored largely by writers who didn’t know or cross-fertilize one another. Apple’s Propheteers, Jay Cantor’s Krazy Kat, Coover’s A Night at the Movies, or You Must Remember This, William T. Vollmann’s You Bright and Risen Angels, Stephen Dixon’s Movies: Seventeen Stories, and DeLillo’s own fictional hologram of Oswald in Libra are all notable post-’85 instances. (Observe too that, in another eighties medium, the arty Zelig, Purple Rose of Cairo, and Sex, Lies, and Video-tape, plus the low-budget Scanners and Videodrome and Shockers, all began to treat screens as permeable.)
It’s in the last couple of years that the image-fiction scene has really taken off. A. M. Homes’s 1990 The Safely of Objects features a stormy love affair between a boy and a Barbie doll. Vollmann’s 1989 The Rainbow Stories has Sonys as characters in Heideggerian parables. Michael Martone’s 1990 Fort Wayne Is Seventh on Hitler’s List is a tight cycle of stories about the Midwest’s pop-culture giants – James Dean, Colonel Sanders, Dillinger – the whole project of which, spelled out in a preface about image-fiction’s legal woes, involves “questioning the border between fact and fiction when in the presence of fame.” And Mark Leyner’s 1990 campus smash My Cousin, My Gastroenterologist, less a novel than what the book’s jacket-copy describes as “a fiction analogue of the best drug you ever took,” features everything from meditations on the color of Carefree Panty Shields wrappers to “Big Squirrel, the TV kiddie-show host and kung fu mercenary,” to NFL instant replays in an “X-ray vision which shows leaping skeletons in a bluish void surrounded by 75,000 roaring skulls.”
One thing I have to insist you realize about this new subgenre is that it’s distinguished, not just by a certain neo-postmodern technique, but by a genuine socio-artistic agenda. The fiction of image is not just a use or mention of televisual culture but a response to it, an effort to impose some sort of accountability on a state of affairs in which more Americans get their news from television than from newspapers and in which more Americans every evening watch Wheel of Fortune than all three network news programs combined.
And please see that image-fiction, far from being a trendy avant-garde novelty, is almost atavistic. It’s a natural adaptation of the hoary techniques of literary realism to a nineties world whose defining boundaries have been deformed by electric signal. For realistic fiction’s big job used to be to afford easements across borders, to help readers leap over the walls of self and locale and show us unseen or -dreamed-of people and cultures and ways to be. Realism made the strange familiar. Today, when we can eat Tex-Mex with chopsticks while listening to reggae and watching a Soviet-satellite newscast of the Berlin Wall’s fall – i.e., when darn near everything presents itself as familiar – it’s not a surprise that some of today’s most ambitious “realistic” fiction is going about trying to make the familiar strange. In so doing, in demanding fictional access behind lenses and screens and headlines and re-imagining what human life might truly be like over there across the chasms of illusion, mediation, demographics, marketing, image, and appearance, image-fiction is paradoxically trying to restore what’s (mis)taken for “real” to three whole dimensions, to reconstruct a univocally round world out of disparate streams of flat sights.
That’s the good news.
The bad news is that, almost without exception, image-fiction doesn’t satisfy its own agenda. Instead, it most often degenerates into a kind of jeering, surfacy look “behind the scenes” of the very televisual front people already jeer at, and can already get behind the scenes of via Entertainment Tonight and Remote Control.
The reason why today’s imagist fiction isn’t the rescue from a passive, addictive TV-psychology that it tries so hard to be is that most imagist writers render their material with the same tone of irony and self-consciousness that their ancestors, the literary insurgents of Beat and postmodernism, used so effectively to rebel against their own world and context. And the reason why this irreverent postmodern approach fails to help the imagists transfigure TV is simply that TV has beaten the imagists to the punch. The fact is that for at least ten years now television has been ingeniously absorbing, homogenizing, and re-presenting the very cynical postmodern aesthetic that was once the best alternative to the appeal of low, over-easy, mass-marketed narrative. How TV’s done this is blackly fascinating to see.
A quick intermission contra paranoia. By saying that the fiction of image aims to “rescue” us from TV, I again am not suggesting that television has diabolic designs, or wants souls. I’m just referring again to the kind of Audience-conditioning consequent to high doses, a conditioning so subtle it can be observed best obliquely, through examples. If a term like “conditioning” still seems either hyperbolic or empty to you, I’ll ask you to consider for a moment the exemplary issue of prettiness. One of the things that makes the people on TV fit to stand the mega-gaze is that they are, by human standards, really pretty. I suspect that this, like most television conventions, is set up with no motive more sinister than to appeal to the largest possible Audience. Pretty people tend to be more pleasing to look at than non-pretty people. But when we’re talking about television, the combination of sheer Audience size and quiet psychic intercourse between images and oglers starts a cycle that both enhances pretty images’ appeal and erodes us viewers’ own security in the face of gazes. Because of the way human beings relate to narrative, we tend to identify with those characters we find appealing. We try to see ourselves in them. The same I.D.-relation, however, also means that we try to see them in ourselves. When everybody we seek to identify with for six hours a day is pretty, it naturally becomes more important to us to be pretty, to be viewed as pretty. Because prettiness becomes a priority for us, the pretty people on TV become all the more attractive, a cycle which is obviously great for TV. But it’s less great for us civilians, who tend to own mirrors, and who also tend not to be anywhere near as pretty as the images we try to identify with. Not only does this cause some angst personally, but the angst increases because, nationally, everybody else is absorbing six-hour doses and identifying with pretty people and valuing prettiness more, too. This very personal anxiety about our prettiness has become a national phenomenon with national consequences. The whole U.S.A. gets different about things it values and fears. The boom in diet aids, health and fitness clubs, neighborhood tanning parlors, cosmetic surgery, anorexia, bulimia, steroid use among boys, girls throwing acid at each other because one girl’s hair looks more like Farrah Fawcett’s than another’s . . . are these supposed to be unrelated to each other? to the apotheosis of prettiness in a televisual culture?
It’s not paranoid or hysterical to acknowledge that television in large doses affects people’s values and self-esteem in deep ways. That televisual conditioning influences the whole psychology of one’s relation to himself, his mirror, his loved ones, and a world of real people and real gazes. No one’s going to claim that a culture all about watching and appearing is fatally compromised by unreal standards of beauty and fitness. But other facets of TV-training reveal themselves as more rapacious, more serious, than any irreverent fiction writer would want to take seriously.
It’s widely recognized that television, with its horn-rimmed battery of statisticians and pollsters, is awfully good at discerning patterns in the flux of popular ideologies, absorbing them, processing them, and then re-presenting them as persuasions to watch and to buy. Commercials targeted at the eighties’ upscale boomers, for example, are notorious for using processed versions of tunes from the rock culture of the sixties and seventies both to elicit the yearning that accompanies nostalgia and to yoke purchases of products with what for yuppies is a lost era of genuine conviction. Ford sport vans are advertised with “This is the dawning of the age of the Aerostar”; Ford recently litigates with Bette Midler over the theft of her old vocals on “Do You Wanna Dance”; claymation raisins dance to “Heard It Through the Grapevine”; etc. If the commercial reuse of songs and the ideals they used to symbolize seems distasteful, it’s not like pop musicians are paragons of noncommercialism themselves, and anyway nobody ever said selling was pretty. The effects of any instance of TV absorbing and pablumizing cultural tokens seem innocuous. But the recycling of whole cultural trends, and the ideologies that inform them, are a different story.
U.S. pop culture is just like U.S. serious culture in that its central tension has always set the nobility of individualism on one side against the warmth of communal belonging on the other. For its first twenty or so years, it seemed as though television sought to appeal mostly to the group side of the equation. Communities and bonding were extolled on early TV, even though TV itself, and especially its advertising, has from the outset projected itself at the lone viewer, Joe Briefcase, alone. Television commercials always make their appeals to individuals, not groups, a fact that seems curious in light of the unprecedented size of TV’s Audience, until one hears gifted salesmen explain how people are always most vulnerable, hence frightened, hence needy, hence persuadable, when they are approached solo.
Classic television commercials were all about the group. They took the vulnerability of Joe Briefcase, sitting there, watching, lonely, and capitalized on it by linking purchase of a given product with Joe B.’s inclusion in some attractive community. This is why those of us over twenty-one can remember all those interchangeable old commercials featuring groups of pretty people in some ecstatic context having just way more fun than anybody has a license to have, and all united as Happy Group by the conspicuous fact that they’re holding a certain bottle of pop or brand of snack – and the blatant appeal here is that the relevant product can help Joe Briefcase belong. “We’re the Pepsi Generation….”
But since, at latest, the eighties, the individualist side of the great U.S. conversation has held sway in TV advertising. I’m not sure just why or how this happened. There are probably great connections to be traced – with Vietnam, youth cultures, Watergate and recession and the New Right’s rise – but the relevant datum is that a lot of the most effective TV commercials now make their appeal to the lone viewer in a terribly different way. Products are now most often pitched as helping the viewer “express himself,” assert his individuality, “stand out from the crowd.” The first instance I ever saw was a perfume vividly billed in the early eighties as reacting specially with each woman’s “unique body chemistry” and creating “her own individual scent,” the ad depicting a cattle line of languid models waiting cramped and expressionless to get their wrists squirted one at a time, each smelling her moist individual wrist with a kind of biochemical revelation, and then moving off in what a back-pan reveals to be different directions from the squirter (we can ignore the obvious sexual connotations, squirting and all that; some tactics are changeless). Or think of that recent series of over-dreary black-and-white Cherry 7-Up ads where the only characters who get to have color and stand out from their surroundings are the pink people who become pink at the exact moment they imbibe. Examples of stand-apart ads are ubiquitous nightly, now.
Except for being sillier – products billed as distinguishing individuals from crowds sell to huge crowds of individuals – these ads aren’t really any more complicated or subtle than the old join-the-fulfilling-crowd ads that now seem so quaint. But the new stand-out ads’ relation to their chiaroscuro mass of lone viewers is both complex and ingenious. Today’s best ads are still about the group, but they now present the group as something fearsome, something that can swallow you up, erase you, keep you from “being noticed.” But noticed by whom? Crowds are still vitally important in the stand-apart ads’ thesis on identity, but now a given ad’s crowd, far from being more appealing, secure, and alive than the individual, functions as a mass of identical featureless eyes. The crowd is now, paradoxically, both the “herd” in contrast to which the viewer’s distinctive identity is to be defined, and the impassive witnesses whose sight alone can confer distinctive identity. The lone viewer’s isolation in front of his furniture is implicitly applauded – it’s better, realer, these solipsistic ads imply, to fly solo – and yet also implicated as threatening, confusing, since after all Joe Briefcase is not an idiot, sitting here, and knows himself as a viewer to be guilty of the two big sins the ads decry: being a passive watcher (of TV) and being part of a great herd (of TV-watchers and stand-apart-product-buyers). How odd.
The surface of stand-apart ads still presents a relatively unalloyed Buy This Thing, but the deep message of television w/r/t these ads looks to be that Joe Briefcase’s ontological status as just one in a reactive watching mass is in a deep way false, and that true actualization of self would ultimately consist in Joe’s becoming one of the images that are the objects of this great herdlike watching. That is, TV’s real pitch in these commercials is that it’s better to be inside the TV than to be outside, watching.
The lonely grandeur of stand-apart advertising not only sells companies’ products, then. It manages brilliantly to ensure – even in commercials that television gets paid to run – that ultimately TV, and not any specific product or service, will be regarded by Joe B. as the ultimate arbiter of human worth. An oracle, to be consulted a lot. Advertising scholar Mark C. Miller puts it succinctly: “TV has gone beyond the explicit celebration of commodities to the implicit reinforcement of that spectatorial posture which TV requires of us.” Solipsistic ads are another way television ends up pointing at itself, keeping the viewer’s relation to his furniture at once alienated and anaclitic.
Maybe, though, the relation of contemporary viewer to contemporary TV is less a paradigm of infantilism and addiction than it is of the U.S.A.’s familiar relation to all the technology we equate at once with freedom and power and slavery and chaos. For, as with TV, whether we happen personally to love technology, hate it, fear it, or all three, we still look relentlessly to technology for solutions to the very problems technology seems to cause – catalysis for smog, S.D.I. for missiles, transplants for assorted rot.
And as with tech, so the gestalt of TV expands to absorb all problems associated with it. The pseudo-communities of prime-time soaps like Knots Landing and thirtysomething are viewer-soothing products of the very medium whose ambivalence about groups helps erode people’s sense of connection. The staccato editing, sound bites, and summary treatment of knotty issues is network news’ accommodation of an Audience whose attention-span and appetite for complexity have atrophied a bit after years of high-dose spectation. Etc.
But TV has tech-bred problems of its own. The advent of cable, often with packages of over forty channels, threatens networks and local affiliates alike. This is particularly true when the viewer is armed with a remote-control gizmo: Joe B. is still getting his six total hours of daily TV, but the amount of his retinal time devoted to any one option shrinks as he remote-scans a much wider band. Worse, the VCR, with its dreaded fast-forward and ZAP functions, threatens the very viability of commercials. Television advertisers’ sensible solution? Make the ads as appealing as the shows. Or at any rate try to keep Joe from disliking the commercials enough so that he’s willing to move his thumb to check out two and a half minutes of Hazel on the Superstation while NBC sells lip balm. Make the ads prettier, livelier, full of enough rapidly juxtaposed visual quanta that Joe’s attention just doesn’t get to wander, even if he remote-kills the volume. As one ad executive underputs it, “Commercials are becoming more like entertaining films.”
There’s an obverse way to make commercials resemble programs: have programs start to resemble commercials. That way the ads seem less like interruptions than like pace-setters, metronomes, commentaries on the shows’ theory. Invent a Miami Vice, where there’s little annoying plot to interrupt an unprecedented emphasis on appearances, visuals, attitude, a certain “look.” Make music videos with the same amphetaminic pace and dreamy archetypal associations as ads – it doesn’t hurt that videos are basically long record commercials anyway. Or introduce the sponsor-supplied “infomercial” that poses, in a light-hearted way, as a soft-news show, like Amazing Discoveries or those Robert Vaughn-hosted hair-loss “reports” that haunt TV’s wee cheap hours.
Still, television and its commercial sponsors had a bigger long-term worry, and that was their shaky detente with the individual viewer’s psyche. Given that television must revolve off antinomies about being and watching, about escape from daily life, the averagely intelligent viewer can’t be all that happy about his daily life of high-dose watching. Joe Briefcase might be happy enough when watching, but it was hard to think he could be too terribly happy about watching so much. Surely, deep down, Joe was uncomfortable with being one part of the biggest crowd in human history watching images that suggest that life’s meaning consists in standing visibly apart from the crowd. TV’s guilt/indulgence/reassurance cycle addresses these concerns on one level. But might there not be some deeper way to keep Joe Briefcase firmly in the crowd of watchers by somehow associating his very viewership with transcendence of watching crowds? But that would be absurd.
I’ve said, so far without support, that what makes television’s hegemony so resistant to critique by the new fiction of image is that TV has co-opted the distinctive forms of the same cynical, irreverent, ironic, absurdist post-WWII literature that the imagists use as touchstones. TV’s own reuse of postmodern cool has actually evolved as a grimly inspired solution to the keep-Joe-at-once-alienated-from-and-part-of-the-million-eyed-crowd problem. The solution entailed a gradual shift from oversincerity to a kind of bad-boy irreverence in the big face TV shows us. This in turn reflected a wider shift in U.S. perceptions of how art was supposed to work, a transition from art’s being a creative instantiation of real values to art’s being a creative instantiation of deviance from bogus values. And this wider shift in its turn paralleled both the development of the postmodern aesthetic and some deep philosophic change in how Americans chose to view concepts like authority, sincerity, and passion in terms of our willingness to be pleased. Not only are sincerity and passion now “out,” TV-wise, but the very idea of pleasure has been undercut. As Mark C. Miller puts it, contemporary television “no longer solicits our rapt absorption or hearty agreement, but – like the ads that subsidize it – actually flatters us for the very boredom and distrust it inspires in us.”
Miller’s 1986 “Deride and Conquer,” the best essay ever written on network advertising, details vividly an example of bow TV’s contemporary appeal to the lone viewer works. It concerns a 1985-86 ad that won Clios and still occasionally runs. It’s that Pepsi commercial where a Pepsi sound van pulls up to a packed sweltering beach and the impish young guy in the van activates a lavish PA system and opens up a Pepsi and pours it into a cup up next to the microphone. And the dense glittered sound of much carbonation goes out over the beach’s heat-wrinkled air, and heads turn vanward as if pulled with strings as his gulp and refreshed, spiranty sounds are broadcast; and the final shot reveals that the sound van is also a concession truck, and the whole beach’s pretty population has collapsed to a clamoring mass around the truck, everybody hopping up and down and pleading to be served first, as the camera’s view retreats to overhead and the slogan is flatly intoned: “Pepsi: the Choice of a New Generation.” Really a stunning commercial. But need one point out, as Miller does at length, that the final slogan is here tongue-in-cheek? There’s about as much “choice” at work in this commercial as there was in Pavlov’s bell kennel. In fact the whole thirty-second spot is tongue-in-cheek, ironic, self-mocking. As Miller argues, it’s not really choice that the commercial is “selling” Joe Briefcase on, “but the total negation. of choices. Indeed, the product itself is finally incidental to the pitch. The ad does not so much extol Pepsi per se as recommend it by implying that a lot of people have been fooled into buying it. In other words, the point of this successful bit of advertising is that Pepsi has been advertised successfully.”
There are important things to realize here. First, this ad is deeply informed by a fear of remote gizmos, ZAPping, and viewer disdain. An ad about ads, it uses self-reference to seem too hip to hate. It protects itself from the scorn today’s viewing cognoscente feels for both the fast-talking hard-sell ads Dan Akroyd parodied into oblivion on Saturday Night Live and the quixotic associative ads that linked soda-drinking with romance, prettiness, and group inclusion – ads today’s jaded viewer finds old-fashioned and “manipulative.” In contrast to a blatant Buy This Thing, this Pepsi commercial pitches parody. The ad’s utterly up-front about what TV ads are popularly despised for doing: using primal, flim-flam appeals to sell sugary crud to people whose identity is nothing but mass consumption. This ad manages simultaneously to make fun of itself, Pepsi, advertising, advertisers, and the great U.S. watching/consuming crowd. In fact the ad’s uxorious in its flattery of only one person: the lone viewer, Joe B., who even with an average brain can’t help but discern the ironic contradiction between the “choice” slogan (sound) and the Pavlovian orgy (sight). The commercial invites Joe to “see through” the manipulation the beach’s horde is rabidly buying. The commercial invites complicity between its own witty irony and veteran-viewer Joe’s cynical, nobody’s-fool appreciation of that irony. It invites Joe into an in-joke the Audience is the butt of. It congratulates Joe Briefcase, in other words, on transcending the very crowd that defines him, here. This ad boosted Pepsi’s market share through three sales quarters.
Pepsi’s campaign is not unique. Isuzu Inc. hit pay dirt with its series of “Joe Isuzu” spots, featuring an oily, Satanic-looking salesman who told whoppers about Isuzus’ genuine llama-skin upholstery and ability to run on tap water. Though the ads rarely said much of anything about why Isuzus are in fact good cars, sales and awards accrued. The ads succeeded as parodies of how oily and Satanic car commercials are. They invited viewers to congratulate Isuzu ads for being ironic, to congratulate themselves for getting the joke, and to congratulate Isuzu Inc. for being “fearless” and “irreverent” enough to acknowledge that car ads are ridiculous and that the Audience is dumb to believe them. The ads invite the lone viewer to drive an Isuzu as some sort of anti-advertising statement. The ads successfully associate Isuzu-purchase with fearlessness and irreverence and the capacity to see through deception. You can find successful television ads that mock TV-ad conventions almost anywhere you look, from Settlemeyer’s Federal Express and Wendy’s spots, with their wizened, sped-up burlesques of commercial characters, to those hip Doritos splices of commercial spokesmen and campy old clips of Beaver and Mr. Ed.
Plus you can see this tactic of heaping scorn on pretensions to those old commercial virtues of authority and sincerity – thus (1) shielding the heaper of scorn from scorn and (2) congratulating the patron of scorn for rising above the mass of people who still fall for outmoded pretensions – employed to serious advantage on many of the television programs the commercials support. Show after show, for years now, has been either a self-acknowledged blank, visual, postmodern allusion- and attitude-fest, or, even more common, an uneven battle of wits between some ineffectual spokesman for hollow authority and his precocious children, mordant spouse, or sardonic colleagues. Compare television’s treatment of earnest authority figures on pre-ironic shows – the FBI’s Erskine, Star Trek’s Kirk, Beaver’s Ward, Partridge Family’s Shirley, Five-O’s McGarrett – to TV’s depiction of Al Bundy on Married, with Children, Mr. Owens on Mr. Belvedere, Homer on The Simpsons, Daniels and Hunter on Hill Street Blues, Jason Seaver on Growing Pains, Dr. Craig on St. Elsewhere.
The modern Sitcom, in particular, is almost wholly dependent for laughs and tone on the M*A*S*H-inspired savaging of some buffoonish spokesman for hypocritical, pre-hip values at the hands of bitingly witty insurgents. As Hawkeye savaged Frank and later Charles, so Herb is savaged by Jennifer and Carlson by J. Fever on WKRP, Mr. Keaton by Alex on Family Ties, boss by typing pool on Nine to Five, Seaver by whole family on Pains, Bundy by entire planet on Married, w/ (the ultimate sitcom parody of sitcoms). In fact, just about the only authority figures who retain any credibility on post-eighties shows (besides those like Hill Street’s Furillo and Elsewhere’s Westphal, who are surrounded by such relentless squalor that simply hanging in there week after week makes them heroic) are those upholders of values who can communicate some irony about themselves, make fun of themselves before any merciless group around them can move in for the kill – see Huxtable on Cosby, Belvedere on Belvedere, Twin Peaks’ Special Agent Cooper, Fox TV’s Gary Shandling (the theme to whose show goes “This is the theme to Gary’s show”), and the ironic eighties’ true Angel of Death, D. Letterman.
Its promulgation of cynicism about all authority works to the general advantage of television on a number of levels. First, to the extent that TV can ridicule old-fashioned conventions right off the map, it can create an authority vacuum. And then guess what fills it. The real authority on a world we now view as constructed and not depicted becomes the medium that constructs our worldview. Second, to the extent that TV can refer exclusively to itself and debunk conventional standards as hollow, it is invulnerable to critics’ charges that what’s on is shallow or crass or bad, since any such judgments appeal to conventional, extratelevisual standards about depth, taste, and quality. Too, the ironic tone of TV’s self-reference means that no one can accuse TV of trying to put anything over on anybody: as essayist Lewis Hyde points out, all self-mocking irony is “Sincerity, with a motive.”
And, more to the original point, if television can invite Joe Briefcase into itself via in-gags and irony, it can ease that painful tension between Joe’s need to transcend the crowd and his status as Audience member. For to the extent that TV can flatter Joe about “seeing through” the pretentiousness and hypocrisy of outdated values, it can induce in him precisely the feeling of canny superiority it’s taught him to crave, and can keep him dependent on the cynical TV-watching that alone affords this feeling. And to the extent that it can train viewers to laugh at characters’ unending put-downs of one another, to view ridicule as both the mode of social intercourse and the ultimate art form, television can reinforce its own queer ontology of appearance: the most frightening prospect, for the well-conditioned viewer, becomes leaving oneself open to others’ ridicule by betraying passe expressions of value, emotion, or vulnerability. Other people become judges; the crime is naivete. The well-trained lonely viewer becomes even more allergic to people. Lonelier. Joe B.’s exhaustive TV-training in how to worry about how he might come across, seem to other eyes, makes riskily genuine human encounters seem even scarier. But televisual irony has the solution (to the problem it’s aggravated): further viewing begins to seem almost like required research, lessons in the blank, bored, too-wise expression that Joe must learn how to wear for tomorrow’s excruciating ride on the brightly lit subway, where crowds of blank, bored-looking people have little to look at but each other.
What does TV’s institutionalization of hip irony have to do with U.S. fiction? Well, for one thing, American literary fiction tends to be about U.S. culture and the people who inhabit it. Culture-wise, shall I spend much of your time pointing out the degree to which televisual values influence the contemporary mood of jaded weltschmerz, self-mocking materialism, blank indifference, and the delusion that cynicism and naivete are mutually exclusive? Can we deny connections between an unprecedentedly powerful consensual medium that suggests no real difference between image and substance and the rise of Teflon presidencies, the establishment of nationwide tanning and liposuction industries, the popularity of “vogueing” to a bad Marilyn-imitator’s synthesized command to “strike a pose”? Or, in serious contemporary art, that televisual disdain for “hypocritical” retrovalues like originality, depth, and integrity has no truck with those recombinant “appropriation” styles of art and architecture in which past becomes pastiche, or with the tuneless solmization of a Glass or a Reich, or with the self-conscious catatonia of a platoon of Raymond Carver wannabes?
In fact the numb blank bored demeanor – what my best friend calls the “girl-who’s-dancing-with-you-but-would-obviously-rather-be-dancing-with-somebody -else” expression – that has become my generation’s version of cool is all about TV. “Television,” after all, literally means “seeing far”; and our 6 hrs. daily not only helps us feel up-close and personal at like the Pan Am Games or Operation Desert Shield but, obversely, trains us to see real-life personal up-close stuff the same way we relate to the distant and exotic, as if separated from us by physics and glass, extant only as performance, awaiting our cool review. Indifference is actually just the contemporary version of frugality, for U.S. young people: wooed several gorgeous hours a day for nothing but our attention, we regard that attention as our chief commodity, our social capital, and we are loath to fritter it. In the same regard, see that in 1990, flatness, numbness, and cynicism in one’s demeanor are clear ways to transmit the televisual attitude of stand-out transcendence – flatness is a transcendence of melodrama, numbness transcends sentimentality, and cynicism announces that one knows the score, was last naive about something at maybe like age four.
Whether or not 1990s youth culture seems as grim to you as it does to me, surely we can agree that the culture’s TV-defined pop ethic has pulled a marvelous touche on the postmodern aesthetic that originally sought to co-opt and redeem the pop. Television has pulled the old dynamics of reference and redemption inside-out: it is now television that takes elements of the postmodern – the involution, the absurdity, the sardonic fatigue, the iconoclasm and rebellion – and bends them to the ends of spectation and consumption. As early as ’84, critics of capitalism were warning that “What began as a mood of the avant-garde has surged into mass culture.”
But postmodernism didn’t just all of a sudden “surge” into television in 1984. Nor have the vectors of influence between the postmodern and the televisual been one-way. The chief connection between today’s television and today’s fiction is historical. The two share roots. For postmodern fiction – written almost exclusively by young white males-clearly evolved as an intellectual expression of the “rebellious youth culture” of the sixties and early seventies. And since the whole gestalt of youthful U.S. rebellion was made possible by a national medium that erased communicative boundaries between regions and replaced a society segmented by location and ethnicity with what rock music critics have called “a national self-consciousness stratified by generation,” the phenomenon of TV had as much to do with postmodernism’s rebellious irony as it did with peaceniks’ protest rallies.
In fact, by offering young, overeducated fiction writers a comprehensive view of how hypocritically the U.S.A. saw itself circa 1960, early television helped legitimize absurdism and irony as not just literary devices but sensible responses to an unrealistic world. For irony – exploiting gaps between what’s said and what’s meant, between how things try to appear and how they really are – is the time-honored way artists seek to illuminate and explode hypocrisy. And the television of lone-gunman Westerns, paternalistic sitcoms, and jut-jawed law enforcement circa 1960 celebrated a deeply hypocritical American self-image. Miller describes nicely how the 1960s sitcom, like the Westerns that preceded them, “negated the increasing powerlessness of white-collar males with images of paternal strength and manly individualism. Yet by the time these sit-coms were produced, the world of small business [whose virtues were the Hugh Beaumontish ones of ‘self-possession, probity, and sound judgment’] had long since been . . . superseded by what C. Wright Mills called ‘the managerial demiurge,’ and the virtues personified by . . . Dad were in fact passe.”
In other words, early U.S. TV was a hypocritical apologist for values whose reality had become attenuated in a period of corporate ascendancy, bureaucratic entrenchment, foreign adventurism, racial conflict, secret bombing, assassination, wiretaps, etc. It’s not one bit accidental that postmodern fiction aimed its ironic cross hairs at the banal, the naive, the sentimental and simplistic and conservative, for these qualities were just what sixties TV seemed to celebrate as “American.”
And the rebellious irony in the best postmodern fiction wasn’t only credible as art; it seemed downright socially useful in its capacity for what counterculture critics call “a critical negation that would make it self-evident to everyone that the world is not as it seems.” Kesey’s dark parody of asylums suggested that our arbiters of sanity were maybe crazier than their patients; Pynchon reoriented our view of paranoia from deviant psychic fringe to central thread in the corporo-bureaucratic weave; DeLillo exposed image, signal, data, and tech as agents of spiritual chaos and not social order. Burroughs’s icky explorations of American narcosis exploded hypocrisy; Gaddis’s exposure of abstract capital as dehumanizing exploded hypocrisy; Coover’s repulsive political farces exploded hypocrisy. Irony in sixties art and culture started out the same way youthful rebellion did. It was difficult and painful, and productive – a grim diagnosis of a long-denied disease. The assumptions behind this early postmodern irony, on the other hand, were still frankly idealistic: that etiology and diagnosis pointed toward cure; that revelation of imprisonment yielded freedom.
So then how have irony, irreverence, and rebellion come to be not liberating but enfeebling in the culture today’s avant-garde tries to write about? One clue’s to be found in the fact that irony is still around, bigger than ever after thirty long years as the dominant mode of hip expression. It’s not a mode that wears especially well. As Hyde puts it, “Irony has only emergency use. Carried over time, it is the voice of the trapped who have come to enjoy their cage.” This is because irony, entertaining as it is, serves an exclusively negative function. It’s critical and destructive, a ground-clearing. Surely this is the way our postmodern fathers saw it. But irony’s singularly unuseful when it comes to constructing anything to replace the hypocrisies it debunks. This is why Hyde seems right about persistent irony being tiresome. It is unmeaty. Even gifted ironists work best in sound bites. I find them sort of wickedly fun to listen to at parties, but I always walk away feeling like I’ve had several radical surgical procedures. And as for actually driving cross-country with a gifted ironist, or sitting through a 300-page novel full of nothing but trendy sardonic exhaustion, one ends up feeling not only empty but somehow … oppressed.
Think, if you will for a moment, of Third World rebels and coups. Rebels are great at exposing and overthrowing corrupt hypocritical regimes, but seem noticeably less great at the mundane, non-negative tasks of then establishing a superior governing alternative. Victorious rebels, in fact, seem best at using their tough cynical rebel skills to avoid being rebelled against themselves – in other words they just become better tyrants.
And make no mistake: irony tyrannizes us. The reason why our pervasive cultural irony is at once so powerful and so unsatisfying is that an ironist is impossible to pin down. All irony is a variation on a sort of existential poker-face. All U.S. irony is based on an implicit “I don’t really mean what I say.” So what does irony as a cultural norm mean to say? That it’s impossible to mean what you say? That maybe it’s too bad it’s impossible, but wake up and smell the coffee already? Most likely, I think, today’s irony ends up saying: “How very banal to ask what I mean.” Anyone with the heretical gall to ask an ironist what he actually stands for ends up looking like a hysteric or a prig. And herein lies the oppressiveness of institutionalized irony, the too-successful rebel: the ability to interdict the question without attending to its content is tyranny. It is the new junta, using the very tool that exposed its enemy to insulate itself.
This is why our educated teleholic friends’ use of weary cynicism to try to seem superior to TV is so pathetic. And this is why the fiction-writing citizen of our televisual culture is in such deep doo. What do you do when postmodern rebellion becomes a pop-cultural institution? For this of course is the second clue to why avant-garde irony and rebellion have become dilute and malign. They have been absorbed, emptied, and redeployed by the very televisual establishment they had originally set themselves athwart.
Not that television is culpable for true evil, here. Just for immoderate success. This is, after all, what TV does: it discerns, decocts, and represents what it thinks U.S. culture wants to see and hear about itself. No one and everyone is at fault for the fact that television started gleaning rebellion and cynicism as the hip, upscale, baby-boomer imago populi. But the harvest has been dark: the forms of our best rebellious art have become mere gestures, shticks, not only sterile but perversely enslaving. How can, even the idea of rebellion against corporate culture stay meaningful when Chrysler Inc. advertises trucks by invoking “The Dodge Rebellion”? How is one to be a bona fide iconoclast when Burger King sells onion rings with “Sometimes You Gotta Break the Rules”? How can a new image-fiction writer hope to make people more critical of televisual culture by parodying television as a self-serving commercial enterprise when Pepsi and Isuzu and Fed Ex parodies of self-serving commercials are already big business? It’s almost a history lesson: I’m starting to see just why turn-of-the-century America’s biggest fear was of anarchists and anarchy. For if anarchy actually wins, if rulelessness becomes the rule, then protest and change become not just impossible but incoherent. It’d be like casting ballots for Stalin: how do you vote for no more voting?
So here’s the stumper for the 1990 U.S. fictionist who both breathes our cultural atmosphere and sees himself heir to whatever was neat and valuable in postmodern lit. How to rebel against TV’s aesthetic of rebellion? How to snap readers awake to the fact that our TV-culture has become a cynical, narcissistic, essentially empty phenomenon, when television regularly celebrates just these features in itself and its viewers? These are the very questions DeLillo’s poor schmuck of a popologist was asking back in ’85 about America, that most photographed of barns:
“What was the barn like before it was photographed?” he said. “What did it look like, how was it different from other barns, how was it similar to other barns? We can’t answer these questions because we’ve lead the signs, seen the people snapping the pictures. We can’t get outside the aura. We’re part of the aura. We’re here, we’re now.”
He seemed immensely pleased by this.
End of the End of the Line
What responses to television’s commercialization of the modes of literary protest seem possible, then, today? One obvious option is for the fiction writer to become reactionary, fundamentalist. Declare contemporary television evil and contemporary culture evil and turn one’s back on the whole Spandexed mess and genuflect instead to good old pre-sixties Hugh Beaumontish virtues and literal readings of the Testaments and be pro-Life, anti-Fluoride, antediluvian. The problem with this is that Americans who’ve opted for this tack seem to have one eyebrow straight across their forebead and knuckles that drag on the ground and just seem like an excellent crowd to want to transcend. Besides, the rise of Reagan/Bush showed that hypocritical nostalgia for a kinder, gentler, more Christian pseudo-past is no less susceptible to manipulation in the interests of corporate commercialism and PR image. Most of us will still take nihilism over neanderthalism.
Another option is to adopt a somewhat more enlightened political conservatism that exempts viewer and networks alike from any complicity in the bitter stasis of televisual culture, and instead blames all TV-related problems on certain correctable defects in broadcasting technology. Enter media futurologist George Gilder, a Hudson Institute Senior Fellow and author of 1990’s Life after Television: The Coming Transformation of Media and American Life. The single most fascinating thing about Life after Television is that it’s a book with commercials. Published in something called “The Larger Agenda Series” by a “Whittle Direct Books” in Federal Express Inc.’s Knoxville headquarters, the book sells for only $11.00 hard, including postage, is big and thin enough to look great on executive coffee tables, and has really pretty full-page ads for Federal Express on every fifth page. The book’s also largely a work of fiction, plus is a heart-rending dramatization of why anti-TV conservatives, motivated by simple convictions like “Television is at heart a totalitarian medium” whose “system is an alien and corrosive force in democratic capitalism” are going to be of little help with our ultraradical TV problems, attached as conservative intellectuals still are to their twin tired remedies for all U.S. ills: the beliefs that (1) the discerning consumer instincts of the little guy would correct all imbalances if only big systems would quit stifling his freedom to choose, and that (2) tech-bred problems can be resolved technologically.
Gilder’s basic report and forecast run thus: television as we know and suffer it is “a technology with supreme powers but deadly flaws.” The really fatal flaw is that the whole structure of television programming, broadcasting, and reception is still informed by the technological limitations of the old vacuum tubes that first enabled TV. The “expense and complexity of these tubes used in television sets meant that most of the processing of signals would have to be done at the” networks, a state of affairs that “dictated that television would be a top-down system – in electronic terms, a |master-slave’ architecture. A few broadcasting centers would originate programs for millions of passive receivers, or |dumb terminals.’ “By the time the transistor (which does essentially what vacuum tubes do but in less space at lower cost) found commercial applications, the top-down TV system was already entrenched and petrified, dooming viewers to docile reception of programs they were dependent on a very few networks to provide, and creating a “psychology of the masses” in which a trio of programming alternatives aimed to appeal to millions and millions of Joe B.s. The passive plight of the viewer was aggravated by the fact that the EM pulses used to broadcast TV signals are analog waves. Analogs were once the required medium, since “with little storage or processing available at the set, the signals … would have to be directly displayable waves,” and “analog waves directly simulate sound, brightness, and color.” But analog waves can’t be saved or edited by their recipient. They’re too much like life: there in gorgeous toto one instant and then gone. What the poor TV viewer gets is only what he sees. With cultural consequences Gilder describes in apocalyptic detail. Even High Definition Television (HDTV), touted by the industry as the next big advance in entertainment-furniture, will, according to Gilder, be just the same vacuuous emperor in a snazzier suit.
But in 1990, TV, still clinging to the crowd-binding and hierarchical technologies of yesterdecade, is for Gilder now doomed by the advances in microchip and fiber-optic technology of the last couple years. The user-friendly microchip, which consolidates the activities of millions of transistors on one 49 [cents] wafer, and whose capacities will get even more attractive as controlled-electron conduction approaches the geodesic paradigm of efficiency, will allow receivers – TV sets – to do much of the image-processing that has hitherto been done “for” the viewer by the broadcaster. In another happy development, transporting images through glass fibers rather than the EM spectrum will allow people’s TV sets to be hooked up with each other in a kind of interactive net instead of all feeding passively at the transmitting teat of a single broadcaster. And fiber-optic transmissions have the further advantage that they conduct characters of information digitally. Since “digital signals have an advantage over analog signals in that they can be stored and manipulated without deterioration,” as well as being crisp and interferenceless as quality CDs, they’ll allow the microchip’d television receiver (and thus the TV viewer) to enjoy much of the discretion over selection, manipulation, and recombination of video images that is now restricted to the director’s booth.
For Gilder, the new piece of furniture that will free Joe Briefcase from passive dependence on his furniture will be “the telecomputer, a personal computer adapted for video processing and connected by fiber-optic threads to other telecomputers around the world.” The fibrous TC “will forever break the broadcast bottleneck” of television’s one-active-many-passive structure of image-propagation. Now everybody’ll get to be his own harried guy with headphones and clipboard. In the new millennium, U.S. television will finally become ideally, GOPishly democratic: egalitarian, interactive, and “profitable without being exploitative.”
Boy, does Gilder know his “Larger Agenda” audience. You can just see saliva overflowing lower lips in boardrooms as Gilder forecasts that the consumer’s whole complicated fuzzy inconveniently transient world will become broadcastable, manipulable, storable, and viewable in the comfort of his own condo. “With artful programming of telecomputers, you could spend a day interacting on the screen with Henry Kissinger, Kim Basinger, or Billy Graham.” Rather ghastly interactions to contemplate, but then in Gilderland to each his own: “Celebrities could produce and sell their own software. You could view the Super Bowl from any point in the stadium you choose, or soar above the basket with Michael Jordan. Visit your family on the other side of the world with moving pictures hardly distinguishable from real-life images. Give a birthday party for Grandma in her nursing home in Florida, bringing her descendents from all over the country to the foot of her bed in living color.”
And not just warm 2D images of family: any experience will be transferrable to image and marketable, manipulable, consumable. People will be able to “go comfortably sight-seeing from their living room through high-resolution screens, visiting Third-World countries without having to worry about air fares or exchange rates … you could fly an airplane over the Alps or climb Mount Everest – all on a powerful high-resolution display.”
We will, in short, be able to engineer our own dreams.
In sum, then, a conservative tech writer offers a really attractive way of looking at viewer passivity and TV’s institutionalization of irony, narcissism, nihilism, stasis. It’s not our fault! It’s outmoded technology’s fault! If TV-dissemination were up to date, it would be impossible for it to “institutionalize” anything through its demonic “mass psychology”! Let’s let Joe B., the little lonely guy, be his own manipulator of video-bits! Once all experience is finally reduced to marketable image, once the receiving user of user-friendly receivers can choose freely, Americanly, from an Americanly infinite variety of moving images hardly distinguishable from real-life images, and can then choose further just how he wishes to store, enhance, edit, recombine, and present those images to himself, in the privacy of his very own home and skull, TV’s ironic, totalitarian grip on the American psychic cajones will be broken!
Note that Gilder’s semiconducted vision of a free, orderly video future is way more upbeat than postmodernism’s old view of image and data. The seminal novels of Pynchon and DeLillo revolve metaphorically off the concept of interference: the more connections, the more chaos, and the harder it is to cull any meaning from the seas of signed. Gilder would call their gloom outmoded, their metaphor infected with the deficiencies of the transistor: “In all networks of wires and switches, except for those on the microchip, complexity tends to grow exponentially as the number of interconnections rises, [but] in the silicon maze of microchip technology . . . efficiency, not complexity, grows as the square of the number of interconnections to be organized.” Rather than a vacuous TV-culture smothering in cruddy images, Gilder foresees a TC-culture redeemed by a whole lot more to choose from and a whole lot more control over what you choose to . . . umm . . . see? pseudo-experience? dream?
It’d be unrealistic to think that expanded choices alone could resolve our televisual bind. The advent of cable upped choices from four or five to forty-plus synchronic alternatives, with little apparent loosening of television’s grip on mass attitudes and aesthetics. It seems rather that Gilder sees the nineties’ impending breakthrough as U.S. viewers’ graduation from passive reception of facsimiles of experience to active manipulation of facsimiles of experience.
It’s worth questioning Gilder’s definition of televisual “passivity,” though. His new tech would indeed end “the passivity of mere reception.” But the passivity of Audience, the acquiescence inherent in a whole culture of and about watching, looks unaffected by TCs.
The appeal of watching television has always involved fantasy. Contemporary TV, I’ve claimed, has gotten vastly better at enabling the viewer’s fantasy that he can transcend the limitations of individual human experience, that he can be inside the set, imago’d, “anyone, anywhere.” Since the limitations of being one human being involve certain restrictions on the number of different experiences possible to us in a given period of time, it’s arguable that the biggest TV-tech “advances” of recent years have done little but abet this fantasy of escape from the defining limits of being human. Cable expands our choices of evening realities; hand-held gizmos let us leap instantly from one to another; VCRs let us commit experiences to an eidetic memory that permits re-experience at any time without loss or alteration. These advances sold briskly and upped average viewing-doses, but they sure haven’t made U.S. televisual culture any less passive or cynical.
The downside of TV’s big fantasy is that it’s just a fantasy. As a special treat, my escape from the limits of genuine experience is neato. As my steady diet, though, it can’t help but render my own reality less attractive (because in it I’m just one Dave, with limits and restrictions all over the place), render me less fit to make the most of it (because I spend all my time pretending I’m not in it), and render me dependent on the device that affords escape from just what my escapism makes unpleasant.
It’s tough to see how Gilder’s soteriological vision of having more “control” over the arrangement of high-quality fantasy-bits is going to ease either the dependency that is part of my relation to TV or the impotent irony I must use to pretend I’m not dependent. Whether passive or active as viewer, I must still cynically pretend, because I’m still dependent, because my real dependency here is not on the single show or few networks any more than the hophead’s is on the Turkish florist or the Marseilles refiner. My real dependency is on the fantasies and the images that enable them, and thus on any technology that can make images fantastic. Make no mistake. We are dependent on image-technology; and the better the tech, the harder we’re hooked.
The paradox in Gilder’s rosy forecast is the same as in all forms of artificial enhancement. The more enhancing the mediation – see for instance binoculars, amplifiers, graphic equalizers, or “high-resolution pictures hardly distinguishable from real-life images” – the more direct, vivid, and real the experience seems, which is to say the more direct, vivid, and real the fantasy and dependence are.
An exponential surge in the mass of televisual images, and a commensurate increase in my ability to cut, paste, magnify, and combine them to suit my own fancy, can do nothing but render my interactive TC a more powerful enhancer and enabler of fantasy, my attraction to that fantasy stronger, the real experiences of which my TC offers more engaging and controllable simulacra paler and more frustrating to deal with, and me just a whole lot more dependent on my furniture. Jacking the number of choices and options up with better tech will remedy exactly nothing, so long as no sources of insight on comparative worth, no guides to why and how to choose among experiences, fantasies, beliefs, and predilections, are permitted serious consideration in U.S. culture. Insights and guides to human value used to be among literature’s jobs, didn’t they? But then who’s going to want to take such stuff seriously in ecstatic post-TV life, with Kim Basinger waiting to be interacted with?
My God, I’ve just reread my heartfelt criticisms of Gilder. That he is naive. That he is an apologist for cynical corporate self-interest. That his book has commercials. That under its futuristic novelty is just the same old American same-old that got us into this televisual mess. That Gilder vastly underestimates the intractability of the mess. Its hopelessness. Our fatigue. My attitude, reading Gilder, is sardonic, aloof, jaded. My reading of Gilder is televisual. I am in the aura.
Well, but at least Gilder is unironic. In this respect he’s like a cool summer breeze compared to Mark Leyner, the young New Jersey writer whose 1990 My Cousin, My Gastroenterologist is the biggest thing for campus hipsters since The Dharma Bums. Leyner’s ironic cyberpunk novel exemplifies a third kind of literary response to our problem. For of course young U.S. writers can “resolve” the problem of being trapped in the televisual aura the same way French poststructuralists “resolve” their being enmeshed in the logos. We can solve the problem by celebrating it. Transcend feelings of mass-defined angst by genuflecting to them. We can be reverently ironic.
My Cousin, My Gastroenterologist is new not so much in kind as in degree. It is a methedrine compound of pop pastiche, offhand high tech, and dazzling televisual parody, formed with surreal juxtapositions and grammarless monologues and flash-cut editing, and framed with a relentless irony designed to make its frantic tone seem irreverent instead of repulsive. You want sendups of commercial culture?
I had just been fired from McDonald’s for refusing to wear a kilt during production launch week for their new McHaggis sandwich. (18)
he picks up a copy of das plumpe denken new england’s most disreputable german-language newsmagazine blast in egg cream factory kills philatelist he turns the page radioactive glow-in-the-dark semen found in canada he turns the page modern-day hottentots carry young in resealable sandwich bags he turns the page wayne newton calls mother’s womb single-occupancy garden of eden morgan fairchild calls sally struthers loni anderson.(37)
what color is your mozzarella? i asked the waitress it’s pink – it’s the same color as the top of a mennen lady speed stick dispenser, y’know that color? no, maam I said it’s the same color they use for the gillette daisy disposable razors for women . . . y’know that color? nope well, it’s the same pink as pepto-bismol, y’know that color oh yeah, i said, well do you have spaghetti? (144)
You want mordant sendups of television?
Muriel got the TV Guide, flipped to Tuesday 8 p.m., and read aloud: . . . There’s a show called “A Tumult of Pubic Hair and Bobbing Flaccid Penises as Sweaty Naked Chubby Men Run From the Sauna Screaming Snake! Snake! . . . It also stars Brian Keith, Buddy Ebsen, Nipsey Russell, and Lesley Ann Warren. (98-99)
You like mocking self-reference? The novel’s whole last chapter is a parody of its own “About the Author” page. Or maybe you’re into hip identitylessness?
Grandma rolled up a magazine and hit Buzz on the side of the head. . . . Buzz’s mask was knocked loose. There was no skin beneath that mask. There were two white eyeballs protruding on stems from a mass of oozing blood-red musculature. (98)
I can’t tell if she’s human or a fifth-generation gynemorphic android and I don’t care. (6)
Parodic meditations on the boundaryless flux of televisual monoculture?
I’m stirring a pitcher of Tanqueray martinis with one hand and sliding a tray of frozen clams oreganata into the oven with my foot. God, these methedrine suppositories that Yogi Vithaldas gave me are good! As I iron a pair of tennis shorts I dictate a haiku into the tape recorder and then . . . do three minutes on the speedbag before making an origami praying mantis and then reading an article in High Fidelity magazine as I stir the coq au vin. (49)
The decay of both the limits and the integrity of the single human self?
There was a woman with the shrunken, wrinkled face of an eighty- or ninety-year-old. And this withered hag, this apparent octogenarian, had the body of a male Olympic swimmer. The long lean sinewy arms, the powerful V-shaped upper torso, without a single ounce of fat. . . . (120)
to install your replacement head place the head assembly on neck housing and insert guide pins through mounting holes . . . if, after installing new head, you are unable to discern the contradictions in capitalist modes of production, you have either installed your head improperly or head is defective (142-43)
In fact, one of My Cousin, My Gastroenterologist’s unifying obsessions is this latter juxtaposition of parts of selves, people and machines, human subjects and discrete objects. Leyner’s fiction is, in this regard, an eloquent reply to Gilder’s prediction that our TV-culture problems can be resolved by the dismantling of images into discrete chunks we can recombine as we fancy. Leyner’s world is a Gilder-esque dystopia. The passivity and schizoid decay still endure for Leyner in his characters’ reception of images and waves of data. The ability to combine them only adds a layer of disorientation: when all experience can be deconstructed and reconfigured, there become simply too many choices. And in the absence of any credible, noncommercial guides for living, the freedom to choose is about as “liberating” as a bad acid trip: each quantum is as good as the next, and the only standard of an assembly’s quality is its weirdness, incongruity, its ability to stand out from a crowd of other image-constructs and wow some Audience.
Leyner’s novel, in its amphetaminic eagerness to wow the reader, marks the far dark frontier of the fiction of image – literature’s absorption of not just the icons, techniques, and phenomena of television, but of television’s whole objective. My Cousin, My Gastroenterologist’s sole aim is, finally, to wow, to ensure that the reader is pleased and continues to read. The book does this by (1) flattering the reader with appeals to his erudite postmodern weltschmerz, and (2) relentlessly reminding the reader that the author is smart and funny. The book itself is extremely funny, but it’s not funny the way funny stories are funny. It’s not that funny things happen here; it’s that funny things are self-consciously imagined and pointed out, like the comedian’s stock “You ever notice how. . . ?” or “Ever wonder what would happen if. . . ?”
Actually, Leyner’s whole high-imagist style most often resembles a kind of lapidary stand-up comedy:
Suddenly Bob couldn’t speak properly. He had suffered some form of spontaneous aphasia. But it wasn’t total aphasia. He could speak, but only in a staccato telegraphic style. Here’s how he described driving through the Midwest on Interstate 80: “Corn corn corn corn Stuckeys. Corn corn corn corn Stuckeys.” (20)
there’s a bar on the highway which caters almost exclusively to authority figures and the only drink it serves is lite beer and the only food it serves is surf and turf and the place is filled with cops and state troopers and gym teachers and green berets and toll attendants and game wardens and crossing guards and umpires. (89-90)
Leyner’s fictional response to television is less a novel than a piece of witty, erudite, extremely high-quality prose television. Velocity and vividness – the wow – replace the literary hmm of actual development. People flicker in and out; events are garishly there and then gone and never referred to. There’s a brashly irreverent rejection of “outmoded” concepts like integrated plot or enduring character. Instead there’s a series of dazzlingly creative parodic vignettes, designed to appeal to the forty-five seconds of near-Zen concentration we call the TV attention span. Unifying the vignettes in the absence of plot are moods – antic anxiety, the over-stimulated stasis of too many choices and no chooser’s manual, irreverent brashness toward televisual reality – and, after the manner of pop films, music videos, dreams, and television programs, recurring “key images” – here exotic drugs, exotic technology, exotic food, exotic bowel dysfunctions. It’s no accident that My Cousin, My Gastroenterologist’s central preoccupation is with digestion and elimination. Its mocking challenge to the reader is the same as television’s flood of realities and choices: ABSORB ME – PROVE YOU’RE CONSUMER ENOUGH.
Leyner’s work, the best image-fiction yet, is both amazing and forgettable, wonderful and oddly hollow. I’m finishing up by talking about it at length because, in its masterful reabsorption of the very features TV had absorbed from postmodern lit, it seems as of now the ultimate union of U.S. television and fiction. It seems also to limn the qualities of image-fiction itself in stark relief: the best stuff the subgenre’s produced to date is hilarious, upsetting, sophisticated, and extremely shallow – and just plain doomed by its desire to ridicule a TV-culture whose ironic mockery of itself and all “outdated” value absorbs all ridicule. Leyner’s attempt to “respond” to television via ironic genuflection is all too easily subsumed into the tired televisual ritual of mock worship.
Entirely possible that my plangent cries about the impossibility of rebelling against an aura that promotes and attenuates all rebellion says more about my residency inside that aura, my own lack of vision, than it does about any exhaustion of U.S. fiction’s possibilities. The next real literary “rebels” in this country might well emerge as some weird bunch of “anti-rebels,” born oglers who dare to back away from ironic watching, who have the childish gall actually to endorse single-entendre values. Who treat old untrendy human troubles and emotions in U.S. life with reverence and conviction. Who eschew self-consciousness and fatigue. These anti-rebels would be outdated, of course, before they even started. Too sincere. Clearly repressed. Backward, quaint, naive, anachronistic. Maybe that’ll be the point, why they’ll be the next real rebels. Real rebels, as far as I can see, risk things. Risk disapproval. The old postmodern insurgents risked the gasp and squeal: shock, disgust, outrage, censorship, accusations of socialism, anarchism, nihilism. The new rebels might be the ones willing to risk the yawn, the rolled eyes, the cool smile, the nudged ribs, the parody of gifted ironists, the “How banal.” Accusations of sentimentality, melodrama. Credulity. Willingness to be suckered by a world of lurkers and starers who fear gaze and ridicule above imprisonment without law. Who knows. Today’s most engaged young fiction does seem like some kind of line’s end’s end. I guess that means we all get to draw our own conclusions. Have to. Are you immensely pleased.
NOTES FOR EDITOR, WHICH EDITOR, FOR REASONS KNOWN ONLY TO HIM, WANTS TO RUN W/ESSAY.
(1) This, and thus the title, is from a toss-off in Michael Sorkin’s “Faking It” published in Todd Gitlin, ed., Watching Television, Pantheon, 1987. (2) Quoted by Stanley Cavell in Pursuits of Happiness, Harvard U. Press, 1981, epigraph. (3) Bernard Nossiter, “The FCC’s Big Giveaway Show,” The Nation, 10/26/85, p. 402. (4) Janet Maslin, “It’s Tough for Movies to Get Real,” NYT Arts & Leisure, 8/05/ 90, p. 9. (5) Stephen Holden, “Strike the Pose: When Music Is Skin-Deep,” ibid., p. 1. (6) Michael Sorkin, p. 163. (7) Daniel Hallin, “We Keep America on Top of the World,” in Gitlin anthology. (8) Barbara Tuchman, “The Decline of Quality,” NYT Magazine, 11/02/80. (9) Alexis de Tocqueville, Democracy in America, Vintage, 1945, pp. 57 and 73. (10) Don DeLillo, White Noise, Viking, 1985, p. 72 (11) Octavio Paz, Children of the Mire, Harvard U. Press, 1974, pp. 103-18. (12) Bill Knott, “And Other Travels,” in Love Poems to Myself Book One, Barn Dream Press, 1974. (13) “Stephen Dobyns, “Arrested Saturday Night,” in Heat Death, McClelland and Stewart, 1980. (14) Bill Knott, “Crash Course,” in Becos, Vintage, 1983. (15) Michael Martone, Fort Wayne Is Seventh On Hitler’s List, Indiana U. Press, 1990, P. ix. (16) Mark Leyner, My Cousin, My Gastroenterologist, Harmony/Crown, 1990, p. 82. (17) Miller, “Deride and Conquer,” in Gitlin anthology. (18) “At Foote, Cone and Belding, quoted by Miller (somewhere I can’t find in notes). (19) There’s a similar point made about Miami Vice in Todd Gitlin’s “We Build Excitement” in his anthology. (20) Miller, p. 194. (21) Miller, p. 187. (22) Miller’s “Deride” has a similar analysis of sitcoms (in fact my whole discussion of TV irony leans heavily on Gitlin’s, Sorkin’s, and Miller’s essays in Gitlin’s anthology), but anyway w/r/t sitcoms Miller is talking about some weird Freudian patricide in how TV comedy views The Father – strange but very cool. (23) Miller’s “Deride” makes pretty much this same point about Cosby. (24) Lewis Hyde, “Alcohol and Poetry: John Berryman and the Booze Talking,” American Poetry Review, reprinted in the Pushcart Prize anthology for ’87. (25) I liberated this from somewhere in Watching Television; can’t find just where. (26) Fredric Jameson, “Postmodernism, or the Cultural Logic of Late Capitalism,” New Left Review 146, Summer ’84, pp. 60-66. (27) Pat Auferhode, “The Look of the Sound,” in Gitlin anthology, p. 113. (28) Miller, p. 199.