When John Landis’ An American Werewolf In London was released in 1981 and David Cronenberg’s The Fly followed five years later, what initially grabbed people’s attention were the Academy Award-winning creature effects by Rick Baker (whose work on the former prompted AMPAS to create the Best Makeup category) and Chris Walas (whose other reward for his part in The Fly’s success was the opportunity to direct its less memorable sequel). Charged with updating horror icons of the ’40s and ’50s for savvy moviegoers primed to be dazzled by state-of-the-art special effects, Baker and Walas stepped up to the plate and delivered unforgettable monsters and set-pieces that have earned a permanent place in movie history. Even today, their work continues to impress, evincing a staying power that modern digital effects have a hard time sustaining. But what keeps viewers coming back to these two films time and again are the human love stories than run parallel with the bloody carnage and acid-spewing monster-men.
While both films have fantastic premises—namely, that a man bitten by a werewolf will become one himself and a man who teleports himself with a housefly will become a man-sized fly—they’re grounded in the mundane and the every day. Take the London flat of nurse Alex Price (Jenny Agutter) in American Werewolf, which reveals a lot about her that she might otherwise be reluctant—or unable—to say out loud. Having taken a fancy to him, Alex brings home American backpacker (and unwitting werewolf) David Kessler (David Naughton) after he’s discharged from the hospital where she’s been tending to him while he recovered from an animal attack on the Yorkshire moors. Warning him not to expect too much since she’s “just a working girl,” she shows him around her cramped quarters, which art director Leslie Dilley and his crew make look as lived-in as possible without being too cluttered. The living room has plenty of throw pillows, photographs on the mantle, and shelves crammed with books and knickknacks, while the bathroom has little plants in individual pots. The hallway leading to her bedroom even has a single flower in a box mounted on the wall, which David’s zombie friend Jack (Griffin Dunne) playfully smells when he pays them a visit later that night. The very first thing the viewer sees when Alex lets David in, though, is a Casablanca poster hanging on the wall, closely followed by one for Gone With The Wind. And briefly glimpsed in her kitchen is an enlarged photograph of Humphrey Bogart. Even if she’s being truthful when she says she’s “not in the habit of bringing home stray, young American men,” it’s clear she has a thing for American culture (see also: the Disney paraphernalia scattered about) and a romantic streak a mile wide.
Poltergeist is by no means the greatest horror movie ever made, but its first 45 minutes offers one of the best sustained stretches of scares, jokes, and subtle subversion in the history of the genre. Long before the suburban Freeling family initiates Poltergeists plot by asking paranormal investigators about the disappearance of their daughter Carol Anne, the film spends over a third of its running-time jumping from everyday annoyances to supernatural disturbances, depicting modern life as damned at its root. Television remotes malfunction. A pet bird dies. A 12-pack of beer explodes. Construction workers leer at a teenage girl. Chairs spontaneously move. A gnarled tree smashes through the Freelings upscale Orange County home. The why of all of this is put off for as long as possible, as the movie takes its time to generate an environment of queasy unease, punctuated by puckish social satire and moments of outright fear.
Heres the question, though: Who deserves the credit for Poltergeists brilliant opening act? Director Tobe Hooper? Or co-writer/producer Steven Spielberg?
Poltergeist was the best and worst thing to happen to Hooper. The movie marked his graduationhowever brieflyfrom low-budget drive-in fare to Hollywood blockbusters. It was such a success that he soon signed a contract with Cannon Films for three more mid-budget pictures. But the first two projects released under the deal didnt resemble either Poltergeist or Hoopers 1974 cult sensation The Texas Chain Saw Massacre, which left critics, producers, and audiences confused as to what kind of filmmaker he intended to be. Meanwhile, rumors persisted that Spielberg had directed most of Poltergeist himself, because Hooper was too indecisive on the set. Then in 1986, Hooper made the commercially disappointing The Texas Chainsaw Massacre 2, and that was pretty much it for him as any kind of artistic force in shock-cinema.
It shouldnt have gone this way. In the 1970s, Spielberg and his New Hollywood compadresMartin Scorsese, Brian De Palma, George Lucas, and their mutual mentor Francis Ford Coppolastitched together what theyd learned from classic American studio films, European art-house fare, and disreputable B-pictures, and then threaded the resultant patchwork with their personal concerns and stylistic flourishes. They brought artistic credibility to pop and pulp, and were richly rewarded for it.
Meanwhile, in their own tributary of the mainstream, directors like Hooper, George Romero, and John Carpenter were shadowing Spielbergs gang, with less fanfare. Each of these three filmmakers made massive hitChain Saw, Night Of The Living Dead, and Halloweenand in the decade that followed, each more or less stayed true to their origins in tawdry exploitation. While making shoestring monster movies and offbeat adventures, they brought just as much of a personal stamp as Scorsese, with their own unique combinations of high- and low-art influences.
Carpenter and Romero have largely gotten their due from critics and scholars, however belatedly. Hoopers legacy has been less secure. The Texas Chain Saw Massacre is in the cinematic pantheon, but beyond that, only Poltergeist gets much attention from non-connoisseurs and thats primarily because of the Spielberg connection. Yet for 12 years, from Chain Saw to Chainsaw 2, Hooper directed eight movies that collectively are as daring and visionary as Romero and Carpenters output in the same era. These films are often heavily flawed, but charged with a fervid intensity, and guided by a purposeful mind.
Hooper grew up in Austin, Texas, and attended UT in the early 1960s, becoming one of the first students in the colleges fledgling film program. He drifted into a career as a cameraman-for-hire, working mainly for the local public television station. By the end of the decade, hed begun shooting an experimental head movie called Eggshells, which is more a demo reel of his technical know-how than it is a proper film. During the production, Hooper befriended writer Kim Henkel, who shared his passion for cinema, his ambition to work in Hollywood, and his certainty that their ticket to the big time wouldnt be another Eggshells.
Seeing Night Of The Living Deador, more importantly, seeing how much money a micro-budget movie from Pittsburgh could rake inHooper and Henkel realized they could get noticed with a horror film without having to leave Austin. Inspired by the pervasive violence of early 1970s culture and the primal terror of Hansel & Gretel, the pair whipped up The Texas Chain Saw Massacre, a gonzo slasher with darkly comic elements and a disturbing docu-realistic grime. Shot over the course of one sweltering Round Rock summer in 1973, the movie was a miserable experience for the cast, whose treatment at the hands of the crew and their fellow actors bordered on the actionably abusive. The shoot wasnt much happier for Hooper, Henkel, and their talented cinematographer Daniel Pearl, who were under-prepared, and dealing with financial backers who didnt understand why they were adding sick humor and fancy camera moves to a movie that would sell based on its title alone.
Poltergeist is by no means the greatest horror movie ever made, but its first 45 minutes offers one of the best sustained stretches of scares, jokes, and subtle subversion in the history of the genre. Long before the suburban Freeling family initiates Poltergeist’s plot by asking paranormal investigators about the disappearance of their daughter Carol Anne, the film spends over a third of its running-time jumping from everyday annoyances to supernatural disturbances, depicting modern life as damned at its root. Television remotes malfunction. A pet bird dies. A 12-pack of beer explodes. Construction workers leer at a teenage girl. Chairs spontaneously move. A gnarled tree smashes through the Freeling’s upscale Orange County home. The “why” of all of this is put off for as long as possible, as the movie takes its time to generate an environment of queasy unease, punctuated by puckish social satire and moments of outright fear.
Here’s the question, though: Who deserves the credit for Poltergeist’s brilliant opening act? Director Tobe Hooper? Or co-writer/producer Steven Spielberg?
Poltergeist was the best and worst thing to happen to Hooper. The movie marked his graduation—however briefly—from low-budget drive-in fare to Hollywood blockbusters. It was such a success that he soon signed a contract with Cannon Films for three more mid-budget pictures. But the first two projects released under the deal didn’t resemble either Poltergeist or Hooper’s 1974 cult sensation The Texas Chain Saw Massacre, which left critics, producers, and audiences confused as to what kind of filmmaker he intended to be. Meanwhile, rumors persisted that Spielberg had directed most of Poltergeist himself, because Hooper was too indecisive on the set. Then in 1986, Hooper made the commercially disappointing The Texas Chainsaw Massacre 2, and that was pretty much it for him as any kind of artistic force in shock-cinema.
It shouldn’t have gone this way. In the 1970s, Spielberg and his “New Hollywood” compadres—Martin Scorsese, Brian De Palma, George Lucas, and their mutual mentor Francis Ford Coppola—stitched together what they’d learned from classic American studio films, European art-house fare, and disreputable B-pictures, and then threaded the resultant patchwork with their personal concerns and stylistic flourishes. They brought artistic credibility to pop and pulp, and were richly rewarded for it.
Meanwhile, in their own tributary of the mainstream, directors like Hooper, George Romero, and John Carpenter were shadowing Spielberg’s gang, with less fanfare. Each of these three filmmakers made massive hit—Chain Saw, Night Of The Living Dead, and Halloween—and in the decade that followed, each more or less stayed true to their origins in tawdry exploitation. While making shoestring monster movies and offbeat adventures, they brought just as much of a personal stamp as Scorsese, with their own unique combinations of high- and low-art influences.
Carpenter and Romero have largely gotten their due from critics and scholars, however belatedly. Hooper’s legacy has been less secure. The Texas Chain Saw Massacre is in the cinematic pantheon, but beyond that, only Poltergeist gets much attention from non-connoisseurs… and that’s primarily because of the Spielberg connection. Yet for 12 years, from Chain Saw to Chainsaw 2, Hooper directed eight movies that collectively are as daring and visionary as Romero and Carpenter’s output in the same era. These films are often heavily flawed, but charged with a fervid intensity, and guided by a purposeful mind.
Hooper grew up in Austin, Texas, and attended UT in the early 1960s, becoming one of the first students in the college’s fledgling film program. He drifted into a career as a cameraman-for-hire, working mainly for the local public television station. By the end of the decade, he’d begun shooting an experimental “head movie” called Eggshells, which is more a demo reel of his technical know-how than it is a proper film. During the production, Hooper befriended writer Kim Henkel, who shared his passion for cinema, his ambition to work in Hollywood, and his certainty that their ticket to the big time wouldn’t be another Eggshells.
Seeing Night Of The Living Dead—or, more importantly, seeing how much money a micro-budget movie from Pittsburgh could rake in—Hooper and Henkel realized they could get noticed with a horror film without having to leave Austin. Inspired by the pervasive violence of early 1970s culture and the primal terror of Hansel & Gretel, the pair whipped up The Texas Chain Saw Massacre, a gonzo slasher with darkly comic elements and a disturbing docu-realistic grime. Shot over the course of one sweltering Round Rock summer in 1973, the movie was a miserable experience for the cast, whose treatment at the hands of the crew and their fellow actors bordered on the actionably abusive. The shoot wasn’t much happier for Hooper, Henkel, and their talented cinematographer Daniel Pearl, who were under-prepared, and dealing with financial backers who didn’t understand why they were adding sick humor and fancy camera moves to a movie that would sell based on its title alone.
When a movie star stands on the red carpet at his latest premiere and tells an interviewer, We made this one for the fans, what does he mean? It sounds on the surface like a vaguely populist Give the people what they want! sort of statement, that the filmmakers had the hardcore supporters of the franchise in mindrather than the studio, or the critics, or the overseas marketswhen they made the movie.
The idea of fan servicewhether its putting in inside jokes and other Easter eggs that only the faithful will understand, or structuring entire films around beloved storylines and charactersis a powerful one in modern risk-averse Hollywood. Look at the Marvel movies, which have been wildly successful by putting the comic book company in charge of the franchise. Characters and storylines adhere to the mythology that the fans already know; the movies themselves are treated like comic book runs, the latest chapters in a sprawling, never-ending story arc.
There have been some fine Marvel movies. But how much creative freedom can a director have in making a movie thats always the middle chapter in a story written by the higher-ups, one that fans already buy rote? Each movie just pushes the engine forward a little more and no director can stray too far from the tracks without potentially derailing the entire franchise and ticking off fans.
But thats not the only way to run a franchise. Look at the Mission: Impossible series, which, beginning with Brian De Palmas first entry in 1996, stands as a bold rebuke to the idea that the fans of a franchise are its best caretakers. A popular series in its time, the original TV show from the late 60s and early 70s had not aged well over the years, revived once as a middling reboot with an aging Peter Graves in the late 1980s. Paramount Pictures had the film rights for years, but by the time Cruise got involved as the first project for his new production company, if people remembered it, they remembered it first for Lalo Schifrins iconic theme music.
Which meant it was ripe for Cruise to turn it into his own action-adventure star vehicle. And that it was also ripe for De Palma, never shy about putting his influences and obsessions into his films, to put his stamp on it as a director. That there wasnt much of an ardent fan base ended up working in the movies favor, because Mission: Impossible was a devalued franchise; the audience was too young to be fans of the show, so nobody would be upset if anyone tinkered with it.
And boy, did De Palma tinker. Signed on by Cruise before a screenplay was finishedDe Palma would bring in screenwriters Steve Zaillan, David Koepp, and Robert Towne to work on it, sometimes on competing versionsDe Palma likely had a freer hand to orchestrate the results than if he had worked from a finished screenplay. The result has his fingerprints all over it: Whip off the lifelike summer blockbuster mask, and theres a Brian De Palma film underneath.
When a movie star stands on the red carpet at his latest premiere and tells an interviewer, “We made this one for the fans,” what does he mean? It sounds on the surface like a vaguely populist “Give the people what they want!” sort of statement, that the filmmakers had the hardcore supporters of the franchise in mind—rather than the studio, or the critics, or the overseas markets—when they made the movie.
The idea of “fan service”—whether it’s putting in inside jokes and other Easter eggs that only the faithful will understand, or structuring entire films around beloved storylines and characters—is a powerful one in modern risk-averse Hollywood. Look at the Marvel movies, which have been wildly successful by putting the comic book company in charge of the franchise. Characters and storylines adhere to the mythology that the fans already know; the movies themselves are treated like comic book runs, the latest chapters in a sprawling, never-ending story arc.
There have been some fine Marvel movies. But how much creative freedom can a director have in making a movie that’s always the middle chapter in a story written by the higher-ups, one that fans already buy rote? Each movie just pushes the engine forward a little more and no director can stray too far from the tracks without potentially derailing the entire franchise and ticking off fans.
But that’s not the only way to run a franchise. Look at the Mission: Impossible series, which, beginning with Brian De Palma’s first entry in 1996, stands as a bold rebuke to the idea that the fans of a franchise are its best caretakers. A popular series in its time, the original TV show from the late ‘60s and early ‘70s had not aged well over the years, revived once as a middling reboot with an aging Peter Graves in the late 1980s. Paramount Pictures had the film rights for years, but by the time Cruise got involved as the first project for his new production company, if people remembered it, they remembered it first for Lalo Schifrin’s iconic theme music.
Which meant it was ripe for Cruise to turn it into his own action-adventure star vehicle. And that it was also ripe for De Palma, never shy about putting his influences and obsessions into his films, to put his stamp on it as a director. That there wasn’t much of an ardent fan base ended up working in the movie’s favor, because Mission: Impossible was a devalued franchise; the audience was too young to be fans of the show, so nobody would be upset if anyone tinkered with it.
And boy, did De Palma tinker. Signed on by Cruise before a screenplay was finished—De Palma would bring in screenwriters Steve Zaillan, David Koepp, and Robert Towne to work on it, sometimes on competing versions—De Palma likely had a freer hand to orchestrate the results than if he had worked from a finished screenplay. The result has his fingerprints all over it: Whip off the lifelike summer blockbuster mask, and there’s a Brian De Palma film underneath.
Tom Clancy was not supposed to be famous. Born in 1947, he grew up middle class in Baltimore and got an English degree from Loyola, where he joined the Army ROTC but was barred from military service because of his poor vision. After school, he got married and worked at an insurance agency that had been founded by his wifes grandfather, and that would have been enoughwould have been a fine life, indistinguishable from a million other livesexcept he decided to try his hand at writing novels in his spare time. He submitted his first manuscript to the Naval Institute Press, which probably seemed like a good fit given his storys military setting and its long stretches of dense technical descriptions. Editors persuaded him to cut a hundred pages of jargon, and that was that: released in 1984, The Hunt for Red October became a runaway hit, earning a jacket blurb from Ronald Reagan on its way to selling 45,000 copies in its first six months and more than 3 million by the time of Clancys death in 2013. His name became a brand for a certain kind of storymilitary espionage potboiler, with an emphasis on technological accuracy and a penchant for verisimilitudeand by the end of his life, his net worth was pegged at $300 million.
You do not get to make that kind of money without Hollywood noticing, and sure enough, the entertainment industry came calling not long after The Hunt for Red October made Clancys name a household one. The film rights were optioned in 1985, and the movie made its way to theaters in 1990. (Clancy wasnt a fan of this or any of the movies made from his books.) This was the start of a film franchise that would continue with adaptations of additional Clancy books, in 1992s Patriot Games and 1994s Clear and Present Danger, at which point it essentially ended. Two attempts to reboot the series2002s The Sum of All Fears and 2014s Jack Ryan: Shadow Recruitflamed out for various reasons, but those later films are also easy to ignore because they took place in different fictional universes, severed from the continuity and consequences of the earlier installments. The first three films made from Clancy books form a trilogy that revolve around the travails of Jack Ryan, a CIA analyst who finds himself drawn into increasingly dangerous situations (politically and physically) in the field, but theyre also much more. Theyre fascinating aesthetic and political documents, charting Americas (and Hollywoods) transition from the Cold War years into the morass of the war on terror. Clancys work came to be emblematic of a certain kind of Boomer mentality: militaristic but wary of the system, interventionist but iffy on the outcome. His breakthrough novel and subsequent successes mined the Cold War for story and emotion, but the Soviet Union disbanded not long after his career got started, leaving a vacuum into which no other clear villain could ever successfully be placed. To watch the films now, more than twenty years after the last installment, is to skip like a stone across the surface of the cultural paranoia that rippled in the wake of perestroika. Theyre metaphors for themselves in many regards, and viewed through the right lens, they cohere into a single image of what it was like to live through the time of their making.
///
The first film is, inarguably, the best. Released in March 1990, The Hunt for Red October is a taut chase movie and stellar action film that never sacrifices intelligence for excitement. Alec Baldwin stars as Jack Ryan, a CIA analyst recruited to suss out the truth behind a special kind of Soviet submarine with silent-propulsion technology. The submarine in question is the Red October, piloted by Captain Marko Ramius (Sean Connery), who, disillusioned with the Soviet navy, plans to defect to the United States with a few loyal officers. His Soviet commanders, learning of this, not only begin chasing Ramius, but also tell the American National Security Adviser that Ramius plans to launch nuclear warheads into the States, thereby roping their adversaries into the hunt. Ryan is the only one who suspects Ramius true plan, so he sets out to intercept him before anybody else can.
Theres a beautiful simplicity in that structure, as every scene and subplot are tied to the central question: who will catch the sub? John McTiernans skill as a director of movies like this cant be overstatedprior to this hed helmed Predator and Die Hard, refining the rulebook on 1980s Hollywood action filmmakingbut he also moves effortlessly between sharply choreographed, fantastically directed action sequences and equally effective moments of character development and contemplation. There are assured, believable performances here, including Baldwins affable Jack Ryan, whos scrambling to keep up with the situation at hand; Scott Glenns Bart Mancuso, the commander of a U.S. sub thats pursuing Ramius; James Earl Joness Admiral Greer, the wise mentor who brings Jack into the situation; and Sam Neills Vasily Borodin, the second-in-command on the Red October who just wants to escape to a peaceful life in America. Adapted by Larry Ferguson and Donald Stewart, the screenplay strips the story to its essence, blending sociopolitical brinksmanship with thriller tropes in a way thats consistently entertaining, even today. It was also a financial success, meaning a sequel was inevitable.
Tom Clancy was not supposed to be famous. Born in 1947, he grew up middle class in Baltimore and got an English degree from Loyola, where he joined the Army ROTC but was barred from military service because of his poor vision. After school, he got married and worked at an insurance agency that had been founded by his wife’s grandfather, and that would have been enough—would have been a fine life, indistinguishable from a million other lives—except he decided to try his hand at writing novels in his spare time. He submitted his first manuscript to the Naval Institute Press, which probably seemed like a good fit given his story’s military setting and its long stretches of dense technical descriptions. Editors persuaded him to cut a hundred pages of jargon, and that was that: released in 1984, The Hunt for Red October became a runaway hit, earning a jacket blurb from Ronald Reagan on its way to selling 45,000 copies in its first six months and more than 3 million by the time of Clancy’s death in 2013. His name became a brand for a certain kind of story—military espionage potboiler, with an emphasis on technological accuracy and a penchant for verisimilitude—and by the end of his life, his net worth was pegged at $300 million.
You do not get to make that kind of money without Hollywood noticing, and sure enough, the entertainment industry came calling not long after The Hunt for Red October made Clancy’s name a household one. The film rights were optioned in 1985, and the movie made its way to theaters in 1990. (Clancy wasn’t a fan of this or any of the movies made from his books.) This was the start of a film franchise that would continue with adaptations of additional Clancy books, in 1992’s Patriot Games and 1994’s Clear and Present Danger, at which point it essentially ended. Two attempts to reboot the series—2002’s The Sum of All Fears and 2014’s Jack Ryan: Shadow Recruit—flamed out for various reasons, but those later films are also easy to ignore because they took place in different fictional universes, severed from the continuity and consequences of the earlier installments. The first three films made from Clancy books form a trilogy that revolve around the travails of Jack Ryan, a CIA analyst who finds himself drawn into increasingly dangerous situations (politically and physically) in the field, but they’re also much more. They’re fascinating aesthetic and political documents, charting America’s (and Hollywood’s) transition from the Cold War years into the morass of the war on terror. Clancy’s work came to be emblematic of a certain kind of Boomer mentality: militaristic but wary of the system, interventionist but iffy on the outcome. His breakthrough novel and subsequent successes mined the Cold War for story and emotion, but the Soviet Union disbanded not long after his career got started, leaving a vacuum into which no other clear villain could ever successfully be placed. To watch the films now, more than twenty years after the last installment, is to skip like a stone across the surface of the cultural paranoia that rippled in the wake of perestroika. They’re metaphors for themselves in many regards, and viewed through the right lens, they cohere into a single image of what it was like to live through the time of their making.
///
The first film is, inarguably, the best. Released in March 1990, The Hunt for Red October is a taut chase movie and stellar action film that never sacrifices intelligence for excitement. Alec Baldwin stars as Jack Ryan, a CIA analyst recruited to suss out the truth behind a special kind of Soviet submarine with silent-propulsion technology. The submarine in question is the Red October, piloted by Captain Marko Ramius (Sean Connery), who, disillusioned with the Soviet navy, plans to defect to the United States with a few loyal officers. His Soviet commanders, learning of this, not only begin chasing Ramius, but also tell the American National Security Adviser that Ramius plans to launch nuclear warheads into the States, thereby roping their adversaries into the hunt. Ryan is the only one who suspects Ramius’ true plan, so he sets out to intercept him before anybody else can.
There’s a beautiful simplicity in that structure, as every scene and subplot are tied to the central question: who will catch the sub? John McTiernan’s skill as a director of movies like this can’t be overstated—prior to this he’d helmed Predator and Die Hard, refining the rulebook on 1980s Hollywood action filmmaking—but he also moves effortlessly between sharply choreographed, fantastically directed action sequences and equally effective moments of character development and contemplation. There are assured, believable performances here, including Baldwin’s affable Jack Ryan, who’s scrambling to keep up with the situation at hand; Scott Glenn’s Bart Mancuso, the commander of a U.S. sub that’s pursuing Ramius; James Earl Jones’s Admiral Greer, the wise mentor who brings Jack into the situation; and Sam Neill’s Vasily Borodin, the second-in-command on the Red October who just wants to escape to a peaceful life in America. Adapted by Larry Ferguson and Donald Stewart, the screenplay strips the story to its essence, blending sociopolitical brinksmanship with thriller tropes in a way that’s consistently entertaining, even today. It was also a financial success, meaning a sequel was inevitable.
During the last months of his life, President Franklin Delano Roosevelt ordered that the mantelpiece in the White Houses state dining room be inscribed with John Adams prayer: I pray Heaven to bestow the best of Blessings on this House and on all that shall hereafter inhabit it. May none but honest and wise Men ever rule under this roof.
Adams, the second president but the first to inhabit the Executive Mansion, as the White House was more often called in those days, couldnt have known how much urgency we might attach to such a prayer in an age of terrorism, global warming, and nuclear weapons. Nor could he have anticipated the way that the passage of time might alter and sometimes outright distort our perception of presidential honesty and wisdom; our definitions of both might be radically different from his. Our films on presidential politics are a snapshot of our hopes and fears, a way to work out our anxieties through fiction. Two films that emerged in reaction to the election of 1960, Advise & Consent directed by Otto Preminger, and The Best Man, written by Gore Vidal, reflect those anxieties as acutely as any ever made.
Like Freuds cigar, sometimes a film is just a film, of course, and not every presidential portrayal on celluloid betrays a hidden wish or worrymaybe Bill Pullmans fighter-flying Thomas Whitmore in Independence Day (1996) or Harrison Fords combat-veteran James Marshall in Air Force One (1997)both chief executives who take matters into their own handshave a subtext in partisan gridlock during the Clinton years, but more likely theyre just action heroes going with the flow in outlandish films. Sometimes, though, the relationship is on the nose, such as in the 1933 fascist fantasy Gabriel Over the White House, which appeared at roughly the nadir of the Great Depression. Walter Huston plays a lackadaisical playboy president who suffers a near-fatal accident and is reborn as a Mussolini-like figure who solves the countrys problems through sheer force of will (and guns). At about the same time, there was also The Phantom President, a Rodgers and Hart musical, in which no less than the Yankee Doodle Dandy himself, George M. Cohan, reminds convention delegates about to nominate him for president:
My friends, this land is sad today, It faces want and dearth. But government of the people, By the people, for the people, Shall not perish from the earth. The chorus answers, Hey, hey, heythats a new thought.
On the calmer end of the spectrum, at a time when Roosevelt was saying he was less concerned with being a great president than with not being the last president, the years 1930-1940 brought no less than three major films about Abraham Lincoln (D.W. Griffiths Abraham Lincoln, starring Walter Huston; John Fords Young Mr. Lincoln, with Henry Fonda portraying Lincoln as lawyer; and John Cromwells Abe Lincoln in Illinois, with Raymond Massey as the titular character) each bearing the reassuring message that when the Republic was last under threat, a hero arose to restore order.
The United States was a far more stable and prosperous proposition during the 1960 presidential campaign season, but the choice between Democratic Senator John F. Kennedy and incumbent Vice-President Richard M. Nixon made it a change election nonetheless. The previous three presidents had been born between 1882 and 1890. Kennedy, 43 years old, or Nixon, 47, would be the first president in United States history born in the 20th century. Whether one agreed or disagreed with the policies of Roosevelt, Harry Truman, or Dwight Eisenhower, in a very real sense the reassuring grandpas who had steered America through the frightening progression of the Depression, World War II, and the Cold War were going away for good.
There were good reasons to doubt both men. If Kennedy won, he would be the youngest elected president in history. While his panache made an appealing contrast with the dowdy Eisenhower, it was also a reminder that he was inexperienced, with an indifferent record in the Senate. Old New Deal Democrats, including Eleanor Roosevelt, doubted Kennedys bona fides. As Arthur Schlesinger Jr. put it, Kennedy seemed too cool and ambitious, too bored by the conditional reflexes of stereotyped liberalism, too much a young man in a hurry. He did not respond in anticipated ways and phrases and wore no liberal heart on his leave. In particular, the Kennedys had failed to repudiate the Red-baiting demagoguery of Senator Joe McCarthy and had, in fact, supported him.
Kennedy was also burdened by inherited doubts. He was a Catholic, and anti-Catholic prejudice was still strong in the country; it had helped defeat Democrat Al Smith in the election of 1928. There was also the looming presence of his father, Joseph Kennedy, a wealthy, unscrupulous climber who had sunk his own presidential ambitions by advising appeasement of Hitler while serving as ambassador to Great Britain at the outset of World War II.
As a Congressman, Senator, and Vice-President who had successfully debated Nikita Khrushchev and stepped in as pinch-president when Eisenhower was ill, Nixon had experience in spades, not that the old general acknowledged it. (Asked at a press conference to name an example of a major idea of [Nixons] that you had adopted, the president replied, If you give me a week, I might think of one. I dont remember.) That experience, though, contained a fair share of disqualifiers. He had pioneered McCarthys tactics, first in his campaigns for the House and Senate, then in the divisive Alger Hiss affair. Due to the revelation of a campaign slush fund, which he combated with the infamous Checkers speech, he seemed to many not just an unscrupulous careerist, but also an example of tawdry, down-market venality. No class, was Kennedys two-word dismissal, an assessment that was echoed in campaign signs that asked, Would you buy a used car from this man? Anticipating Donald Trump, Nixon had tried to rebrand himself so often that one commentator said the question was not if there was a new Nixon or an old Nixon, but whether there is anything that be called the real Nixon, new or old. (James T. Patterson, Grand Expectations, 435.)
Tensions around the election were unsurprisingly stoked by the candidates. Kennedys major theme was the anodyne, Its time for America to get moving again, but in his first televised debate with Nixon, he began by questioning, Lincoln-style, whether the world could continue to exist half slave and half free, asking, Can freedom be maintained under the most severe attack it has ever known? It was as if the Russians were about to march down Pennsylvania Avenue and paint the White House red.
Unsurprisingly, with the old guard fading away and the new guard doing what it could to shatter any sense of serenity, public uncertainty about the election expressed itself in polemical art that asked tough questions about the integrity of the American political system and the quality of men that system produces. Among the first was the novel Advise and Consent by Allen Drury, appearing in 1959. A huge bestseller and inexplicable winner of the Pulitzer Prize, it became a Broadway play the next year, with the film version, directed by Preminger, finished in time for Oscar season in 1961 but held back for contractual reasons until June 1962. Vidals play The Best Man premiered on Broadway on March 31, 1960. The film adaptation, directed by Franklin Schaffner (who had directed the Broadway version of Advise) appeared on April 5, 1964. Significantly, both struggle to find an ending that does not duck the questions the stories pose, and both fail. Over 50 years later, with a presidential election of our own in the offing, we are still asking the questions.
*
In Advise & Consent, an unnamed, ailing president (Franchot Tone) nominates controversial candidate Robert Leffingwell (Henry Fonda) to succeed the recently deceased Secretary of State. Leffingwell is an intellectual who disdains kneejerk anti-Soviet policies. The former head of two federal agencies, he has made powerful enemies among the senators who must confirm his appointment, chief among them senior senator from South Carolina Seab Cooley (Charles Laughton, visibly ailing in his final role). Leffingwell also has an obsessive advocate in the sneering, peace-at-any-price junior senator from Wyoming, Fred Van Ackerman (George Grizzard). Caught between them is Brigham Anderson (Don Murray), senior senator from Utah, who will chair the subcommittee assigned to conduct the confirmation hearings. A family man with a pretty wife and a young daughter, Anderson is hiding a secret that could influence his vote if one side or the other was to get ahold of it.