Sunday, March 27, 2016


The Limits of Absurdity by Robert Zaretsky

Robert Zaretsky Los Angeles Review of Books

Seventy years ago this month, Albert Camus arrived in New York City. It was the first, and last, visit the author of The Stranger made to the United States. Camus spent most of his time in New York — a city, he confessed, that defeated his understanding. His experience was, in a word, absurd. To mark the visit’s anniversary, the literary estate of Albert Camus has organized a series of readings, performances, and discussions across the city. The actor Viggo Mortensen, singer and songwriter Patti Smith, folk singer Eric Andersen, and scholars Morris Dickstein and Alice Kaplan are among the artists and writers who will participate. One of the events will be at the Midtown branch of the New York City Library, where at 6:30 p.m., April 14, LARB’s history editor, Robert Zaretsky, will join New Yorker writer Adam Gopnik for a public conversation on the place of Camus’s life and work in the 21st century.

¤

ON MARCH 25, 1946, the French anthropologist Claude Lévi-Strauss, having left the rainforests of Brazil for the concrete canyons of New York City, confronted a social structure as complex and harsh as those he had found in the rainforests of Brazil. Moonlighting as the French Embassy’s cultural attaché, Lévi-Strauss received an unexpected visit from a group of French passengers who had just arrived on an American freighter, the Oregon. Immigration officials had detained one of them because he refused to give the names of friends who belonged to the Communist Party. Lévi-Strauss dispatched a colleague to the docks, and the French visitor, frazzled and frustrated, was finally released.

With this faintly absurd event began Albert Camus’s only visit made to America. 

Camus was no ordinary tourist. France’s Ministry of Foreign Affairs had sent him as an official representative of the recently liberated country. Who better to speak to American audiences about France’s experience of occupation and liberation? By 1944 and the liberation of Paris, the young French-Algerian writer was not just the author of The Stranger and The Myth of Sisyphus, both published to critical acclaim in occupied Paris. He was also the editor of Combat, the most influential underground paper of the French Resistance. With a suddenness that both touched and troubled him, Camus had become the one marketable export left to a bloodied and brutalized country: the French intellectual for whom ideas were a matter of life and death.

His friend, Jean-Paul Sartre, had preceded him to New York in 1945. Playing the role of existentialism’s John the Baptist, Sartre spoke at great length about Camus to a reporter from, of all places, the American edition of Vogue. Praising the new literature that had taken root in the liberated soil of France, Sartre declared, “its best representative is Albert Camus, who is thirty years old.” Under the pressure of occupation and resistance, Sartre observed, the metaphysical absurdity that marked Camus’s novel and essay had morphed into a form of political absurdity. While this explained the pessimistic nature of Camus’s work, it also represented an antidote to despair. “It was when he lost all hope that man found himself, for he knew then that he could rely only upon himself,” Sartre explained. “The constant presence of death, the perpetual threat of torture, made such writers as Camus measure the powers and the limits of man.”

It also made Camus measure the powers and limits of celebrity. When he followed Sartre’s opening act the following year, he seemed groomed for the starring role. Lithe and likable, smart and smooth, Camus struck more than one observer as a Gallic Bogart — a comparison that delighted many French, none more so than Camus. But hailing from a country ravaged by material scarcity, he was hardly dressed for the part. When The New Yorker journalist A. J. Liebling called on Camus the day after his arrival, he was astonished by the Frenchman’s “absurd suit,” whose lapels and cut, he wrote, seemed to predate the Great Crash.

But Liebling was also seduced by the visitor’s warmth and humor. They talked about the war just ended — Liebling had been in France for its liberation — and the peace just begun, about Paris and New York. Inevitably, Liebling quizzed Camus about his own work, bringing up the conclusion of The Myth of Sisyphus, which offers the image of the Greek hero condemned to roll a boulder up a mountainside for all eternity. “For a man arrived at such a grim conclusion,” Liebling noted in his “Talk of the Town” article, “M. Camus seemed unduly cheerful.” When Liebling asked why, Camus replied: “Just because you have pessimistic thoughts, you don’t have to act pessimistic. One has to pass the time somehow. Look at Don Juan.”

At interview’s end, we must imagine Liebling happy. And little imagination is needed to see his happiness when he was asked to play the master of ceremonies two days later at the McMillin Theater (since renamed the Miller Theatre), where Camus was scheduled to give a talk. Though he shared the marquee with two other French writers — one was Jean Bruller, who under the penname Vercors published the great resistance work The Silence of the Sea — Camus was the evening’s star. More than 1,200 people flocked to the hall; more than a few, one suspects, had a shaky knowledge of French but nevertheless grasped the event’s significance. Though he had seemed relaxed to Liebling two days earlier, Camus now felt the air of expectation filling the auditorium. He paused, if only for a moment, overcome by stage fright.

But this small crisis gave way to a larger crisis, one affecting us all. In “La crise de l’homme,” or “The Crisis of Man” — the title he gave to his talk — Camus set himself an enormous task. To a packed auditorium of mostly young Americans, he sought to convey the character and consequences of events that, while scarcely touching his audience, had ravaged Europe. “The men and women of my generation,” he began,

were born just before or during World War One, reached adolescence in time for the Great Depression, and turned 20 years old when Hitler took power. To complete our education, we were offered the Spanish Civil War, Munich, and another world war followed by defeat, occupation, and resistance.

This upbringing, Camus drily concluded, made for an “interesting generation.” Faced by the absurdity of these events, his generation had to find reasons to resist and rebel. Where, though, could they be found? Neither religion nor politics offered guidance, while traditional morality was a “monstrous hypocrisy.” On a continent swept by mass murder and terrorism, a world awash in nihilism, Camus’s generation was thrust into the most terrifying of contradictions. “We hated both war and violence, yet we had to accept both one and the other.” They confronted, in short, the crisis of man.

By way of illustration, Camus offered vignettes of life under the Nazi occupation. There is the concierge at a Gestapo building somewhere in Europe who enters an apartment where two men, suspected of resistance activity, are shackled and bloodied. When one of them asks for help, she replies with pride: “I never mind the business of my building’s residents.” And there is one of Camus’s fellow résistants, his head bandaged, led into an SS officer’s room where, the day before, his ear had been ripped off. The officer, who had overseen the torture, asks sympathetically: “Tell me, how are your ears?” Finally, there is a Greek mother who learns that the German soldiers who took her three sons hostage are about to shoot them. Pleading with the officer to spare them, he offers a compromise: a son can be freed, but the mother needs to choose which one. She points to the eldest, thus condemning her two other sons to death. 

Camus tells his listeners that he did not choose these stories to shock them, but because they illustrate the crisis of man the world now confronted. The crisis, quite simply, is the fruit of a world where torture is not only practiced but evokes little more than indifference with the torturers and acceptance among its witnesses. When our response to the killing or torturing of a fellow human being is anything other than horror and outrage; when we consider the deliberate infliction of pain as no more disturbing than standing in line for our daily food rations; when we have reached this point, we must accept that the world will not improve simply because Hitler is gone. Scanning the hall, Camus declared: “We are all of us responsible and we are duty-bound to seek the causes of the terrifying evil that still gnaws at the soul of Europe.”

Against this bleak diagnosis of our condition, Camus offered a prescription nearly as bleak. While there was no reason for hope, this was not a reason to despair. We can solve this crisis, he announced, only “with the values we still have at hand — in a word, the awareness of the absurdity of our lives.” This was not metaphysical grandstanding, or the sort of language that sounds better in French than English. Camus was instead invoking the philosophical and moral themes that shaped the worlds of The Stranger and The Myth of Sisyphus. While neither was yet translated into English, many in the audience must have been aware of the opening lines to the latter book. “There is just one truly important philosophical question: suicide. To decide whether life is worth living is to answer the fundamental question of philosophy. Everything else is child’s play; we must first of all answer the question.” This question confronts us the day that, finding ourselves in “a universe suddenly divested of illusions and light,” we nevertheless insist on meaning. If our “irrational and wild longing for clarity” is met by “the unreasonable silence of the world,” Camus wonders, is suicide the only reasonable response? Is it possible, he demanded, “to live without appeal”?

But the audience perhaps did not know that Camus had since moved beyond these early works. Though the books had been critically acclaimed when they were published in 1942, Camus saw that events had outstripped the meaning of Meursault’s and Sisyphus’s solitary and singular rebellions. The time had come to reassess the limits of absurdity. What would the world make, he asked in his journal, of a thinker who announced: “Up to now I was going in the wrong direction. I am going to begin all over”? With relief, he realized that it didn’t matter.

This was not all that Camus now grasped. Absurdity, he saw, “teaches nothing.” Instead of taking this diagnosis as a fatality — instead of looking only at ourselves, as do Sisyphus or Meursault — we must look to others. We are, in the end, condemned to live together in a precarious, unsettling world. “The misery and greatness of this world: it offers no truths, but only objects for love,” he wrote in the journal. “Absurdity is king, but love saves us from it.” Love saves us from absurdity. 

Four years later, when he stepped onto the stage at McMillin, Camus had thus left absurdity behind. What lay in front of him was rebellion. He had already completed The Plague — published the following year — and had begun the essay that would become The Rebel. In his talk, he sketched the themes that would inform the book. In a world shorn of sense, he stated, too many people had concluded that whoever succeeded was right, and whatever was right was measured by success. For those who resisted this conclusion, for those unwilling to live in a world of victims and torturers, neither faith nor philosophy offered a resource. Instead, the only source of justification “was in the very act of rebellion.” What we fought for, Camus concluded, “was something common not just to us, but to all human beings. Namely, that man still had meaning.”

But what could such a claim about meaning, well, mean in America? In this happy country, Camus suggested, his audience could not “see, or see only dimly,” what Europe had undergone. But they needed to know that there were men and women “who had seen this evil for years, still feel it in their flesh and see it in the faces of those they love.” They are now rising up, he warned “in a terrifying rebellion which risks to carry everything away.” But these rebels, Camus would subsequently make clear, form a particular breed. They reject not just metaphysical but also political absurdity: namely, a state’s insistence on giving meaning to the unjustifiable suffering it inflicts on its citizens. These rebels not only say “no” to an unspeaking universe but also “no” to an unjust ruler. They do not try to conquer but simply to confront a meaningless world and those who deny their humanity.

Most critically, however, the rebel imposes a limit on her own self. Rebellion is a defensive, not offensive, act; it is equipoise, not a mad charge against an opponent, one that demands attending to one’s own humanity as well as that of others. Just as the absurd never authorizes despair, much less nihilism, a tyrant’s acts never authorize one to become tyrannical in turn. The rebel, by embracing a “philosophy of limits,” does not deny his master as a fellow human being, she denies him only as her master; and she resists the inevitable temptation to dehumanize her former oppressor. In the end, rebellion “aspires to the relative and supposes a limit at which the community of man is established.”

At the end of his address, Camus invited the audience to join this rising tide of rebellion. “As for the American youths listening to us,” he announced, this generation of European rebels “respect the humanity that animates you and the freedom and happiness reflected in your faces. They expect from you what they expect from all people of good will: a loyal contribution to the dialogue they wish to establish in this world.” No sooner had he finished speaking than a Columbia official revealed that a burglar had broken into the ticket office and stole the evening’s earnings, all of which had been earmarked for orphanages in France. From the audience a voice suggested that everyone pay again on their way out; by the time the last person left the hall, there was more money in the register than the first time around. Justin O’Brien, a French professor at Columbia and Camus’s American translator, attributed this happy outcome to Camus’s “persuasive words.”

If these words did leave a mark on Americans 70 years ago, can they still do so? What, if anything, would Camus think about the absurdity of our current political climate? It is impossible to say, of course, but what is fairly certain is that he would be as perplexed today by Americans as he was in 1946. The audience’s reaction to the theft at Columbia impressed him deeply; in his journal he noted that it typified “American generosity.” What he found best in us, he wrote, was our “spontaneous friendliness and warmth.”

But other traits impressed him less. While he praised our generosity, he worried about our naïveté. Though he lauded our role in liberating France, he lambasted our decision to liberate the atom. Attracted by our hospitality, he was repelled by our superficiality. “The secret to conversation here,” he believed, “is to talk in order to say nothing.” In the one published work based on his visit to the US, a short essay titled “The Rains of New York,” he confessed that after several weeks in the city, he

still [knew] nothing about New York, whether one moves about among madmen here or among the most reasonable people in the world; whether life is as easy as all America says, or whether it is as empty here as it sometimes seems; […] whether it serves any purpose that the circus in Madison Square Garden puts on ten simultaneous performances in four different rings, so that you are interested in all of them and can watch none of them.

Camus’s confusion over Americans has become our confusion, while his observations cut as deeply today as they did 70 years ago. The 40 rings of our political circus, with its mixture of madness and reason, the easiness of the talk about torture and the emptiness of analyses by our media: all of this carries echoes from that night at the McMillin Theater. This confusion will not end soon, but what must begin, as Camus told his audience, is weighing our words. We need, he declared, “to call things by their proper names and understand that we murder millions of human beings when we allow us to think certain thoughts.”

Tuesday, March 22, 2016

The Interpretation of Screams

A new critical biography of filmmaker David Lynch

A. S. HAMRAH


In 1967, when he was a student at the Pennsylvania Academy of Fine Arts in Philadelphia, David Lynch, the future director of Blue Velvet and Twin Peaks, made a mixed-media sculpture, a Rube Goldberg device that, as Dennis Lim describes it in his thorough, compact, and illuminating new book on Lynch, “required dropping a ball bearing down a ramp that would, through a daisy chain of switches and triggers, strike a match, light a firecracker, and cause a sculpted female figure’s mouth to open, at which point a red bulb inside would light up, the firecracker would go off, and the sound of a scream would emerge.”
This combination of the board game Operation with a blow-up doll and Samuel Beckett’sNot I (the play featuring only a woman’s mouth), plus the sounds of a gunshot and a scream, anticipated the components of the work Lynch has spent the subsequent fifty years making. In film and television, his distinctive oeuvre has obsessed cinephiles, fans of the outré, and film academics, giving rise to the adjective “Lynchian,” a word, as Lim points out, that many have tried to define but that the culture at large has decided means “weird.” Lim boils the Lynchian down to “abysmal terror, piercing beauty, convulsive sorrow.” Lynch’s movies, he writes, “give form to the submerged traumas and desires of our age.”
A nicotine fiend and a coffee addict who mixes existential dread with sadomasochism in all-American settings, Lynch is that rare director who makes subversive films without a chip on his shoulder, seemingly without any will to provocation. He is at home with his neuroses and obsessions. His secret is that he proceeds as though he is acting from the most impossible condition of all: normalcy. While directors like David Fincher and Lars von Trier explore similar terrain with grim determination, only Lynch enters nightmare worlds like the Eagle Scout he was, as inquisitive about the depths of human psychology as he is about bugs and twigs.
“There is goodness in blue skies and flowers, but another force—a wild pain and decay—also accompanies everything,” Lynch has said. “There’s this beautiful world and you just look a little bit closer, and it’s all red ants.” Like the ones on the severed ear in Blue Velvet. Lim connects Lynch to the dark forces that drive the American psyche, the same ones D. H. Lawrence analyzed in his Studies in Classic American Literature, and there is more than a touch of “Young Goodman Brown” in Lynch’s homespun American surrealism. Like the character in Hawthorne’s story, Lynch is drawn to the woods at night, where ordinary people confront the demonic. The Black Lodge in Twin Peaks houses America’s violent soul.
This view was ingrained in Lynch from the start. His father, a research scientist with the US Forest Service, wrote a doctoral thesis called “Effects of Stocking on Site Measurement and Yield of Second-Growth Ponderosa Pine in the Inland Empire,” a title seeded with Lynchian allusion. When a painter acquaintance gave the teenage Lynch a copy of Robert Henri’s bookThe Art Spirit, a primer from the Ashcan School that describes art as “our greatest happiness” and encourages its readers (assuming they are male) to “do some great work, Son!,” Lynch’s path was set.
Lynch grew up in Leave It to Beaver–like towns, pinging happily between the Pacific Northwest, Idaho, and Virginia. While his upbringing implanted half his mental landscape, it did not prepare him for Philadelphia, which completed the picture. Lynch was uneasy in northeastern cities after visiting his grandparents in Brooklyn, where he was worried by a tenant in his grandfather’s kitchenless building frying an egg on an iron, and recoiled at the hellishness of the subway.
When he arrived in Philadelphia for art school, urban blight overtook him. Lynch moved into an industrial district, a grimy anti-Oz of crumbling buildings and smokestacks. He spent his time there in the neighborhood diner, a hangout for morgue attendants who let him visit their slabs. Soon he was married with a child, working as a printer and living in a cheap house in a bad neighborhood.
“There were places there that had been allowed to decay,” Lynch recalls, “where there was so much fear and crime that just for a moment there was an opening to another world.” He does not mean the world of poverty he was ensconced in so much as the interior world of psychological trauma poverty underscores. In Eraserhead, his first feature, Lynch surrealized that world, aestheticizing it across mental galaxies and inside bodies, heads, and planets. Was that only possible in a place that seemed beyond hope and ungentrifiable—a galaxy far away from us now?
Click to enlarge
David Lynch, Blue Velvet, 1986.
MGM
Lynch has called Eraserhead “the real Philadelphia Story.” Completed over a painstaking six years after he was admitted to the American Film Institute in Los Angeles, this “Dream of Dark and Troubling Things” (as the poster’s tagline had it) was greeted by Variety when it premiered in 1977 with the headline “Dismal American Film Institute Exercise in Gore; Commercial Prospects Nil.” Handled expertly by an independent distributor on the midnight-movie circuit during what Lim calls “year zero for punk,” Eraserhead provedVariety wrong. Lynch’s ugly baby thrived in a film economy separate from blockbusters likeStar Wars, which debuted the same year.
George Lucas, in fact, later asked Lynch to direct Return of the Jedi. Of the roads not taken in film history, that was a highway Lynch was wise to avoid. He worked for hire on The Elephant Man and Dune before he fully abandoned notions of a conventional Hollywood career to make Blue Velvet in 1986. The Canadian filmmaker Guy Maddin calls it “the last real earthquake to hit cinema.”
Arriving at the height of the Reagan era, Blue Velvet rattled audiences, almost molesting them. It had the same effect on Film Studies, where it became an object of attraction and repulsion, calling across a void to film theorists of all stripes during the time of Back to the Future and Pretty in Pink, colorful pop films of youthful self-discovery lacking Blue Velvet’s sense of menace. Lynch’s film shared Back to the Future’s Freudian plot and Pretty in Pink’s New Wave veneer, but his style of deadpan terror was, to say the least, different. Fredric Jameson, in Postmodernism, or, The Cultural Logic of Late Capitalism, was notably discombobulated, seeing the film as an insidious form of “postnostalgia,” in which “evil has finally become an image.”
Blue Velvet, in addition to providing material for poststructural analysis, is one of those films in which every line is memorable. Lynch’s work as a screenwriter is often overlooked in writing on him, but Lim, the director of programming at the Film Society of Lincoln Center, does it justice. Isabella Rossellini was Lynch’s significant other at the time, and as the hostage-chanteuse Dorothy Vallens, her line “You put your disease in me” sums up what the film did to people who saw it when it came out in theaters, or later on VHS tapes. Home viewings, Lim points out, in which Lynch’s dark vision of small-town America could be watched again and again, paved the way for what Lynch did next.
In 1990, Twin Peaks extended the Lynchian into prime-time network television. Lim describes the series as “a mass-culture text that called for communal decoding,” and its debut drew 35 million viewers, a third of the potential TV audience. Connoisseur magazine called it “The Series That Will Change TV Forever,” and it did, albeit very slowly. Quality television and its signature genre, the “dead girl” mystery, owe their existence to the story of Special Agent Dale Cooper’s investigation of Laura Palmer’s murder. Now that Lynch is bringing Twin Peaks back on Showtime, he has come full circle, and it’s fitting to note that what was once OK for broadcast TV is now only acceptable on premium cable.
After a few years in the pop-culture wilderness, Lynch made Lost Highway in 1997, the first movie of his so-called LA Trilogy, which includes 2001’s Mulholland Dr. and Inland Empire, the 2006 opus starring Laura Dern he shot on pro-am digital video. While Mulholland Dr.began life as another series for ABC, Lynch was by then able to secure financing from European sources. Moving beyond the constraints of Hollywood financing and network notes gave him the freedom he needed to contemplate his experiences in Southern California his own way. In those films, Lynch took Los Angeles, Hollywood, and screen acting as his subjects.
Lim points out that these films emerged from the period “of Timothy McVeigh and the Unabomber, the Branch Davidians and Heaven’s Gate cults, the televised trials of O. J. Simpson and the Menendez brothers,” which created “a potent incubator for apocalyptic thoughts.” Lynch’s Los Angeles films, in which identity dissolves and congeals into murder, reflect the California sun back on itself. Despite the swimming pools and the modernist houses, he and his characters, living amid palm trees instead of pines, are not out of the woods. By the time of Inland Empire, which was partially shot in Poland and is three hours long, the Lynchian Möbius strip had enfolded a new psychic continent.
Those three films, masterworks of anguish and splintered dreams, may be his last. Before his imminent return to TV, Lynch seemed to have given up feature filmmaking for the self-banishment of the Transcendental Meditation foundation he started to promote world peace. Whether it’s DV, TV, or TM, Lynch has earned the right to do what he wants. If Inland Empire’s closer, an all-female group dance scored to Nina Simone’s “Sinnerman,” is the last scene of his last movie, it will be the semi-happy end to a career in film that began with a man and his hairdo trudging through a wasteland.
A. S. Hamrah is the film critic for n+1 and writes for a variety of publications includingCineaste and The Baffler.

Saturday, March 12, 2016

You've no idea...

how delightful sex can be until you've had afternoon sex in a bright sunny bedroom in the 4th Arrondissement Paris, France!


Friday, March 04, 2016

Hard-boiled, hard-edged and Hollywood

OLIVER HARRIS
Times Literary Supplement
THE LEGENDARY DETECTIVE

The private eye in fact and fiction

In 1972 Fidel Castro’s Ministry of the Interior announced a competition to develop the crime genre in Cuba. They wanted stories that would deter anti-social behaviour, promote vigilance and establish heroes so principled they didn’t even swear. The contest attracted no entries. The Ministry’s miscalculation reflects the complexity of the genre’s appeal. Does crime fiction, as some have argued, serve as a prop to the status quo, reinforcing the law and even, via the palliative presence of a detective, helping accommodate us to social injustice? Or is it quite the opposite: a means of critique, shining a light into otherwise unexplored corners? In truth, the genre thrives on this duality. As the failure of the Cuban competition suggests, didacticism is a turn-off. What the putative crime writers of Havana wanted was to explore corruption. What readers wanted, then as now, was not a morality tale but stories of jaded men and women playing by their own rules.

No figure embodies this ambivalent appeal as effectively as the private eye. The legendary PI emerges via true-crime tales and pulp fictions to supplant the cowboy as modern hero: a romantic loner mistrusted by both police and crooks, playing both ends against the middle. It is this rugged individualist that John Walton’s eye-opening research tears to pieces. In The Legendary Detective: The private eye in fact and fiction, Walton asks how the American detective of collective memory arises out of one of the country’s most controversial and partisan industries.

While the relatively centralized states of nineteenth-century Britain and France were developing their own municipal detective agencies, the American investigator was born of a legal vacuum. The first generation belonged to the railway age, a response to the absence of federal policing across state lines. The second generation got rich servicing the era’s corporate monopolies. Most PIs were salarymen, and their employers were bound to big business. By the end of the century, the Pinkerton National Detective Agency employed more than 1,000 operatives in industrial espionage alone, predominantly targeting subversion among their clients’ workforces. When, in 1892, the Carnegie Steel Company experienced a strike against wage reductions, Pinkerton mustered a force of 300 guards. The law forbade the transport of private armed forces across state lines so the Pinkerton men came on one train, their guns and ammunition on another. Four workers died in the ensuing stand-off. During the Cripple Creek Miners’ Strike of 1903, Pinkerton agents bombed the Independence Railroad depot, killing thirteen strikers before attempting to pin the blame on the miners themselves. At the core of this violence was what the La Follette Committee, set up in 1936 to investigate the dark side of industrial relations, called the munitions dealer–detective agency connection. Arms manufacturers such as Federal Laboratories Inc shared friends, board members and anti-union sentiments with the investigative agencies they served. Federal supplied the tear gas and Thompson sub-machine guns.

Walton asks how we got from this to Dick Tracy’s Secret Detecto Kit, given out free with Quaker’s cereal in the 1930s. He discusses the PI in relation to the sociologist Everett Hughes’s concept of “good people and dirty work”: society’s means of dealing with repugnant yet apparently necessary activities in its midst (Hughes applied the concept to concentration camp guards and lynch mobs). The idea of necessarily “dirty” jobs lets a community convert polite silence into morally digestible legend. The fact that PI agencies arose alongside a booming popular culture industry allowed this process to flourish. Just as self-justifying memoirs by investigators inspired the pulps, so the heroism of the fictional PI fed back into field reports and PR puffs. Exercises in publicity such as Allan Pinkerton’s Criminal Reminiscences and Detective Sketches (1878) and The Masked War (1913) by his rival agency’s boss William Burns put the emphasis squarely on fighting dastardly criminals, despite the minimal involvement of their agencies in such low-profit work. Walton pursues the idea of fictionalization as legitimation, digging deep into previously untapped archives to unearth the reality of an industry that was happy to be preserved in myth.

One Pinkerton operative who achieved an enviable reputation for strike-breaking and surveillance was Dashiell Hammett. If his first novel, Red Harvest (1929), achieves exceptional richness of detail in its presentation of labour conflict and local corruption in “Poisonville”, it may be due to his personal involvement in the 1920 miners’ struggle on which it’s based. As in The Maltese Falcon, appearing in the same year, Hammett’s first-hand experience of political sleaze, industrial violence and the everyday routine of an agent allowed for a realism that brought hard-boiled fiction to new heights.

Nathan Ward’s brief but dogged new study of Hammett’s life, The Lost Detective, focuses on the early years, in a quest to show how the writer emerged from the detective. Hammett joined Pinkerton in 1915, aged twenty-one. His first published work appeared in 1922, the year he left the world of real investigation, but Pinkerton had given him the street knowledge needed to stand out amid a host of less credible competitors. Less expectedly, it also gave him a training in prose style: reports for clients had to meet expectations of concision and unsentimental “objectivity”. Hammett, who had left school at fourteen, took pride in his finesse, accepting that his skill in writing enhanced his in-house reputation. This crisp, objective approach gave his novels a cinematic style it would take cinema itself a decade to catch up with. It also allows Ward to position Hammett in a broader narrative of American literature, as Ward draws a parallel with the influence of another exacting style sheet – that of the Kansas City Star – on a young Ernest Hemingway’s stripped-back prose. Both backgrounds provided tools for the construction of a self-consciously hard-edged literature. In his famous essay of 1944, “The Simple Art of Murder”, Raymond Chandler went so far as to align Hammett with Whitman in the larger triumph of American letters against artifice, a “revolutionary debunking of both the language and material of fiction”.

With Hammett’s own reports missing from the Pinkerton archives (possibly suggesting the sensitivity of the work he was engaged in), much of the connection between his literary life and his earlier employment rests on anecdote. The anecdotes that his family and friends remember him repeating carry an element of guilt mixed with tough-guy pride, most significantly one involving Hammett being offered $5,000 to kill a left-wing agitator. The novels, accordingly, gain a hint of exculpation.

the myth of the PI crystallized when its creators found the right balance of cynicism and moral sensibility
In Walton’s account, the myth of the PI crystallized when its creators found the right balance of cynicism and moral sensibility. This moment can be identified more precisely with Hammett’s creation of Sam Spade in The Maltese Falcon. Spade defined the PI as lone operator, in contrast to Hammett’s previous fictional detective, the Continental Op, whose very name is lost to the agency he serves. Spade runs his own show, allowing him a marginally less complicit passage through a society corrupt on all sides. In his detachment, he exemplifies what Ross McDonald called the investigator as a “poor man’s sociologist”. But Spade’s appeal, and the appeal of the genre at its best, goes beyond sociology to deliver something akin to a mass-market existentialism: the claustrophobic thrill of characters fighting their way through a world without values of its own. As an alibi for a less than reflective profession, it proved unbeatable.

By the time Spade was immortalized on screen by Humphrey Bogart in 1941, the age of the PI was waning; labour work was against most agency policy, crime investigation was handled by others. But on screen they were about to come into their own. Two attempts at adapting The Maltese Falcon in the more comic mode of earlier detective films had flopped, but when John Huston placed Bogart opposite Mary Astor’s Brigid O’Shaughnessy and turned up the angst and eroticism, a cinematic style was born.

If hard-boiled novels found a new glamour in grit, film noir perfected the look. The essays in Kiss the Blood off My Hands seek fresh angles on a genre that has attracted so much scholarship that the academic field has its own worn tropes: German Expressionism, post-war ambience, gender politics. Several essays in Robert Miklitsch’s edited collection advance the study of film noir by attending to previously neglected aspects of style: sound, for example (Krin Gabbard looks at love songs; Neil Verma places the genre back in the lost context of the period’s hugely popular radio dramas, an old technology recovered via digital means). In a similar vein, Vivian Sobchack considers the use of back projections to convey the impression of driving: a cost-cutting measure turned to its own expressive ends in his chosen example, Detour (1945), where these juddering views can convey inner torment as effectively as any retrospective voiceover.

Other essays revisit and qualify the received truths of noir studies, none quite as extensively received as the co-dependency of PI and femme fatale. Debate has centred on whether these icons of dangerous feminine sexuality simply revisit old misogynist fantasies, or if they might communicate a potentially liberating new power. Both Philippa Gates and Julie Grossman find a more nuanced approach. Gates points towards the frequency with which female leads take on the role of hard-boiled investigator themselves – see, for example Jill (Betty Grable) in I Wake Up Screaming (1941), or Ann Hamilton (Katharine Hepburn) in Undercurrent (1946). Grossman, meanwhile, argues that women’s concerns are fundamental to the genre, not just its shadowy margin. If films such as Otto Preminger’s Laura (1944) and Nicholas Ray’s In a Lonely Place (1950) dramatize the oppressiveness of a dominant masculine culture it may be connected to the fact that they are based on novels by women (by Vera Caspary and Dorothy Hughes respectively). Indeed, the influence of women on the noir world, as both writers and audience, is evident in how the studios themselves perceived the films, frequently classed as “melodramas” rather than noir, and even seen in close relation to melodrama’s subgenre, the “woman’s picture”.

This touches on a fundamental problem for any historian of noir: it never actually existed. It was French critics, cut off from American cinema during the war, who first coined the phrase when five years’ worth of Hollywood crashed into Paris in one moody and exhilarating haul. In fact, cinematic effects that the French found so expressive of malaise could often be the result of wartime constraints (dim lighting helped cloak cheap sets; elliptical transitions were sometimes the consequence of hastily written scripts being shot on the fly). But the success of the phrase suggests the cineastes were justified in recognizing a new desire on the part of American filmmakers to find innovative means for the expression of social and psychological unease.

If we are searching for the source of this unease we need look no further than Hollywood itself. It is no coincidence, Mark Osteen argues in his essay “A Little Larceny: Labour, leisure and loyalty in the ’50s noir heist film”, that the first true heist movies appeared in 1950, between the first and second round of hearings by the House Un-American Activities Committee. Liberal Hollywood figures such as Huston had tried to fight back with the Committee for the First Amendment, but resistance crumbled as the blacklist grew. In The Asphalt Jungle (1950), Huston tells the story of a robbery gang imploding under the pressure of mutual suspicion and multiple betrayals. In Osteen’s reading, the gang expresses the failed ideals of solidarity. But it can also serve a reverse purpose, he argues, portraying the limitations of corporations themselves. In the criminal mastermind’s selection of participants for their specialized skills (safecracker, driver, thug etc) and impeccable attention to timing in the heist’s execution, Osteen sees a critique of Fordist factories and new scientific management techniques for optimizing workers’ performance. Time and again, the heist should go like clockwork, but human frailty intervenes.

At a time when direct political criticism was taboo, these were the means by which it could survive. But how much judgement did the audiences of the 1940s and 50s discern? In their genre-defining Panorama du film noir américain 1941–53 (1955), Raymond Borde and Étienne Chaumeton depicted their chosen canon as casting a disapproving gaze over contemporary American society. But they were conscious of the ideological complexity involved when an imperial power invades your cinemas so seductively. Like its hard-boiled predecessors, film noir set urban decay alongside unprecedented wealth, exposed the greed and violence behind both, and at its heart it placed the private eye, an everyman defined by a job he finds distasteful but is bound to by financial necessity. Here was modern life laid bare, and it had never looked so alluring. Sometimes realism is the best propaganda.

Oliver Harris is the author of the Nick Belsey series of novels. The latest one, The House of Fame, is due to be published next month.