Saturday, June 29, 2019

Fifty Years After Stonewall


Coming Out, and Rising Up, in the Fifty Years After Stonewall

By Masha Gessen The New Yorker

A day or two after the Stonewall Riots, in June, 1969, Virginia Apuzzo, a twenty-eight-year-old nun and college lecturer, went down to Sheridan Square looking to shed her shame. The shame had been with her since she was ten and proposed to a girl who was perhaps a year older, a neighbor in an apartment building in the Bronx. “She called me ‘crazy’ and I felt red-faced, shriveled,” Apuzzo wrote in an e-mail to me. Now, down in the Village, where queer people stood up to police during a raid of a gay bar, “the energy and atmosphere gave me my first sense of Pride, of having the right to breathe out loud.” She quit the convent and joined the newly created gay-liberation movement. 

Just six years later, when Richard Burns, a student at Hamilton College, came out, gay-liberation groups had proliferated. There was a tiny one on campus, and Burns joined immediately. Gay life, gay activism, and gay pride were no longer inconceivable—they were even, with some trepidation, occasionally written about in the newspapers. What remained inconceivable was a positive image of gay life and love in the mainstream media: in those rare instances when homosexuals appeared in movies or on television, they were pathetic, ridiculous, or frightening. So for Burns, when he became an activist, the goal was “being able to come out, find people to love, and safety,” and this seemed revolutionary, because nothing of the sort was reflected in popular culture. 

Seven years after this, when I came out, as a fifteen-year-old in Boston, the image of gay people in the mainstream media was not much better, and was about to get a lot worse: homosexuality would soon be perceived as firmly linked to a terrifying, deadly new disease, at first called Gay-Related Immune Deficiency. But queer role models were available to me. Boston was an epicenter of gay and lesbian activism, whose own epicenter was the weekly Gay Community News, the only national gay newsweekly. G.C.N. held envelope-stuffing parties every Friday. I went there to volunteer, eat pizza, and silently gawk at brilliant gay activists. Burns, who had served as managing editor at G.C.N., was one of them. In 1982, he was a law student at Northeastern University, as was a lesbian activist named Urvashi Vaid. (They met on their first day at law school, in 1980.) I thought they were gods. We are acquainted socially now, and are even members of the same book group, but I don’t think they know that I had a one-sided relationship with them back then. I wouldn’t be who I am if it hadn’t been for them and about a dozen other queer people I met in the nineteen-eighties—including Apuzzo, who was then the head of the National Gay Task Force. (The organization changed its name to the National Gay and Lesbian Task Force in 1985 and is now the National L.G.B.T.Q. Task Force.) I have probably conducted more than twenty thousand interviews in my life, but I remember the first time I interviewed Apuzzo over the phone, in the late eighties. She said that, at that stage in her life, she was interested in working politically with young people, and this set my heart aflutter. 

The writer Andrew Solomon has proposed a distinction between “vertical” and “horizontal” identities. The former, like religion, ethnicity, and hereditary conditions, may be passed on from generation to generation. The latter separates children from their parents but connects them to people outside their family of birth. This type of identity may be sexuality, a condition like deafness in the child of hearing parents, or any number of other identities that parents—and, often, other older relatives and community members—cannot model. Like most queer people, I had to look outside my birth family (and run away as fast and as far as possible from the Russian-immigrant community) in order to understand who I could be and how I could live. Many of the people who were my unwitting guides have died, several of them of aids. But on the eve of Stonewall 50, I got in touch with the people who are still here. I asked them to think back to the time when they became activists and recall what they imagined then that the future might be like—and compare that to now. 

Burns, who is now sixty-four, told me over the phone that, as he recalled, “The holy grail was nondiscrimination laws—then we’d be done. There was no understanding yet of the duality between legal equality and lived equality.” The gay movement faced a backlash and setbacks before it made any measurable gains. In the late nineteen-seventies, a Christian singer named Anita Bryant toured the country campaigning against “homosexual propaganda” and spreading the fear of predatory homosexual pedophiles invading American schools. In the nineteen-eighties, aids began killing gay men and turned the living into pariahs. In 1986, the Supreme Court upheld the Georgia sodomy law in Bowers v. Hardwick, a case that began when the police entered an Atlanta bedroom, found two men in bed together, and arrested them. The court ruled that the police had been right to do so, and that the right to privacy was reserved for heterosexuals. (I didn’t have to look that up: thirty-three years later, I remember the names of the participants, including the lawyers, the details of the case, and the sickening feeling in the pit of my stomach when the decision came down.) 

In the minds of gay people there could be no doubt that we lived in a country that despised us. But, Burns pointed out, the gay community mobilized in response to every calamity. The first gay and lesbian march on Washington, in 1979, marked the beginning of a national movement. aids showed more-assimilated gay men that they were not protected, and brought them to activism. In 1987, act up was born in a speech by the writer Larry Kramer that he delivered at the New York Lesbian and Gay Community Center, where Burns had begun his twenty-two-year tenure as executive director. The second march on Washington, in October of that year, took act up national. Every time something terrible happened, Burns said, the movement got stronger: more people joined in response to the 1998 murder of Matthew Shepard, and after a 2000 Supreme Court decision that upheld the Boy Scouts’ right to discriminate on the basis of sexual orientation. 

The tide took a long time to turn. In 2003, in Lawrence v. Texas, the Supreme Court reversed the Bowers decision, striking down remaining sodomy laws throughout the country. The following year, Massachusetts became the first state where same-sex marriage was legal. Burns was not an early proponent of the marriage-equality fight. His roots were in gay liberation, which had its roots in the great social movements of the nineteen-sixties. “We didn’t want to get married—we wanted to dismantle the nuclear family. We didn’t want to serve in the military—we wanted to disband the military.” Now, he said, he is a “very happy beneficiary” of the expansion of the right to marry. He and his partner were married, in 2015 (Burns had to check the inside of his wedding band to make sure that he remembered the date correctly); the comedian Kate Clinton, who is Vaid’s partner, officiated. The biggest reasons to get married, Burns said, were “age and health concerns.” And love, he then added. 

Back in the day, the scholar and activist Karla Jay told me on the phone, “We had much more radical hope. Our hope was that society would have changed in a dramatic way. I’m not talking about the legalization of marriage—I’m talking about a society in which sexual orientation wouldn’t matter, because people would view others as simply human beings.” It was a vision of transformation and convergence more than acceptance and integration. Jay was a member of the Gay Liberation Front, an organization born of the Stonewall riots, and its first female chair. A favorite slogan was “We’ll never go straight until you go gay.” At a time when holding hands with your lover in public was unthinkable and same-sex dancing was illegal, Jay said, “we imagined decriminalization rather than legalization.” They wanted all love and sex to be deregulated, not to have same-sex love included in the regulatory system of the state. But when the larger world suddenly opened the door to same-sex couples, inviting them into the system, the movement giddily marched through that door, abandoning its earlier dreams and leaving behind many of its most marginalized members. “Transgender women of color, poor elders, homeless youth, people who don’t want to live in a couple, single people, asexual people, threesomes, communes—all those people have been thrown under the bus,” Jay said. 

To teen-age me, Jay showed that you could be it all: a street activist and an academic, butch, and unapologetically focussed on queer issues. Jay, who is seventy-two, was a professor at Pace University for thirty-nine years and has written and edited several key books on queer history and theory. She has maintained her radical views. But in 2004, when Massachusetts legalized same-sex marriage, Jay and her partner travelled to Northampton and got married. They had four witnesses, a blueberry pie with two plastic bride figurines on top, and, she said, “thirteen hundred rights that we got, but I don’t feel good about that.” Still, they did it for the legal protections marriage offers. “If there is anything I can do to spare the woman I love any pain if anything should happen to me, I will do it,” Jay said. “I will walk on coals.” 

This weekend, Jay will celebrate the fiftieth anniversary of Stonewall at a private reunion of G.L.F. to be held at the Center, following a panel with some other former members of the organization. She expects fireworks—it was always a contentious group. “Some of the people I struggled with are still among my closest friends,” she said, “because they love the world the same way I love the world.” 

About six months into the G.L.F.’s existence, a group of dissidents split off from the organization, which they felt was spending too much time on too many different social-justice issues, and formed the Gay Activist Alliance, to concentrate on gay liberation. In the winter of 1971, a textile designer named Jonathan Ned Katz went to a G.A.A. meeting. “I was terrified,” he told me on the phone. “I knew it was going to change my life, but I didn’t know how.” Katz joined the media committee, which convened to discuss “how we were going to get our new consciousness out to the public.” Katz, who grew up in a radical-left household in New York, with a fatherwho had taught him about the history of race relations in the United States, had written two documentary radio plays about slavery. “And I thought, There must be gay history!” Katz wrote a documentary play—agitprop is how he thought of it—called “Coming Out!,” which also served as his coming out to his mother, via an ad for the production in the Village Voice. The play led to a book, “Gay American History,” and a now nearly fifty-year career as a historian of homosexuality in America. 

A decade after Katz went to his first G.A.A. meeting, I was lurking in the library stacks at Brookline High School, reading the thousand-page “Gay American History” in single-afternoon portions. It was the only book in the library that had the word “gay” in the title. This book meant everything to me: it was proof that, as a queer person, I had a legacy, and it was also a way to connect to people who shared it. I don’t know how many other kids read the book in the stacks—it was well worn, despite having shrink wrap over its red glossy cover—but only one girl had had the courage to check the book out of the library. She had done it repeatedly; her name was written on the lending card over and over. I tried to find her, but she had dropped out of school. I did meet her eventually, after I had also dropped out and she and I were working as bicycle messengers at the same gay-run company. 

When he first started going to demonstrations, Katz told me, he had no hope for effecting change. “I did it because it was nice to be with other people—it felt good,” he said. Then again, he had felt hopeless when he was attending peace marches in the early sixties, before there was a mass antiwar movement—whereupon he stopped going, because his body didn’t seem to matter so much. The gay movement grew and changed, too. “I underestimated the ability of capitalism to take in certain rights movements, to view them as new consumer groups to market to,” he said. “I’m not against it; it’s just not the vision of human liberation that a lot of us had.” That vision was of a full realization of democracy, complete with equal resources and opportunity. 

That said, Katz, who is eighty-one, wanted me to know that he has had just the best almost-fifty years and is still full of joy. “It’s great that I no longer have to live in shame and go to my therapist and say, ‘I want to be a heterosexual.’ I have so much fun discovering gay history. I’m Jonathan Ned Katz, detective historian! I wouldn’t have become that if I hadn’t gone to that G.A.A. meeting.” 

When Urvashi Vaid became an activist in the nineteen-seventies—first in social-justice and anti-racism movements, then in the feminist movement, and finally in the queer movement, she thought that victory would come when women’s liberation was achieved, and that this victory was inevitable. “We were going to change sexual hierarchies, would end capitalism,” she told me on the phone. “I thought we would all be living in communes, and we would create new economic structures.” With every decade of activism, she has tempered her goals. Vaid, who has served as executive director of the National Gay and Lesbian Task Force, worked in the foundation world and in academia, and now runs her own consulting group. There has been less policy change and less accumulation of political power than she had expected: somehow, the math, which showed that women and progressives were in the majority, has not resulted in a redistribution of political power. But there have been unexpected gains, too—many religious communities have come to accept homosexuality, for example. And there is marriage.

“If you had told twenty-year-old me that I would be married, I would have laughed in your face,” Vaid, who is sixty, told me. “Marriage is property; property is theft!” She and Kate Clinton have been married for five years, together for thirty-one. “It’s an amazing thing,” she said, of marriage. “It transformed outlaws into in-laws. It domesticated us, which is a complication.” And yet, she believes, same-sex couples are also changing the institution of marriage, simply because “we don’t do gender roles the same way. We just don’t.”

It wasn’t the focus on marriage that left the more marginalized members of the L.G.B.T. community behind, she believes—it was the lack of focus on economic justice, gender, and race. But while that may not be changing in the mainstream L.G.B.T. movement, something else has happened that excites Vaid far more than marriage. “Queer people are the leaders of the progressive movement,” she said. “I always expected it, and now it’s happened.” Queer African-American women founded the Black Lives Matter movement; queer organizers are central in the fight for immigrants’ rights; and two of the country’s largest labor unions, Service Employees International and the American Federation of Teachers, are led by lesbians. “The left didn’t get race, didn’t get gender, didn’t get sexuality in the seventies, eighties, and nineties—and then it changed.”

After I dropped out of high school, I spent a year living on Beacon Hill, one of Boston’s gay neighborhoods. Everyone knew one another—it was an intricately interconnected multigenerational community, mostly of gay men, but with a sizable lesbian minority. These were the lesbians who liked to dance and who idolized Bette Midler; the other lesbians, who hosted potlucks and listened to women’s music, lived across the river, in Cambridge and Somerville. Everyone I knew had a story of rupture with their family. Many of my friends had run away from home—a couple were teen-agers, like me, and several were in their twenties but had been on their own for a decade. Older men gave shelter to younger men and boys. My girlfriend had got away from her family by enlisting; she was now going to college on the G.I. Bill. I moved to New York to go to college, at Cooper Union. In the spring of my freshman year, I read about a gay high school opening in the city. There were twenty students, the article said, mostly effeminate boys and butch girls who had trouble fitting in at their old schools. I remember feeling like I had missed out—I wanted to go back to high school with other queer people. At Cooper Union, I didn’t know any students who were queer, though I did run into my sculpture professor at the lesbian club the Cubbyhole almost every weekend.

Even though I never went to Harvey Milk High School or the Institute for the Protection of Lesbian and Gay Youth (now the Hetrick-Martin Institute), a resource center for homeless gay and lesbian youth that preceded the school, I felt somehow connected to them. I was still the same age as the teen-agers who were there; back in Boston, I’d left friends who had been turned away by their families. The founders of these institutions showed that we could take care of our own—something that we would see on a much greater scale during the aids crisis.

Joyce Hunter, a co-founder of the school and a founding member of the institute, was herself once a high-school dropout estranged from her family. When she went to her first women’s dance, run by the G.A.A., in 1971, she was married to a man, working at a factory, and raising two kids. When she became an activist, her biggest concern was getting to keep her kids—at the time, many women lost custody once they came out as lesbians. Hunter’s kids stayed with her. She went back to school and then started her own school. Now eighty, Hunter is living in Sunnyside, Queens, with her partner of thirty-nine years. They married two years ago. “My grandkids married us,” Hunter told me. “We decided we needed to, because of our age and financial issues. But we are lucky because we have families that love us.”

The Harvey Milk School, on Astor Place, is now one of the many small high schools run by the Department of Education. “But for kids who don’t live in Manhattan, it’s still hit or miss,” Hunter said. “They may see gay people on TV, but they are still on Staten Island, where they are different.” Still, Hunter is probably the most hopeful of the people I interviewed for this article. It was always at the core of her work with young people—she created places where they could feel hope, and observing them gave her hope. “I feel strongly about the hope,” she told me.

That feeling I first experienced at eighteen, when I read about the queer high school—the feeling of missing out on something that should have been my life—was a preview of witnessing social change while aging. The writer Sarah Schulman told me that she felt it in 2003, when the Supreme Court struck down sodomy laws. “I knew that I would never personally benefit from that decision,” she told me on the phone. She didn’t mean that she feared being prosecuted for homosexual sex—it was the court’s acknowledgment of the basic humanity of queer people that came too late for her. “I had been devastated by familial homophobia. I had a totally marginalized career. This was never going to go away.”

I met Schulman in 1990, the year her novel “People in Trouble” was published, and I walked around the city with lines from the novel playing in my head. Schulman, who was thirty-two then, was everything I wanted to be: an activist, a writer published by a big New York house, a popular speaker. She seemed like the trailblazer of the big gay writing breakthrough. In retrospect, she said, this was the period when publishing discovered niche marketing. Barnes & Noble now had a gay section, and this was where Schulman’s and other gay-themed books were shelved. They went from being writers to being “gay writers.”


The problem with being a “gay writer” is the assumption that your writing is interesting only to gay people. Schulman said that, when she went to her high-school reunion, she could instantly tell who among her old classmates was gay: they had read her books, while the straight attendees asked her what she did. Schulman has received enough conventional honors—she is a distinguished professor in the cuny system, a fellow at the New York Institute for the Humanities, and currently a fellow at the MacDowell Colony—to have a sense of how much more recognition she would have had if she were not perceived as solely a lesbian writer. “I am never called upon as a public intellectual, except on gay topics,” she told me. “Only recently have straight people started to read my work.” The New Yorker wrote about her 1990 novel in 2017. I asked Schulman if she has a revenge fantasy, or at least a fantasy of finally getting the accolades she deserves. She said that, if it came, it would probably feel like it was too late, like the Supreme Court decision, a bittersweet and insufficient resolution to a lifetime on the margins.

When we spoke, Shulman was at MacDowell, revising a draft of her monumental history of act up, an organization in which she was active and the memory of which she has maintained by working on a large oral-history project and a documentary film, “United in Anger,” which came out in 2012. A lot of the media coverage of the fiftieth anniversary of Stonewall has seemed to obscure the aids epidemic, to mention it as almost a footnote, or at least just one chapter in a story of many losses and more gains. For queer people of my generation and older, the epidemic shaped our understanding of life itself. When we talk about aids, it seems, we can talk about nothing else, because nothing else looms as large. It seems that there may be no way to talk about aids in a way that reflects its impact on the gay community, the L.G.B.T. movement, and, indeed, American culture and politics. For younger generations, though, and for writers and historians crafting a longer narrative for the current festivities, the aids epidemic has become a self-contained historical episode.

In fact, some forty million people in the world are living with H.I.V., more than a million of them in the United States. In 2017—the last year for which statistics were available—almost a million people in the world died of aids. In the United States, tens of thousands of people become infected with H.I.V. every year; most of them are gay men, and most of these men are either African-American or Latino. This is happening twenty-three years after researchers announced that aids would now be a manageable chronic disease. It is true that current treatments help a majority of people with H.I.V. who take them, but millions of people who need it do not have access to treatment, and many more don’t even know that they are infected. It is also true that H.I.V. transmission can now be effectively prevented—both by treating people who are infected, until there is no virus in their bodily fluids, and by taking anti-H.I.V. medication preventively. But a majority of Americans who would benefit from preventive treatment do not receive it. Still, the conventional narrative is that aids is a plague that began and ended a very long time ago.

For me, Larry Kramer, the writer who founded act up, is a model for turning one’s fears and feelings of helplessness into anger and action. He was the first to raise an alarm, in the gay media, in 1982, about gay men dying of a mysterious disease. Gay Men’s Health Crisis, a service group devoted to taking care of sick and dying men, started in Kramer’s living room. In 1987, he gave a speech that managed to be desperate and inspiring at the same time. It started act up. Now Kramer, who is eighty-four, once again finds himself unable to think about anything but aids—and his own anger.

“aids activism and writing about it have been my whole life,” Kramer wrote, in an e-mail. “The future I hoped for was its elimination. Instead it has only got worse. We have lost the war against aids and are continuing to do so. It’s all getting much much worse for us and this won’t stop.” He added, “Too many people hate us and are accelerating this hate.”

Political activism, if it is successful, benefits younger generations more than the people who created the change. Kate Bornstein, the writer and performance artist, told me that she expects the gender revolution that she has been working toward to happen in my lifetime, but not in hers. She is seventy-one. In 1991, she was the first person I saw speak about the possibility of being neither a man nor a woman. The idea was fairly new to her, too. She had begun transitioning from male to female in 1984. Her therapist suggested that she get out and meet people by joining an activist group, and she went to the National Gay Task Force, which was then a small office in New York where people were planning for the 1987 march on Washington. Bornstein thought of herself as a transsexual lesbian woman and, she said, “I just wanted to disappear into the lesbian community. It was warm: they had potlucks and game nights. It was family.” But the aids crisis ended the cozy lesbian life that had welcomed her in Philadelphia and San Francisco, Bornstein noted. “Potlucks and game nights became phone trees and ‘Who can take care of Don.’ ”

The idea of a transsexual lesbian was novel, and Bornstein’s new friends asked a lot of questions. Mostly they wondered how someone who had not been socialized as a girl and a young woman could become a woman. The questioning was not hostile. “They asked me, ‘How?,’ ‘What?,’ ‘Explain, please.’ And the more I was explaining, the more I realized, You are right, I’m not”—not a woman, a lesbian woman. At the same time, Bornstein knew very well that she wasn’t a man. “And that was the scariest time,” she said. “That left me with no ground under my feet.” To find a way to talk about herself, Bornstein started writing and performing her thinking about gender. (She now identifies as nonbinary.)

There was, in fact, a history to what Bornstein was living, writing, and acting. Before Stonewall, queer people were subverting gender roles, playing with gender roles, and camping it up. (The documentary “Before Stonewall,” restored and rereleased this month, provides more than an hour of delightful footage of largely this.) At Stonewall, said Bornstein, “there were people who were nonbinary. There were transsexual women. They were not closeted or middle class.” But she was not there. “I was a middle-class white Jewish closeted transsexual.” It was the year Bornstein graduated from college. Fifteen years later, she said, “I grabbed onto Stonewall’s coattails.”

Some of the people I interviewed are marching this weekend—a few with the official Stonewall 50 march, but more with the Queer Liberation March, which will have no police protection and no corporate sponsors. Schulman is sitting the festivities out at MacDowell, and Bornstein is staying home because she is not well enough to march. I will be joining both marches with my daughter and her friend, two teen-agers who are growing up in a world that is not quite what I imagined: I thought we would have retired the concept of sexual orientation by now and would have made a bigger dent in gender. Still, this world is much better than the one I came out in—for now.

“We have achieved so much more legal equality than I ever thought we would in my lifetime,” Burns said. “And now a lot of that is under attack. I always assumed, naïvely, that, when we made progress, it would be permanent. We are always going to have to fight to keep the gains we have made.” In the insanity of the Trumpian news cycle, the threats are barely registering: the Administration is packing the federal courts; proposed regulations in Health and Human Services and Housing and Urban Development would legalize discrimination in housing and health care on the basis of sexual orientation and gender identity. “This will have devastating consequences, especially for transgender people, who are the canary in the coal mine,” Burns said.

“I am ambivalent about the fiftieth anniversary,” Apuzzo wrote to me. “Seeing so many young people being so relaxed in their sexuality, so free to express their right to be who they are—I celebrate that enthusiastically! I celebrate the fact that we have learned that ‘getting government off our backs’ isn’t enough by a long shot. In fact, we had to learn how to make government responsive to its people, and that we must demand the right to be involved in the decisions that affect our lives. I can celebrate the fact that, for the most part, we have gone from a political ‘issue’ to a constituency.” But, she continued, “In these days, when the ground is always shifting, when truth is obscured by so many layers of lies and the very institutions utilized to get us to a fiftieth anniversary are being undermined and their power diminished, what’s the plan?”

Several of the people I interviewed for this article admitted that they have only recently stopped assuming that progress was linear and irreversible. Indeed, Bornstein pointed out, queer people have been here before. “A similar moment happened in Weimar Germany in the thirties,” she said. “Gender and sexuality exploded. The arts were amazing. And fascism was on the rise. And then fascism won.”

But this time, she said, things will be different, because changes in the legal status of L.G.B.T. people and the perception of gender and sexuality are a global phenomenon. She is convinced that change will continue. The next frontier is a total reconfiguring of gender, and it will be like nothing we’ve ever seen.

From Audase: The truth of human sexuality is not going away. Every great idea waxes and wanes, Gay rights might challenged, but won't be defeated. We might see a few years of revisionism and retrenchment, but there's no going back. Politics is often progress and then a step back, 

Thursday, June 20, 2019

The Most Irrational Animals




Why Humans Are The Most Irrational Animals
Our complex imagination makes us more irrational than other creatures. Nimals Online

Bence Nanay | Professor of Philosophy at University of Antwerp and Senior Research Associate at Peter house, Cambridge University

It is easy to make fun of the Aristotelian idea that humans are rational animals. In fact, a bit too easy. Just look at the politicians we elect. Not so rational. Or look at all the well-demonstrated biases of decision-making, from confirmation bias to availability bias. Thinking of humans as deeply irrational has an illustrious history, from Francis Bacon through Nietzsche to Oscar Wilde, who, as so often, came up with the bon mot that sums it all up: "Man is a rational animal who always loses his temper when he is called upon to act in accordance with the dictates of reason." 

My aim is to argue that humans are, in fact, not more rational, but less rational than other animals. Aristotle talked about rationality as the distinguishing feature of humans compared to other animals. I think we can use irrationality as a distinguishing feature. It’s not just that humans are irrational animals; humans are more irrational than any other animals. 

This is not a completely new line either, although the point has often been made merely as a provocative overstatement. In fact, according to the standard account of biases, irrationality (in the guise of biases) is explained by simpler cognitive mechanisms taking over. And these simpler mechanisms are exactly the ones we share with animals. So if human irrationality is explained by animal cognitive mechanisms, then humans will not come out as less rational than animals. 

I have a different argument, one that focuses on the importance of imagination in our mental life. I argued here and here that imagination plays a crucial role in making most of our important decisions. Think back to some of the big decisions you have made over the years. Break up with your partner or not? Which college to choose? Go to grad school or not? Which job offer to take? Which house to bid on? And so on. My guess is that you made all of these decisions by imagining yourself in one of the two situations and then imagining yourself in the other and then comparing the two. 

"When we make these grand decisions, we imagine what we imagine to be our future selves in imagined alternative scenarios. Imagination is used three times. And none of these uses of imagination is particularly rational" 

Here is an example from my own past. After college, I was accepted in grad programs in the US and in the UK. I thought, probably correctly, that this choice would have a major impact on my life course and was really struggling with this. I could narrow down the US options to what seemed the best (within the US) and I did the same for the UK options. But deciding between becoming American and becoming British was just too difficult. I imagined myself in Britain, at fancy college dinners, wearing a gown and sipping port. And I imagined myself in California diners in flip flops with Oreo shake in hand. Not an easy comparison. 

The point is that I really had very little idea about just what situations I will find myself in. So I actually imagined myself in imagined situations – ones that were more informed by films I have seen than reality. But imaginative episodes of this kind are even more complicated. Imagination is used not even twice, but three times. Let’s suppose I am making this decision now. Who am I imagining in that California diner? My current self will never be there, so imagining my current self would not be particularly helpful. It is my future self who has the chance to hang out in California, but the problem is that we don’t have any firm information about what our future selves will be like. So it is really my imagined future self who I should imagine. 

So things are a bit complicated then: when we make these grand decisions, we imagine what we imagine to be our future selves in imagined alternative scenarios. Imagination is used three times. And none of these uses of imagination is particularly rational. Imagining our future selves is especially unreliable, as we systematically underestimate how much we will change in the future – see the psychological phenomenon of the End of History Illusion

And the scenarios we imagine ourselves in have very little to do with the actual situations we would find ourselves in. I spent relatively little time in Cambridge sporting a gown and really almost no time in Californian diners in flip flops (the Oreo shake is another question…). The general point is that because of the irrationality of imagination, we make irrational decisions. And while animal decisions are not maximally rational either, they are at least not (or not to the same extent) influenced by the irrationality of imagination (as it doesn’t seem likely that animals would be capable of this triple embedded imaginative episode). In this sense humans really are more irrational than animals. 

Is this all bad? I don’t think so. The first reason for this is that it is a bit difficult to tell what would be the rational decision in these scenarios. Take again the third use of imagination in decision making: the one concerning the self. It is not really possible to imagine your future self that will be in California or Cambridge because your future self will largely be formed in response to the decision that you’re about to make. So you are not in a position to rationally imagine your future self in order to make a decision, because your future self is a result of this very decision. 

The second reason is that irrationality may not be such a terrible thing. And the irrationality of imagination may actually be a liberating mental facility. As Fernando Pessoa said, “Because I am nothing, I can imagine myself to be anything. If I were somebody, I wouldn’t be able to. An assistant book-keeper can imagine himself to be a Roman emperor; the King of England can’t do that, because the King of England has lost the ability in his dreams to be any other king than the one he is. His reality limits what he can feel.” 



Bence Nanay 

Tuesday, June 18, 2019

The Trump administration's dangerous fever dream about Iran


The Trump administration's dangerous fever dream about Iran

Michael H Fuchs London Guardian


The Iran debate in Washington is increasingly divorced from reality – and that should worry us
 

National Security Adviser John Bolton 


The Trump administration is caught in a fever dream about Iran, and the fever is becoming dangerous.

In the wake of attacks on oil tankers in the Gulf of Oman – which the US blames on Iran, though questions remain about the attack – the discourse on Iran being pushed by the administration and others is reaching fever pitch. Senator Tom Cotton has called for a “retaliatory military strike”. The New York Times columnist Bret Stephens says the US should threaten to sink Iran’s navy.

This, of course, is not surprising to anyone who has watched the Iran debate in Washington. It’s divorced from reality.

Everyone knows that Iran’s government is dangerous. It represses the Iranian people. It sponsors terrorism across the Middle East. The question is not whether Iran is bad – the question is what the best strategy is to deal with the threats.

During the end of George W Bush’s administration and Barack Obama’s terms, the priority was stopping Iran’s nuclear program. After years of a carefully coordinated global sanctions campaign, the Obama administration secured a deal that stopped Iran’s nuclear program. The International Atomic Energy Agency and even the Trump administration certified that the deal was working. Before, during, and after the deal, the US has countered Iranian terrorism and bolstered Israel’s defenses.

None of it has been good enough for those trapped in the fever dream. The fever dream convinces policymakers to cozy up to regimes from Saudi Arabia to the United Arab Emirates that are themselves awful. It rationalizes support for a devastating war in Yemen. It causes the State Department to fund, until recently, a group that attacked journalists and experts for not being anti-Iran. Locked in the fever dream’s grips for years, John Bolton in 2015 called for bombing Iran.

All of this is in lieu of a realistic set of goals or a strategy for actually achieving them. It’s a fever dream, and only those with the fever seem to be able to make sense of the dream. US allies from Europe to Asia are frustrated with how much sway the fever dream has over US policy; they wonder what could possibly drive the US to abandon such a successful deal.

It reminds one of the absurdity and recklessness of those who talked about bombing North Korea in 2017 – which could have started a nuclear war – and how completely they ignored the catastrophic consequences.

Most worrying, it reminds one of the run-up to the war in Iraq in 2003 – how the Bush administration lied about Iraq’s possession of WMD and links to terrorism, and how Americans were told the invasion would be a “cakewalk” and that America would be greeted as a liberator. Reports of a rare, recent high-level meeting on Iran at the CIA before the latest round of warnings over Iran should concern those who remember the politicization of intelligence before the Iraq war.

The result of it all is a heightened risk of conflict. And though the Trump administration claims it does not want war, secretary of state Mike Pompeo reportedly said that the 2001 Authorization to Use Military Force (AUMF) – passed in order to fight those involved in 9/11 – could authorize a war with Iran (which the AUMF does not).

America needs to have a real conversation about Iran. We need to make clear that Iran is not ten feet tall.

We can’t let the fever dream drive US policy in the Middle East. Israel has the region’s strongest military, and with continued US support it can defend itself. America will continue to counter Iranian-sponsored terrorism. America does not need to blindly support countries such as Saudi Arabia and the UAE in the name of countering Iran. These governments repress their own people, stoke regional conflicts, and have funded terrorism and radical indoctrination abroad. When the United States partners with Saudi Arabia by supporting a war in Yemen – supposedly to counter Iran – that creates a humanitarian disaster, America has lost sight of its interests.

To develop an effective strategy, America needs to put itself in Iran’s shoes. After the US invaded two of Iran’s neighbors – Afghanistan and Iraq – tensions got even worse with Iran’s pursuit of a nuclear weapon and deadly fights between Iranians and American soldiers in Iraq. America believes Iran’s aggression has intensified while Iran believes America is surrounding Iran.

Lost in the fever dream is the need for a dialogue with Tehran. A strategy of pressure alongside dialogue produced the nuclear deal. The US should talk to Iran about the future of Afghanistan and Iraq. And there is a broader dialogue to be had – alongside pressure – about regional security.

Iran could do more to push a crisis to the brink as well. With Trump ripping up the nuclear deal, Iran is saying it will enrich uranium beyond the limits of the nuclear deal. If it was responsible for the oil tanker attack, more destabilizing acts could follow. That could lead to a vicious cycle that raises the chances of war.

We need to wake up from the fever dream and have a real debate about Iran, develop a real strategy for Iran, and start a real dialogue with Iran.

Michael H Fuchs is a senior fellow at the Center for American Progress, and a former deputy assistant secretary of state for east Asian and Pacific affairs

Wednesday, June 05, 2019

The Man Who Told America the Truth About D-Day


The Man Who Told America the Truth About D-Day


By David Chrisinger
NY Times


Most of the men in the first wave never stood a chance.In the predawn darkness of June 6, 1944, thousands of American soldiers crawled down swaying cargo nets and thudded into steel landing craft bound for the Normandy coast. Their senses were soon choked with the smells of wet canvas gear, seawater and acrid clouds of powder from the huge naval guns firing just over their heads. As the landing craft drew close to shore, the deafening roar stopped, quickly replaced by German artillery rounds crashing into the water all around them. The flesh under the men’s sea-soaked uniforms prickled. They waited, like trapped mice, barely daring to breathe. 

A blanket of smoke hid the heavily defended bluffs above the strip of sand code-named Omaha Beach. Concentrated in concrete pill boxes, nearly 2,000 German defenders lay in wait. The landing ramps slapped down into the surf, and a catastrophic hail of gunfire erupted from the bluffs. The ensuing slaughter was merciless. 

But Allied troops kept landing, wave after wave, and by midday they had crossed the 300 yards of sandy killing ground, scaled the bluffs and overpowered the German defenses. By the end of the day, the beaches had been secured and the heaviest fighting had moved at least a mile inland. In the biggest and most complicated amphibious operation in military history, it wasn’t bombs, artillery or tanks that overwhelmed the Germans; it was men — many of them boys, really — slogging up the beaches and crawling over the corpses of their friends that won the Allies a toehold at the western edge of Europe. 

That victory was a decisive leap toward defeating Hitler’s Germany and winning the Second World War. It also changed the way America’s most famous and beloved war correspondent reported what he saw. In June 1944, Ernie Pyle, a 43-year-old journalist from rural Indiana, was as ubiquitous in the everyday lives of millions of Americans as Walter Cronkite would be during the Vietnam War. What Pyle witnessed on the Normandy coast triggered a sort of journalistic conversion for him: Soon his readers — a broad section of the American public — were digesting columns that brought them more of the war’s pain, costs and losses. Before D-Day, Pyle’s dispatches from the front were full of gritty details of the troops’ daily struggles but served up with healthy doses of optimism and a reliable habit of looking away from the more horrifying aspects of war. Pyle was not a propagandist, but his columns seemed to offer the reader an unspoken agreement that they would not have to look too closely at the deaths, blood and corpses that are the reality of battle. Later, Pyle was more stark and honest. 

For days after the landing, no one back home in the States had any real sense of what was happening, how the invasion was progressing or how many Americans were being killed. 

Nearly impossible to imagine today, there were no photographs flashed instantly to the news media. No more than 30 reporters were allowed to cover the initial assault. The few who landed with the troops were hampered by the danger and chaos of battle, and then by censorship and long delays in wire transmission. The first newspaper articles were all based on military news releases written by officers sitting in London. It wasn’t until Pyle’s first dispatch was published that many Americans started to get a sense of the vast scale and devastating costs of the D-Day invasion, chronicled for them by a reporter who had already won their trust and affection. 

Before World War II, Pyle spent five years crisscrossing the United States — and much of the Western Hemisphere — in trains, planes and a Dodge convertible coupe with his wife, Jerry, reporting on the ordinary people he met in his travels. He wrote daily, and his columns, enough to fill volumes, were syndicated for publication in local papers around the country. These weren’t hard-news articles; they were human-interest stories that chronicled Americans during the Great Depression. Pyle told stories about life on the road, little oddities and small, heart-lifting triumphs and the misery that afflicted the drought-stricken Dust Bowl regions of the Great Plains. 

Pyle honed a sincere and colloquial style of writing that made readers feel as if they were listening to a good friend share an insight or something he noticed that day. When the United States entered World War II, Pyle took that same technique — familiar, open, attuned to the daily struggles of ordinary people — and applied it to covering battles and bombings. Venturing overseas with American forces in 1942, Pyle reported the war through the eyes of the regular infantrymen on the front lines. He wrote about the food, the weather and the despair of living in slit trenches during the rainy late winter of 1943. He asked the soldiers their names and their hometown addresses, which he routinely included in his articles. Soon millions of readers were following Pyle’s daily column in about 400 daily and 300 weekly newspapers across the United States. In May 1944, Pyle was notified that he had been awarded the Pulitzer Prize for his dispatches. 

On D-Day, as the invasion force fought for the beach, Pyle was trapped just offshore, on a ship transporting tanks. He had boarded with a kit bag heavy with liquor bottles, some good-luck talismans and a Remington portable typewriter. As eager as he was to witness the landing, Pyle wasn’t allowed to go ashore at Omaha Beach until the morning after. For a couple of hours that day, he walked alone on the beach, along the ragged line where the ocean meets the sand, with his eyes trained downward. Weighing just over 100 pounds, Pyle resembled “a short scarecrow with too much feet,” as one Army historian described him.

Puffing on cigarettes and probably drinking a fair amount, Pyle spent the following days pecking away on his typewriter. His readers needed his words to make sense of what “our boys” were enduring in France. After he had written enough material for a few columns, he wondered if his plain-spoken prose would be enough to help anyone back home understand what it was to be contaminated with so much death. 

Pyle’s first column about the D-Day landings, published on June 12, 1944, gave his readers an honest accounting of how daunting the invasion had been — and what a miracle it was that the Allies had taken the beaches at all. “The advantages were all theirs,” Pyle said of the German defenders: concrete gun emplacements and hidden machine-gun nests “with crossfire taking in every inch of the beach,” immense V-shaped ditches, buried mines, barbed wire, “whole fields of evil devices under the water to catch our boats” and “four men on shore for every three men we had approaching the shore.” “And yet,” Pyle concluded, “we got on.” 

Pyle’s intent with this first column seems to have been simple: to elicit appreciation for the huge achievement and gratitude for “those both dead and alive” who had clawed their way up the beaches and taken down the enemy. 

This kind of dispatch was well-trod ground for Pyle, whose wartime columns tended to omit certain facts on the ground and reassure readers back home that the Allies were on the path to eventual victory. Tell the truth of it but offer reassurance too. Pyle used this same strategy when he began covering the war in 1940, and it served him well when he followed inexperienced American troops into ground combat in North Africa in 1942 and 1943, only to see them battered by the German army. After 1,600 men were killed or wounded by Germans in a trap at Sidi bou Zid in Tunisia, Pyle described the withdrawal of the remaining American forces as “a majestic thing.” Describing the fast-moving convoys of trucks and tanks, he wrote, “it was carried out so calmly and methodically” that it “was hard to realize, being a part of it, that it was a retreat.” He didn’t mention the 100 American tanks that were destroyed, or the loss in confidence the rank-and-file soldiers were feeling toward their command. Though he didn’t entirely whitewash the American defeat, which he called “damned humiliating,” Pyle’s artful narrative lent purpose and dignity to events that perhaps should have been probed more critically. 

Pyle’s second report from the Normandy beaches, published 10 days after D-Day, was markedly different from anything he had ever previously filed. “It was a lovely day for strolling along the seashore,” he wrote, reeling the reader in with a cheerful opening. “Men were sleeping on the sand, some of them sleeping forever. Men were floating in the water, but they didn’t know they were in the water, for they were dead.” Pyle cataloged the vast wreckage of military materiel, the “scores of tanks and trucks and boats” resting at the bottom of the Channel, jeeps “burned to a dull gray” and half-tracks blasted “into a shambles by a single shell hit.” Some reassurances followed to soften the unvarnished fact — the losses were an acceptable price for the victory, Pyle said — but he hadn’t shied away from showing his readers the corpses and “the awful waste and destruction of war.” Pyle was working up to something he hadn’t done before. 

The next day, June 17, newspapers across the country published Pyle’s third column describing the D-Day beachhead. By allowing the objects he saw in the sand to tell an eloquent story of loss, Pyle showed his readers the true cost of the fighting, without explicitly describing the blood and mangled bodies. “It extends in a thin little line, just like a high-water mark, for miles along the beach,” Pyle wrote about the detritus of the battle. “Here in a jumbled row for mile on mile are soldiers’ packs. Here are socks and shoe polish, sewing kits, diaries, Bibles and hand grenades. Here are the latest letters from home. . . . Here are toothbrushes and razors, and snapshots of families back home staring up at you from the sand. Here are pocketbooks, metal mirrors, extra trousers and bloody, abandoned shoes.” 

Pyle often included himself in his stories, addressing his readers directly and letting them see him in the scene, a reassuring presence who was keeping his eye on things for them, reducing sprawling events to their digestible essentials. But here Pyle depicted himself as stunned and confused — a dazed witness to gambles and losses on a scale that nobody could comprehend. “I picked up a pocket Bible with a soldier’s name in it, and put it in my jacket,” he wrote. “I carried it half a mile or so and then put it back down on the beach. I don’t know why I picked it up, or why I put it back down.” 

By the end of the column, Pyle’s readers were confronted with outright horror: “As I plowed out over the wet sand of the beach,” Pyle wrote, “I walked around what seemed to be a couple of pieces of driftwood sticking out of the sand. But they weren’t driftwood. They were a soldier’s two feet. He was completely covered by the shifting sands except for his feet. The toes of his G.I. shoes pointed toward the land he had come so far to see, and which he saw so briefly.” 

Omaha Beach had some of the fiercest fighting of the invasion. Pyle came ashore here the next day and walked alone on the beach. 

This was a different Ernie Pyle from the one millions of Americans knew from the newspapers that kept them company at the breakfast table or on the train home in the evening. If his reporting before D-Day was aimed at comforting the disturbed readers back home with optimism and tales of the soldiers’ endurance, his reporting from the beaches of Normandy was aimed at disturbing the comfortable. 

To his own surprise, his dispatches about D-Day’s losses were not met with rejection or censorship. In addition to the newspapers that ran his columns, Life magazine requested permission to run an excerpt, and radio programs quoted Pyle in commercials imploring listeners to buy war bonds. In Washington, two of the columns were reprinted in the official Congressional Record. “It’s getting so you can’t pick up any damned publication at all without seeing you mentioned,” Lee Miller, Pyle’s editor, wrote to the reporter on June 19, 1944. 

Until D-Day, war had largely been an exhilarating experience for Pyle, terrible but often uplifting. Ten days after the landings, the awfulness of all the death he was witnessing in the “thousands of little skirmishes” in the hedgerow country of Normandy was carving away at his mental state. He reported having knots in his stomach from “constant tenseness and lack of sleep.” In a letter back home, he confided that he had to “continually fight an inner depression over the ghastliness of it all.” “Sometimes,” he wrote to Miller on June 29, “I get so obsessed with the tragedy and horror of seeing dead men that I can hardly stand it. But I guess there’s nothing to do but keep going.” 

Less than two weeks after witnessing the jubilant liberation of Paris, Pyle wrote his final column from Europe. “I’m leaving,” he told his readers. “ ‘I’ve had it,’ as they say in the Army. I have had all I can take for a while.” After spending 29 months overseas, writing around 700,000 words about the war and surviving nearly a year at the front lines, Pyle confided that his spirit was faltering and confused. “I do hate terribly to leave right now, but I have given out,” he wrote. “I’ve been immersed in it too long. The hurt has finally become too great.” 

Pyle returned home to New Mexico. After a few months back in the United States, overwhelmed by mountains of mail, invasions of his privacy and his wife’s attempted suicide, Pyle’s dread of war was outweighed by his unease in civilian life. Life on the front line was simpler. Pyle missed it. Shortly before Christmas 1944, he began making final preparations to report to the Pacific, where American forces were “island hopping” their way toward Japan. 

The grim view of the war that overtook Pyle in Normandy — the sense that perhaps the losses were simply beyond bearing — seemed to follow Pyle to the Pacific, but it showed up differently in his reporting there. Interviewing bomber pilots on islands far from the fighting and sailors on Navy ships who seemed safe and comfortable compared with infantrymen on the front lines, Pyle felt that he was seeing a softer, easier war, and he let it show. “The days are warm and on our established island bases the food is good and the mail service is fast and there’s little danger from the enemy,” he wrote in a column titled “Europe This Is Not.” Worried that he wasn’t doing his part for the war effort, Pyle arranged to go with the Marines when they landed on Okinawa, where the fighting was expected to be intense. It was no D-Day — the Japanese had retreated inland, and Pyle was amazed to see a beach landing with no carnage — but the Marines soon found themselves mired in bitter fighting for every hill and cave. On April 18, 1945, 20 days before the war in Europe ended, Pyle was shot through the left temple by a Japanese machine-gunner and died instantly in a ditch on the tiny island of Ie Shima, off the northwest coast of Okinawa. 

“I’m leaving,” Pyle wrote in his final column from Europe. “I’ve had it, as they say. 

Before Pyle’s body was buried under a crude marker in the 77th Division’s cemetery, a draft of a column he was writing was discovered in his pocket. It was not so much a dispatch as it was a meditation on the end of the war. “Last summer,” Pyle said, “I wrote that I hoped the end of the war could be a gigantic relief, but not an elation. In the joyousness of high spirits it is so easy for us to forget the dead.” That was a relief that he knew was simply unavailable to many and a forgetting that shouldn’t be allowed to any. 

The draft went on: “There are so many of the living who have had burned into their brains forever the unnatural sight of cold dead men scattered over the hillsides and in the ditches. . . . Dead men in such familiar promiscuity that they become monotonous. Dead men in such monstrous infinity that you come almost to hate them. Those are the things that you at home need not even try to understand. To you at home they are columns of figures, or he is a near one who went away and just didn’t come back. You didn’t see him lying so grotesque and pasty beside the gravel road in France. We saw him, saw him by the multiple thousands. That’s the difference.” 

During his four years as a war correspondent, Pyle was embraced by enlisted men, officers and a huge civilian public as a voice who spoke for the common infantryman. With his trauma in France, he had become one of them. After sharing so much of their experience, he understood how gravely war can alter the people who have to see it and fight it and live it. He knew that the survivors can come home with damage that is profound, painful and long-lasting. It was a truth that he found hard or even impossible to communicate to the readers back home — and it is a truth that is still difficult and troubling now, 75 years after D-Day. 

We accept that our wars are different now — more scattered, seemingly never-ending, against a more diffuse and elusive enemy — but those wars are still presented with the promise that we are fighting for our way of life or the survival of our values, and that we’ll enjoy greater peace and security when those wars are won. War reporting has become more honest and unsparing about tallying the death toll — at least on our side — but politicians making the case for deployments and invasions still don’t invite the public in advance to decide whether the promised benefits will be worth the losses. 

Seeing and reporting the vast losses on the beach at Normandy and watching war’s meat grinder in action in the vicious battles that followed, Pyle was evidently forced to recalculate the arithmetic of victories and losses. By the time he was killed, 10 months later and on the opposite side of the world, the lesson seemed to have solidified for him. Not even the war ending, not even victory — which his previous reporting usually kept in sight as the great goal of the war — would be able to bring back all the people killed or counteract the damage done to the survivors. Pyle had written about battles and war in a way that promised hope. By the time victory was actually in sight, he had come to feel that there was no way the war could be a story with a happy ending.