Saturday, March 21, 2020

U.S. economy deteriorating faster than anticipated


U.S. economy deteriorating faster than anticipated as 80 million Americans are forced to stay at home 


Already, it is clear that the initial economic decline will be sharper and more painful than during the 2008 financial crisis 

David J. Lynch and Heather Long Washington Post

The U.S. economy is deteriorating more quickly than was expected just days ago as extraordinary measures designed to curb the coronavirus keep 84 million Americans penned in their homes and cause the near-total shutdown of most businesses. 

In a single 24-hour period, governors of three of the largest states — California, New York and Illinois — ordered residents to stay home except to buy food and medicine, while the governor of Pennsylvania ordered the closure of nonessential businesses. Across the globe, health officials are struggling to cope with the growing number of patients, with the World Health Organization noting that while it required three months to reach 100,000 cases, it took only 12 days to hit another 100,000. 

The resulting economic meltdown, which is sending several million workers streaming into the unemployment line, is outpacing the federal government’s efforts to respond. As the Senate on Friday raced to complete work on a financial rescue package, the White House and key lawmakers were dramatically expanding its scope, pushing the legislation far beyond the original $1 trillion price tag

With each day, an unprecedented stoppage gathers force as restaurants, movie theaters, sports arenas and offices close to shield themselves from the disease. Already, it is clear that the initial economic decline will be sharper and more painful than during the 2008 financial crisis. 

Next week, the Labor Department will likely report that roughly 3 million Americans have filed first-time claims for unemployment assistance, more than four times the record high set in the depths of the 1982 recession, according to Bank of America Merrill Lynch. That is just the start of a surge that could send the jobless rate spiking to 20 percent from today’s 3.5 percent, a JPMorgan Chase economist told clients on a conference call Friday. 

Why a massive economic stimulus package is crucial for coronavirus recovery 

The Post's Jeff Stein explained March 18 why the White House and Congress are moving swiftly on economic stimulus amid the coronavirus outbreak. (Monica Akhtar/The Washington Post) 

Estimates of the pandemic’s overall cost are staggering. Bridgewater Associates, a hedge fund manager, says the economy will shrink over the next three months at an annual rate of 30 percent. Goldman Sachs pegs the drop at 24 percent. JPMorgan Chase says 14 percent. 

“We are looking at something quite grave,” said economist Janet L. Yellen, the former Federal Reserve chair. “If businesses suffer such serious losses and are forced to fire workers and have their firms go into bankruptcy, it may not be easy to pull out of that.” 

Little more than seven months before the presidential election, President Trump already is looking past the crisis and promising a swift recovery. “We’re going to be a rocket ship as soon as this thing gets solved,” he said Thursday. “… We think it’s going to come back really fast.” 

Most economists expect the economy to begin climbing out of its deep hole in the second half of this year. But those forecasts depend upon the pandemic being brought under control and the United States and other governments enacting policies that prevent lasting harm to factories and financial arteries. Even if all that happens, the economy will be smaller at the end of this year than it was at the beginning, according to Bridgewater, Goldman and JPMorgan. 

The truth is no one knows what will happen months from now. No one on Wall Street or in Washington has any experience dealing with the kind of complex threat that has suddenly materialized to upend American life — a global health scare that is strangling the economy and disrupting financial markets. 

Individual workers and their families — many only recently recovered from the economic cataclysm of 2008 and 2009 — are already feeling the effects. The unexpected economic shock has put millions of Americans living on the precipice of ruin. 

In a Fed survey last year, 39 percent of Americans said they would be unable to handle an unexpected $400 expense. 

Lyndsy Hartmann knew something was wrong last weekend when she went to her job at a spa company in Charlottesville. Hartmann, 34, normally handles 150 calls a day from people interested in booking spa treatments and massages. But on Saturday and Sunday, she received a total of six. 

Her boss laid off most of the staff Monday morning. Tuesday, it was Hartmann’s turn. 

Her husband, who works as a personal chef, also has lost his job. He had a side gig driving for Uber, but the college students who were his main customers disappeared earlier this month when the virus forced the University of Virginia to close. Now, the couple is worried about paying their $1,300 apartment rent. 

“I don’t have savings. My rent is so high that I live paycheck to paycheck,” Hartmann said. “Everyone is talking about how the coronavirus is affecting restaurants, but it’s affecting hotels, spas and hospitality in general. I don’t know how I’m going to pay rent and pay my bills." 

The situation is much the same in Mount Clemens, a suburb of Detroit. Steve Manzo is a cook and bartender at an Irish pub who began the week anticipating double pay for St. Patrick’s Day. Instead, he lost his job Monday and spent hours on the state’s unemployment website trying to apply for government aid that will make up less than half of his typical $60,000 annual income. 

Like many, Manzo lives paycheck to paycheck and is worried he might not even have a restaurant to return to if the pandemic endures. 

“I keep looking out my window and everything looks normal, but it doesn’t feel normal,” Manzo said. “I live downtown with bars and restaurants, and nobody is here.” 

Even Yellen, 73, has not escaped the anxiety. She is hunkered down in her Washington home running economic forecasts that all look grim. Her son, who flew home from London on Saturday and was caught in the mass of people at passport control at Dulles International Airport, is self-quarantining in the family’s basement to avoid endangering his mother. Yellen, one of the world’s top economists, leaves plates of food for him at the top of the basement stairs. 

Senate consideration of a financial rescue of more than $1 trillion came amid mounting signs of economic distress. In a joint letter, United Airlines chief executive Oscar Munoz and union officials warned of imminent layoffs and urged the airline’s nearly 100,000 employees to lobby Congress for help. 

“While many in Washington, D.C. now realize the gravity of this situation, time is running out,” the letter said. “… To be specific, if Congress doesn’t act on sufficient government support by the end of March, our company will begin to take the necessary steps to reduce our payroll in line with the 60% schedule reduction we announced for April. May’s schedule is likely to be cut even further.” 

Lawmakers’ willingness to countenance massive spending and corporate bailouts in response was one sign that Americans increasingly recognize the battle against the coronavirus involves a stark choice between health and wealth. 

U.S. officials are deliberately engineering a sharp recession to blunt the spread of a fatal disease for which there is no cure, definitive treatment or vaccine. Governors in Florida, New Jersey and Nevada on Friday all took or indicated they would take significant action to limit activity outside the home. 

The only way to interrupt the viral blitzkrieg is to implement social distancing. But keeping friends and co-workers apart is a recipe for crippling the consumer spending that drives 70 percent of the $21 trillion economy. 

Alexis J. Walker, 40, earned about $60,000 last year with a series of jobs — working a part-time job at an after-school program in Philadelphia, which ended a week ago; substitute teaching; working as a nanny; DJ-ing at parties; and selling homemade shea butter — and occasionally renting her apartment on Airbnb. 

Last weekend, Walker had $500 worth of babysitting and DJ gigs lined up, but all her customers canceled, along with her entire lineup of Airbnb guests this spring. Her landlord is letting her delay her rent payment, but she knows it will eventually come due. 

“Those bills still have to be paid. My debts keep accruing,” said Walker, who hopes to establish a virtual after-school program to keep some money coming in. “What will send this country into a depression is all the bills people will have to pay in two months.” 

Dean Baker, senior economist at the Center for Economic and Policy Research, said a depression would involve a more protracted period of unusually low activity than what seems likely. 

“Everything depends on how long we are effectively locked down,” he said. “If it’s just two to three weeks, then it need not be this bad. But if we locked down for most of the quarter, then we are looking at a really bad story.” 

The sudden turnabout in U.S. economic fortunes is without historic parallel. As 2020 began, the U.S. economy had been expanding without interruption since the middle of 2009. The jobless rate was near a half-century low, and the stock market was headed toward a record high. 

Now, the economy is screeching to a halt and the stock market is in free fall. On Thursday, the Big Three automakers said they would close their factories through March 30. Real estate agents have canceled open houses. Marriott, the largest hotel company in the world, is closing its hotels and furloughing thousands of workers. 

In a video message to employees, CEO Arne Sorenson said he would not take any salary for the rest of the year. The pandemic “is like nothing we’ve ever seen before,” he said. 

The sudden loss of hotel business exceeded the combined effects of the Great Recession and the aftermath of the 9/11 terrorist attacks. On average, hotel revenue is down 75 percent, which requires draconian retrenchment. 

“I can tell you that I have never had a more difficult moment than this one,” said an emotional Sorenson. “There is simply nothing worse than telling highly valued associates, people who are the very heart of this company, that their roles are being impacted by events completely out of their control.” 

On Wall Street, the bloodletting has continued almost without interruption. After a brief respite Thursday, stocks sank again on Friday. The Dow Jones industrial average closed at 19,173.98, down more than 913 points, or 4.6 percent. Since reaching an all-time high on Feb. 12, the Dow has lost 35 percent of its value. 

“Get out, get out while you can before they shutter the whole darn United States,” Chris Rupkey, chief financial economist for MUFG, wrote in a note to clients. “A stunning reversal of fortune for the best economy in history to the worst economy in history in not even two months. The fastest recession in history. With no one spending a dime, it will stay that way a long, long time.” 

For small-business owners, there is palpable fear of bankruptcy. 

Bryan DeHenau already has filed for bankruptcy once before, when the construction industry crashed amid the Great Recession. In the years since then, he’s built up a roofing business in hopes of developing a more recession-proof income stream. 

Six weeks ago, he said, he had work “coming out the ying-yang.” Now business is down 50 percent and he worries about obtaining enough roof shingles for his remaining customers or dealing with a potential lockdown if Michigan follows the lead of other states. 

“We’re just trying to manage day to day. It’s all you can do,” DeHenau said. “But if everybody is laid off, they aren’t going to be buying a roof.”

Friday, March 13, 2020

Reality Arrives to the Trump Era


Reality Arrives to the Trump Era 

By Andrew Sullivan New York Magazine


Plagues routinely start with denial. In his great novel, The Plague, Albert Camus describes a scene at the very beginning, after several rats in a town started dying identical deaths: 

‘These rats, now?’ the magistrate began. [Doctor] Rieux made a brief movement in the direction of the train, then turned back toward the exit. ‘The rats?’ he said. ‘It’s nothing.’ The only impression of that moment which, afterwards, he could recall was the passing of a railroad man with a box full of dead rats under his arm. 

This is not to excuse the negligence of the Trump administration and the CDC. But it helps explain it. Plagues are such an enormous disruption of regular life that it is always hard to accept that we are engulfed in one. This is why plagues, of course, always tend to have the advantage over people. Soon enough, however, the direness of the situation began to set in: 

In a very few days the number of cases had risen by leaps and bounds, and it became evident to all observers of this strange malady that a real epidemic had set in … Everybody knows that pestilences have a way of recurring in the world; yet somehow we find it hard to believe in ones that crash down on our heads from a blue sky … In this respect our townsfolk were like everybody else, wrapped up in themselves … They went on doing business, arranged for journeys, and formed views. How should they have given a thought to anything like plague, which rules out any future, cancels journeys, silences the exchange of views. 

Those of us who have already been through a plague experience in our lives know this all too well. As the clear signs of a new and deadly epidemic began to emerge among gay men in the early 1980s, most people ignored or downplayed or even joked about it, and many of those most at risk shut their eyes. 

Randy Shilts, in his epic tale of this nightmare, And the Band Played On, relays the first guidance from the American Association of Physicians for Human Rights: “Sensitive to concerns that the group not be ‘sex-negative,’ the guidelines assured gay men that there was nothing wrong with having sex, but they should check their partners for KS lesions, swollen lymph nodes, and overt symptoms of AIDS.” Even the Gay Men’s Health Crisis in New York — an activist group formed to confront the reality of this new plague — put “the accumulated wisdom of homosexual physicians in one phrase: ‘Have as much sex as you want, but with fewer people and HEALTHY people.’” Even though it was by then clear that asymptomatic carriers were just as capable of transmitting the virus, denial was too strong. 

In San Francisco in early 1983, epidemiologists had a curious resemblance to the CDC now. After the first quarter’s AIDS incidence report came out, Shilts writes: 

Dr. Andrew Moss concluded that ‘in some cohorts of gay men in San Francisco, AIDS incidence rates in the thirty and forty year old groups are now of the order of 1 to 2 percent.’ Only later would studies show that by this time in 1983, the 62 percent of gay men who still engaged in risky behavior had at least a 25 percent chance of being intimate with someone infected with the new virus. 

So the estimate was off by a factor of ten, which informed my decision to self-isolate a week ago. 

Bathhouses — which facilitated even higher rates of transmission — stayed open. The gay press needed the ads from the bathhouses, and the bathhouses were profitable; and the liberationist culture that had only recently emerged simply could not concede that liberation, in this instance, was laced with death. 

The same denialism can be see in Camus: 

That the regulations now in force were inadequate was lamentably clear … The only hope was that the outbreak would die a natural death; it certainly wouldn’t be arrested by the measures the authorities had so far devised … There was enough for immediate requirements, but not enough if the epidemic were to spread. 

Which is the case with ICU beds right now in the U.S. Even when the deaths mounted in The Plague, the public resisted facing the reality: 

Our townsfolk apparently found it hard to grasp what was happening to them. There were feelings all could share, such as fear and separation, but personal interests, too, continued to occupy the foreground of their thoughts. … It was only as time passed and the steady rise in the death-rate could not be ignored that public opinion became alive to the truth … These figures, anyhow, spoke for themselves. Yet they were still not sensational enough to prevent our townsfolk, perturbed though they were, from persisting in the idea that what was happening was a sort of accident, disagreeable enough, but certainly of a temporary order. 

“A lot of people think that goes away in April, with the heat,” President Trump said on February 10. “It’s going to disappear one day, it’s like a miracle,” he said over two weeks later. “It will go away, just stay calm,” he insisted as recently as this past Tuesday. Many of his supporters declared the epidemic a hoax, or insisted it was nothing more than the regular flu — even though it is estimated to be at least ten times as lethal. Yes, these denialist declarations are driven by tribal politics. But they exist beyond the Trump cult, and are also propelled by the ancient human resistance to accepting that our normal lives are over, that we live in a new paradigm, and there is no escaping it. 

It’s like watching a movie when the screen suddenly and unaccountably slips out of focus, or keeps freezing for a few seconds, and you wait for the reel to be corrected, or get back to where it was, but it doesn’t. After a while, you begin to realize that this is the movie, that you will have to learn to watch it in a new way, and that waiting for a return to normal is a delusion — a very human delusion, but false nonetheless. 

It is rare that the authorities act swiftly enough and drastically enough to stop a plague from growing. Even with the difficult-to-catch HIV retrovirus, by the time it was very clear that the best course of action was no sex or very safe sex, the die was cast. Plagues are dynamic things and are fueled by complacency. With this coronavirus, which is far, far easier to catch, we had obvious warning signs from China, but assumed a travel ban would keep the U.S. safe. We had a chance to roll out WHO testing kits, to ensure that if there were an outbreak in the U.S., it could be contained. But the Trump administration decided to produce an American version of the test, which was screwed up by errors, delaying it for weeks. And so we had no real grip on the spread or incidence of the virus, which is asymptomatic in most cases to begin with. We had no idea where it was, and we still don’t. It might have been possible to contain the illness even a few weeks ago. But we were flying as blind as the authorities in 1918 — even with 21st century technology. So now we have a pandemic that can only be managed rather than stopped. 

And this is not entirely a function of the Trump administration’s incompetence. Look at Italy. What’s needed is a set of draconian measures at a time when the epidemic is still small, and normal life is in full swing. But in a period when relative normalcy still prevails, such draconian measures will inevitably seem completely panicky for most, slowing economic activity and growth and making a government instantly unpopular. In Western democracies, this makes a plague far harder to stop. Appeasement of plagues, like appeasement of dictators, never works. 

President Trump is not the only complacent figure. In Britain and Germany, Prime Minister Boris Johnson and Chancellor Angela Merkel have all but resigned themselves to an inevitable culling of the population, and have imposed few draconian measures by fiat. Only yesterday, Johnson was unwilling to shut down soccer matches — before the soccer authorities decided to do it on their own. 

And if you want to see a classic example of how a virus spreads, just look at the House of Commons, where the entire political class crams into a tiny space cheek by jowl — even after several members of Parliament, and the the Health Secretary, have already fallen ill. Watch this video of a health minister coughing and spluttering at the despatch box. It’s madness. But the alternative — a suspension of Parliament; measures to end all public gatherings, restaurants and bars, and theater productions; mandatory self-quarantining for everyone, sick or well, for a couple of weeks — seemed utterly bonkers even a few days ago. But they would have helped a lot a month ago. 

With Trump, we have a deeper crisis, of course. Trump is incapable of admitting error, numb to any form of empathy, narcissistic even in a communal crisis, and immune to any kind of realism. He simply cannot tell anyone bad news. And he cannot keep a story straight, which is essential for public health. His only means of communication is deceptive salesmanship. He defunded the federal body designed to tackle such emergencies, and his Cabinet is packed with incompetence, corruption, and fealty. He cannot summon trust among at least half the country, and he has willfully destroyed confidence in the public institutions we desperately need to get through this. 

In this, he is a typical man-at-the-bar pontificator, or shock-jock tweeter, whose strange theories are matched only by his own refusal to be tested for the virus, even though we now know he has been exposed. He is in charge of public health but can still blithely say something completely untrue — like everyone coming into the U.S. is being tested, or that anyone who wants a coronavirus test can get one, to give two damning examples. Rather than concede a failure, Trump will always lie. He is utterly unfit to be president, and always has been. We had a chance to remove him from office before a catastrophe struck, but the Senate kept him in power. This is their responsibility too. 

It’s still unfair to blame all this on one man, when we have all been complacent because we are human, and the way we have responded is almost exactly how almost every community in the past has responded as plagues set in. But from here on out, we have to grapple with the fact that we are on our own. Trump is singularly incapable of addressing this credibly or effectively, with anything like the right mix of realism and hope the crisis demands. 

He is immune to data, resistant to any facts that might suggest his own administration’s failure, and his prime-time address was deeply unsettling and off-kilter. We have been so, so lucky to have avoided a major crisis for the last three years, but our luck has now run out. We can rarely halt a plague, but we can manage one with the least human collateral damage. It seems to me that we may be headed, instead, for another 1918, mitigated only by antibiotics to deal with the bacterial infections that a century ago piggybacked on viral infections and multiplied the victims. 

The only thing we now know for certain is that a description of this era as surreal is now out of date. At some point, reality was bound to step in, of course, and it’s been quite amazing how long we have been able to postpone it. But this is now as real as it gets. And it is just the beginning. 

The Right Look 

If I were to point out a possible future for a sane post-Trump conservatism, I’d point to one Rishi Sunak. I don’t blame you for not knowing his name yet, but he recently became Boris Johnson’s chancellor of the exchequer, responsible for Britain’s finances and economy, the second most powerful figure in the British government. He’s the grandson of Indian immigrants to the U.K., went to Oxford and Stanford, and became a hedge-funder. First elected to Parliament in 2015, his handsome smile has been matched only by the perfect fit of his suits. He’s said to have a personal fortune of $250 million, helped by marrying the daughter of one of the richest men in India. A member of a thriving minority group in the U.K., he’s a Hindu, known for lavish parties on his Yorkshire estate, and called by the locals the Maharajah of the Dales. And he isn’t yet 40 years old. 

But what really marks him is his abandonment of Thatcherism. He unveiled a budget earlier this week that borrowed at levels no Tory has supported in decades. He pledged a massive infrastructure investment of up to $750 billion over five years. He made no attempt to lower the debt, the key aim of the Cameron and May governments before Johnson. The Financial Times noted that “the youthful chancellor delivered not just an end to austerity. He marked a shift from the low-tax, low-spending Conservatism of Margaret Thatcher towards a new rightwing model, embracing higher spending funded by borrowing. The goal of returning national debt to levels seen before the financial crisis has been jettisoned.” At the same time, “investment in roads, rail, housing, broadband and capital projects as a proportion of the economy will rise to levels not seen since the 1970s.” 

Toryism is an intrinsically adaptive political instinct. The Conservative Party shifts right and left as time passes, always eager to hold power, and rarely as ideological as during the Thatcher era. Sunak is also an immigrant success story, the son of small business owners, and a supporter of Brexit. In one stroke, he defies the idea that a post-Brexit U.K. will be reactionary or racist. He co-authored a think-tank study on British racial minorities, “A Portrait of Modern Britain,” and how to empower them. And he is proving how much easier it is for the right to turn left on economics than for the left to turn right on culture. 

Yes, the future fiscal outlook is grim — Sunak spent the first third of his budget speech on measures to contain the coronavirus — and disentangling from the E.U. is bound to be economically hazardous. But as a package for a contemporary conservatism, he’s hard to beat. 

A Welcome Concession by the New York Times 

It took them many months, but it’s a good thing that the editor, Jake Silverstein, and primary author, Nikole Hannah-Jones, of the New York Times’ 1619 Project have finally conceded that they did make a mistake in claiming that the retention of slavery was a primary reason for the American revolution. Or in Hannah-Jones’s even stronger words: “We may never have revolted against Britain if the founders had not understood that slavery empowered them to do so; nor if they had not believed that independence was required in order to ensure that slavery would continue.” 

A special issue of the magazine last year had been devoted to affirming that the true origin of the United States was 1619, when slaves were first brought to these shores, rather than 1776, when the colonies won independence from Britain. It was a project to place white supremacy, rather than individual liberty, as the core meaning of America, and to show how little had changed in this respect over the centuries. It claimed that it was telling the truth about America for the first time. It was designed to reveal, in Hannah-Jones’s words, “our true identity as a country and who we really are.” 

Silverstein’s concession is a marked shift from his position back in December, when he was adamant that he would not concede anything to the many historians who had criticized the project, especially over Hannah-Jones’s assertion about slavery’s centrality as a motivation for the Revolution. He wrote: 

“We disagree with [the historians’] claim that our project contains significant factual errors and is driven by ideology rather than historical understanding. While we welcome criticism, we don’t believe that the request for corrections to The 1619 Project is warranted … I think it would be useful for readers to hear why we believe that Nikole Hannah-Jones’s claim that ‘one of the primary reasons the colonists decided to declare their independence from Britain was because they wanted to protect the institution of slavery’ is grounded in the historical record.” 

Silverstein went on to note outrage among some southerners when an American slave, brought back to England, was granted freedom once he got there, and when the British offered freedom to slaves who joined the British army. And this was one real aspect of the broader complaints of the colonists. 

But now, Silverstein has had to “recognize that our original language could be read to suggest that protecting slavery was a primary motivation for all of the colonists. The passage has been changed to make clear that this was a primary motivation for some of the colonists.” But the larger claim that the desire to retain slavery was the motivation of the Founders themselves remains uncorrected in the essay. Even now, Silverstein insists that this is merely a “clarification,” and not a “correction” — and it feels to me like a compromise in an internal New York Times’ argument. Hannah-Jones, for her part, defends herself by saying that “writing sweeping passages of history is not easy, and sometimes there is a tension between journalistic inclinations & historical ones.” 

All of this is welcome, and Hannah-Jones and Silverstein did the right thing. I hope more such “clarifications” follow. But it seems to me that the real tension here was not between journalistic inclinations and history but between ideological inclinations and history. The entire point of the 1619 Project, after all, was to “reframe” American history, to make 1619 its core beginning. And it was to buttress that argument that Hannah-Jones and Silverstein wildly overstated the salience of white supremacy to American independence. 

And look, educating people about the brutal horrors of the slavery regime, as uncovered by recent historians, and the staggering cognitive dissonance and hypocrisy of many of the Founding Fathers is only a good thing. The hideous reality of slavery has been euphemized too long. But the upping of the ideological ante, the decision to call the issue a “project,” the placing of slavery at the center of the revolution, and the intent to deploy it as simple, incontrovertible, historical truth to schoolkids takes things much further. 

It is, in fact, history as filtered through the ideology of critical race theory, which regards the entire American experiment as an exercise in racial domination, deliberately masked by rhetoric about human freedom and equality. And that’s fine in the pages of, say, The Nation or Jacobin — explicitly political journals of opinion and polemic. But the paper of record and the Pulitzer Board, both of which sponsored and promoted the issue, are surely different. They aspire to factual, honest journalism — not ideological reframing, repackaged as empirical reality. They imply a liberal view of the world, in which the race of authors is far less important than the cogency of what they have to say, in which history is not predetermined by analyses of “structural oppression,” but by fact and contingency. The Times is supposed to be more about empiricism than activism. And to be fair, there was a lot of good journalism in the issue — it was just skewed and distorted to fit into a single argument about America’s racist “DNA.” 

Maybe that old empirical era is over — as the activist staffers impose their ideological project on the old guard. Or maybe this “clarification” is a sign that certain standards have not entirely been abolished, and that liberalism itself has not been thrown completely out of the Gray Lady’s windows. Maybe this final, reluctant concession is a sign that someone at the Times realized what they had done, and pushed to undo just a little bit of the distortion. Which is, in my mind, a reason for hope.

Sunday, March 01, 2020

How the Horrific 1918 Flu Spread Across America



How the Horrific 1918 Flu Spread Across America

The toll of history’s worst epidemic surpasses all the military deaths in World War I and World War II combined. And it may have begun in the United States 

An emergency hospital at Camp Funston, Kansas, 1918. “Of the 12 men who slept in my squad room, 7 were ill at one time,” a soldier recalled. (New Contributed Photographs Collection / otis historical Archives / National Museum of Health and Medicine) 

By John M. Barry SMITHSONIAN MAGAZINE

Haskell County, Kansas, lies in the southwest corner of the state, near Oklahoma and Colorado. In 1918 sod houses were still common, barely distinguishable from the treeless, dry prairie they were dug out of. It had been cattle country—a now bankrupt ranch once handled 30,000 head—but Haskell farmers also raised hogs, which is one possible clue to the origin of the crisis that would terrorize the world that year. Another clue is that the county sits on a major migratory flyway for 17 bird species, including sand hill cranes and mallards. Scientists today understand that bird influenza viruses, like human influenza viruses, can also infect hogs, and when a bird virus and a human virus infect the same pig cell, their different genes can be shuffled and exchanged like playing cards, resulting in a new, perhaps especially lethal, virus. 

We cannot say for certain that that happened in 1918 in Haskell County, but we do know that an influenza outbreak struck in January, an outbreak so severe that, although influenza was not then a “reportable” disease, a local physician named Loring Miner—a large and imposing man, gruff, a player in local politics, who became a doctor before the acceptance of the germ theory of disease but whose intellectual curiosity had kept him abreast of scientific developments—went to the trouble of alerting the U.S. Public Health Service. The report itself no longer exists, but it stands as the first recorded notice anywhere in the world of unusual influenza activity that year. The local newspaper, the Santa Fe Monitor, confirms that something odd was happening around that time: “Mrs. Eva Van Alstine is sick with pneumonia...Ralph Lindeman is still quite sick...Homer Moody has been reported quite sick...Pete Hesser’s three children have pneumonia ...Mrs J.S. Cox is very weak yet...Ralph Mc-Connell has been quite sick this week...Mertin, the young son of Ernest Elliot, is sick with pneumonia,...Most everybody over the country is having lagrippe or pneumonia.” 

Several Haskell men who had been exposed to influenza went to Camp Funston, in central Kansas. Days later, on March 4, the first soldier known to have influenza reported ill. The huge Army base was training men for combat in World War I, and within two weeks 1,100 soldiers were admitted to the hospital, with thousands more sick in barracks. Thirty-eight died. Then, infected soldiers likely carried influenza from Funston to other Army camps in the States—24 of 36 large camps had outbreaks—sickening tens of thousands, before carrying the disease overseas. Meanwhile, the disease spread into U.S. civilian communities. 

The influenza virus mutates rapidly, changing enough that the human immune system has difficulty recognizing and attacking it even from one season to the next. A pandemic occurs when an entirely new and virulent influenza virus, which the immune system has not previously seen, enters the population and spreads worldwide. Ordinary seasonal influenza viruses normally bind only to cells in the upper respiratory tract—the nose and throat—which is why they transmit easily. The 1918 pandemic virus infected cells in the upper respiratory tract, transmitting easily, but also deep in the lungs, damaging tissue and often leading to viral as well as bacterial pneumonias. 

Although some researchers argue that the 1918 pandemic began elsewhere, in France in 1916 or China and Vietnam in 1917, many other studies indicate a U.S. origin. The Australian immunologist and Nobel laureate Macfarlane Burnet, who spent most of his career studying influenza, concluded the evidence was “strongly suggestive” that the disease started in the United States and spread to France with “the arrival of American troops.” Camp Funston had long been considered as the site where the pandemic started until my historical research, published in 2004, pointed to an earlier outbreak in Haskell County. 

Wherever it began, the pandemic lasted just 15 months but was the deadliest disease outbreak in human history, killing between 50 million and 100 million people worldwide, according to the most widely cited analysis. An exact global number is unlikely ever to be determined, given the lack of suitable records in much of the world at that time. But it’s clear the pandemic killed more people in a year than AIDS has killed in 40 years, more than the bubonic plague killed in a century. 

The impact of the pandemic on the United States is sobering to contemplate: Some 670,000 Americans died. 

In 1918, medicine had barely become modern; some scientists still believed “miasma” accounted for influenza’s spread. With medicine’s advances since then, laypeople have become rather complacent about influenza. Today we worry about Ebola or Zika or MERS or other exotic pathogens, not a disease often confused with the common cold. This is a mistake. 

We are arguably as vulnerable—or more vulnerable—to another pandemic as we were in 1918. Today top public health experts routinely rank influenza as potentially the most dangerous “emerging” health threat we face. Earlier this year, upon leaving his post as head of the Centers for Disease Control and Prevention, Tom Frieden was asked what scared him the most, what kept him up at night. “The biggest concern is always for an influenza pandemic...[It] really is the worst-case scenario.” So the tragic events of 100 years ago have a surprising urgency—especially since the most crucial lessons to be learned from the disaster have yet to be absorbed. 

********** 

Initially the 1918 pandemic set off few alarms, chiefly because in most places it rarely killed, despite the enormous numbers of people infected. Doctors in the British Grand Fleet, for example, admitted 10,313 sailors to sick bay in May and June, but only 4 died. It had hit both warring armies in France in April, but troops dismissed it as “three-day fever.” The only attention it got came when it swept through Spain, and sickened the king; the press in Spain, which was not at war, wrote at length about the disease, unlike the censored press in warring countries, including the United States. Hence it became known as “Spanish flu.” By June influenza reached from Algeria to New Zealand. Still, a 1927 study concluded, “In many parts of the world the first wave either was so faint as to be hardly perceptible or was altogether lacking...and was everywhere of a mild form.” Some experts argued that it was too mild to be influenza. 

Yet there were warnings, ominous ones. Though few died in the spring, those who did were often healthy young adults—people whom influenza rarely kills. Here and there, local outbreaks were not so mild. At one French Army post of 1,018 soldiers, 688 were hospitalized and 49 died—5 percent of that population of young men, dead. And some deaths in the first wave were overlooked because they were misdiagnosed, often as meningitis. A puzzled Chicago pathologist observed lung tissue heavy with fluid and “full of hemorrhages” and asked another expert if it represented “a new disease.” 

By July it didn’t seem to matter. As a U.S. Army medical bulletin reported from France, the “epidemic is about at an end...and has been throughout of a benign type.” A British medical journal stated flatly that influenza “has completely disappeared.” 

In fact, it was more like a great tsunami that initially pulls water away from the shore—only to return in a towering, overwhelming surge. In August, the affliction resurfaced in Switzerland in a form so virulent that a U.S. Navy intelligence officer, in a report stamped “Secret and Confidential,” warned “that the disease now epidemic throughout Switzerland is what is commonly known as the black plague, although it is designated as Spanish sickness and grip.” 

The second wave had begun. 

********** 

The hospital at Camp Devens, an Army training base 35 miles from Boston that teemed with 45,000 soldiers, could accommodate 1,200 patients. On September 1, it held 84. 

On September 7, a soldier sent to the hospital delirious and screaming when touched was diagnosed with meningitis. The next day a dozen more men from his company were diagnosed with meningitis. But as more men fell ill, physicians changed the diagnosis to influenza. Suddenly, an Army report noted, “the influenza...occurred as an explosion.” 

At the outbreak’s peak, 1,543 soldiers reported ill with influenza in a single day. Now, with hospital facilities overwhelmed, with doctors and nurses sick, with too few cafeteria workers to feed patients and staff, the hospital ceased accepting patients, no matter how ill, leaving thousands more sick and dying in barracks. 

Roy Grist, a physician at the hospital, wrote a colleague, “These men start with what appears to be an ordinary attack of LaGrippe or Influenza, and when brought to the Hosp. they very rapidly develop the most vicious type of Pneumonia that has ever been seen. Two hours after admission they have the Mahogany spots over the cheek bones, and a few hours later you can begin to see the Cyanosis”—the term refers to a person turning blue from lack of oxygen—“extending from their ears and spreading all over the face....It is only a matter of a few hours then until death comes...It is horrible....We have been averaging about 100 deaths per day...For several days there were no coffins and the bodies piled up something fierce...” 

Devens, and the Boston area, was the first place in the Americas hit by the pandemic’s second wave. Before it ended, influenza was everywhere, from ice-bound Alaska to steaming Africa. And this time it was lethal. 

********** 

The killing created its own horrors. Governments aggravated them, partly because of the war. For instance, the U.S. military took roughly half of all physicians under 45—and most of the best ones. 

What proved even more deadly was the government policy toward the truth. When the United States entered the war, Woodrow Wilson demanded that “the spirit of ruthless brutality...enter into the very fibre of national life.” So he created the Committee on Public Information, which was inspired by an adviser who wrote, “Truth and falsehood are arbitrary terms....The force of an idea lies in its inspirational value. It matters very little if it is true or false.” 

At Wilson’s urging, Congress passed the Sedition Act, making it punishable with 20 years in prison to “utter, print, write or publish any disloyal, profane, scurrilous, or abusive language about the form of government of the United State...or to urge, incite, or advocate any curtailment of production in this country of any thing or things...necessary or essential to the prosecution of the war.” Government posters and advertisements urged people to report to the Justice Department anyone “who spreads pessimistic stories...cries for peace, or belittles our effort to win the war.” 

Against this background, while influenza bled into American life, public health officials, determined to keep morale up, began to lie. 

Early in September, a Navy ship from Boston carried influenza to Philadelphia, where the disease erupted in the Navy Yard. The city’s public health director, Wilmer Krusen, declared that he would “confine this disease to its present limits, and in this we are sure to be successful. No fatalities have been recorded. No concern whatever is felt.” 

The next day two sailors died of influenza. Krusen stated they died of “old-fashioned influenza or grip,” not Spanish flu. Another health official declared, “From now on the disease will decrease.” 

The next day 14 sailors died—and the first civilian. Each day the disease accelerated. Each day newspapers assured readers that influenza posed no danger. Krusen assured the city he would “nip the epidemic in the bud.” 

By September 26, influenza had spread across the country, and so many military training camps were beginning to look like Devens that the Army canceled its nationwide draft call. 

Philadelphia had scheduled a big Liberty Loan parade for September 28. Doctors urged Krusen to cancel it, fearful that hundreds of thousands jamming the route, crushing against each other for a better view, would spread disease. They convinced reporters to write stories about the danger. But editors refused to run them, and refused to print letters from doctors. The largest parade in Philadelphia’s history proceeded on schedule. 

The incubation period of influenza is two to three days. Two days after the parade, Krusen conceded that the epidemic “now present in the civilian population was...assuming the type found in” Army camps. Still, he cautioned not to be “panic stricken over exaggerated reports.” 

He needn’t have worried about exaggeration; the newspapers were on his side. “Scientific Nursing Halting Epidemic,” an Inquirer headline blared. In truth, nurses had no impact because none were available: Out of 3,100 urgent requests for nurses submitted to one dispatcher, only 193 were provided. Krusen finally and belatedly ordered all schools closed and banned all public gatherings—yet a newspaper nonsensically said the order was not “a public health measure” and “there is no cause for panic or alarm.” 

There was plenty of cause. At its worst, the epidemic in Philadelphia would kill 759 people...in one day. Priests drove horse-drawn carts down city streets, calling upon residents to bring out their dead; many were buried in mass graves. More than 12,000 Philadelphians died—nearly all of them in six weeks. 

Across the country, public officials were lying. U.S. Surgeon General Rupert Blue said, “There is no cause for alarm if precautions are observed.” New York City’s public health director declared “other bronchial diseases and not the so-called Spanish influenza...[caused] the illness of the majority of persons who were reported ill with influenza.” The Los Angeles public health chief said, “If ordinary precautions are observed there is no cause for alarm.” 

For an example of the press’s failure, consider Arkansas. Over a four-day period in October, the hospital at Camp Pike admitted 8,000 soldiers. Francis Blake, a member of the Army’s special pneumonia unit, described the scene: “Every corridor and there are miles of them with double rows of cots ...with influenza patients...There is only death and destruction.” Yet seven miles away in Little Rock, a headline in the Gazette pretended yawns: “Spanish influenza is plain la grippe—same old fever and chills.” 

People knew this was not the same old thing, though. They knew because the numbers were staggering—in San Antonio, 53 percent of the population got sick with influenza. They knew because victims could die within hours of the first symptoms—horrific symptoms, not just aches and cyanosis but also a foamy blood coughed up from the lungs, and bleeding from the nose, ears and even eyes. And people knew because towns and cities ran out of coffins. 

People could believe nothing they were being told, so they feared everything, particularly the unknown. How long would it last? How many would it kill? Who would it kill? With the truth buried, morale collapsed. Society itself began to disintegrate. 

In most disasters, people come together, help each other, as we saw recently with Hurricanes Harvey and Irma. But in 1918, without leadership, without the truth, trust evaporated. And people looked after only themselves. 

In Philadelphia, the head of Emergency Aid pleaded, “All who are free from the care of the sick at home... report as early as possible...on emergency work.” But volunteers did not come. The Bureau of Child Hygiene begged people to take in—just temporarily—children whose parents were dying or dead; few replied. Emergency Aid again pleaded, “We simply must have more volunteer helpers....These people are almost all at the point of death. Won’t you...come to our help?” Still nothing. Finally, Emergency Aid’s director turned bitter and contemptuous: “Hundreds of women...had delightful dreams of themselves in the roles of angels of mercy...Nothing seems to rouse them now...There are families in which the children are actually starving because there is no one to give them food. The death rate is so high and they still hold back.” 

Philadelphia’s misery was not unique. In Luce County, Michigan, a couple and three children were all sick together, but, a Red Cross worker reported, “Not one of the neighbors would come in and help. I ...telephoned the woman’s sister. She came and tapped on the window, but refused to talk to me until she had gotten a safe distance away.” In New Haven, Connecticut, John Delano recalled, “Normally when someone was sick in those days [people] would bring food over to other families but...Nobody was coming in, nobody would bring food in, nobody came to visit.” In Perry County, Kentucky, the Red Cross chapter chairman begged for help, pleaded that there were “hundreds of cases...[of] people starving to death not from lack of food but because the well were panic stricken and would not go near the sick.” 

In Goldsboro, North Carolina, Dan Tonkel recalled, “We were actually almost afraid to breathe...You were afraid even to go out...The fear was so great people were actually afraid to leave their homes...afraid to talk to one another.” In Washington, D.C., William Sardo said, “It kept people apart...You had no school life, you had no church life, you had nothing...It completely destroyed all family and community life...The terrifying aspect was when each day dawned you didn’t know whether you would be there when the sun set that day.” 

An internal American Red Cross report concluded, “A fear and panic of the influenza, akin to the terror of the Middle Ages regarding the Black Plague, [has] been prevalent in many parts of the country.” 

Fear emptied places of employment, emptied cities. Shipbuilding workers throughout the Northeast were told they were as important to the war effort as soldiers at the front. Yet at the L.H. Shattuck Co. only 54 percent of its workers showed up; at the George A. Gilchrist yard only 45 percent did; at Freeport Shipbuilding only 43 percent; at Groton Iron Works, 41 percent. 

Fear emptied the streets, too. A medical student working in an emergency hospital in Philadelphia, one of the nation’s largest cities, encountered so few cars on the road he took to counting them. One night, driving the 12 miles home, he saw not a single car. “The life of the city had almost stopped,” he said. 

On the other side of the globe, in Wellington, New Zealand, another man stepped outside his emergency hospital and found the same thing: “I stood in the middle of Wellington City at 2 P.M. on a weekday afternoon, and there was not a soul to be seen; no trams running; no shops open, and the only traffic was a van with a white sheet tied to the side with a big red cross painted on it, serving as an ambulance or hearse. It was really a city of the dead.” 

Victor Vaughan, formerly the dean of the University of Michigan’s Medical School, was not a man to resort to hyperbole. Now the head of the Army’s communicable disease division, he jotted down his private fear: “If the epidemic continues its mathematical rate of acceleration, civilization could easily disappear...from the face of the earth within a matter of a few more weeks.” 

********** 

Then, as suddenly as it came, influenza seemed to disappear. It had burned through the available fuel in a given community. An undercurrent of unease remained, but aided by the euphoria accompanying the end of the war, traffic returned to streets, schools and businesses reopened, society returned to normal. 

A third wave followed in January 1919, ending in the spring. This was lethal by any standard except the second wave, and one particular case would have an exceptional impact on history. 

On April 3, 1919, during the Versailles Peace Conference, Woodrow Wilson collapsed. His sudden weakness and severe confusion halfway through that conference—widely commented upon—very possibly contributed to his abandoning his principles. The result was the disastrous peace treaty, which would later contribute to the start of World War II. Some historians have attributed Wilson’s confusion to a minor stroke. In fact, he had a 103 degree temperature, intense coughing fits, diarrhea and other serious symptoms. A stroke explains none of the symptoms. Influenza, which was then widespread in Paris and killed a young aide to Wilson, explains all of them—including his confusion. Experts would later agree that many patients afflicted by the pandemic influenza had cognitive or psychological symptoms. As an authoritative 1927 medical review concluded, “There is no doubt that the neuropsychiatric effects of influenza are profound...hardly second to its effect on the respiratory system.” 

After that third wave, the 1918 virus did not go away, but it did lose its extraordinary lethality, partly because many human immune systems now recognized it and partly because it lost the ability to easily invade the lungs. No longer a bloodthirsty murderer, it evolved into a seasonal influenza. 

Scientists and other experts are still asking questions about the virus and the devastation it caused, including why the second wave was so much more lethal than the first. Researchers aren’t certain, and some argue that the first wave was caused by an ordinary seasonal influenza virus that was different from the pandemic virus; but the evidence seems overwhelming that the pandemic virus had both a mild and virulent form, causing mild as well as severe spring outbreaks, and then, for reasons that remain unclear, the virulent form of the virus became more common in the fall. 

Another question concerns who died. Even though the death toll was historic, most people who were infected by the pandemic virus survived; in the developed world, the overall mortality was about 2 percent. In the less developed world, mortality was worse. In Mexico, estimates of the dead range from 2.3 to 4 percent of the entire population. Much of Russia and Iran saw 7 percent of the population die. In the Fiji Islands 14 percent of the population died—in 16 days. One-third of the population of Labrador died. In small native villages in Alaska and Gambia, everyone died, probably because all got sick simultaneously and no one could provide care, could not even give people water, and perhaps because, with so much death around them, those who might have survived did not fight. 

The age of the victims was also striking. Normally, elderly people account for the overwhelming number of influenza deaths; in 1918, that was reversed, with young adults killed in the highest numbers. This effect was heightened within certain subgroups. For instance, a Metropolitan Life Insurance Company study of people aged 25 to 45 found that 3.26 percent of all industrial workers and 6 percent of all coal miners died. Other studies found that for pregnant women, fatality rates ranged from 23 percent to 71 percent. 

Why did so many young adults die? As it happens, young adults have the strongest immune systems, which attacked the virus with every weapon possible—including chemicals called cytokines and other microbe-fighting toxins—and the battlefield was the lung. These “cytokine storms” further damaged the patient’s own tissue. The destruction, according to the noted influenza expert Edwin Kilbourne, resembled nothing so much as the lesions from breathing poison gas. 

********** 

Seasonal influenza is bad enough. Over the past four decades it has killed 3,000 to 48,000 Americans annually, depending on the dominant virus strains in circulation, among other things. And more deadly possibilities loom. 

In recent years, two different bird influenza viruses have been infecting people directly: the H5N1 strain has struck in many nations, while H7N9 is still limited to China (see “The Birth of a Killer”). All told, these two avian influenza viruses had killed 1,032 out of the 2,439 people infected as of this past July—a staggering mortality rate. Scientists say that both virus strains, so far, bind only to cells deep in the lung and do not pass from person to person. If either one acquires the ability to infect the upper respiratory tract, through mutation or by swapping genes with an existing human virus, a deadly pandemic is possible. 

Prompted by the re-emergence of avian influenza, governments, NGOs and major businesses around the world have poured resources into preparing for a pandemic. Because of my history of the 1918 pandemic, The Great Influenza, I was asked to participate in some of those efforts. 

Public health experts agree that the highest priority is to develop a “universal vaccine” that confers immunity against virtually all influenza viruses likely to infect humans (see “How to Stop a Lethal Virus”). Without such a vaccine, if a new pandemic virus surfaces, we will have to produce a vaccine specifically for it; doing so will take months and the vaccine may offer only marginal protection. 

Another key step to improving pandemic readiness is to expand research on antiviral drugs; none is highly effective against influenza, and some strains have apparently acquired resistance to the antiviral drug Tamiflu. 

The Great Influenza: The Story of the Deadliest Pandemic in History 

Magisterial in its breadth of perspective and depth of research and now revised to reflect the growing danger of the avian flu, "The Great Influenza" is ultimately a tale of triumph amid tragedy, which provides us with a precise and sobering model as we confront the epidemics looming on our own horizon. 

Then there are the less glamorous measures, known as nonpharmaceutical interventions: hand-washing, telecommuting, covering coughs, staying home when sick instead of going to work and, if the pandemic is severe enough, widespread school closings and possibly more extreme controls. The hope is that “layering” such actions one atop another will reduce the impact of an outbreak on public health and on resources in today’s just-in-time economy. But the effectiveness of such interventions will depend on public compliance, and the public will have to trust what it is being told. 

That is why, in my view, the most important lesson from 1918 is to tell the truth. Though that idea is incorporated into every preparedness plan I know of, its actual implementation will depend on the character and leadership of the people in charge when a crisis erupts. 

I recall participating in a pandemic “war game” in Los Angeles involving area public health officials. Before the exercise began, I gave a talk about what happened in 1918, how society broke down, and emphasized that to retain the public’s trust, authorities had to be candid. “You don’t manage the truth,” I said. “You tell the truth.” Everyone shook their heads in agreement. 

Next, the people running the game revealed the day’s challenge to the participants: A severe pandemic influenza virus was spreading around the world. It had not officially reached California, but a suspected case—the severity of the symptoms made it seem so—had just surfaced in Los Angeles. The news media had learned of it and were demanding a press conference. 

The participant with the first move was a top-ranking public health official. What did he do? He declined to hold a press conference, and instead just released a statement: More tests are required. The patient might not have pandemic influenza. There is no reason for concern. 

I was stunned. This official had not actually told a lie, but he had deliberately minimized the danger; whether or not this particular patient had the disease, a pandemic was coming. The official’s unwillingness to answer questions from the press or even acknowledge the pandemic’s inevitability meant that citizens would look elsewhere for answers, and probably find a lot of bad ones. Instead of taking the lead in providing credible information he instantly fell behind the pace of events. He would find it almost impossible to get ahead of them again. He had, in short, shirked his duty to the public, risking countless lives. 

And that was only a game.