Save this storySave this storySave this storySave this story
In my childhood home, a modest, low-slung rectangle in eastern Washington, my mother was a bedroom away from me when she experienced her last moment. I remember standing in front of her, just after, feeling that I was watching a show or a movie, that this up-close experience was somehow false.
I had never seen death in person before. I had, however, seen it frequently on my phone’s screen, on my laptop, on TV, in movie theatres. So what was I looking at here? At my mother’s bedside, having never had the chance to confront serious loss in any substantive way, I was without comparison. In the following weeks, I struggled to accord what I’d seen with the world beyond our home. Looking around, it sometimes seemed loss and grief hardly existed at all.
Today, in the U.S. and the U.K., death is largely banished from the visual landscape. A century ago, approximately eighty-five per cent of Brits died at home; these days, it’s closer to twenty-five per cent, and around thirty per cent in America. Many of those deaths have moved to the hospital, an often sterile environment where, as during the pandemic, loved ones are sometimes restricted from visiting. When individual bodies show up in newspapers, magazines, and social media, they tend to be exoticized, people not like us. When they are familiar, they have “their faces turned away,” as Susan Sontag wrote; their identity is eroded, reduced, until they are more concept than person. We see this form of not quite death so often that one can be forgiven for mistaking, as I did, the curated depiction for the actual event.
And then there is the stigma of grief—the idea, now rampant in American life, of closure. Most people are loath to linger in loss. We are expected to get back to work, back to normal. According to a recent survey, U.S. companies offer, on average, five days of bereavement leave, a remarkably brief amount of time to grapple with a death. (For the death of a “close friend/chosen family,” the number drops to a single day.) Typical mourning rites can seem to take closure to an extreme: at a funeral, loved ones may surround and console you for an afternoon, but we have few widespread customs that continue in the aftermath. This is in stark contrast to practices elsewhere—the Day of the Dead in Mexico; the Japanese Buddhist festival of Obon, which honors ancestral spirits—that prepare grievers to carry a loss for their entire lives.
In America, the appeal of closure may be traced to “On Death and Dying,” the 1969 best-seller, by the Swiss-American psychiatrist Elisabeth Kübler-Ross, that outlined the “five stages” of grief, ending with acceptance. Kübler-Ross has been widely misread by the public: her original research was on how people coped with the prospect of their own death, not with the loss of another. As the social scientist Pauline Boss has pointed out, closure is a construct, something that can never fully be attained; even if we grieve in stages, there is no prescription for how to grieve, much less for how to neatly overcome a loss. Boss suggests that closure’s popularity is a product of America’s “mastery-oriented culture,” in which “we believe in fixing things, finding cures.” With my own grief, too, I imagined a solution. I wanted to mourn quietly, persistently, toward a goal, until the pain, even the death itself, was nearly forgotten.
Loss wasn’t always obscured or seen as a trial to overcome. Throughout the eighteenth century, in much of Western Europe, death was witnessed directly and with little fanfare, according to the French historian Philippe Ariès. Ariès was well known for “Western Attitudes Toward Death: From the Middle Ages to the Present,” his 1974 history of how the social construction of death changed over time. Observing an era in which mortality rates were much higher, he identified four distinguishing characteristics. The dying person was typically in his own bed. He usually had some awareness of his situation; he “presided over it and knew its protocol.” His family, sometimes even his neighbors, would join him at his bedside. And, while he was dying, emotions were relatively measured, the death being expected, to some degree already mourned, and broadly understood as part of the flow of time.
Although Ariès has been criticized, sometimes fairly, for an overreliance on literary sources and an idealization of the past, his core conclusion holds true: there was a social regularity—and nearness—to death that’s largely foreign to many today. (Ariès used the term “tamed death,” nodding to how mortality was at the forefront of public consciousness.) Even the trappings of mourning evinced this openness. In the eighteenth and nineteenth centuries, grieving women generally wore heavy black outfits that included veils and bonnets; sometimes there were necklaces, or bits of jewelry that contained the hair of the deceased. Both male and female mourners often used special stationery with black borders for correspondence. (Over time, the borders would narrow, to show readers that the bereaved party was slowly recovering.) And “death portraits,” although creepy to contemporary eyes, were popular memorials, further elevating death’s presence in the cultural psyche.
In the nineteen-hundreds, though, our relationship to grief seemed to change, transforming from a public, integrated phenomenon to a personal and repressed one. Some of this may have been prompted by the First and Second World Wars, which resulted in such multitudes of dead—men whose bodies were often unrecoverable—that the old rituals were no longer tenable. Other reasons were political, serving the needs of power. During the First World War, for instance, American suffragists marched against the prospect of U.S. involvement, noting the immense loss of life and the struggle it would create for women left alone at home or widowed. The protest’s goal, per one suffragette, was to stretch “out hands of sympathy across the sea to the women and children who suffer and to the men who are forced into the ranks to die.” In the heat of August, 1914, women paraded through Manhattan in traditional black mourning clothes.
President Woodrow Wilson had run on an isolationist platform, but by 1917 the United States had joined the fray, and such demonstrations threatened his agenda. In 1918, conscious of the public’s perception of the war, he wrote to Anna Howard Shaw, the former president of the National American Woman Suffrage Association, asking that the suffragettes encourage women across the country to reframe their mourning as patriotism. Instead of mourning clothes, he suggested, women could wear badges bearing white stars, which “upon the occurrence of a death be changed into stars of gold.” At the time, the Nineteenth Amendment was in the balance, and Shaw, who understood the importance of Wilson’s support, obliged, asking her followers to dial back their public grief and change their dress. “Instead of giving away to depression, it is our duty to display the same courage and spirit that they do,” she said. “If they can die nobly, we must show that we can live nobly.” On July 7, 1918, the Times ran an article entitled “Insignia, Not Black Gowns, as War Mourning: Women of America Asked to Forego Gloomy Evidences of Grief.” (The article was pinned between two stories about the terrors of the war: “Mustard Gas Warfare” and “Need of Still Larger Armies.”) The Nineteenth Amendment passed the next year, with Wilson’s endorsement.
Across the Atlantic, Freud was rethinking mourning as a private pursuit. Perhaps grief was actually a form of “work,” he wrote in “Mourning and Melancholia”—and only upon that work’s completion could the ego become “free and uninhibited again.” Death continued to recede from the public square: Walter Benjamin, in his 1936 essay “The Storyteller,” notes how it had been relegated to the corridors of the hospital, where the ill and dying were “stowed away.” Silence, individualism, and stoicism became valorized, and talk of death and grief no longer belonged in daily interactions. “Should they speak of the loss, or no?” the anthropologist Geoffrey Gorer wondered in his 1965 book “Death, Grief, and Mourning in Contemporary Britain.” “Will the mourner welcome expressions of sympathy, or prefer a pretence that nothing has really happened?” In his book, which drew from a survey of about sixteen hundred British citizens, Gorer suggested that people who chose pretense were less likely to sleep well and have strong social connections.
Gorer, like Ariès, attributed this shift to “the pursuit of happiness” having been “turned into an obligation”: the challenging aspects of life were now framed as individual burdens, rather than shared setbacks. The quest for happiness has long been baked into the American psyche, but one can see its distortion in quasi-therapeutic concepts such as “putting yourself first” and “emotional bandwidth”—the notion that an uncomfortable emotion is an undesirable one, and that we should set firm limits on certain discussions of hardship, even with intimate friends. Add to that “self-care”—arguably the greatest marketing success of the twenty-first century, in which consumption is repackaged as a path toward well-being—and Ariès’s claim that we live in the era of “forbidden death” continues to resonate. “The choking back of sorrow, the forbidding of its public manifestation, the obligation to suffer alone and secretly, has aggravated the trauma stemming from the loss of a dear one,” Ariès wrote, citing Gorer. “A single person is missing for you, and the whole world is empty. But one no longer has the right to say so aloud.”
After my mother’s memorial, after we scattered her ashes, I decided to run a marathon. I was still looking for proxies for grief, situations where an external accomplishment could solve my inner turmoil. Needless to say, it didn’t work. Not the running, not the hiking, not the strength-training regimen. Grief was a different beast, one that couldn’t be overcome through will power alone.
The historian Michel Vovelle challenged Ariès’s idea that “forbidden death” defined the West’s attitude toward loss, or that death had even become taboo by the mid-twentieth century. Vovelle believed that the historian’s job wasn’t merely to look at shifts in the past. “Why not look for these turning points in the present?” he wrote. Indeed, to look at the current moment is to see an unusual evolution, in which grief’s privatization has given way to the blossoming of a new hybrid form.
On social media, one often finds public grief that’s rooted in private interests. When a statesman or a celebrity passes away, or when videos of a distant tragedy circulate, expressions of mourning can sometimes seem to be a mix of sincerity and performance, an opportunity less to confront death than to strategically display one’s sympathies. Corporations issue statements of solidarity which are, at bottom, advertisements. (After the Boston Marathon bombing, the food site Epicurious tweeted, “In honor of Boston and New England, may we suggest: whole-grain cranberry scones!”) Crystal Abidin, an ethnographer of Internet culture, calls this phenomenon “publicity grieving”; it returns grief to the public square, but in strange, vaguely unnerving forms. When millennials began taking “funeral selfies” around 2013, the trend sparked a minor media frenzy, eliciting think pieces and advice articles, including one from a casket-making company.
The exploitative aspect of publicity grieving is obvious. Still, it’s notable that collective mourning is once again part of the texture of daily life. The sociologist Margaret Gibson is clear-eyed about the turn—death mediated by the Internet, she notes, is not the same as death being intimately known and accepted—but she also recognizes the ways in which grief has been normalized, its effects allowed to emerge once more in social interaction. One of her studies focussed on YouTube bereavement vlogs—videos, posted by young people in the days and months after they’d lost a parent, in which they forge apparently genuine bonds with the strangers watching, sharing their pain and showing how “mourning continues across a lifetime.” Elsewhere, initiatives such as The Dinner Party, a predominately online meetup for people who have experienced a variety of losses, provide a kind of “second space” for grief, somewhere between “normal” life and the formalized privacy of a therapist’s office. Even the funeral-selfie-takers seem—to me, at least—to possess motives more benevolent than voyeuristic self-promotion. Perhaps they wanted to share their sense of loss, but were unsure how to do so, in person, without feeling like they were an encumbrance. A frivolous form of photography may not seem commensurate with the gravity of death, but approaching the subject with some amount of levity and candor may be precisely what we need.
A decade on, I’m still figuring out my own grief. After completing the Paris Marathon, soon after my mom died, I didn’t run for several years. Lately, I’ve taken it up again, cutting curling circles through the park near my home. The point I’ve begun to look forward to is no longer the finish line, but the moment when I begin to hit a psychic and physiological wall. In the past, I might have stopped, gone home, downed some Gatorade. It was painful. Now I’ve found some satisfaction in the unease, in living within the feeling rather than blasting past it. I see that my feet continue to move, that my breath persists. Sometimes it overwhelms me, but then I look up and see, all around the park, others running, just as winded as I am, experiencing something of the same. ♦
Sourse: newyorker.com