nothing like summer in the city


That feeling you get around 5:00 Friday afternoon, when it’s so hot you can see it in the air, and there are actually empty parking spots in the Plant Hall parking lot, and for some reason you’re still at work.

(My Pathways to Honors students have to keep a photoblog this semester, and I’ve promised to keep one too, in the spirit of learning and camaraderie. I’ll be posting my entries here under the tags “HON101” and “photoblog.”)

Posted in HON101 | Tagged , | Leave a comment

Notes on Feeling Good, August 2016

Television commercials are generally unbearable, but one in particular has been provoking me to throw things at my TV lately. “What’s on my to-do list today?” it begins, sans-serif black capital letters against a white background, clean and modern-looking and vaguely Silicon Valley-esque, an aesthetic one associates with Apple products. A series of smiling, mostly young-ish, ethnically diverse people share their answers, their names and job titles superimposed on the screen. “Mapping the oceans,” answers one John Blum, Ph.D, Earth Scientist; “Protecting biodiversity” responds Rina Batra, Chemical Engineer; “Defeating malaria,” declares Deena Buford, M.D., Physician; “Improving energy efficiency,” says Briana Smith, Civil Engineer; “Reducing energy poverty,” begins Artis Brown, Civil Engineer, and Kirti Parmar, Computer Engineer, finishes his sentence, “in the developing world.” And so on, concluding with a smiling Stacey Wilson, Chemical Engineer: “And you thought we just made gas!”

Guess what, guys! It’s an Exxon Mobil commercial. Its closing tagline: “powering the world responsibly.” As if that weren’t transparently bullshit, given that Exxon has known about and been covering up climate change for decades: funding climate skeptic groups, lobbying against environmental regulations, and exerting a major and detrimental influence over US politics. “Powering the world responsibly” is exactly what ExxonMobil hasn’t been doing for the past 30-50 years.

What really worries me about this ad, though, is its cynical appeals to American liberals via what we might call Diversity™.[1] This ad is for us, y’all. It doesn’t look like Paul Ryan’s widely mocked, blindingly white Republican intern selfie. It’s careful to include people of color, women scientists, people with accents. Hey, everyone, it seems to say, Exxon cares about diversity. It cares about you, women in STEM. It’s modern! Please do not look over there at the terrifying drought and water shortages threatening people in southern African countries; ignore the South and Central American women and children affected by Zika; and for heaven’s sake, don’t think about the water wars that are likely to keep the Middle East ensnared in violence for decades to come. Climate change probably isn’t real. Look here instead, at the gentle, non-threatening multiculturalism of our commercial. Let it softly wash over you, letting you feel good about filling up your tank.

I’ve been seeing this ad during the Olympics, and I hate to say it but it’s perfectly tailored to that time slot. Let me preface my killjoy argument by admitting, I’ve been enjoying the hell out of the Olympics. I’ve been genuinely moved many times: Simone Biles’s all-around win, Katie Ledecky’s record-smashing 800m swim, Wayde van Niekirk winning gold and setting a world record from lane 8 in the men’s 400m run, Simone Manuel’s gold medal in the 100m freestyle, Usain Bolt being Usain Bolt once again. It’s inspiring, and I think it really matters that kids can watch Biles, Laurie Hernandez, Gabby Douglas, Manuel, Ledecky, van Niekirk, and so many other great athletes, and see themselves. That wasn’t the case in the era of Jim Crow and apartheid, and it’s worth celebrating.


Amid all the feel-good stories, the athletes who have overcome so many hurdles to achieve something great, and the beautiful diversity of human beings, I don’t want to forget that the Olympics aren’t great for everybody. They haven’t been great for the people displaced from their homes in Rio to make room for the Olympic park. They haven’t been great for Rio’s poorer residents, whose needs have been ignored in favor of Olympics-related building and transportation projects. And they haven’t been great for Brazil’s economy; the state of Rio de Janeiro is now broke, despite the routine, routinely false promises that hosting the Olympics would bring in revenue. As Joe Nocera puts it, “after the Games end on Aug. 21, almost three weeks after they begin, most of us will move on. The people of Rio will be left to pick up the pieces.”

It would not be fair to say that the Olympics are equivalent to that awful Exxon commercial in using sunny diversity optics to distract attention from their slow violence against poor and marginalized people. But it wouldn’t be realistic to completely deny any connection either. Diversity can be co-opted, made into Diversity™; it can be used as a shield or a mask, covering for real harm to real people, as we smile and feel good about ourselves and how far we have come.

(Small letters because this part makes me nervous: I’m talking about ExxonMobil and the Olympics right now, but I’m also talking about the Democratic Party. Watching the DNC last month, too, I felt genuinely moved by the diversity of the speakers and the vision of a better America it conjured. Khizr Khan’s speech made me cry; Hillary Clinton’s shout-outs to disability rights made me pump my fist; Tim Kaine’s speech enveloped me like a warm, fuzzy sweater. And I also felt prickles of suspicion. Is any of this real, or is it just a way to lull progressives into complacency so that we’ll vote for Clinton and let the Democrats carry on business as usual? Are we really going to reform our racist criminal justice system, for example, or are we just going to buy the police body cams and then ignore their footage? Are we actually going to take in a significant number of refugees, or nah? To what extent is the Democratic Party invested in change, and to what extent is it invested in telling people, “Look over here at these balloons, not over there at that bad stuff”? I honestly don’t know.)

Representation matters; athletic achievement matters; the chance, every two-to-four years, to feel like a world citizen matters. The good feeling people get from inclusiveness matters, especially in a year where exclusion and hate have become planks of the Republican presidential campaign. I think, in fact, that these things matter too much to be turned into cheap marketing gimmicks, instruments deployed to pull the wool over our eyes as capital preys on the most vulnerable world citizens and the environments they inhabit.

[1] My thinking about diversity optics as at once necessary and insufficient is informed by a number of scholars writing about diversity in university settings, notably Sara Ahmed and Sofia Samatar. Samatar writes, “University life demands that academics of color commodify themselves as symbols of diversity—in fact, as diversity itself, since diversity, in this context, is located entirely in the realm of the symbolic.” A similar logic, of course, is at work outside the university too, in politics, commerce, sports, and other arenas.

Posted in Uncategorized | Tagged , , , , , , | Leave a comment

1816/2016: Cruel Summers

The month of May in 1816 was marked by unseasonably cold weather in North America and Europe. Snow still lingered on the ground in Quebec and New York, and cold fronts on May 14 and 28 brought frost from Canada to Tennessee and Virginia, killing many of the early crops. Across the Atlantic, the cold and snow of winter continued, pummeling Percy Shelley and Mary Wollstonecraft Godwin’s carriage as they traveled to Lake Geneva, where Godwin (later Mary Shelley) would write the first draft of Frankenstein.

No one knew it yet, but the chilly May weather was not just a fluke. The pattern would continue through the coming months, drenching Britain and Europe in rain, leaving North America cold and parched, and destroying crops on both continents. 1816 would go down in history as “the year without a summer,” or, as the hardy New Englanders would come to call it, “Eighteen Hundred and Froze to Death.”[1]


The month of May in 2016 has been characterized by dangerous climatic activity of other kinds. India is in the midst of a heat wave which has broken records, melted roads, and killed hundreds of people. Water is in short supply, and little relief can be expected until the monsoon season begins in June. Halfway across the globe in Alberta, Canada, wildfires have forced tens of thousands to evacuate their homes. Thousands of buildings have been destroyed, and the air quality of the region has become dangerously smoky. An unusually warm and dry winter this year made Alberta particularly vulnerable to a large, fast-moving blaze that swept through about 1.25 million acres in three weeks.

Meanwhile, the NOAA reports that monthly global temperatures continue to soar: this April was the hottest April since records began in 1880, and the twelfth month in a row to break the record. 2016 just might go down in history as the year without a winter.


In 1816, the bizarre weather hit the poor hardest. Frost and excessive rain in England ruined large percentages of the crops, leading to soaring grain prices and civil unrest. In Ireland, a failed harvest and typhus epidemic that lasted well into 1817 killed tens of thousands. In Switzerland, thousands of famished peasants took to the highways in search of sustenance. In the United States, a steady stream of New Englanders headed south and west in search of a fairer climate. A season that was unpleasant for the landlord class and political elites was disastrous for the poor farmers whose crops failed, and for the laborers who could no longer afford bread and milk.


Unsurprisingly, the same is true today: climate change hits the global poor the hardest. In India this year, the heat wave has driven thousands of subsistence farmers to the cities in search of alternative work. In Louisiana, the Native American inhabitants of Isle de Jean Charles are being relocated by the Department of Housing and Urban Development. More than 90% of the island’s land mass has been submerged or swept away since 1955, and the remaining occupants have been called the first American climate refugees. Experts see Isle de Jean Charles as a bellwether for much larger populations of future climate refugees, possibly including residents of South Florida. In Zimbabwe, extreme drought has caused maize shortages and an increased need for food aid. Ethiopia’s year-long drought has been followed by flood-producing rains, leading to widespread food insecurity and displacing thousands.

Meanwhile, Republican presumptive presidential nominee and self-professed “not a big believer in man-made climate change” Donald Trump has applied to build a sea wall to protect his County Clare, Ireland, golf course. His company cited sea level rise due to climate change in its permit request. So I guess climate change affects the rich too.


Commentators in 1816 didn’t know what caused the cold, unfruitful summer. Some blamed sun spots; others blamed deforestation; still others believed the world was ending. What they didn’t realize was that a major volcanic eruption in Indonesia the prior year was the true cause of the strange weather. In April of 1816, Mount Tambora unleashed a violent cascade of lava, ash, and dust. It flattened the village at its base, rained ash on villages a hundred miles away, and left much of Indonesia a desolate wasteland. Those who survived the eruption were faced with famine, poisoned water, and disease in its aftermath, and nearly 90,000 people died.

The eruption released 55 million tons of sulfur dioxide into the atmosphere, where a chemical reaction converted it into droplets of sulfuric acid. Within a few weeks, a thin cloud of sulfuric particles covered the whole earth. There they lingered, reflecting solar energy back into space, blanketing a cooling globe for many months to come.


We know what’s causing the heat and extreme weather today. It’s an El Niño year, yes. It’s also the carbon dioxide emissions from our cars, and the methane emissions from the livestock we eat, and the nitrous oxide emissions from our agricultural fertilizers, and the coal and oil and natural gas we burn for energy, and the vast expanses where carbon-eating forests used to be before we cut them down. It’s not a mystery this time.


Mary Shelley’s Frankenstein is a novel about ethical responsibility and unintended consequences, consequences which fall hardest on innocents like Victor Frankenstein’s little brother William or the family’s servant Justine. It’s about Victor’s (largely failed) attempts to take responsibility for his creation, and to be responsible to his creation, the monster who is at once victim and perpetrator. The novel’s final, climactic moments famously take place in the icy waters of the Arctic, where the life ebbs out of Frankenstein and the creature visits his corpse aboard a ship bound for the North Pole.

When Captain Walton, the novel’s frame narrator, finds the creature hovering over Frankenstein’s lifeless body, mourning his creator, he says, “’Wretch!… it is well that you come here to whine over the desolation that you have made. You throw a torch into a pile of buildings; and when they are consumed you sit among the ruins and lament the fall.” It is an accusation against the monster, but might equally well apply to Frankenstein himself, whose actions repeatedly amount to careless destruction and fruitless lamentation.

It might apply to us too, as we mourn with the victims of droughts and wildfires yet continue to throw torches into the atmosphere. In her article “Fort McMurray and the Fires of Climate Change,” Elizabeth Kolbert points out that it might seem unseemly to bring up environmental politics in the midst of human tragedy. “But to fail to acknowledge the connection is to risk another kind of offense,” she writes:

We are all consumers of oil, not to mention coal and natural gas, which means that we’ve all contributed to the latest inferno. We need to own up to our responsibility, and then we need to do something about it. The fire next time is one that we’ve been warned about, and that we’ve all had a hand in starting.

Like Frankenstein, we can’t absolve ourselves of responsibility for the monster we’ve collectively created. From one catastrophic year to another, a message resonates: it’s long past time to be responsible to and for each other.


[1] All information about the Year Without a Summer drawn from William K. Klingaman and Nicholas P. Klingaman’s The Year Without Summer: 1816 and the Volcano that Darkened the World and Changed History (New York: St. Martin’s Griffin, 2013).

Posted in Uncategorized | Leave a comment

The Great British Bake-Off and National Identity

Like so many of my American compatriots, I’ve become obsessed with The Great British Bake-Off. Americans like it because it is so unlike our own reality competition shows. No GBBO contestant has ever looked into a camera and deadpanned, “I’m not here to make friends.” The bakers don’t scheme against each other or peacock for the cameras; they’re all charmingly self-effacing and nice. And judges Mary Berry and Paul Hollywood are no Simon Cowells; they are honest but diplomatic in their assessments of the bakers’ creations. “It’s a shame,” is Paul’s trademark comment when a bake turns out badly. Mary’s is, “It’s a bit dry,” or “It’s a bit underbaked.” These comments are enough to elicit tears and a sheepish response: “I can’t believe I’m crying over pastry,” etc.

GBBO confirms American stereotypes about the British: they are nicer than us, less brash, less melodramatic. They have a sense of decorum, and delightful accents. One of my favorite moments in the show is a challenge in which the bakers are instructed to bake American-style pies. Over the course of the challenge, it’s gradually revealed that none of the bakers or the judges actually likes American pies very much. They find them too cloyingly sweet. That’s us; we dump vast amounts of sugar into everything, while our British counterparts are the definition of restraint.

Another aspect of GBBO’s appeal is its inclusive vision of British national identity. Its contestants represent a multicultural Britain, and its dishes fuse together English traditions with international flavors. My favorite baker, Chetna, is Indian-British, and her recipes often feature Indian spices. Paul dubbed her “the queen of flavor.” Contestants of South Asian descent have made a major mark on the show, from Chetna to Ali, whose family is from Pakistan, to recent winner Nadiya, who is Muslim and whose parents are from Bangladesh. The show represents a progressive vision of Britishness, embracing not only people of color, but also gay contestants (including two of the three finalists in season three). Host Sue Perkins’ lesbian chic style is neither overtly commented on nor hidden away, and that’s characteristic of the show’s attitude toward diversity. No one makes a fuss about it; it’s just quietly there.

Probably the most overtly political moment on the show came when Scottish finalist James created “United Chiffon Cakes,” five cakes representing each of the UK’s four countries individually and one of the nation-state united. It was 2012, and the Scottish referendum on independence (which would fail in 2014) was in the works. As far as political statements go, James’s better-together cakes were decidedly mild. No one is waving signs or shouting about the Tories.

That hasn’t stopped critics from decrying the Bake-Off’s “political correctness,” however. The backlash against Nadiya’s win, for example, was both swift and stupid. GBBO may be a paradise of multiculturalism, but Great Britain is not, and xenophobia and Islamophobia are powerful forces in its cultural politics.

It would be easy to dismiss GBBO’s “big tent” vision of the nation as naïve and idealistic, given the racism and discrimination that continues to divide the UK. It’s hard for me, as a scholar of British literature, to look at the beautiful country estates that host the competition each season without remembering the ruthless empire that funded so many of those estates. And it’s hard to hear Mel Giedroyc, Perkins’s co-host, talk about the history of sugar in England and how it became widely available in the nineteenth century without thinking of the slave labor in the US South and the West Indies that made sugar cheap. GBBO’s light-hearted history lessons have to occlude a lot of brutality in order to be made suitable for a primetime TV hit.

But there’s another way to think of the Bake-Off’s tent. In “Of Other Spaces,” philosopher Michel Foucault argues that modern society is composed not just of the mapped and rationalized spaces that make up most of our daily life—the street, the home, the office, the café, etc. It also contains pockets of strangeness, “something like counter-sites, a kind of effectively enacted utopia in which the real sites, all the other real sites that can be found within the culture, are simultaneously represented, contested, and inverted.” These spaces he calls heterotopias. They are not utopias because they are real sites, geographically locatable. But there is something unreal, or “mythic,” about them, which links them to utopia.

For Foucault, heterotopias are sites in which several imaginary spaces can overlap. The theater is a classic example: a stage made of wood is also a series of fictional places. Foucault says that “the oldest example of these heterotopias that take the form of contradictory sites is the garden.” Ancient Persian gardens consisted of “four parts representing the four parts of the world.” They were simultaneously themselves and microcosms of the whole world.

Some heterotopias, Foucault explains, are ephemeral. They are “linked… to time in its most flowing, transitory, precarious aspect, to time in the mode of the festival.” Fairgrounds and vacation villages which lie empty most of the time become abuzz with activity on special days, days set aside from the ordinary work calendar. They exist outside our rationalized everyday lives, lived according to clock time or railway time. And thus they expose these lives, making them available for scrutiny.

I think of the baking tent as a heterotopia. It’s not merely a place where twelve people come together to bake. It’s also a microcosm of Great Britain, not as it really exists, but as it might have existed if the violent history of the British Empire could be erased. It’s a tiny slice of utopia, where people of all backgrounds can befriend each other, where meritocracy can exist without greedy, grasping competitiveness, and where the pain of real history can be numbed. I think it would be uncharitable to say this vision is simply faked for the cameras, when the contestants all speak positively of the friendships they made on the show. But it would not be quite right to say it’s real, either, when Patisserie Week or Sweet Dough Week must inevitably come to an end, and bakers must return to their “real lives” in a nation where anti-immigrant, anti-Black, and even anti-gay sentiments still abound.

It’s an imaginary UK, and it might be easy to dismiss it as feel-good sentimentalism rather than progressive politics. But maybe it is a necessary heterotopia, one that exposes the real space of the nation around it by way of its difference.

Posted in Uncategorized | Tagged , , | 2 Comments

29 Days of Blogging: be a poetic supporter

I thought a fun challenge for February would be to write something for the blog every day. This is significantly more ambitious than my January challenge–no drinking for a month–but in the spirit of continued self-improvement, I begin.

I am faculty advisor for a new student club, the Poets’ Society, which had its first meeting tonight. I do not myself write poetry, but as they (should have) said in Grease, “If you can’t be a poet, be a poetic supporter.” Young people who love poetry must be treasured and encouraged. They are an endangered species. To add numbers to their first meeting, I participated in their first writing exercise: fifteen minutes to write a poem about yourself, which we then traded and read aloud. Let’s call mine a prose poem, which I present for your edification:

Three scars. One on my chin. I was two and, reportedly, insistent on zipping up my windbreaker myself. One overly exuberant motion later and I was whisked to the emergency room to get six stitches. One on my left knee. I was eleven and Rollerblading around the block with Apollo, a border collie/hog mix, when he saw a squirrel. Gravel in my skin and blood weeping down the front of my leg, I Rollerbladed all the way home. One on my left foot. A shard of glass from a broken lightbulb, acquired last fall. I had just watched a documentary on antibiotic-resistant bacteria and couldn’t stop picturing the tiny tubules colonizing my wound, like lichen on tree roots. I put on a band-aid and a shoe and went to a meeting, and bled through by the time the meeting was over. I did not go to the hospital.

In the documentary a man went to India and got hit by a train or lorry of some sort. He was rushed to a hospital, where he fell in and out of fever. They held him down, screaming, as a doctor cut his leg off. But they could not stop the infection. He was airlifted back to Seattle, and doctors worked around the clock to save his life as drug after drug failed. He survived, but they warned: even now, the drug-resistant microbes could still be in his body, lurking, waiting for the right time.

Posted in Uncategorized | Tagged , | Leave a comment

King Lear or Peyton Manning?

king lear   Peyton_Manning_throwing

Do these sentences refer to the aged, confused former superstar wandering the cold and windswept heaths of England, or the one wandering the cold and windswept field in Denver? You be the judge.

  1. You see how full of changes his age is; the observation we have made of it hath not been little.
  1. An old man in football years who can’t shake off a frosty wind like he did in the old days.
  1. His body is broken. He is playing on a bad left foot and has no mobility. He has lost his ability to throw the ball down the field. He is not a better choice than backup Brock Osweiler to start the game, although Osweiler is dealing with a right knee injury.
  1. Idle old man, that still would manage those authorities that he hath given away! Now, by my life, old fools are babes again; and must be used with cheques as flatteries.
  1. Instead, [they] seem dedicated to waiting around for a guy who’s never coming back. They’re selling a ghost when the real thing is infinitely more interesting and directly in front of them.
  1. Sir, I thought it fit to send the old and miserable king to some retention and appointed guard.
  1. He was left for dead, buried by doomsayers who considered only a worn birth certificate and the ravages of Father Time, the treacherous adversary who delights in intercepting any Hall of Fame quarterback’s journey to Canton and celebrating a demoralizing pick-six.
  1. This night, wherein the cub-drawn bear would couch, the lion and the belly-pinched wolf keep their fur dry, unbonneted he runs, and bids what will take all.
  1. The End comes for all, and the end had come for [him].
  1. Come not between the dragon and his wrath.
  1. On this cold, windblown mountain night an old man needed to savor the win that would not come easy. He didn’t know if it would be his last.
  1. The weight of this sad time we must obey; speak what we feel, not what we ought to say. The oldest hath borne most: we that are young shall never see so much, nor live so long.

Continue reading

Posted in Uncategorized | Tagged , , , | Leave a comment

Consider the dog:

a noble beast;

a noble beast

a creature of quiet dignity;

a creature of quiet dignity

a stalwart companion;

a stalwart companion

elegant in manner;


elegant in manner


fastidious in habit;

fastidious in habit

fierce in battle;

fierce in battle

majestic in repose;

majestic in repose

a loyal defender of hearth and home.

a loyal defender

Posted in Uncategorized | Tagged , , | Leave a comment

My Year in Reading 2015: Submerged Histories

[Spoilers for The Buried Giant, Boy, Snow, Bird, and My Brilliant Friend ahead]

Three of the best books I read this year were novels that dwell on submerged histories of violence. One, Kazuo Ishiguro’s The Buried Giant, unfolds on a misty medieval English landscape whose inhabitants live in a spellbinding haze of forgetfulness. Gradually, Ishiguro reveals that the epidemic of forgetfulness emanates from the breath of a dragon, tasked by the sorcerer Merlin to dissolve people’s memories of King Arthur’s brutal war against the Saxons. The novel ends with the dragon’s death at the hands of a Saxon warrior, and thus with the rumblings of a coming battle. Britons and Saxons who once lived together in an uneasy peace will, when their histories are restored, turn again to war. “Who knows what old hatreds will loosen across the land now?” asks one character. Another responds, “The giant, once well buried, now stirs.”

The Buried Giant didn’t get great reviews, but it’s the kind of book that digs into your subconscious and resurfaces again long after you read it. I read it in March but think of it often. Like all fantasy, it’s really about us. Most of the time, we forget that the land on which we dwell is haunted by violent histories—war, slavery, genocide, colonialism, theft. It’s especially easy to forget if your ancestors were the ones on King Arthur’s side. Ishiguro’s novel doesn’t offer any easy platitudes or lessons. Forgetting the past means living in a tenuous truce that occasionally erupts into a violence we don’t really understand. Remembering the past is clarifying but painful, as old injustices never rectified come to light. Justice and peace seem as elusive in The Buried Giant as they are in real life.

Helen Oyeyemi’s Boy, Snow, Bird, a fairy tale set in 1950s New England, is about the uncovering of old family secrets—both the characters’ and the nation’s. It’s startling and impressive that a British author could write such a good American novel, one that captures the US’s historical consciousness so idiosyncratically and yet so perfectly. Boy, Snow, Bird is a story about passing in the mostly-white (fictional) town of Flax Hill, Massachusetts, and about the buried cruelties that lie under the town’s idyllic surface. Fairy tales lend themselves to psychoanalytic readings—they are so often about the return of the repressed, how it bubbles up into one’s consciousness in a drama of revelation. Oyeyemi’s adaptation of the Snow White story expertly deploys this psychoanalytic effect, weaving it together with American history to show us what rose-tinted memories of “the good old days” must disavow—race, racism, even one’s own family.

I don’t want to speak more specifically about the plot of Boy, Snow, Bird, because it’s easily spoilable and I think you should read it—it’s truly great. But I do want to say a bit about why it’s important to me. One of my (many) teaching failures this year occurred with a writing assignment that asks students to reflect critically on an episode in their own lives. One of the prompts for this assignment, building off some short stories we had read, encouraged students to write about a time when they first became aware of their own gender/race/class identity. What I found with this assignment was a common pattern among white students: They had grown up in a mostly-white neighborhood or town like Flax Hill, and thus didn’t “discover” racism until (middle school/high school/college/moving/take your pick). Fortunately, in their first encounter with “diversity,” they were never racist, but, they explain, they became more open-minded by learning or living with people different from themselves. The problem, as I came to see it, was that in these personal narratives, the white neighborhood always represents a pre-racial innocence, and the students never ask how their neighborhood came to be so white. To them it seems natural. I think this pattern is my own fault, not the students’. It’s unrealistic to expect that white college freshmen would know much about redlining, white flight, or mortgage discrimination unless I teach it to them. One of the many great things about Boy, Snow, Bird, though, is that it shatters the American illusion of pre-racial innocence by showing what must be excluded to create the illusion.

In Elena Ferrante’s My Brilliant Friend, the first part in a tetralogy about two girls growing up in Naples at midcentury, one of the key coming-of-age moments occurs when the girls, as teenagers, learn about Italy’s fascist history. As children, they know that their neighborhood is full of enmity and violence, but these shadows are curiously unmoored from history. “I don’t recall having ever thought that the life we had there was particularly bad,” Lenù explains. “Life was like that, that’s all, we grew up with the duty to make it difficult for others before they made it difficult for us.” As teenagers, however, Lenù and her best friend Lila befriend a communist, Pasquale, who tells them things about the past—“Fascism, Nazism, the war, the Allies, the monarchy, the republic”—that they have never been taught before. It is a revelation, one worth quoting at length:

[Lila] said that we didn’t know anything, either as children or now, that we were therefore not in a position to understand anything, that everything in the neighborhood, every stone or piece of wood, everything, anything you could name, was already there before us, but we had grown up without realizing it, without ever even thinking about it. Not just us. Her father pretended that there had been nothing before. Her mother did the same, my mother, my father, even Rino. And yet Stefano’s grocery store before had been the carpenter shop of Alfredo Peluso, Pasquale’s father. And yet Don Achille’s money had been made before. And the Solaras’ money as well. She had tested this out on her father and mother. They didn’t know anything, they wouldn’t talk about anything. Not Fascism, not the king. No injustice, no oppression, no exploitation. They hated Don Achille and were afraid of the Solaras. But they overlooked it and went to spend their money both at Don Achille’s son’s and at the Solaras’, and sent us, too. And they voted for the Fascists, for the monarchists, as the Solaras wanted them to. And they thought that what had happened before was past and, in order to live quietly, they placed a stone on top of it, and so, without knowing it, they continued it, they were immersed in the things of before, and we kept them inside us, too.

The history of violence and corruption inheres in the very material of the neighborhood—the stones, the wood, the grocery store, and most importantly, the money of its richest denizens. Faced with the choice that The Buried Giant poses, forgetfulness and quiet versus remembrance and vengeance, the parents choose to forget. But, as Ferrante shows, to forget is not to erase the past but to inadvertently continue it.

My Brilliant Friend ends with a wedding that brings together former enemies but fails to achieve genuine reconciliation among them. I have not read the sequels yet, but I suspect that hostilities will continue between Lila and the Solaras. After all, Ferrante has not yet resolved Lila’s insight that her neighbors’ wealth (and her eventual husband’s wealth too) comes from the postwar black market. She has simply placed a stone on top of it.

These novels’ understandings of history speak powerfully to me today. History is the thing we keep trying to bury that keeps resurfacing. I see this not just in the fiction I read this year, but also in the essays that stuck with me the most. In one of these essays Eula Biss, drawing on Ta-Nehisi Coates, imagines the history of white Americans as a “forgotten debt.” It’s partly a metaphor for the intangible privileges of whiteness, and partly a literal reference to how money itself is tied up in racist histories:

Once you’ve been living in a house for a while, you tend to begin to believe that it’s yours, even though you don’t own it yet. When those of us who are convinced of our own whiteness deny our debt, this may be an inevitable result of having lived for so long in a house bought on credit but never paid off. We ourselves have never owned slaves, we insist, and we never say the n-word. ‘‘It is as though we have run up a credit-card bill,’’ Coates writes of Americans, ‘‘and, having pledged to charge no more, remain befuddled that the balance does not disappear.’’

My reading this year reminded me, in a hundred little ways, that history shapes my life, and never more powerfully than when I forget it and feel at home in the present.

Posted in Uncategorized | Tagged , , , , , , , , , , | Leave a comment

Five Ways of Looking at a Trope

[This is a piece I wrote for a seminar at the Modernist Studies Association Conference this month, “Catastrophe and the Limits of Genre.”]

One: a 1924 Eugène Atget photograph, Rue de la Montagne-Sainte-Geneviève, a Parisian street scene with the Panthèon dim in the background, beautifully lit yet eerily depopulated. Atget’s documentary photographs became famous when they attracted the attention of the Surrealist Man Ray. Today, we are more likely to rediscover them via Walter Benjamin, who cites Atget approvingly for creating a kind of photography in which “the human being withdraws,” and with it, the cult value of art. Referencing Atget’s “photographs of deserted Paris streets,” Benjamin writes, “It has justly been said that he photographed them like scenes of crimes. A crime scene, too, is deserted; it is photographed for the purpose of establishing evidence. With Atget, photographic records begin to be evidence in the historical trial.”[1] To Benjamin and to viewers today, the empty streets whisper of some crime, some catastrophe that has interrupted business as usual.


Two: a photograph by Ryan Spencer, from his 2015 collaboration with the writer Leslie Jamison, Such Mean Estate. Spencer collects still frames from apocalyptic films, rendered in black and white, uncaptioned, and small on the page, dwarfed by the white space surrounding it. This photo resembles Atget’s in composition; it is another deserted cityscape in which “the human being withdraws.” Dark wreckage occupies the foreground, while skyscrapers loom in the lighter, cloudy background, as the Panthèon does in the deep space of Atget’s image.

buy large

In “Catechism,” the essay accompanying Spencer’s photographs, Leslie Jamison begins with a question and answer which refer to this image and others:

What does the sky hold?

Too many birds. Broken freeways. The frail limbs of a charred forest. Blindness if you stare straight at the sun. Helicopters swarming the sky like mosquitoes, then smoked propellers falling past the sign reading BUY LARGE. We did.[2]

Despite the 2015 date on the essay, everything about this screams “modernism” to me. The question and answer form, which might once have evoked only the Catholic catechism, now inevitably reminds me of Joyce’s “Ithaca”; the sentence fragments, meanwhile, recall Imagist poetry. Jamison’s catalogue of images—the birds, the freeways, the tree limbs—both enacts and resists synecdoche. The images gesture toward larger narratives, perhaps the apocalyptic films from which Spencer’s photographs are drawn; but they also refuse narrative itself. The end of the world, Jamison’s prose suggests, is not a storyline but a collection of genre tropes, decontextualized, arranged to elicit the creeping horror and guilt that are perhaps the most interesting things about apocalyptic films. The BUY LARGE sign in this photograph prompts, for Jamison, the same kind of dawning recognition that Atget’s street scenes did for Benjamin: a crime has been committed here.

Three: an excerpt from Emily St. John Mandel’s 2014 apocalyptic novel Station Eleven. Weeks after a virulent flu outbreak has killed 99% of the world’s population, a lone survivor named Jeevan leaves his fortress-like Toronto apartment to seek a new life. He walks out into a dark, still, uncanny city:

The world had emptied out since he’d last seen it. There was no movement on the plaza or on the street, or on the distant expressway. A smell of smoke in the air, with a chemical tinge that spoke of burning offices and house fires. But most striking was the absolute absence of electric light. Once, in his early twenties, he’d been walking up Yonge Street around eleven p.m. and every light on the street had blinked out. For an instant the city had vanished around him, and then the lights were back so quickly that it was like a hallucination, everyone on the street asking their companions if they’d seen it too—“Was it just me?”—and at the time he’d been chilled by the suggestion of a dark city. It was as frightening as he would have imagined.[3]

Jeevan has a bright flash of memory in the dark city, and the memory itself is like a photo-negative of the current moment: a flash of darkness in an electrically brightened city, the inverse of a camera’s flash in a dark room. Toronto is now a ghost town, haunted by the people who used to inhabit it and the lights that used to illuminate it. The city’s darkness also makes me think of urban blackouts during World War II, a darkness that promised safety but could not always deliver it.

Four: the end of Katherine Anne Porter’s 1937 short novel Pale Horse, Pale Rider, a story about the 1918 flu pandemic. Miranda, the protagonist, contracts the virus and slips in and out of delirium. When she recovers, she finds that “more than a month” has passed, the war has ended, and her lover has died from the flu. Upon leaving the hospital, she finds her city barren and quiet: “No more war, no more plague, only the dazed silence that follows the ceasing of the heavy guns; noiseless houses with the shades drawn, empty streets, the dead cold light of tomorrow. Now there would be time for everything.”[4] “Time for everything” might include time for recrimination. Feelings of guilt infuse Miranda’s fever dreams, and if the pandemic is a natural disaster or act of God, the war is not. The silent streets remain haunted by the absent presence of the “heavy guns.” This, too, feels like the scene of a crime.

Five: Sometimes the deserted, broken city is not a trope at all, but a reality. Bomber planes were first used during World War I, and aerial bombardment soon became a crucial military strategy. By the end of World War II, few major European cities remained untouched by bombing campaigns. Warsaw, Rotterdam, London, Berlin, Dresden, Helsinki, Rome—none emerged unscathed. Air raids also devastated cities in Japan, including Tokyo, and in 1945 US bomber planes dropped their deadliest cargo of all, the nuclear bombs that destroyed Hiroshima and Nagasaki. Hundreds of thousands of people were killed during WWII air raids, and countless others displaced. Cathedrals, palaces, bridges, houses, offices—it is hard to grasp how many crumbled or burned. This photograph from the British Imperial War Museums was taken in London in 1917, at the height of World War I. One imagines that the photographer wanted to inspire hope by showing St. Paul’s, intact and towering in the background, a faint symbol of British resilience in relief against the lifeless debris of the foreground.


© IWM (HO 81)

What these five images suggest to me is a continuity between modernist culture and the contemporary apocalyptic genre, the texts that express our end-of-the-world fears and our guilt about global war and global warming. This genre does not just look toward an apocalyptic future; it also looks back to the apocalyptic past of the modernist era, when people lived many of the catastrophes we fear. Sometimes writers and filmmakers make these intuitive connections explicit. Station Eleven, for example, mentions the 1918 flu as a precursor to the current outbreak; so do Steven Soderbergh’s film Contagion and Marc Forster’s World War Z. Everyone writing about political upheaval today seems contractually required to quote Yeats’s “The Second Coming.” A Google News search for “the center cannot hold” returns 2530 results, including articles about the Israeli elections, the British elections, Obama’s foreign policy, Syrian refugees in Europe, and ISIS. A search for “mere anarchy is loosed upon the world” returns 257 results, including articles about the Greek economic crisis, Canada’s objections to the niqab, the ascendance of Donald Trump, and the surprising playoff bid of the Chicago Cubs. Other times, the connections between modernist and contemporary apocalyptic texts are less obvious, visible only in a faint echo or a graphic match like the photographs above.

Can modernism help us better understand the apocalyptic mood today? Can a more sustained attention to modernism’s apocalyptic visions help us think about our collective guilt over the war, environmental ruin, and predatory capitalism that we’ve seen over the past decade or two? Can it help us recognize the ways our culture has confused the desire to built a new, better world with the wish to see the old one burn? These are some of the questions I hope to investigate in a future research project. I suspect that modernism can offer us a critical distance from which we can look differently at the apocalyptic genre today, a genre which condemns us to watch the world end over and over again, in a thousand different ways, yet cannot seem to move us to change.

[1] Walter Benjamin, “The Work of Art in the Age of Its Technological Reproducibility: Second Version,” in The Work of Art in the Age of Its Technological Reproducibility and Other Writings on Media, ed. Michael W. Jennings, Brigid Doherty, and Thomas Y. Levin (Cambridge, MA: Harvard UP, 2008), 27.

[2] Leslie Jamison, “Catechism,” in Such Mean Estate by Ryan Spencer (Brooklyn: PowerHouse Books, 2015), n.p.

[3] Emily St. John Mandel, Station Eleven (New York: Alfred A. Knopf, 2014), 431-2.

[4] Katherine Anne Porter, Pale Horse, Pale Rider, in The Collected Stories of Katherine Anne Porter (San Diego: Harcourt Brace & Company, 1972), 317.

Posted in Uncategorized | Leave a comment


There are course outcomes, and then there are course outcomes. There are the official outcomes developed for assessment purposes and dutifully reproduced on my Writing and Research syllabi. Students should be able to find, read, use, and properly cite scholarly sources; they should be able to develop a semester-long research project, from proposal to annotated bibliography to final paper; they should be able to present the results of their research in multiple modes. On some days, I think that if I can help most of my students get to this point, I’ve done my job. Other days, I think these outcomes are too modest; I dream bigger.

It’s about a month into my new experimental Writing and Research course, “Art at the End of the World.” We’ve read and discussed Emily St. John Mandel’s Station Eleven (well, most of us have); we’ve watched some Key and Peele sketches to learn about the conventions of the apocalyptic genre; and this past week we filled out the empty spaces in the course schedule with student-assigned readings that correspond to their own individual research projects. Well, “readings”—mainly movies and TV shows. The students weren’t too keen on assigning themselves actual reading. I think they may find that watching two films a week is more taxing than reading a short story or two, but maybe that’s only true for me.

I designed the course with blank spaces in the schedule because I wanted the students to be more invested in the objects we study and the objects they research. In previous semesters, I chose all of the readings, and students had to pick one of my assigned texts to develop a research project around. There was still some flexibility, but not, I thought, quite enough. The result was that some students got really excited about their research projects, while others merely went through the motions. Plus, some students stopped doing the assigned reading, and class discussions foundered. I hoped that giving students more say in the course material would lead more of them to care more. (Not being completely naïve, I do recognize that this approach is still not going to get 100% buy-in—a small handful of students will not prepare for class even if the preparation consists of watching World War Z. We’re going for improvement, not perfection.)

It wasn’t a surprise that the students picked mainly blockbuster movies—Armageddon, I Am Legend, Contagion—to assign themselves. They’re fun to watch. Popular culture studies is in my view the perfect gateway to humanities scholarship for undergraduates. The primary sources are accessible and appealing, while the secondary sources show the dazzling possibilities of careful analysis. It’s often an exciting moment when students realize that the movie or television show they thought was just cotton candy for the brain is actually brimming with meaning. (Students from previous classes told me it was a mind-blowing moment when they learned that the X-Men series is often considered an allegory for LGBT rights.) If you want to better understand the culture you live in, it’s hard to think of a better way to do it than to analyze its most popular and profitable artifacts.

There are some drawbacks to this approach, however. Most humanities courses, I think, have a desired outcome that is not measurable and hence mostly unwritten: to broaden students’ minds by exposing them to works that are beautiful, challenging, and out of the mainstream. Even as a bookish and nerdy undergraduate, I would probably never have picked up Joyce’s Ulysses on my own. And yet struggling through it in a class was life-changing—it’s what sent me to graduate school to study modernism.

Left to our own devices, we often prefer stories that are palatable, relatable, and not too challenging to our own worldview. Rebecca Mead writes, for example, that the critical tendency to praise works for being “relatable” is a “scourge,” a sign of lazy media consumption and self-absorption. When one uses relatability as the main criterion for evaluating a story, Mead explains,

The reader or viewer remains passive in the face of the book or movie or play: she expects the work to be done for her. If the concept of identification suggested that an individual experiences a work as a mirror in which he might recognize himself, the notion of relatability implies that the work in question serves like a selfie: a flattering confirmation of an individual’s solipsism.

Ruth Graham’s clickbaity screed against adults reading YA fiction was motivated, I think, by a similar insight. I don’t agree that it’s embarrassing to enjoy the easy pleasures of young adult fiction, but surely Graham is onto something when she claims that “mature readers also find satisfaction of a more intricate kind in stories that confound and discomfit, and in reading about people with whom they can’t empathize at all.” And maybe college humanities courses are one place in which that more mature satisfaction should be cultivated.

My own intellectual coming-of-age (to put it pretentiously) owes a lot to Horkheimer and Adorno’s “The Culture Industry,” a scathing Marxist critique of popular culture for deadening the minds of the masses and making us content to be passive consumers and wage slaves. And it owes an equal amount to the old-school story of modernism as a backlash against that easily digestible yet pernicious mass culture. Nowadays, modernist scholars have usefully complicated that story and deconstructed the modernism/mass culture binary. But it’s worth keeping in mind the way we can’t seem to stop having the same debate. Poptimism vs. adult taste, fan culture vs. aesthetic diversity, time-worn vs. just-in-time-to-cash-in books… there’s a serious concern that, alongside the “intoxicating sense of illegitimate freedom”[1] accompanying the collapse of the distinction between high and mass culture, something legitimate and precious is also lost.

I do worry about the tyranny of democratic taste, about the quiet student whose desire to study something old or difficult or strange gets drowned out when the majority of his or her classmates vote to watch an action thriller with Matt Damon instead. Perhaps that’s why I couldn’t bring myself to do an entirely peer-driven course as Lee Skallerup Bessette did, why I couldn’t relinquish the last bit of control, and why I assigned Station Eleven. At once an instance of beautiful writing, a celebration of the art that inheres in graphic novels and Star Trek and other pop culture objects, and a defense of enduring classics like King Lear, Station Eleven will, I hope, keep those little, dissenting voices audible in our classroom. And I hope it will contribute to that secret, unmeasurable course outcome: persuading students that good literature, even when difficult or time-consuming, is worth their time.

[1] A line I borrow from, where else, Virginia Woolf’s “The Mark on the Wall,” a difficult modernist short story that everyone should read.

Posted in Uncategorized | Leave a comment