A while back, I left formal academia. In the post where I announced that decision, I promised that I’d start writing here more frequently again, since I’d be free from the crushing pressure to self-censor that I’d been feeling as a professional academic. Obviously, that hasn’t happened. For a variety of reasons, the pressure to self-censor is still on. More importantly, I’ve taken a step back from blogging and writing in order to get my head on straight in a time when everything seems slanted, and a little silence can be almost infinitely more valuable than additional noise.
Don’t get me wrong: this is hard for me. I like to hear myself talk, to fill the room with my ideas. I’m no wallflower. I have strongly held opinions, and like every opinionated person, I sincerely think others should drop their benighted, stupid, misinformed opinions and pick up mine. Mine are the right ones, after all.
But the past two years have driven home just how wrong “experts” can be about their pet topics. And a lot of what I once believed has burned up like prairie grass on fire. Once you have your core beliefs and convictions challenged that dramatically, it becomes, or should become, harder to advocate your views with the same naïve bluster that you once did. After all, you were wrong at least once. Badly.
So, yes, there’s a certain mature caution, maybe, a canny circumspection in the face of dramatically mutating, wildly out-of-control current events, that partly explains why I’ve been uncharacteristically quiet since escaping from academia’s burning tower. But it’s more complex than that, too.
Let me talk about three good reasons to be quiet.
1. It Turned Out That I Was Way More Burned Out Than I Thought
Academic life was extraordinarily draining. I was in the lab or office until midnight so many evenings that I became practically vampiric, biking home through the empty city night after night, always alone. Holidays and weekends were an opportunity to do real work instead of the routine crap and meetings that filled up the week. On vacations, I thought constantly about research.
Many academics will be nodding in recognition. When you actually do take off time to do something purposely rejuvenating — hiking a New Hampshire mountain, say, or indulging in an afternoon trip to the beach — you can’t relax and enjoy it. You feel guilty. More than anything, you worry about getting a bad reputation. You suspect, deep down, that if you really put aside work for one afternoon, if you really just forget yourself and enjoy a few hours, others will conclude that you’re not dedicated, not 100% all-in. Mentors and peers will start seeing you as just another normal person, a standard human who needs breaks and weekends. Opportunities will swiftly dry up.
Being a successful academic, then, rhymes in some ways with being an entry-level analyst at a Manhattan investment bank. Your life is the work. Everyone you know — everyone who matters, everyone who might be able to help you move up in your career — seems to work 80, 90, 100 hours a week. Family and personal life don’t just take a back seat; they’re distractions, dead weights that can only hold you back from success.
The difference is that, in academia, compensation for all those sweaty hours doesn’t come in the form of sweet salary packages and bonuses, Upper East Side condos, or Breckenridge ski vacations. Instead, the academic currency par excellence is reputation. Prestige. Respect.
Your life in finance is worthwhile if you finally make senior VP and pull down a million a year. In academia, you’ve made it when everyone is citing your works, when people organize conferences and workshops to discuss your ideas, when peers write a festschrift for you or a new theorem or concept enters the canon with your name on it — the Haber-Bosch process, a Markov chain. The Wood Theorem.*
This is what makes the whole thing so stressful. You’re busting your hump not for the cold, vulgar utility of cash, but for the intangible payout of joining some eternal hierarchy of Smart People. This community is necessarily Platonic because nearly all of its members are dead. So what you’re really aiming at — amid all the accolades, the honorary chairs, the citations — is immortality.
I wouldn’t be the first to point out that the pettiness and tempest-in-a-teacup character of much of academic life is due to the fact that the squabbles aren’t about money or property, but about immaterial tokens of status and prestige. As the old saw goes, the reason academic politics are so bitter is because the stakes are so low.
But those stakes feel pretty high. Finance bros may be trying to get stupidly rich, but academics are trying to justify their very presence on earth. Maybe neither of these goals is especially edifying, but one hangs a much heavier existential burden on the fragile peg of your career than the other.
Add to all this the ludicrous odds of finding an academic job in today’s market — the higher-ed labor market being essentially a pyramid scheme, dependent on sucking in ever-growing new legions of grad students to keep the whole top-heavy illusion afloat — and you have a nasty recipe for burnout.
So it was that I found myself, after stepping away from academia, unexpectedly sleeping like I’d been hit by a train. I’d wake up mid-morning, eat breakfast, read for maybe an hour, and then crawl like a stricken animal to the couch. There, I’d pass out again until late afternoon, early evening.
I’m not talking about pleasant little afternoon naps, either. I was so tired every day, so comprehensively psychologically exhausted, that sleep would steal over me like a blow to the temple. Every day, rain or shine, right at 2:00 pm. There was no choice in it. It was like a temporary foray into narcolepsy.
This went on for more than a month.
Even after I was finally able to stay awake through the day again, the idea of writing sometimes almost made my stomach turn.
That’s what burnout is like. You become almost physically allergic to what you once loved.
2. I Tried to Put Sanity First
Before I earned my PhD in 2016, there was a time when it seemed that I could have seamlessly made a lateral jump from academia to opinion journalism or full-time blogging. I have a breezy, journalistic writing style.** I write on topics that interest people. A few famous writers shared or reblogged my posts. The doors to the Big Time seemed open, if just a crack.
But I didn’t push on those doors. Instead, just as my blogging “career” was taking off, I redirected my attention, focusing my efforts more and more into research, into academia. I wanted to join that eternal academic pantheon I just mentioned. Blogging became more and more a pesky side gig instead of my favorite pastime. I had a dissertation to write, after all. That didn’t leave much time for eating or showering, much less blogging. I never really returned to posting at the same weekly rate again.
Maybe I was scared of success. I saw a promising future and shrunk back like J. Alfred Prufrock, not daring to eat the peach. A kind of self-sabotage, perhaps.
But whatever the reason, I’m relieved now that I didn’t try to make a go of it as a writer or opinion-slinger in the days before Donald Trump and Covid barreled onto the scene, or even since then. That’s because nearly everyone who works in journalism or blogging has gone cracked. If I’d been professionally embedded in that world, I’d have gone cracked right along with them.
I really mean it. Around 2017, bloggers and journalists whose work I used to follow and admire started to sound hysterical, frantic, even frothing at the mouth. Essays and articles seemed to pulse with a new kind of barely restrained, animal hatred. Many of them came across as caring less and less about the world beyond their newspaper or the community of opinion-writers, more and more about their own, shouting in-group. Writers were becoming dangerously solipsistic, curved inward, a kind of deranged, postmodern exercise in professional self-referentiality. The world of blogging and journalism was — is — a toxically unhealthy environment.
It didn’t help that the social media platform Twitter had long since become the de facto public square for anyone who wrote for a living. As we all know, Twitter is possessed by demons. It’s also the most de-ritualized human social space ever created.
This is a problem because human culture needs ritual to survive. Even mundane social conventions such as handshakes and shared meals add layers of soothing protocol to social encounters. These buffering rituals assure everyone involved that, whatever their inner views about immigration or Brexit or the objectively terrible quality of recent Marvel movies,† they can at least agree on something important: how to behave, how to treat each other.
But as the anthropologist Roy Rappaport pointed out, the effects of language often directly run counter to ritual. Words enable us to simply tell each other our hidden opinions, our likes and dislikes, our preferences and aversions. Words, unlike ritual, can transpose our inner beliefs and opinions — and our lies — directly into another person’s mind. And so words easily divide and break down where ritual builds and glues together.
Because it turns out that no two people really agree on anything. So when social exchanges consist entirely of personal opinions, the only rational response, naturally, is to loathe each other.
In the character-limited venue of Twitter, though, almost the only thing we can do is spout off our opinions. It’s a dangerous situation. Imagine a marriage where both spouses said aloud whatever came through their minds:
“Why yes, you do look fat in that dress.”
“That waiter with the George Clooney chin is so hot.”
“I really resent your parents.”
On top of this, imagine that this couple never did any conventionally nice things for each other, holding back on all the nice actions — bringing each other tea, going on monthly dates, offering back rubs — that could tangibly reinforce their commitment. The marriage would be 100% verbal, opinion-slinging self-expression.
The whole thing would last about four days. The divorce papers might be served via cruise missile.
This is exactly what Twitter is like, except that instead of being limited to one miserable couple it encompasses virtually all of the overeducated New Yorkers and Washingtonians who make the critical decisions and tell the stories that determine the fate of our country. Ha ha!
Even as a very part-time blogger, I found myself getting swept up in the urge to hate all the idiots out there, to revile them and their dastardly opinions. My wife got worried about me. I had to prioritize sanity. I learned to unplug.
I did this by pursuing things I liked to do. I was an English major in undergrad. Stories and literature were my first love. So instead of getting trapped like a fly in the enormous sticky Venus trap of the blogosphere, I spent a lot of 2021 delving back into Shakespeare, reading Tolkien and Welsh folklore, gobbling up great poets from Homer and Virgil to Eliot and Hopkins and Robert Hayden.‡
The only metaphor that captures how immeasurably delightful this was for me is a desert plant desperately soaking up water after a decade’s drought. In the wake of many years of splitting the world apart, carving it up with theory and data and causal analysis — living in a world of objects — I spent much of 2021 immersing myself in narratives and art, putting the universe together again.
I’m glad I prioritized sanity. I didn’t get much blogging done this year, but I did become disentangled, earnestly if not yet fully, from the whirling chaos of reactivity and virulence that has become the siren song of our public culture.
I count that as a win.
This leads us to the third reason for my temporary retreat from public writing.
3. We Need More Signal, Less Noise
Signal-to-noise ratio is a basic concept in information theory. A signal is information. Noise is all the irrelevant data that makes it hard to extract that information.
Imagine that you’re driving through the desert listening to a country station hissing with static (and maybe the occasional ghostly whispers of voices from other, nearby frequencies). The song is the signal. The static is the noise. The signal-to-noise ratio simply refers to how clear the signal is: how easily you can hear the song.
Public life today has a very low signal-to-noise ratio. Lots of zeroes after that decimal. There’s so much content, and so little clarity about what all that content means, that we’re all basically living in information soup, all the time.
When my wife and I gave up reading news for Lent this year (seriously — I couldn’t possibly recommend this more highly), I was gobsmacked by how much lighter I felt, how much less stressed.
The noise had stopped.
I didn’t feel that I was missing much, either, because by the middle of 2020, the news sources I’d once trusted had come to seem utterly untrustworthy. Before June of last year, I really believed that I could go to, say, CNN.com to get a version of the actual truth, albeit maybe a little endearingly biased. I thought my job was just to filter out the inevitable spin, extracting the objective facts. After all, I had a civic duty to be informed.
Those days are over, sadly. I now realize that much of the news, no matter where you look, is propaganda for someone.°
A propaganda-based information ecology offers so much noise, and so little signal, that at the beginning of 2021 I knew I needed to get my bearings. I needed to get de-enmeshed so that I could make contact again with what I know to be real: my wife and family, my own body, our church, the great poetry that makes me so grateful, despite everything, to have this bastard tongue we call English as my first language.
I knew I probably wouldn’t be able to add much signal to the overwhelming noise of our current blogosphere. But I could dampen the noise in my own life. That is to say, the numerator in the signal-to-noise ratio wasn’t under my control, at least not for now.
But the denominator was.
So I won’t make any promises today about blogging, here or elsewhere. A day will come again when I’ll be writing regularly again, maybe even full-time. But in the meantime, the posts will keep trickling out. I still have much to say. In the near future, I’ll cover some cool papers I’ve recently published.
Meanwhile, go for a walk. Be in the real world. Read a book. Stay sane.
It’s the best way to fight back.
Happy New Year.
*The Wood Theorem is still pending.
**Though a dissertation and two postdocs have been hell on my writing chops. All sorts of obnoxious jargon slithered into my popular writing after I got serious about being an academic pro. I now understand, at a deep level, the poet W.H. Auden’s incisive warning: “Thou shalt not commit a social science.”
†I, personally, gave up on Marvel movies after End Game. That was when I realized that nearly all modern blockbuster movies are written not by mature grownups with a rich appreciation of the classic rules and constraints of storytelling, but by privileged kids who, sadly, lack the real-life experience is necessary for writing believable, compelling characters and stories. (Exceptions: Guardians of the Galaxy Vols. I and II, Black Panther.) When Dune came out last month, I watched it twice in the theater simply to revel in the remarkable experience of being treated like an adult.
‡Hayden’s famous short poem “Those Winter Sundays,” about the ambivalent self-sacrifices of his foster father, is one of the most wrenching portrayals ever put to paper of the tension between gratitude to and resentment toward fatally flawed elders. “What did I know, what did I know/of love’s austere and lonely offices?” Ah, what do we know, indeed.
° I know these are choppy waters, but the protests and riots last summer — or, rather, the way the media covered (or ignored) them — changed my perspective in a major way.
I followed a number of accounts on Twitter that were on the ground in Minneapolis, in New York, in Seattle during those events. I also had friends in some of these places. The protests that the news media made out to be “mostly peaceful” were some of the most destructive events to ever happen in these cities. (Indeed, all told, insurance estimates place them as likely the most destructive and costly riots in American history.)
But many people have no idea that these things even happened, because they were relying on name-brand media, just like I did until not very long ago. During the heady 2020 protests, legacy outlets focused relentlessly on police misbehavior (which did happen) while almost completely avoiding covering anything that could cast the protests in a bad light.
To hear these news sources tell it, Minneapolis was just a law-abiding city in which peaceable protesters were suddenly terrorized by police without cause, not a major metropolitan area in which, in the space of one weekend, rioters burned down hundreds of stores (many or most of which were owned by people of color), damaged thousands of buildings, torched an entire police station, and — oh yeah — burned an entire six-story affordable housing development to the ground, leaving a smoldering cityscape behind them.
On the phone the following week, one friend who lived in the neighborhood told me, “It looks like Sarajevo out there.”
The protests were surely grounded on a noble aspiration: to heal our nation of racism and to establish more just practices of policing in the wake of a number of high-profile deaths of black Americans at the hands of cops. But months of Covid lockdown combined with opportunistic, professional-class radicalism set the stage for something much more sinister — and the news media almost completely ignored it.
Maybe they didn’t want to tarnish the image of Black Lives Matter or the protest movement. Maybe they feared that widespread negative reaction to the protests would help Donald Trump in the upcoming election.
In either case, the news media buried one of the biggest stories of 2020 for political purposes. A commenter on the raconteur journalist Michael Tracey’s Substack newsletter sums up the furious sense of betrayal this evoked:
I live in Minneapolis, less than a mile from where George Floyd was killed and to say that the coverage of this city has been skewed would be a gross understatement. The way people in the press and online talk about what has happened in this city is infuriating to the point of tears for me.…As the riots ensued last year, neighbors geared up each night, connecting hoses to put out fires and band together to fend off looters and arsonists (calling the cops, fire fighters, or ambulance wasn’t an option since there was no one available to answer the call). During the day apartments and businesses boarded up and spray painted messages on the boards to make their case to the mob (“minority owned business ” or a common one “children live upstairs”).
So, returning to my previous point, I no longer know where to get basic information. Sure, you can get bits and pieces of the truth by following rogue Twitter accounts, but those always turn out to be batty in their own way. A few months after you gratefully follow them for telling the truth about one thing, they’re selling wild conspiracy theories about something else.
Fox News and other right-wing outlets covered the riots with glee, but they play to their audience in just the same way as the left-wing outlets do. We now know, for instance, that many right-wing commentators knew how bad the January 6th Capitol riots were, but downplayed them on the air — again, for political purposes.
The difference is that I didn’t trust Fox News to begin with. Now I don’t trust the respectable legacy outlets, either.
Welcome to the 2020s!