Analog Future

The future is local.

I mean local in several senses of the word. The future will be local, first, in the sense that the things you do there will be somewhere close to you instead of located on a computer somewhere in Atlanta or San Francisco or Dublin. It will also be local in the sense that the majority of things you will make and do there will likely be stored on your own computer, perched on your tabletop, stored on your bookshelf, built on your workbench, cooked in your kitchen, and so on, rather than somewhere else. You will own them. Related to this, the future will be local, finally, in the sense that you will share things there with local people whom you actually know, rather than digital representations of people in chat rooms or on headsets. You will likely post the things you make there on your own website, print them in your own zine, sell them in your own community. The internet is not dead, but its role as the primary force shaping our lives is coming to an end.

When I say “the internet,” I don’t mean the technical stack. I’m not referring to the network of networked computers communicating with one another using various protocols. Instead, I refer to the “phenomenological internet” of “the more familiar sites of daily use by billions of people” that Justin E.H. Smith defines in his book, The Internet is Not What You Think It Is. Smith writes,

“Animals are a tiny sliver of life on earth, yet they are preeminently what we mean when we talk about life on earth; social media are a tiny sliver of the internet, yet they are what we mean when we speak of the internet, as they are where the life is on the internet.”

To this definition I would add another category, however: the streaming media provider. When we speak of the internet, we also speak of Netflix, Hulu, Amazon, Disney Plus, and so on. These multi-billion dollar corporations draw on the rhetoric of “the internet” to position themselves as scrappy upstarts opposing the staid traditional media providers, such as film studios and television networks. Viewers have largely accepted this position and view these services as outposts of the internet on their television screens.

Prediction is a mug’s game, so think of this as a prescription instead of a prediction. There are several related trends converging over the next several years that are likely to drive people away from the comfy little burrows they’ve carved out of the internet by forking over $5 or $7.99 or $14.99 or a steady stream of personally identifiable data every month. Together, these trends map the contours of serious contradictions between abundance and desire, on the one hand, and humans and machines on the other, which strikes at the heart of the internet as we have understood it since around 2004. The dialectic emerging from these contradictions will drive new user behaviors in the next decade.

The first trend is the grinding ennui which has resulted from the relentless production of entertainment and cultural commodities for consumption on the internet. Reduced in the past several years to a sort of semi-nutritive paste called “content,” art and entertainment are losing their capacity to relieve and enrich us and now increasingly amplify the isolation and pessimism of life online.

A seemingly infinite stream of money dedicated to the production of entertainment on the internet has resulted in an ocean of unremarkable “content” that does little more than hold your attention long enough to satisfy the adware algorithm or build a platform big enough to stage the next bit of content in the franchise and queue up the next marketing event. Outside of their algorithmically contoured bubbles of fandom, there is little difference between Marvel and Star Wars or DC or YouTube creators or Twitch streamers or podcasts. Netflix shows and Amazon Prime shows and Hulu shows and HBO Max shows and Paramount Plus shows and Peacock shows and so on are indistinguishable blips in time, forgotten as quickly as they are consumed. Books scroll by on Kindle screens or drop serially onto shelves. Photographs and artwork slide past on instagram, meriting a second or perhaps a moment’s notice before disappearing into the infinite past. Pop music percolates through TikTok, moves week-by-week downward on officially curated playlists, radiates out into commercials, and then disappears, poof, as rapidly as it came, displaced by the next. Independent music on the internet–even on platforms nominally controlled by the artists, like Bandcamp or SoundCloud–exists in much the same sort of vacuum as it always has. The internet promised an efflorescence of color and creativity. What it gave us instead was a flat, white light that grows dimmer over time as the algorithms which shape it converge on a single point of optimization.

The top 5 most-viewed links on Facebook in the last quarter

Because the vast majority of the “content” is indistinguishably boring, the second trend is tightly related to the first. Social media is dying. Many platforms, Facebook front and center, are already dead, gliding still on accumulated momentum but inevitably bound to stop. As recently as 2016, we believed that Facebook could change the world. In recent quarters, however, the most viewed content on the behemoth platform has either been a scam or originated somewhere else. The top 5 most-viewed links in the second quarter of this year, according to Facebook, consisted of TikTok, two spam pages, and two news stories from NBC and ABC on the Uvalde School Shooting. TikTok leads the second-place spam page by a huge margin. Facebook is not a healthy business. Ryan Broderick recently summed up the situation with Facebook admirably on his excellent “Garbage Day” Substack. “Facebook, as a product, is over,” Broderick writes. “Meta knows it. Facebook’s creators know it. Possibly even Facebook’s users. But no one has anywhere else to really go.”

People who rely on social media to promote and build businesses are beginning to note a general decline as well. According to a poll detailed in a recent article on “creatives” frustrated with social media, 82% believe that “engagement” has declined since they started using social media. “I’ve given up on Instagram,” one freelance artist noted. “I wasn’t even sure it was making a difference with getting more work. And I seem to be doing okay without it.”

Facebook and Instagram are in rapid decline, but what about TikTok, YouTube, Reddit, Twitter, and others? A third problem, more profound than the others, faces these: there are no more users to gain. Two decades into the social media era, the market is highly segmented. New platforms like TikTok will continue to emerge, but their surge will climb rapidly to a plateau. The decades-long push for growth that fueled platforms like Facebook and Twitter through the 2000s and 2010s dovetailed with the proliferation of smartphones. Now that the smartphone market is saturated, social media companies can no longer look forward to a constantly expanding frontier of new users to sign up.

Relying on content algorithms to retain existing users or coax those back who have already left, platforms accelerate the ennui of optimization. This leaves precious little room for new types of content or new talents to emerge. Still, people will entertain each other. Those who create art will seek approval and criticism. Others will seek out new and exciting art and entertainment to enjoy. When there is no room on social media for to put these groups of people together, they will find each other in new (old) ways: on the street.

You may have recently heard that machines are going to solve the problem of creating new and engaging content for people to consume on the internet. AI models like DALL-E, Stable Diffusion, GPT-3, various Deepfake models for video, and others use the oceans of existing images, text, audio, and video to create new content from scratch. Some of these models, such as Nvidia’s StyleGAN, are capable of producing content indistinguishable from reality. Artists are beginning to win prizes with AI-generated work. AI-generated actors are appearing in media speaking languages they don’t know, wearing bodies decades younger than the ones they inhabit in reality. GPT-3 is a “shockingly good” text generator which prompted the author of a breathless article in this month’s Atlantic to swoon. “Miracles can be perplexing,” Stephen Marche writes in the article, “and artificial intelligence is a very new miracle…. [An] encounter with the superhuman is at hand.”

Some critics of these AI models argue that they will prompt a crisis of misinformation. Deepfakes may convince people that the President of the United States declared war on an adversary, for example, or a deepfake porno video could ruin a young person’s life. These are valid concerns. More overheated critics suggest that AI may one day surpass human intelligence and may, therefore, power over its creators like masters to pets. Setting aside the Social Darwinist overtones of this argument—that ”intelligence,” exemplified by the mastery of texts, translates automatically to power—machine learning algorithms are limited by the same content challenges facing social media. AI may create absorbing new universes of art and sound and video, but it can only generate content based on the existing corpus, and it can only distribute that content on existing networks. People have to create new texts for AI to master. The willingness of a continuous army of new users to generate these texts and upload them to the phenomenological internet of social media and streaming video, where they can be easily aggregated and made accessible to machine learning models using APIs, is declining. The same types of algorithms that prompted Stephen Marche to proclaim a New Miracle in The Atlantic are driving the most successful corporations in history right off a cliff as I write this.

These critiques of AI-generated content assume that people will continue to scroll social media and engage with the things they see there in ways similar to their behavior over the past decade. In this model, to review, users scroll through an endless stream of content. When they see posts that inspire or provoke, impress or irritate, they are encouraged to like, comment, and share these posts with their friends and followers. The content may be endless, but the people on both sides of the transaction are the most important elements in the decision to like, comment, or share. Users are not impressed or provoked by the content itself, but because of the connection it represents with other people. They respond and share this content performatively, acting as a bridge or critic between the people who created the content–and what they represent–and their friends and followers. If you remove enough of the people, all of the content loses its value.

At a more fundamental level, people are the appeal of any creative work. Art without an artist is a bit like clouds or leaves: these may be beautifully or even suggestively arranged, but they offer no insight on what it means to be human. GPT-3 may tell a story, but it does so mimetically, arranging words in a pattern resembling something that should please a human reader. You may level the same criticism at your least-favorite author, but at least they would be insulted. GPT-3 will never feel anything.

AI-generated content will neither solve the content problem for platforms nor prompt a further crisis of misinformation and confusion for users. AI content will be the nail in social media’s coffin.

As a result of these interlocking trends–the crushing ennui of “content,” the decay of social media, the dearth of new smartphone users, and the incompatibility of AI-generated art with human needs–“culture” is likely to depart the algorithmic grooves of the internet, sprout new wings offline, and take flight for new territory. Perhaps, once it is established there, the internet will catch up again. Perhaps then software will try, once again, to eat the world. This time it has failed.

What am I doing Here?

1.

Outside my window there is a shaft of sunlight streaking across the fence. The fence is dark brown, a color that hasn’t been popular since the 1970s, and the local paint store keeps a formula in the notebook just for our condo association to paint the fences and trim. They call the color “Westwood Brown,” and if we ever decide to change the color scheme here they will probably hesitate to pull the color out of the book because it is like a collector’s item now. It is a story they can tell to new employees. Even so, if you catch it when the light is just right, like right now, Westwood Brown can transport you to a different time–the Halston era, the epoch of the land yacht Coupe de Ville. It is the golden hour and the light points, like a celestial digit, straight at the spreading petals of brilliant flame-colored Bromeliad my wife planted along the fence. I think, I should take a picture of that.

I am unsure if it is me thinking about the picture, or if it is Instagram thinking it for me.

This week I quit Twitter. It’s not like I had much of a presence there, so we will be fine without each other. I had something like 150 followers, followed around 1,200 accounts, and scrolled over there occasionally when the rest of my addictive scroll-holes were drying up for the day. Lately I had been playing a simple game every time I opened the app. If the first post was someone: 1.) wailing about the hypocrisy of the other side in the culture war, 2.) flexing their success, or 3.) piling onto the controversy du jour, I would close the app and move on. I have had very little reason to scroll beyond the first screen in the past few weeks.  

I’ve quit Twitter and deleted Facebook before, but I was encouraged to close my Twitter account this time by the spate of articles that popped up this week frankly raising the question: what are we doing here? Quinta Brunson’s thoughts on minding your own damn business afford the best example, but the algorithms must have registered some little chemical twist in my pituitary because I kept running across pieces, like this profile of Twitter-person Yashar Ali, or this interview with someone who quit social media, that chip away at the foundation of social media’s necessity by questioning whether we need it or whether the people we see there are as significant as they appear. 

The cynical among you are likely to say that these are dumb questions; that of course social media people are unworthy of our attention, and of course we don’t need it. I think making this claim is a bit like playing a character, though: that of the discerning sage, you probably think, or the intelligent free-thinker standing on a stage opposite the vapid follower and the bankrupt influencer. Which of you is Malvolio and which is Toby Belch will depend upon the attitude of the viewer, however. The postures and costumes are different, but from far enough away the result is the same. They strike us now as just a couple of old assholes, rendered immortally luminous by a poetic genius of world-historical importance. On the internet, there is no poetry to illuminate them. Both characters are constricted by straitjackets of bullshit, and both of them seem terribly unhappy

That brings me in a roundabout way to the question I sat down to ask in the first place. What am I doing here? I’m pretty sure I’ve asked this question before, but I don’t want to go back and check. That would be depressing, like reading an old poetry notebook or flipping through an old diary from high school. I’ll reframe the question this time instead and try to write my way to an answer. Is what I’m doing here, whatever it is, balancing out some of that unhappiness? Is that even possible?  

2.

Lately people have been writing about how much they miss the old internet. The new internet is too vanilla, they argue, too boring and perversely commercial, like a shopping mall. I am normally inclined to agree, but the other afternoon I woke up from a nap and read about alternate reality games and obscure social media characters on Garbage Day. Perhaps it was the residual Triazolam and Nitrous Oxide leftover in my system from oral surgery that morning, but I had trouble making sense of it in the same way that I once stumbled gape-mouthed across arcane forums and exotic communities like a rube from the meandering suburbs of America Online. The old internet, with all of its nonsense, all of its randomness and quirky passion, is still with us. I’m looking at you, Malvolio, for the next line. Of course it is, you might say. It’s all just people.

But did the old internet make people happy? The internet seems always to have been a contested space, a rambling assemblage of insider communities whose best days are very recently gone. I remember my first connection, a dial-up hotline to AOL across endless air-conditioned days and nights in the summer of 2001. I was late to the party, five years behind my more affluent peers, and anxious to make up for it in sheer eagerness. I joined email lists, clicked through webrings, posted to forums. I had been led to believe, probably by bewildered news anchors or breathless magazine articles, that the web was a brand new thing. What I found instead was a bunch of conversations with no beginning and no end. It seemed like the authors of my favorite web pages had all recently stepped away from their keyboards. Communities were cantankerous places, hostile to newcomers, where old fires never seemed to burn themselves out. There was the Eternal September of 1993, for example, named for the time when all the noobs from AOL flooded Usenet and never left. At least that’s how the old Usenet admins have it. For my part, I started moderating a mailing list and stepped right into the middle of flame wars dating back years. Old-timers had to bring me up to speed on the old arguments so I would know how to intervene. New trolls arrived weekly. Maybe it was a bad list, but anyone who’s spent time moderating an online community knows the struggle. 

I think of this when I wonder why we all flocked to MySpace, and then Facebook, so rapturously. The old internet was a place made of text, and consequently a place where strangers spread their shit all over you. You couldn’t avoid it. If you wanted to communicate, you had to work at it. You had to learn the jargon, nod along with inside jokes you didn’t always find funny, assimilate opinions you couldn’t publicly examine. MySpace and Facebook were made of pictures, blissfully free of ideas that couldn’t be communicated at a glance. Facebook remains so. Your mere existence, your possession of a face, is the only cost of admission.  

3.

Neither the old internet nor the new, then, have made us any happier than we were in the time before. Rather than happiness, what we’ve gained is access to a sort of dark power bubbling in the river of instant and endless information flowing through the pipes. We use this information to construct stories, and it is through these stories that we harness the power to do work in the world. I can’t help but feel as though we’ve designed the internet to give us information which reinforces the worst stories we tell about ourselves. These are stories of growth and progress in which we appear better and smarter than anyone who lived before us. These are also stories that make us feel empowered as individuals, unique among lesser peers. Meanwhile the algorithms conditioning the flow of information enable us, in a vicious feedback loop, to look away from the stories we’d rather not tell–stories of stasis or declension, of similarity and solidarity. 

There is another force flowing in those dark waters as well: the emotive power of suggestive blank spaces. Memes are like atoms, rich in potential and eager to freely associate, but information devoid of context is like a shadow divorced from its object. We can only guess at its sources and meanings. This is precisely the type of information the internet is engineered to deliver, however. You ask Google a question and it gives you an answer. 

Google’s brand is based on authority and correctness, at least. The rest of the internet is built to keep you engaged. If content exists in a sort of Darwinian state of nature, the most successful information is that which makes you feel. Reddit and the chans are factories producing and serving up the most emotive content. You scroll TikTok or YouTube and the algorithm serves you videos according to their likelihood of keeping you engaged. Twitter and Facebook serve up bite-size nuggets of emotion on the feed. News editors engineer headlines to galvanize you to action. It is not that microblogs, articles, memes, pictures, blurbs, and short videos are incapable of rich contextualization; it is that the creators focused on context are not as successful as those focused on engagement.    

Part of what I am trying to do here is counter these tendencies. I am drawn to the stories we prefer not to tell. As a historian, for example, I am fascinated as much by continuity as by change over time. No matter the era in which they lived, informants in the archives shared their similarities with us as readily as they disclosed their differences. Our motivations echo theirs. Many of our creations fall short of theirs. Like strangers on the internet, they rub their shit all over us too. We would do well to wallow in it, though, because we cannot engineer a new world as easily as we can engineer a new user experience. Our culture is built from theirs. We live in the cities and towns they built. We speak their language, worship their gods, read their books. Rather than seeing ourselves as disruptors or innovators, we might benefit from seeing ourselves as cautious trustees of that world, therefore; as careful fiduciaries focused on moving slowly and maintaining things rather than moving fast and breaking them. 

This perspective need not be conservative. The narrative of constant growth and improvement is the guiding myth of capitalism, after all. I think building an alternative grounded in context, focused on capturing the prosaic or humanizing the proletarian (to the best of my meager ability, at least), can make us feel a little more anchored in the swift currents of a society built to pick our pockets and power over us by maintaining a constant state of instability. If this approach can make the internet a little bit of a happier place, then maybe I’m doing some good here.

4. 

I failed to mention at the beginning of this essay that I am not necessarily a reliable narrator. 

I actually don’t know whether I’m achieving any of the lofty goals I just described. Perhaps, like an artist’s statement, everything I said up there illuminates the principles organizing my work. I like the sound of that but I’m not sure it’s true. The truth is that I really just work on ideas that I like without worrying about how they fit into some schema. If I admit this, however, then my work in this essay isn’t done and it remains for me to answer: does doing this make me happy? Now I can’t hide behind a shield of analysis. This just got scary.    

Writing for me is a form of exorcism. If I go more than a few days without writing something, anything, I can feel a sort of dark pit forming somewhere inside me. I’ve come to realize that this darkness is death stalking me, as it stalks all of us, from somewhere just outside of my peripheral vision. The longer I go without writing–or, to a lesser extent, creating other things like images or music–the closer it gets, until the feeling of hopelessness is almost unbearable. This sounds like an acute illness, I understand, but each of us is striving to overcome this darkness in our own way every moment we are alive. Some of us achieve it through devotion to family. Others achieve it through friends or work, some achieve it with drugs. Writing is what works best for me. 

Every word written is written for an audience. This is obviously true for articles and essays like this one, for books, and so on, but even a journal is just a book we write for our future selves. With this in mind, a few years ago I thought: if I’m writing simply to stay alive, why not publish it? It takes so many rejection letters to get an acceptance, and I don’t have time (this line of thinking goes) to develop a whole submission and tracking process in addition to working, studying, trying to shift gears and be creative, and then somehow writing and finishing some hairbrained idea in the first place. 

The narrative of progress through technology is here to make me feel good about this. With the rise of Substack and the constant firehose of essays like “No One Will Read Your Book” or “10 Awful Truths About Publishing” or this NY Times article which found that 98% of books released last year sold fewer than 5,000 copies, there is no better time than now to rethink publishing altogether. There are currently 7,614 markets and agents listed on duotrope. Many of the “lit mags” on the web and indexed by duotrope are labors of love undertaken by one or two individuals. They went out and bought a domain and built a WordPress site, just like I’ve done here. What’s the difference? 

Ask anyone who writes and they will point out the problem right away. There are few things more pathetic than a self-publisher. Publishing my own work here helps to exorcise that darkness, but it will never feel good enough. The point of publication, especially now that we all have the resources to publish whatever we want–is that someone else thinks what you have to say is worth amplifying. Publishing your own work is like an admission of inadequacy. It feels like saying, I don’t think this is good enough to publish or, even worse, nobody else thinks this is good enough to publish. 

This is an indoctrinated opinion. Just like those stories of endless progress that troubled me a few hundred words ago, this opinion serves a social purpose. It is likely that how you see that purpose depends on your bedrock identity. Perhaps you see publication as a form of competition that elevates the best writing over the mediocre, driving everyone to greater heights of achievement along the way. Or maybe you see it as a limiting mechanism that functions–either intentionally or not–to push out voices that challenge dominant opinion. 

Since I am now in the personal and supposedly truthful part of this essay, I should say that my own view changes based on how strongly I feel about my own abilities. When I am down, probably after a rejection or two, my opinion of publishing is decidedly Jacobin: down with the editors! At other times, probably when I’ve had something published or just written something that I feel good about, I’m as sanguine about the market as Adam Smith. 

Perhaps all of my opinions are like this and the problem I am struggling to write around is that we are supposed to be consistent. Do the algorithms know I am as changeable as the wind? Do they take advantage of the distance between my variable opinions and desires and how consistent I think they are? We value being but we are all, always, merely becoming. 

This is the real strength of a blog. A book is firm, stolid like our opinions are supposed to be. A book, we might say, is being. The internet is fluid, as variable as the flood of emotions and opinions shaping our daily experience. It is a space characterized by constant becoming, by revising, rethinking. Conversations here neither begin nor end. We’re all just passing by one another, sharing ideas inscribed with light and stored as magnetic charges on a magnetic array somewhere far away. 

This website is my bridge between becoming and being.

This website is a bridge between the old web and the new.   

This website won’t make anyone happier, but it’s not for lack of trying. 

Water Oak

Quercus nigra

There is a tree in the small stand of forest where I take my lunch at work. It has grown from two woody solitudes, twisted in convergent forms like twins battling for supremacy of the same body. One of the twins has emerged triumphant since the plant took root, standing tall–as tall as a Water Oak can stand–above the other, which is bent toward its mightier sibling, rotting at the top, acceding the victory of its twin. It is a tree like other trees. It does not tower. It has no lore. It lives upon its own insistence, feeding on what sunlight it can gather from its prosaic patch of earth sandwiched between the silent waste of the government parking lot and the incessant, hissing excess of Interstate 10. Today I and the twins will commune, like yesterday, reflecting upon our own insistent will to live. Feeding in silence.

I only notice the tree because it is nearest to my picnic table. There are no charismatic grandfathers or grandmothers in my lunch-wood, no booming fauna, no roaring water. Places like this are where most Americans experience nature. In my part of the country, these places are often gray and brown, bark and mud. Dominion of the water oaks.

A water oak is like a chameleon: adaptable, unpredictable. A prolific nineteenth-century observer of trees wrote of the water oak: “There is no oak in the United States of which the foliage is so variable and so different from that of the tree, on the young stocks and on the sprouts from an old trunk or from the base of a limb that has been lopped.” It favors wetlands but can grow indifferently on compact or sandy uplands. It is semi-Evergreen in the South, taking on a showy yellow for a week or so before dropping its leaves according to its own schedule and sometimes not at all. It grows in polluted cities with poor soil and drainage as readily as it will grow in old fields or in the rich muck along the edge of wetlands. Water Oaks don’t much mind drought–contrary to their name–but don’t particularly like strong storms, which can blow away their fragile trunks. One of these likely put an end to the weaker twin of my lunch-wood’s tree. Water Oak flowers, last but not at all least, are brown like their fruit, which stains sidewalks and parking lots a deep tannic hue. In this way, then, Water Oaks connect my asphalt milieu to the impossibly murky rivers which cut their quiet way through the red clay far away from my little Southern city. Town and country, strong and weak, wet and dry: they cannot be reduced.

As they connect town and country for me, so, too, do they connect present with past. Like so many of the people I have known, Water Oaks are short and tough but prone to tragic deterioration. They die young, hollowed out by the age of 40, subject to every one of the world’s whims. Bits and pieces of the trees lie aground, bearing mute gray-and-brown testimony to past trauma. Lightning-scarred, savaged by birds and rodents, worsened and weakened by neighbors, seasons, companions, they fall and die by the age of fifty. I can’t help but think of my cousin when I imagine the tragedy of the Water Oak. The Water Oak is yours, Billy Yetman.

Happy Tastes What?

Dairy Queen says Happy Tastes Good.

I have nothing with which to refute the argument, but it unsettles me anyway. Somewhere deep down, it feels like we’ve come too far by now to accept such a bald, simple proposition. It was something that may have been true for our parents once, but not us, and most definitely not now. Heavens no. Those days are gone.

Associate Freely

I don’t know how to explain this cynicism, so I’ll try something easier instead. I feel confident asserting that “happy,” for us, now, must taste just a little bit real, a little bit off, to feel true.

To explain why, perhaps we could look at the spiraling dread that lurks beneath everything we read–surfacing at strange times, like in this seemingly innocuous pop culture article about why The Simpsons is no longer relatable that Digg thought I would enjoy reading to kick off my week. (They were wrong). Maybe it has something to do with the toxic dialectic of revulsion and self-reflection inspired by reality television shows like 1,000-lb. Sisters, in which two morbidly overweight sisters struggle with their emotions about food and exercise, or Shipping Wars, in which truck drivers underbid each other to win the privilege of hauling “unusual items” across the country. Maybe it’s Flip or Flop, where the money shot isn’t an orgasm but an itemized accounting of profits and losses surrounding the sale of houses that none of us will ever be able to afford at the end of each episode. This is entertainment for a brutalized populace, trained to cheer for the profits of the haves and despise the failures of the have-nots.

Maybe most of all it is because the line between entertainment content and lived experience is finally thin enough to have disappeared. We’re all content-creators now. We’re out here taking part in the flame war between Wendy’s and Burger King, infinite scrolling and hash-tagging our opinions for strangers. Your favorite creators–from presidents and royals all the way down to the TikTokker working at McDonald’s around the corner–are just as worried about maintaining their para-social relationship with you as they are focused on getting the camera angle and sound just right on that next shot. “Happy” is only happy when it is widely shared, but most of us must accept that the things we share will cast but a tiny ripple on the surface of existence. For the rest, the wider their “Happy” is shared the more must they face the fact that there are many millions for whom happiness is unattainable, millions more for whom it is impossible. Simple pleasures, networked and amplified, amount to something a great deal more complex.

With all that in mind, here are two alternative slogans.

  • “Happy Tastes Pretty Good, but Don’t Get Used to it.” — I like this one because it is also very simple. It tells me what to think and it contains a call to action. Don’t get used to it. I can handle that.
  • “Happiness Tastes OK. Expect clouds this evening.” — I like this one because it is useful. It also contains a weather report, which does more for me than the many hours of house hunting/flipping content I’ve watched on languid weekend days. After all, information is power.

McDonald’s: Communion

A large Diet Coke and a medium fry.

It is the second day of January and I am on the 5th day of a “vegan cleanse.” I’m not quite sure of what I am cleansing myself. I agreed to the cleanse because it felt, somehow, like cleansing myself of the worst parts of American capitalism. Those parts that reduce thinking creatures to so many dollars realized per dollar spent, no matter the cost in misery, in health, or whatever other problems we imagine we could escape if we just had enough money. Enough money to be somewhere else when the knocker and the butcher come for us, too.

I said yes to the cleanse, but here I am: in the drive-thru line on a rain-soaked Saturday afternoon, waiting impatiently for the line to move up so I can find something, anything, without animals or their “products” in it to go with my drink. I still think like an omnivore, so I consider for a moment picking up a sundae as a surprise for my wife at home. It takes me a beat too long to realize why that is a bad idea, why it might be a different sort of surprise than what I had in mind. It’s got to be fries, I learn. That’s all there is.

There is something vaguely shameful about McDonald’s. Even if you don’t mind the cruel calculus of fast food, if you love the idea of an international supply chain delivering dripping death from the killing floor to the PlayPlace in a neighborhood near you, you’re still likely to feel a little shame waiting in the line at McDonald’s. You’d rather be somewhere else. You probably wouldn’t want a friend driving by to see you there. Perhaps our willingness to wallow in this shame, time and again, for food that nobody seems to actually like, is evidence that we’re all in an abusive relationship with the golden arches—but that is a thought to unravel some other day. Right now my order is up at the window.

I take a sip and pull around the building. My dog pokes her little shaggy gray head into the space between the driver and passenger seats, her eyes boring a hole in the bag of fries in the passenger seat. For her this is what it’s all about, really: food, always. I share a fry with her and think, in that warm glow of peace and well-being unique to salty, fried foods, that it’s much the same for us. Down there at the bottom of everything it’s always about food. Our lust for travel, our finest memories and innermost desires: food. Everything else, from sex to symphony orchestras, is a distant runner-up.

It is in this salty afterglow that I reflect, too, on the good radiating from this building. The teenagers working their first jobs behind the counter. The managers—if they’re good ones—imparting a new category of knowledge to their young charges. The meals on the table, directly and indirectly, for which this place is responsible. I think about the homeless men and women inside, safe from the rain and cold in one of the few places that won’t turn them out. I think about my father-in-law, for whom McDonald’s is somehow a wonderful meal, and the complicated generational differences underlining my inability to understand his opinions about this and everything.I think about my mother, who passed a tiny portion of the damage our culture has done to women down to me in the form of a deep and abiding taste for Diet Coke.

This place smooths, in a small way, the jarring gaps of age, of race, of gender and wealth and sensibility and morality that divides us all. Maybe that good flowing out of the front door offsets some of the bad flowing in through the back. I take a sip. I pass a fry to my dog. She takes it, ever so gently, from my hand and retires to the back seat. She’ll be back for more before I can finish my own. I eat another fry, savoring the salt, the hint of oil, and think, soon the rain will stop. Out in some featureless place, where I don’t think it ever rains, McDonald’s will continue turning cows into dollars. Tomorrow my cleanse will continue. But today we have shared some fries, we two, and that is all we really need.

Muir’s Notebooks: Thinking and Working in the Age of Distraction

            Not long after the guns of the Civil War fell cold in the 1860s, John Muir opened a notebook and inscribed his name on the frontispiece. “John Muir, Earth-Planet, Universe,” he wrote, situating himself as firmly as any of us may hope to do so. And then he started walking, a thousand miles or so, to the Gulf of Mexico. After setting out on the first of September 1867 on the “wildest, leafiest, and least trodden way I could find,” Muir’s excitement was palpable when he reached Florida six weeks later. “To-day, at last, I reached Florida,” he wrote in his journal on October 15th, “the so-called ‘Land of Flowers’ that I had so long waited for, wondering if after all my longing and prayers would be in vain, and I should die without a glimpse of the flowery Canaan. But here it is, at the distance of a few yards!”

A characteristic Florida view.

         Muir undoubtedly walked a long way from Indianapolis to Georgia, but he cheated his way into Florida, booking overnight passage on a steamboat from Savannah to Fernandina. Perhaps that’s why he felt so down and out after an easy half-day and night of conversation and loafing aboard the steamer Sylvan Shore. “In visiting Florida in dreams,” he wrote, “I always came suddenly on a close forest of trees, every one in flower, and bent-down and entangled to network by luxuriant, bright-blooming vines, and over all a flood of bright sunlight. But such was not the gate by which I entered the promised land.” What he found, instead, was a tangle of marsh and swamp. A hopelessly flat vista of marsh broken only with “groves here and there, green and unflowered.” Dropped unceremoniously on this inauspicious shore, without even breakfast to ease his way into the new world, Muir was overwhelmed. The peninsula was “so watery and vine-tied,” he reported, “that pathless wanderings are not easily possible in any direction.” He made his way south from the gloomy coast down the railroad tracks, “gazing into the mysterious forest, Nature’s Own.” Everything was new. “It is impossible,” he wrote of the forest along the tracks, “the dimmest picture of plant grandeur so redundant, unfathomable.” Sometimes I feel the same way, though I’ve lived here longer than Muir had been alive when he walked down the lonely rail line trying to make sense of the place.

         I picked up Muir’s book recounting the journey a hundred and fifty years later because part of that very long walk took place in Florida, and I am filling up my own notebooks here on Earth-Planet, Universe with the starry-eyed hope that another book about Florida may one day emerge from their pages. Unlike Muir, though, I can draw on an infinite library of books, videos, field guides, and brochures to reduce the unfathomable grandeur of Muir’s nineteenth century gaze to the qualified certainty of my twenty-first century gaze. On a different shelf in my office, for example, I can pull down the Guide to the Natural Communities of Florida. I can leaf through the 81 varieties of land cover the authors have identified in the state until I find the one that Muir was likely to have found along his lonely railroad track: Mesic Hammock. “The shrubby understory may be dense or open, tall or short,” the Guide reports, “and is typically composed of a mix of saw palmetto (Serenoa repens), American beautyberry (Callicarpa americana), American holly (Ilex opaca),” and so on. Maybe I can pull down the field guide to plants and trees, then; or, perhaps, just type their names into the Google search bar on my phone and find out just about anything we know about these thorny, prickly plants with just a few taps.

Callicarpa americana

The sort of deep botanical knowledge Google offers to any armchair naturalist today is what Muir hoped to gain as he explored the little-traveled paths of the South. He set out to find it by tramping through the vines, turning over the ground cover, taking notes, making impressions of leaves and flowers. With only hardbound botanical guides to aid his memory—paperback books then only existed as pamphlets and dime novels, not scientific guides—we can imagine the kind of notes that Muir would need to take to remember it all. Most of all, he had to know how to look, how to take in enough information about a plant shaded by drooping beautyberry branches or hidden beneath the cutting blades of a saw palmetto a few feet off of the trail to describe it later or look it up if he didn’t know what it was. Muir did not have the luxury of a camera in his pocket, connected to an electric warren of machines making inferences from the collective learning of scientists and thousands of amateur naturalists to identify the plant instantly. Muir had to live with it for a while, turning it over and over in his mind until he could write it down. He had to bring some knowledge to the field with him, to know the important parts to remember. Muir had to work for it. 

I’ve used apps to identify plants, and they are wonderful. You snap a picture of a flower, or a whorl of leaves, press submit, and like magic a selection of possible candidates appears. It only takes a moment more of reading and looking to positively identify the plant before your eyes. There is no need to walk the laborious path down a dichotomous key—a series of this-or-that questions people use to identify plants and trees in the field—or stumble through the obscure chapters of a specialized field guide. If a naturalist today can download identifying data to their phone, and if they bring a battery backup or two into the field, the old field guide is as obsolete as the buggy whip. Problem solved, right?

         The internet, and by extension our whole lives now, thrives on this promise of problems solved. The old “fixed that for you” meme sums up the mindset, but you have to go a step beyond the meme’s use in the culture wars (the internet’s stock-in-trade, after all) to get there. If you don’t know it, here’s the culture war setup. Somebody posts an opinion you don’t like on the internet. You strike words from the post, like this, and replace them with other words that you do like. Then you post the altered text in the comments of the original under the simple heading, “FTFY.” For example, if you wrote a tweet that said, “I love Twix!,” some wag might respond: “FTFY: I love Twix Reese’s!” Though your interlocutor would be wrong—Twix is undoubtedly the superior candy—unfortunately the stakes are often much higher. For a while, FTFY was the perfect clap-back to a Trump tweet or a Reddit post. Like all things on the internet, however, FTFY’s popularity is fading away by sheer dint of use. Here’s an example I found on Google in case you are reading this after the meme has completely disappeared.

FTFY

FTFY is a successful meme because it works on two levels. The first is merely discursive: here is an alternative point of view. If you go back and read one of the breathless essays, from before 4chan and Trump, on the democratic promise of the internet, you’ll see a lot of this. The internet is a place for people to express their opinions, and isn’t that good? Mark Zuckerberg still relies on this discursive level to justify Facebook. “Giving everyone a voice empowers the powerless,” he told a room full of people at Georgetown University last year, who, for some reason, did not burst into uproarious laughter, “and pushes society to be better over time.” If this were the end of communication—I speak, you listen; you speak, I listen—then Zuckerberg would be right and FTFY would be innocuous. The second level of meaning is why anyone uses the meme in the first place, though.

The second level is philosophical: here is a self-evidently correct point of view which shows that you are wrong and I am right. Someone using FTFY intends to point at differences of opinion and erase them at the same time. This creates a sort of nervous thrill in the reader, who revels in the shame of the erased whether they agree with them or not. It has no effect on the author beyond alienation, but the point is not to persuade anyway. It is to profit, in the social and psychological sense, by signaling one’s virtue in exchange for internet points. Rinse and repeat.

Facebook, Reddit, Twitter, and others turn shitposters’ play points into real dollars and power through the intentionally-obscured work of software algorithms. Thanks to this perverse alchemy, which converts mouse movements and button-presses into trillion-dollar fortunes, social media excels at delivering us to these impasses of opinion, where we can only point and gasp at hypocrisy for the benefit of those who agree with us. We call this free speech, but it feels like something else, like a sad video game we play on our phones in bed until we fall asleep and the screen slowly goes black. FTFY.

Software’s been Fixing That For You since the 1950s. It started off slowly, the awkward preserve of reclusive engineers, but–I don’t have to tell you this, you already know–grew in scale and intensity like a wild avalanche until now, when it holds the power, depending on which expert is holding forth, to either destroy life on the planet or usher in a new era free of death, pain, and inequity. This bestows upon software the elemental power of nuclear fission. Until recently, we’ve accepted it without nearly as much hand-wringing. Is it too late?

The world-eating logic that propels software’s growth is “efficiency.” This is the Fix in FTFY. In his recent book, Coders, Clive Thompson describes the “primal urge to kill inefficiency” that drives software developers. “Nearly every [coder]” he interviewed for the book, Thompson writes, “found deep, almost soulful pleasure in taking something inefficient and ratcheting  it up a notch.” I understand this urge. At work I have spent the same hours I would have spent downloading and renaming  files writing a script to download and rename them instead. I’ve coded macros to make it easier to populate fields on contract templates instead of confronting the banality of existence by editing Microsoft Word documents manually. As a result of this urge, coders and capitalists argue, nearly everything we do is more efficient today as a result of software than it was ten years ago. As 5G transmitters make their way to cell towers around the world, the same argument goes, nearly everything we do tomorrow will be more efficient than it is today. We accept this, the way we accept new clothes or new toys. 

Onerous luggage

We shun or diminish the things that software displaces. Landline phones are not merely obsolete, for example. They are laughably so. The checkbook register my teachers labored for me to understand in school simply vanished some time around 2005. I left $2,000 worth of CDs sitting next to a dumpster when I moved away from my hometown in 2008 because I had ripped them all to my computer and had an iPod. (I would later deeply regret this decision). Typewriters are a cute hobby for rich actors, rather than tools so vital that Hunter S. Thompson carted his IBM Selectric II from hotel to hotel on benders for forty years. Rejecting these things feels as much like a social gesture as a personal one. Who wants to be seen writing a check at the store? Who wants to talk on a landline phone? 

Shunning inefficiency strengthens our commitment to software. This brings me back to Muir’s notebook. Muir had to see, to remember, to write once in his notebook and then write again to turn those notes into something useful. Seeing and remembering, rather than taking a picture: inefficient. Looking things up in a book when he returned from the field: inefficient. Taking notes on paper: inefficient. And yet I find when I go out into the woods with my phone, tablet, or computer and do what Muir did I see very little and remember even less. I write nothing; and nothing useful, beyond a beautiful afternoon and a vague green memory, comes of it. 

This is mostly my fault. I could use these powerful tools, I guess, to cash in on efficiency and make something even better. But I don’t. Instead, I get distracted. I pull out my phone to take a picture and find that I have an email. I scroll Twitter for a moment, then Reddit, until I am drawn completely into the digital worlds on my screen, shifting from one screen to the next until I manage, like a drunk driver swerving back into his lane, to pull my eyes away. There is a moment of disorientation as I confront the world once again. I have to struggle to regain the revery that drove me to reach for the phone in the first place. This part is not completely my fault. The dopamine-driven design language that drives us to distraction is well known. If I manage to overcome this pattern somehow and actually take the picture, it goes to Google Photos, one of several thousand pictures in the database that I will never seriously think about again. When I take notebooks into the woods, with pen and pencil and guide book, I do remember. I see and think and make things that feel useful.  

More than merely remembering what I’ve seen, working without computer vision helps me see and learn more than I did before I put pencil to paper. Because I am a historian, always looking backward, my mind turns once again to old books and ideas. I am reminded of the nineteenth-century art critic, writer, and all-around polymath John Ruskin. Ruskin understood the power of intentional sight–the practiced vision aided by the trained eye of an artist–as a key to deeper understanding. “Let two persons go out for a walk,” he wrote in one thought experiment; “the one a good sketcher, the other having no taste of the kind.” Though walking down the same “green lane,” he continued, the two would see it completely differently. The non-sketcher would “see a lane and trees; he will perceive the trees to be green, though he will think nothing about it; he will see that the sun shines, and that it has a cheerful effect, but not that the trees make the lane shady and cool….” 

What of the sketcher? “His eye is accustomed to search into the cause of beauty and penetrate the minutest parts of loveliness,” Ruskin explained. “He looks up and observes how the showery and subdivided sunshine comes sprinkled down among the gleaming leaves overhead,” for example. There would be “a hundred varied colors, the old and gnarled wood…covered with the brightness; … the jewel brightness of the emerald moss; …the variegated and fantastic lichens,” and so on. This, I argue, is the vision of the unaided eye in the twenty-first century. Unencumbered by the machines that reduce our experience to arrays of data, we can see it in new and more meaningful ways. 

In a widely anthologized tract, the renaissance artist Leon Battista Alberti wrote, “you can conceive of almost nothing so precious which is not made far richer and much more beautiful by association with painting.” I think Ruskin would have agreed. He jump-started his career through a full-throated endorsement of the Romantic master J.M.W. Turner. The artist’s deft, rapid brushwork, as seen in the detail taken from one of his paintings above, conveys the depth of painterly vision. “Indistinctness is my forte,” Turner rather famously wrote; but I see something rather more distinct than mere visual reproduction at work here. I digress.

More than a renowned art critic, Ruskin was an influential social reformer who believed that adult education, especially education in art, could relieve some of the alienation and misery suffered by workers who spent the majority of their lives operating machines. Workers in Ruskin’s era struggled for the 40-hour work week, deploying the strike, the ballot, and the bomb for the right to enjoy more of their own time. Twenty years after his death, workers throughout the industrialized world seized the time to pursue the sort of self-improvement that Ruskin longed for them to enjoy. Because we can only believe in what Milan Kundera called the “Grand March” of history–that things are better today than they were yesterday, ever onward–we forget the flush of literacy, creativity, and prosperity that blossomed with the passage of the eight-hour workday. Some thirty years later, my grandfather still enjoyed the sort of self-actuated existence Ruskin advocated. 

Pop managed a water filter warehouse in Jacksonville, Florida for thirty years after recovering from a gruesome leg injury he sustained in North Africa in 1944. At night, when my dad was a child, Pop took a radio repair correspondence course. He never finished high school but devoured books nonetheless, especially interested in anything he could get on Nazism. He had a doorstop copy of Shirer’s Rise and Fall of the Third Reich on his living room chair. He took subscriptions to magazines, Popular Mechanics alongside the Saturday Evening Post–nothing highbrow but dog-eared anyway–and read the newspaper religiously. There wasn’t much television to watch. Father and son built models together. They went fishing. 

It was not a golden time by any means. Pop was a brooding, difficult man. He kept a bottle of gin hidden in the yard. He nursed grudges and pouted over a spare dinner of Great Northern beans. He dealt silently with a gnawing pain from the war in North Africa, it seems, until he couldn’t hold it in, dressing up in his army uniform one time in the depths of a quietly furious drunk and threatening to leave the family. I don’t imagine he read his books and magazines when the black dog drove him to the bottle, but I hope he could take comfort in ideas nonetheless. My dad does. He chased away the lumber yard blues on Sunday night watching Nature on PBS and reading Kerouac on the side of the couch illuminated by the warm light from the kitchen. He executed masterful oil paintings on the kitchen table, weeknight after weeknight, amassing a room full of work that would make the neighbors gasp with delight at the jewel box in the back bedroom of the unassuming apartment upstairs. He passed some of this down to me, in turn, though I will never have the talent or the patience he poured into his work. I hope Pop gave that to us.

Pop was not alone in his evening pursuits, but it is hard to imagine a similar man pursuing the same interests today. In 2018 the Washington Post, interpreting survey results from the Bureau of Labor Statistics, reported that the share of Americans who read for pleasure had reached an all-time low, falling more than 30 percent since 2004. The share of adults who had not read a single book in a given year nearly tripled between 1978 and 2014. It is tempting to blame the internet and smartphones for this decline, but it began in the 1980s, according to the Post. Screens account for this change. Television, firstly and mostly, but computers, too, and now phones and tablets. I have stared at a screen for ten hours today. There is still at least two hours of screen time left before I will lovingly set my phone in its cradle by the bed and fall asleep. I am not wringing my hands over the death of books. Ours is a highly-literate era, awash in information. Drowning in text. I am wringing my hands over what seems like the dearth of deep thought, the kind of careful thinking that comes from reading without distraction, from looking without mediation, from quiet.

After a week tramping across the flat pine woods and swamps of North Florida, John Muir found himself in Cedar Key, a sleepy village on the coast which feels almost as remote today as it must have felt in the 1860s. “For nineteen years my vision was bounded by forests,” he wrote, “but to-day, emerging from a multitude of tropical plants, I beheld the Gulf of Mexico stretching away unbounded, except by the sky.” Then as now, however, Cedar Key was the end of the road. With no boats in the harbor and apparently little desire to move on to points further down the peninsula–and vanishingly few they would have been–Muir decided to take a job at a local sawmill and save money for passage on a timber ship bound for Galveston which was due to arrive in a couple weeks. He worked a day in the mill, but “the next day… felt a strange dullness and headache while I was botanizing along the coast.” Nearly overcome with exhaustion and an overwhelming desire to eat a lemon, he stumbled back to the mill, passing out a few times along the way, where he collapsed into a malarial fever. “I had night sweats,” he wrote, “and my legs became like… clay on account of dropsy.” Uncertain whether he would even stay in town when he arrived, Muir instead spent three months convalescing in the sawmill keeper’s house at the end of the world in Cedar Key. 

The fishing village a few years later.

Once he was strong enough to leave the house, the young naturalist made his faltering way down to the shore. “During my long stay here as a convalescent,” he recalled in the memoir, “I used to lie on my back for whole days beneath the ample arms of… great trees, listening to the winds and the birds.” I have spent long days and nights in the hospital. It is nearly impossible to imagine even a half-day in a recovery room without the option of scrolling the internet, watching TV, playing a video game. I suppose, therefore, that I am thankful for software. It fixed boredom for me. 

But still, Muir’s description of Cedar Key is warm, reminiscent. It is easy to imagine that these fever days spent listening to the waves and thinking about plants and birds and life beneath the spreading Live Oak boughs on the desolate gulf coast of Florida contributed in a significant way to who he was about to become. Just a few months later, Muir was in California whooping with delight in the Yosemite Valley. It was there that he became Yosemite’s Muir, the preservationist sage of the Sierra Club and father of modern environmentalism. But perhaps we should rename a little stretch of the quiet wooded shore in Cedar Key the Muir Woods, too. The time Muir spent there in forced meditation seems to have shaped the man, if only slightly, as the forces of wind and water in their slight but constant way shaped El Capitan. There was nothing to fix.

Chicago in the Rear-View

Well, it’s all over now save for the thinking. Of course, what is anything but thinking? Walking, eating, even breathing, are just thinking in motion. Travel is no different. The sights, sounds, smells, and feelings we seek by traveling are, at bottom, just another way of thinking through the world. The trip I’ve been thinking about for the last four months is finally over and I’m still unpacking it all, but I’m interested today in how we come to think about places in the first place. 

If you played Sim City 2000, you might remember a little easter egg in the game. Here’s how it worked: build a library in your city, click on the building to view details, and then click the button marked “Ruminate.” The game would then open a window containing an essay on cities by Neil Gaiman. I suggest playing the game on DosBox and reading the essay in context, but you can read the short piece here if you don’t have the time. I first read Gaiman’s essay when I was about ten years old, and I’m convinced that it shaped the way I think about cities from the very beginning, because Sim City was the first tool I ever used to think about what a city is, how it works–and Gaiman’s essay tied it all together. Software can move you like that. 

Ruminating in Sim City 2000

“Cities are not people,” Gaiman writes, “but, like people, cities have their own personalities.” When I think of Chicago I imagine a vast, brown machine straddling Lake Michigan, churning incessantly. A pulsating, breathing hybrid being made from people and steel and brick and concrete. On my second night in the city, now 950 miles away, I called it a “grand steamroller of a city… an unstoppable machine looming over the Great Lake,” and at the end of the trip I felt the same way. Of course I’m not alone in this characterization. Carl Sandburg famously described the “City of the Big Shoulders,” the “Hog Butcher for the World,/ Tool Maker, Stacker of Wheat,/ Player with Railroads and the Nation’s Freight Handler.” Anthony Bourdain called Chicago a “completely non-neurotic, ever-moving, big hearted but cold blooded machine with millions of moving parts… that will…roll over you without remorse.” Following Sandburg, we are inclined to see these millions of people, living like anywhere else, as some sort of thing, some lovable but impersonal monster chewing up corn and spitting out steel. Why? Maybe we give each other this idea of what a place is, and we travel to reinforce it. Maybe we travel because of it. Or, maybe Gaiman was right. Maybe Chicago really is a sprawling machine made of people. 

Bathymetric Map of Lake Michigan

On my first day in the city I was riding the Red Line train south into the Loop and it struck me as odd that all of this should be happening just a few thousand feet away from the cold, quiet depths of wild Lake Michigan. While the train raged through a tunnel, an image popped into my head of a Smallmouth Bass, ensconced in silence and ever-so-still, suspended in the water just a few hundred yards away from this roaring, clanging madness. In my imagination, a single little bubble escapes the fish’s slowly opening mouth. It meanders to the surface, where it contributes an immeasurably tiny voice to the symphony of noise swirling in the air around the city. It is amazing that these two things–electric locomotive and smallmouth bass–should exist in such proximity to one another, and it raises the question: is the fish part of the machine, too? Tennyson argued in his way that nature is “red in tooth and claw,” but in this place the traveler cannot help but feel that the order is reversed. The city is wild; the lake civilized. It’s all a matter of perspective, yes, but the frigid calm of the lake’s depths seems to offer a poignant counter-argument to the City for living in this part of the world. The fish does not move unless it must. The people living in the city are always moving, bundled against the killing cold. Maybe this is why the city seems like a thing unnatural: it moves when it should be still.

This image has an empty alt attribute; its file name is kfgwK74qkOQlzevRBOD-PO7o-YSjSz7_UDG-pna3aepbBWDjSEIxcu1UP35FbFWBbSSsC6aEo01tlE1dh21L3-j6KV4UiE6zxpj5qHEfSVH-yaXMHReeAr7-Fx3DMJmvqfDiTjt2
1733 Map of Chicago (Source)

The cold is unmistakable. The wind, infamous. It gets dark at 4:30 in the afternoon during this time of year. When I was there it was foggy and wet, muddy from the first snowfall. As the sun slid beneath the horizon and the long, cold night closed in, I thought too about how miserable Chicago must have been for the people who lived there hundreds of years ago. “Cities exist in location,” Gaiman says, “and they exist in time. Cities accumulate their personalities as time goes by.” Huddled against the cold, counting the days until the spring, Chicago’s early people–Native progenitors and European usurpers alike–must have cultivated a biting sense of humor and a firm work ethic to survive here. Joking to blunt the sharp edges of the cold and shorten winter’s long nights, then working feverishly in the warmer months to survive the cold again. The first Europeans came to know Chicago as a place to cross the river: once, twice, three times you could portage the Rivière Chicagou on this 1733 map. The city, as Gaiman suggests accumulates its character across time and space. You stamp your feet when you’re cold. In Chicago, you cross the river. Over time, millions of people found their way to the portage. They stamp their feet to stay warm. They cross the river. They do it over and over again until they start to look like millions of moving parts and the city takes on a life of its own.

Images like these are the things we use to understand cities. I’m no closer to understanding Chicago today than when I boarded the plane to visit, but neither was Sandburg when he wrote:

“The bronze General Grant riding a bronze horse in Lincoln Park

Shrivels in the sun by day when the motor cars whirr by in long processions going somewhere to

keep appointment for dinner and matineés and buying and selling”

The city is what we project upon it. It is then what we project upon the projections. Add image upon image, time upon place, and the palimpsest can take on a life of its own, like Sandburg’s General Grant in the remainder of the canto:

“Though in the dusk and nightfall when high waves are piling

On the slabs of the promenade along the lake shore near by

I have seen the general dare the combers come closer

And make to ride his bronze horse out into the hoofs and guns of the storm.”

Chicago, I will miss you.

Grant Monument

Willie Taggart and the Hubble Constant Tension

There’s a problem with the universe.

Some people here in my hometown feel that way because the Seminoles are losing football games, but I’m talking about the real universe, the one out there. Right now, scientists who study the universe are puzzling over the answer to a simple equation, Ho=v/d. This equation has been with us since 1929, when Edwin Hubble discovered that all galaxies, in all directions, appear to be moving away from us here on earth. If you plug in a couple of measurements, he found–the velocity at which one of these galaxies is moving away from the Earth, v, and the retreating galaxy’s distance from the earth, d–you should come up with a simple number. This number describes the universe’s rate of expansion, and it’s called the Hubble Constant in honor of its inventor.

Edwin Hubble. From Wikipedia.

The Hubble Constant is important because it lies at the heart of how we understand the history of the universe. Using this measurement, physicists have inferred that the universe has progressed through three eras of expansion. First, they argue, it expanded very rapidly. This was the time around the Big Bang, billions of years ago, when the universe exploded into existence. Over time, the gravitational pull of a strange substance called dark matter slowed this initial expansion, but in the current era, a strange force called dark energy is speeding the expansion again. As you might guess, this model only works if the Constant is actually constant. For something that’s supposed to be a constant, however, the Hubble Constant has changed an awful lot.

When Hubble plugged in his numbers in 1929, for example, he came up with a number around 500 km/s/Mpc. That is around 99.4 miles per second per million-light-years, I think, but scientists were never happy with that number anyway. It took a long time for the physicists to reach consensus, in fact, but rival camps of researchers by the 1970s had at least settled into a pattern, with measurements of the “constant” ranging between 55 and 100 km/s/Mpc. Not only was the Hubble Constant unconstant, it was also controversial. When a group of physicists tried to solve the problem once and for all at a conference in Aspen in 1985, one of the participants says, “there really was no way to get the old timers to work with the young turks.” The controversy would continue. 

Thirteen of the Aspen group realized that the problem was really about calibrating instruments to come up with a good measurement. When you try to observe something with precision from billions of miles away, it turns out, there are a lot of things that can go wrong. I understand this. I tried to take pictures of the moon with my smartphone mounted to a telescope just last month and ended the night with two blurry photos and a bad mood. Even with the best equipment in the world and a PhD in astrophysics, trying to measure the radial velocity of a distant galaxy is really hard. 

Astronomy is hard. Credit: me.

Enter the Hubble Space Telescope. If you were alive in the 1990s, you probably heard about the Hubble on the news or read about it in the newspaper, because it was originally launched with a flaw in its enormous mirror and had to be fixed by spacewalking astronauts in 1993. Once the extremely expensive mirror was fixed, the orbiting telescope completely changed the field of astronomy. Solving the Hubble Constant dilemma was one of its goals and, with fresh new glass, it worked. By the late nineties, scientists using the Hubble reduced variability in measurements of the Constant to less than 10%. With the Hubble data, finally, scientists could tell us how quickly the universe was expanding and, therefore, how old it is. Cue the celebration.

An image of the stellar system Eta Carinae taken by the Hubble Space Telescope. More information is here.

This celebration is coming to a halt. Over the past few years, physicists have been looking at new areas, using different tools and methods, and coming up with different values for the Constant based on where they look. After measuring the light from exploding stars, one group argues that the value is 73. Two other groups argue that the number is 67, based on measurements of cosmic radiation, or 70, based on analysis of the light from red giant stars. Last month, another group of physicists published a paper using gravitational lenses to argue that the value is actually 77. That nice model of the universe, with rapid expansion, slowing, and speeding back up? It doesn’t work if the Hubble Constant is different in different places. Physicists are struggling now to come up with a new explanation for how the universe works.

Tell me if this sounds familiar. Some time after World War II, everyone agreed on a norm. In this case, it was a Hubble Constant somewhere between 50 and 100; but you can imagine if you like that the norm might be something like, Presidents release their tax returns. Everyone used that norm to build a really strong and comfortable system. In this case, that system was the standard model of cosmology; but you can imagine, again, that it might be something like, postwar American prosperity. Then, new tools gave everyone a lot more information, and the norm started to break down. People started putting that information together in new ways, and, for them, the norm–the Constant–should be 73, or 67, or 77.The evidence supports their claims. It’s all just a matter of where they choose to look and what they choose to measure.

This reminds me of Willie Taggart, the unfortunate head football coach at Florida State University. Some of the facts about Coach Taggart are hard to dispute. Taggart was hired in late 2017, after the ignominious departure of the last coach, Jimbo Fisher. Taggart inherited a football team that was in trouble. Fisher had not recruited well in his last season. Many of the team’s most talented players had graduated in the two years before. Taggart promised success through simplicity nonetheless. He achieved a record of just 9 wins and 12 losses before he was fired by the university on November 3, 2019.

This is where we are right now: awash in so much information, so many points of view, so much evidence, that just about everything can be argued cohesively. While Taggart was here in Tallahassee–I assume that he’s gone now, just to get away from the smoldering indignity of it all–I read and heard and made so many arguments about the coach that I can’t remember them all. I heard he’s getting better, just look at the NC State game, and I thought: sure, yeah, that makes sense. But then I read, he’s not getting any better, just look at all of the penalties, and I thought, too: well, shucks, that also makes sense. I scrolled over similarly conflicting opinions on my twitter and facebook feed every Saturday, sparred over inconsistent and opposite viewpoints with friends on Monday, and most of the arguments on either side made sense. 

It could be that I’m just a bad reader of football opinion, but I’m not alone. The whole city this autumn was wracked by two poles of opinion. One: you can’t fire a coach until he’s had at least three seasons to turn things around. Two: if the team’s not getting better they need to move on now. These tribal pole stars were like Hubble Constants of 67 and 77. Both were plausible and well-supported by people who knew what they were talking about, and both were true if you accepted the evidence. They were also complete opposites.

This is where football starts to diverge from the Hubble analogy. Conflicting information in science leads to new science, but conflicting information in culture leads to politics. “Any time there’s a discrepancy, some kind of anomaly,” physicist Katherine Mack told the Washington Post for a story about the Constant, “we all get very excited.” I don’t think anyone is excited about politics anymore. Unfortunately, the politics of the Taggart situation are unavoidable. Like pretty much everything else in America, Taggart’s story boils down to the bone stock of the culture war: race and class.

The internet is made up of little building blocks of information. Like Legos, you can take this information apart and put it back together into just about anything you want. While there are more communities on the web than any of us can imagine, it seems like most people in these communities use politics to help them understand how they should put information together. College football communities are no different. Ever since Taggart arrived in Tallahassee, the culture warriors who care about FSU football started taking information about his tenure apart and putting it together in shapes that fit their worldview. By the end of his first season, an upset fan posted a lynching meme over the caption, “Believe in something. Even if it means sacrificing your rep.” The university condemned the post and everyone moved on, but the subtext was now out in the open. The stakes for Taggart were higher, and the obstacles more formidable, than they would have been for a white coach.

Ask any of Coach Taggart’s detractors, and they’ll tell you: they’re not racists. But, racists or not, race was such an important part of the Taggart story that we can’t ignore it. Earlier this year I found myself in the middle of an extremely minor Twitter skirmish between Taggart supporters and Taggart detractors arguing over a stunt involving a lemonade stand and a prominent Booster. The Booster told me the whole thing was a joke and saw himself out of the conversation, but critics and supporters kept coming in for a little while longer. It was hard to generalize about the coach’s supporters, but his critics were easier to pin down. Sunglasses, always. An exhausting barrage of exclamation points. Somewhere the phrase, “I support THE PRESIDENT.” Even a rebel flag. Some of the accounts have been deleted since then, for some later infraction or indecency. A true badge of honor.

Sounds about right.

The Taggart saga was only a minor flash point in the broader culture war consuming all of us, but viewing it through that lens makes it easier to understand how it could take over a town. Provocative critics pushed their arguments to the very edge of acceptability and, at least once, well beyond. The rest of us could only fume, could only point angry fingers at racism indirectly, like trying to look at a dim star you can only see when you look away. These provocateurs used football to talk about race. When they pulled it off just right, they could push a little further. When they failed, they could scurry behind a screen of deniability. It was all a joke. Chill out. It’s only football. Perhaps the rest of us, to be fair, are using race to talk about football. It’s deadly serious. Wake up. It’s more than a game. Either way, there’s so much information out there that we can choose which Lego set we want to play with. Nobody is happy with this state of affairs.

In the 1960s, a philosopher named Thomas Kuhn argued that scientific knowledge doesn’t grow in a straight line, steadily advancing as scientists dream up new experiments to test new hypotheses. Instead, Kuhn argued, science progresses through occasional, groundbreaking paradigm shifts, followed by long periods of “normal” work to test the paradigm. Kuhn’s argument was a breakthrough in 1962–a paradigm shift in its own right–that has come under fire, like pretty much all ideas, in the six decades since then; but much of it still rings true. Especially this, especially now: all of that “normal” work in science eventually introduces so much chaos to the theory, so many unanswered questions, that only a whole new theory can clear the board. This is why physicists are excited by the Hubble Constant tension. They can see new science just over the horizon. If we use football as a looking-glass on society, should we be excited too? Is there a new politics just over the horizon? Coach Taggart’s brief experience in the capital city suggests not. We’re long overdue for a paradigm shift.

Books and the Geology of the Soul

I spent a series of cold, foggy mornings when I was a young man of about twenty-one working at a door shop in Jacksonville reading On The Road on my way to work. It was my second time reading the book, but the first time I really got what Kerouac was trying to do. I was haunted by the people and the time, haunted by the palpable vibrancy of Kerouac’s telling of that madcap journey in a way that colored my whole perception of my hometown and myself. Those sodium-lit winter mornings haunt me still. They remind me of what a book can do, how it can shape us. Let me explain.

Jacksonville is an old railroad town. You come across it sometimes in old novels when people are taking the train South with a Capital S. James Bond passes through town on the old Seaboard Silver Meteor in Live and Let Die, for instance, where he grabs breakfast in a greasy spoon around the corner from the station before climbing back aboard and moving on. These travelers never stick around, but it was a grand train station–which is still there, incidentally, only today they have gun and exotic bird shows in a convention space that used to shelter weary travelers and anxious lovers. An old steam locomotive sits behind a fence outside, holding out brute silent testimony to the grand past for anyone who will stop and listen. Hundreds of thousands of people drive by the locomotive everyday as they make their way in and out of town from the surrounding sprawl.

I was one of them when I worked about a mile away from the Convention Center. I had about an hour-long bus ride from my apartment way out on the southern fringe of the city to the steps of the Convention Center–look, I know this geography doesn’t mean much to you, but it pleases me to tell it–where I would jump off the bus and double back toward Stockton Street through a tunnel that ran up underneath the Interstate 95 overpass. I had an hour with the book, therefore, in the peculiarly calm atmosphere of a city bus rumbling across town in the dark of a cold winter morning. Putting the book away and climbing off of the bus was like sliding out of a warm blanket, gasping as the icy air blasting across the St. John’s River slapped me in the face. People don’t think Florida is cold. Those people have not walked a North Florida mile to work at Dark Thirty o’ clock on a January morning.

I wanted to be the kind of guy with a book hanging out of my back pocket, so there it was; and I remember, for some reason, taking it out of my pocket and reading as I walked. Now, to get to work from the Convention Center, I had to walk over about five railroad crossings that cut through the old half-assed industrial zone where our shop was located. And, without a doubt, there would be a train on one of those crossings. Probably parked. I could wait for the train to shudder to life and clear the tracks eventually, which would be smart, but forbearance is for the aged. So I put the book away and faced hard reality once again to climb up over the behemoth flatbed cars and continue the journey with the book slapping around in the back pocket of my hand-me-down blue work pants. My pants were spotted with caulk and dyed a dull white by fiberglass dust blowing out of door machines, and I knew I was climbing over the same rails and walking by the same warehouses that stood when Kerouac wrote and it was somehow profoundly meaningful for me one morning in particular, I remember, as the sun came up over the end of the road and rails.

Something about the trains; the morning night world; the bus and the tunnel; the alternating warmth and cold, dark and light; the variety of light, from harsh fluorescent to dim, orange sodium; the blue-gray dawn; the contrast between Sal Paradise’s free-wheeling life on the road and my routine working in a wood shop for eight dollars an hour; something about all of that came together like a fine recipe that somehow contributed to who I am today. I’m sure any reader could share a similar story of strange alchemy, of a worn paperback working with time and place to shape them into something else. It is a beautiful geology of the soul.

Punk Postures: Overlooking Punk’s Real Lessons

It didn’t take an advanced degree in the humanities for me to realize that punk rock is a crock of shit. Just a sprained ankle. It wasn’t sour grapes, either, but dedication that broke punk’s fuck-it spirit wide open for me. For years I carried a big Fender bass amp–my only amp at the time, my precious–up and down the stairs to my Dad’s apartment. Every Friday and Sunday in 9th grade when it was time to visit Dad or go back to Mom’s, I dragged the damn thing like an Acme contraption from a Looney Tunes cartoon up those fourteen steep concrete slabs, heaving and cursing the whole way. Later, when I lived in the apartment and played in bands regularly, I lugged that heavy bastard up and down the stairs every other day like a religious ritual. It never got easier, and sometimes it lugged me down instead. But it was a price I had to pay. Asking Dad for a ride, schlepping the giant heavy box, looking like a fool, tumbling down the last three steps and limping for days afterward: far from anarchy, this was work.

fender-bxr-200-175528

Then there were the hours upon hours I had to spend practicing–an even more exhausting worship ritual than the semi-daily ritual of labor. Play the song; try the technique, again and again; pray the muscle memory remains the next day but keep trying anyway; do it over and over.

All of that work is why I can relate to the gracefully aging punks in The Guardian’s recent where-are-they-now profile, “Never Mind the Bus Pass.” 55 year-old former Alien Kulture bassist Aussaf Abbas knew, for example, that punk rock couldn’t pay the bills, so he went to work as an investment banker and has since “met prime ministers and finance ministers and CEOs of major corporations.” “This was unbelievable for an immigrant kid,” he insists, “who grew up in Brixton in a single-parent family.” One-time Au Pairs singer and guitarist Lesley Woods took a similar path to affluence. “After the band folded,” she explains, “my brain was quite scrambled and I needed to get my mind back, so I thought I’d do something really difficult and started studying law.” While she still “mucks about” with music, her work as a barrister is so intense that she only has time for a few recordings and “the odd performance.” Others in the profile tell similar stories.

The standard punk posture insists, outraged: Abbas, Woods, and their peers are sellouts, shills for neoliberalism. But for most people, music offers work—vast, endless vistas of work—with little more than a token spiritual reward at the end of the day. Investment banking may not be the best solution to the problem, but neither is Higher Education, which attracts an army of refugees from dive bars and touring vans every year. Everyone must negotiate neoliberalism on its own terms, and what choice does anyone have? Only those privileged with money or parents with money, a great deal of luck, or generous friends really have a chance to earn anything more than a few dollars and a few Facebook followers at the end of night. Most need all of these simply to live as a musician. Punk’s outrage and anarchy relies on an ocean of privilege, then. For the innumerable devotees whose parents and friends can’t or won’t support years of work without reward, punk rock’s promise of DIY catharsis is merely palliative. The truth beneath the posture gleams like a shiny nickel reflecting the inverse of the American dream: work all you want, kid. It ain’t enough.

To make matters worse, the posture has only ever been clear in retrospect. In an extraordinary piece in The Baffler last winter, “Punk Crock,” Eugenia Williamson wonders: “As punk pushes into its fourth decade, its rules, aesthetic, and parameters are still murky at best. Does punk retain any meaning at all?” Despite the claims of passionate devotees–like the Noisey Facebook commenter she quotes who argues that “the complex ideology of punk goes way beyond the genre of music–it’s also about not giving a fuck and doing exactly what is authentic to you”–punk is hidebound by an inherent logic based on fictions of lost purity and dying scenes. Beneath the aesthetic, her article suggests, punk was never really there. Its earliest adherents lived like the coke-fueled arena rockers they despised; their descendants have “not only voted for Rand Paul but [are] raising children in a McMansion funded by festival dates.” So much for anarchy.

Screen Shot 2016-05-14 at 8.44.43 AM

My own decade-long encounter with cathartic do-it-yourself anarchy was far from revolutionary. I repeated the upstairs and downstairs rituals of labor and repetition later, for example, when I was still a dropout working at Walmart by day and playing the bass by night in a band that pretended it could barely bash out the chords to “Blitzkrieg Bop.” That had been the appeal of the band, actually, when I tried out: their unapologetic badness. They had posted a to-hell-with-it ad for a bassist on Craigslist citing their inability to play but their desire to try anyway. I replied. Everyone was better than they had claimed, of course. They had performed the rituals too. So within a week I was lugging another huge bass amp up three flights of stairs twice a week to the drummer’s apartment across the street from the University of Florida. We called ourselves “Surprise Blowjob”–SBJ for short–and played a few shows over the course of a weltering Gainesville summer before going our separate ways. We joked about “punx” with one breath and rented a practice space with the other; paid for recording with one hand (well, one of us did: Thank you, Ryan, if you read this) and burned our own CDs with the other. We booked shows when we weren’t practicing; drove to Jacksonville to play for 15 people. I designed stickers and merch; stayed up a few nights after work to design a website. And then it was over. A new semester, job hunts, and grad school were looming for the students in the band. They left. Other bands were calling me. Like the individual rituals of labor and repetition, the group ritual of band-building has to be repeated like a rosary. Friends and strangers come and go from the devotee’s life.

SBJ_sticker_page

I packed and unpacked, carried and setup my amplifiers, my gig bags, my cables and pedals in and out, up and down, through every change. I understand now that these rituals of individual dedication and group support were the only authenticity punk rock could offer. Everything else is just an argument about aesthetics.