Friction: MFA at Work

Technology is supposed to make things better. Lately it seems as though, almost day by day, the tools and systems that surround us are growing more complex and less useful. Here is an example.

The mobile phone on my desk at work flashes a notification about once a week. “Update Apple ID Settings,” the notification advises me, because “some account services will not be available until you sign in again.” I click continue and a new screen appears, entitled “Apple ID for your organization.” The screen instructs me to continue to a web address where I may sign in to my account. I tap the screen to activate a large blue button labeled “Continue,” and a browser page showing my workplace’s login screen appears. I enter my password–encrypted and saved on the phone, thankfully–and a new screen appears presenting me with the option to verify my identity through a phone call or a text message. I select phone call, because I am unable to receive text messages on this phone. If I did happen to select text verification, here is what would happen: the screen would change again, displaying a message over a set of floating periods indicating that the verification server is awaiting my confirmation text message. Nothing would happen, however, and I would need to begin the process again.

A moment after selecting phone verification, the phone rings. I answer and an automated voice speaks:

“This is the Microsoft sign-in verification system,” the voice says. “If you are trying to sign in, press the pound key.”

I tap the small window at the top of the screen representing the call in progress. This leads to another screen, where I must tap the “Handset” area to open a virtual representation of an old phone handset. I then tap the area of the glass screen corresponding to the pound key.

“Your sign-in was successfully verified,” the voice responds. “Good-bye.” The blazing red notification bubble will never disappear until I take this action.

The entire interaction takes less than thirty seconds. It is irritating in the moment, but the process is easy enough that I don’t have to think much about it once I get started. If I refused to do so, however, after a while the software on my phone would stop working. First, I would lose the features furthest from the core of the phone. Apps that change often–productivity apps like Excel or OneNote, for example–would be first to go, blocked by a verification server requiring the newest version to operate. Next, I might start to lose access to some of the manufacturer’s frequently-updated software, like Maps and Photos. Finally, given enough time and system updates, even the most basic features like mail and text messages, and then the phone itself, would stop working, rendering the $1,000 computer less useful than a concrete block until I completed the ritual of verification.

A Note on the Disappearing Internet

A while ago, I wrote that the future is local. File this quick note in the same folder.

Tonight I was trying to locate a handy graph showing trends in the construction of shopping malls in the twentieth century to supplement a travel essay I’m working on. I know I’ve seen charts, tables, timelines, and maps which show exactly what I needed, so I thought it would be trivial to find it on Google. Turns out it was easy to find secondary content describing what I wanted, but the primary sources were long gone from the internet. Here’s a great example.

In May 2014, The Washington Post ran a story about the death of American shopping malls. After the usual rambling wind-up to the ad break, the article got to the point: an animated map designed by an Arizona State grad student tracking the construction of malls across space and time in the twentieth century. “Over a century,” Post columnist Emily Badger wrote, “the animation gives a good sense of how malls crept across the map at first, then came to dominate it in the second half of the 20th century.” That is exactly what I wanted! I scrolled up and down the page, looking for a map with “dots… colored by the number of stores in each mall,” but it was nowhere to be found. I clicked a link to the source: nothing. MapStory.org appears to have gone offline sometime in the summer of 2020. Increasingly dismayed, I went back to Google and searched again. This Archinect article, published a few hours after the Post column, embedded the map directly. All that remains now is a blank box. Business Insider was a few days late to the party, but it was the same story there: a blank box where the map used to be.

As a last resort, I turned to the Wayback Machine at the Internet Archive. An archived version of a web app like MapStory appears to have been is never ideal and only rarely works. Sure enough, the archived version of the mall map is just text gore. I’m afraid Sravani Vadlamani’s map is gone, and probably gone forever.

As corporations merge and downsize; as executives and product managers make changes to content retention strategies; as technical standards and fashions in code change over time; and as server upgrades, data loss, simple bit rot, and other forms of entropy accumulate; more and more of these primary sources are going to disappear. In the best-case scenario, dedicated archivists might be able to stay ahead of the chaos and preserve some majority of the information we see every day. Because the last ten years or more of the internet is largely hidden behind the walls of social media, however, the odds that this scenario will prevail are vanishingly small. We should be prepared for a much worse situation: if we don’t make a local copy of the things we see on the internet, they probably won’t be there when we come back.

As an historian, I am troubled by the potential consequences of this fragility. “Darkness” did not prevail in the so-called dark ages of the past because people were less intelligent, inventive, or ambitious than their ancestors. The “darkness” seems to have existed only in retrospect, when later generations recognized a rupture in information between one age and the next. Burning libraries is one way to cause such a rupture. Perhaps networked computers serving dynamically generated content is another. Let us hope not.

Old Friends: Payphones of Tallahassee

Maybe you remember what it felt like, what it sounded like, to use one of these. I remember the dusty plastic cover on the heavy phone book dangling beneath the box. I remember the slight delay between picking up the receiver and hearing the dial tone down the line. I remember the automated voice insisting on more coins in the machine. I remember the road noise, the ringing phone on the other end of the line, the throat clearing anticipation. Most of those sensations are gone, but a few of the old workhorses remain, including this battered old friend rotting away at a gas station just below the campus of Florida A&M University.

Inspired by 2600 Magazine’s longtime obsession with these beautiful, hackable old devices, I keep an eye open for them and try to grab pictures when I can.

There was no dial tone when I placed the dangling speaker to my ear and picked up the other end, but I did hear a strange clicking sound. That may have been the sound of wires striking metal, or the death rattle of the ancient and destroyed mechanism.

Old Friends: Payphones of Tallahassee

Maybe you remember what it felt like, what it sounded like, to use one of these. I remember the dusty plastic cover on the heavy phone book dangling beneath the box. I remember the slight delay between picking up the receiver and hearing the dial tone down the line. I remember the automated voice insisting on more coins in the machine. I remember the road noise, the ringing phone on the other end of the line, the throat clearing anticipation. Most of those sensations are gone, but a few of the old workhorses remain, including this battered old friend rotting away at a gas station just below the campus of Florida A&M University.

Inspired by 2600 Magazine’s longtime obsession with these beautiful, hackable old devices, I keep an eye open for them and try to grab pictures when I can.

There was no dial tone when I placed the dangling speaker to my ear and picked up the other end, but I did hear a strange clicking sound. That may have been the sound of wires striking metal, or the death rattle of the ancient and destroyed mechanism.

The Vibe Shift

You’ve probably heard of the vibe shift.

The vibe shift is whatever you want it to be.

The vibe shift is the death of the unitary internet.

The vibe shift is the re-emergence of local, regional, national constellations of power and culture separate from the astroturfed greenery of the web.

The vibe shift is a return to ‘zines, books, movies, maybe even magazines and newspapers, because the web was once an escape from work and all the responsibilities of “real life” and now it has come to replace them.

Lately I have been leaving my phone in the car when I go places. These insidious toys entered our lives with a simple question: “what if I need it?” I cannot recall a single situation in the past decade when I truly needed a mobile phone. Instead I have begun to ask myself, “what if I don’t need it?” What if a mobile surveillance and distraction device is actually the last thing I need to carry with me?

Muir’s Notebooks: Thinking and Working in the Age of Distraction

            Not long after the guns of the Civil War fell cold in the 1860s, John Muir opened a notebook and inscribed his name on the frontispiece. “John Muir, Earth-Planet, Universe,” he wrote, situating himself as firmly as any of us may hope to do so. And then he started walking, a thousand miles or so, to the Gulf of Mexico. After setting out on the first of September 1867 on the “wildest, leafiest, and least trodden way I could find,” Muir’s excitement was palpable when he reached Florida six weeks later. “To-day, at last, I reached Florida,” he wrote in his journal on October 15th, “the so-called ‘Land of Flowers’ that I had so long waited for, wondering if after all my longing and prayers would be in vain, and I should die without a glimpse of the flowery Canaan. But here it is, at the distance of a few yards!”

A characteristic Florida view.

         Muir undoubtedly walked a long way from Indianapolis to Georgia, but he cheated his way into Florida, booking overnight passage on a steamboat from Savannah to Fernandina. Perhaps that’s why he felt so down and out after an easy half-day and night of conversation and loafing aboard the steamer Sylvan Shore. “In visiting Florida in dreams,” he wrote, “I always came suddenly on a close forest of trees, every one in flower, and bent-down and entangled to network by luxuriant, bright-blooming vines, and over all a flood of bright sunlight. But such was not the gate by which I entered the promised land.” What he found, instead, was a tangle of marsh and swamp. A hopelessly flat vista of marsh broken only with “groves here and there, green and unflowered.” Dropped unceremoniously on this inauspicious shore, without even breakfast to ease his way into the new world, Muir was overwhelmed. The peninsula was “so watery and vine-tied,” he reported, “that pathless wanderings are not easily possible in any direction.” He made his way south from the gloomy coast down the railroad tracks, “gazing into the mysterious forest, Nature’s Own.” Everything was new. “It is impossible,” he wrote of the forest along the tracks, “the dimmest picture of plant grandeur so redundant, unfathomable.” Sometimes I feel the same way, though I’ve lived here longer than Muir had been alive when he walked down the lonely rail line trying to make sense of the place.

         I picked up Muir’s book recounting the journey a hundred and fifty years later because part of that very long walk took place in Florida, and I am filling up my own notebooks here on Earth-Planet, Universe with the starry-eyed hope that another book about Florida may one day emerge from their pages. Unlike Muir, though, I can draw on an infinite library of books, videos, field guides, and brochures to reduce the unfathomable grandeur of Muir’s nineteenth century gaze to the qualified certainty of my twenty-first century gaze. On a different shelf in my office, for example, I can pull down the Guide to the Natural Communities of Florida. I can leaf through the 81 varieties of land cover the authors have identified in the state until I find the one that Muir was likely to have found along his lonely railroad track: Mesic Hammock. “The shrubby understory may be dense or open, tall or short,” the Guide reports, “and is typically composed of a mix of saw palmetto (Serenoa repens), American beautyberry (Callicarpa americana), American holly (Ilex opaca),” and so on. Maybe I can pull down the field guide to plants and trees, then; or, perhaps, just type their names into the Google search bar on my phone and find out just about anything we know about these thorny, prickly plants with just a few taps.

Callicarpa americana

The sort of deep botanical knowledge Google offers to any armchair naturalist today is what Muir hoped to gain as he explored the little-traveled paths of the South. He set out to find it by tramping through the vines, turning over the ground cover, taking notes, making impressions of leaves and flowers. With only hardbound botanical guides to aid his memory—paperback books then only existed as pamphlets and dime novels, not scientific guides—we can imagine the kind of notes that Muir would need to take to remember it all. Most of all, he had to know how to look, how to take in enough information about a plant shaded by drooping beautyberry branches or hidden beneath the cutting blades of a saw palmetto a few feet off of the trail to describe it later or look it up if he didn’t know what it was. Muir did not have the luxury of a camera in his pocket, connected to an electric warren of machines making inferences from the collective learning of scientists and thousands of amateur naturalists to identify the plant instantly. Muir had to live with it for a while, turning it over and over in his mind until he could write it down. He had to bring some knowledge to the field with him, to know the important parts to remember. Muir had to work for it. 

I’ve used apps to identify plants, and they are wonderful. You snap a picture of a flower, or a whorl of leaves, press submit, and like magic a selection of possible candidates appears. It only takes a moment more of reading and looking to positively identify the plant before your eyes. There is no need to walk the laborious path down a dichotomous key—a series of this-or-that questions people use to identify plants and trees in the field—or stumble through the obscure chapters of a specialized field guide. If a naturalist today can download identifying data to their phone, and if they bring a battery backup or two into the field, the old field guide is as obsolete as the buggy whip. Problem solved, right?

         The internet, and by extension our whole lives now, thrives on this promise of problems solved. The old “fixed that for you” meme sums up the mindset, but you have to go a step beyond the meme’s use in the culture wars (the internet’s stock-in-trade, after all) to get there. If you don’t know it, here’s the culture war setup. Somebody posts an opinion you don’t like on the internet. You strike words from the post, like this, and replace them with other words that you do like. Then you post the altered text in the comments of the original under the simple heading, “FTFY.” For example, if you wrote a tweet that said, “I love Twix!,” some wag might respond: “FTFY: I love Twix Reese’s!” Though your interlocutor would be wrong—Twix is undoubtedly the superior candy—unfortunately the stakes are often much higher. For a while, FTFY was the perfect clap-back to a Trump tweet or a Reddit post. Like all things on the internet, however, FTFY’s popularity is fading away by sheer dint of use. Here’s an example I found on Google in case you are reading this after the meme has completely disappeared.

FTFY

FTFY is a successful meme because it works on two levels. The first is merely discursive: here is an alternative point of view. If you go back and read one of the breathless essays, from before 4chan and Trump, on the democratic promise of the internet, you’ll see a lot of this. The internet is a place for people to express their opinions, and isn’t that good? Mark Zuckerberg still relies on this discursive level to justify Facebook. “Giving everyone a voice empowers the powerless,” he told a room full of people at Georgetown University last year, who, for some reason, did not burst into uproarious laughter, “and pushes society to be better over time.” If this were the end of communication—I speak, you listen; you speak, I listen—then Zuckerberg would be right and FTFY would be innocuous. The second level of meaning is why anyone uses the meme in the first place, though.

The second level is philosophical: here is a self-evidently correct point of view which shows that you are wrong and I am right. Someone using FTFY intends to point at differences of opinion and erase them at the same time. This creates a sort of nervous thrill in the reader, who revels in the shame of the erased whether they agree with them or not. It has no effect on the author beyond alienation, but the point is not to persuade anyway. It is to profit, in the social and psychological sense, by signaling one’s virtue in exchange for internet points. Rinse and repeat.

Facebook, Reddit, Twitter, and others turn shitposters’ play points into real dollars and power through the intentionally-obscured work of software algorithms. Thanks to this perverse alchemy, which converts mouse movements and button-presses into trillion-dollar fortunes, social media excels at delivering us to these impasses of opinion, where we can only point and gasp at hypocrisy for the benefit of those who agree with us. We call this free speech, but it feels like something else, like a sad video game we play on our phones in bed until we fall asleep and the screen slowly goes black. FTFY.

Software’s been Fixing That For You since the 1950s. It started off slowly, the awkward preserve of reclusive engineers, but–I don’t have to tell you this, you already know–grew in scale and intensity like a wild avalanche until now, when it holds the power, depending on which expert is holding forth, to either destroy life on the planet or usher in a new era free of death, pain, and inequity. This bestows upon software the elemental power of nuclear fission. Until recently, we’ve accepted it without nearly as much hand-wringing. Is it too late?

The world-eating logic that propels software’s growth is “efficiency.” This is the Fix in FTFY. In his recent book, Coders, Clive Thompson describes the “primal urge to kill inefficiency” that drives software developers. “Nearly every [coder]” he interviewed for the book, Thompson writes, “found deep, almost soulful pleasure in taking something inefficient and ratcheting  it up a notch.” I understand this urge. At work I have spent the same hours I would have spent downloading and renaming  files writing a script to download and rename them instead. I’ve coded macros to make it easier to populate fields on contract templates instead of confronting the banality of existence by editing Microsoft Word documents manually. As a result of this urge, coders and capitalists argue, nearly everything we do is more efficient today as a result of software than it was ten years ago. As 5G transmitters make their way to cell towers around the world, the same argument goes, nearly everything we do tomorrow will be more efficient than it is today. We accept this, the way we accept new clothes or new toys. 

Onerous luggage

We shun or diminish the things that software displaces. Landline phones are not merely obsolete, for example. They are laughably so. The checkbook register my teachers labored for me to understand in school simply vanished some time around 2005. I left $2,000 worth of CDs sitting next to a dumpster when I moved away from my hometown in 2008 because I had ripped them all to my computer and had an iPod. (I would later deeply regret this decision). Typewriters are a cute hobby for rich actors, rather than tools so vital that Hunter S. Thompson carted his IBM Selectric II from hotel to hotel on benders for forty years. Rejecting these things feels as much like a social gesture as a personal one. Who wants to be seen writing a check at the store? Who wants to talk on a landline phone? 

Shunning inefficiency strengthens our commitment to software. This brings me back to Muir’s notebook. Muir had to see, to remember, to write once in his notebook and then write again to turn those notes into something useful. Seeing and remembering, rather than taking a picture: inefficient. Looking things up in a book when he returned from the field: inefficient. Taking notes on paper: inefficient. And yet I find when I go out into the woods with my phone, tablet, or computer and do what Muir did I see very little and remember even less. I write nothing; and nothing useful, beyond a beautiful afternoon and a vague green memory, comes of it. 

This is mostly my fault. I could use these powerful tools, I guess, to cash in on efficiency and make something even better. But I don’t. Instead, I get distracted. I pull out my phone to take a picture and find that I have an email. I scroll Twitter for a moment, then Reddit, until I am drawn completely into the digital worlds on my screen, shifting from one screen to the next until I manage, like a drunk driver swerving back into his lane, to pull my eyes away. There is a moment of disorientation as I confront the world once again. I have to struggle to regain the revery that drove me to reach for the phone in the first place. This part is not completely my fault. The dopamine-driven design language that drives us to distraction is well known. If I manage to overcome this pattern somehow and actually take the picture, it goes to Google Photos, one of several thousand pictures in the database that I will never seriously think about again. When I take notebooks into the woods, with pen and pencil and guide book, I do remember. I see and think and make things that feel useful.  

More than merely remembering what I’ve seen, working without computer vision helps me see and learn more than I did before I put pencil to paper. Because I am a historian, always looking backward, my mind turns once again to old books and ideas. I am reminded of the nineteenth-century art critic, writer, and all-around polymath John Ruskin. Ruskin understood the power of intentional sight–the practiced vision aided by the trained eye of an artist–as a key to deeper understanding. “Let two persons go out for a walk,” he wrote in one thought experiment; “the one a good sketcher, the other having no taste of the kind.” Though walking down the same “green lane,” he continued, the two would see it completely differently. The non-sketcher would “see a lane and trees; he will perceive the trees to be green, though he will think nothing about it; he will see that the sun shines, and that it has a cheerful effect, but not that the trees make the lane shady and cool….” 

What of the sketcher? “His eye is accustomed to search into the cause of beauty and penetrate the minutest parts of loveliness,” Ruskin explained. “He looks up and observes how the showery and subdivided sunshine comes sprinkled down among the gleaming leaves overhead,” for example. There would be “a hundred varied colors, the old and gnarled wood…covered with the brightness; … the jewel brightness of the emerald moss; …the variegated and fantastic lichens,” and so on. This, I argue, is the vision of the unaided eye in the twenty-first century. Unencumbered by the machines that reduce our experience to arrays of data, we can see it in new and more meaningful ways. 

In a widely anthologized tract, the renaissance artist Leon Battista Alberti wrote, “you can conceive of almost nothing so precious which is not made far richer and much more beautiful by association with painting.” I think Ruskin would have agreed. He jump-started his career through a full-throated endorsement of the Romantic master J.M.W. Turner. The artist’s deft, rapid brushwork, as seen in the detail taken from one of his paintings above, conveys the depth of painterly vision. “Indistinctness is my forte,” Turner rather famously wrote; but I see something rather more distinct than mere visual reproduction at work here. I digress.

More than a renowned art critic, Ruskin was an influential social reformer who believed that adult education, especially education in art, could relieve some of the alienation and misery suffered by workers who spent the majority of their lives operating machines. Workers in Ruskin’s era struggled for the 40-hour work week, deploying the strike, the ballot, and the bomb for the right to enjoy more of their own time. Twenty years after his death, workers throughout the industrialized world seized the time to pursue the sort of self-improvement that Ruskin longed for them to enjoy. Because we can only believe in what Milan Kundera called the “Grand March” of history–that things are better today than they were yesterday, ever onward–we forget the flush of literacy, creativity, and prosperity that blossomed with the passage of the eight-hour workday. Some thirty years later, my grandfather still enjoyed the sort of self-actuated existence Ruskin advocated. 

Pop managed a water filter warehouse in Jacksonville, Florida for thirty years after recovering from a gruesome leg injury he sustained in North Africa in 1944. At night, when my dad was a child, Pop took a radio repair correspondence course. He never finished high school but devoured books nonetheless, especially interested in anything he could get on Nazism. He had a doorstop copy of Shirer’s Rise and Fall of the Third Reich on his living room chair. He took subscriptions to magazines, Popular Mechanics alongside the Saturday Evening Post–nothing highbrow but dog-eared anyway–and read the newspaper religiously. There wasn’t much television to watch. Father and son built models together. They went fishing. 

It was not a golden time by any means. Pop was a brooding, difficult man. He kept a bottle of gin hidden in the yard. He nursed grudges and pouted over a spare dinner of Great Northern beans. He dealt silently with a gnawing pain from the war in North Africa, it seems, until he couldn’t hold it in, dressing up in his army uniform one time in the depths of a quietly furious drunk and threatening to leave the family. I don’t imagine he read his books and magazines when the black dog drove him to the bottle, but I hope he could take comfort in ideas nonetheless. My dad does. He chased away the lumber yard blues on Sunday night watching Nature on PBS and reading Kerouac on the side of the couch illuminated by the warm light from the kitchen. He executed masterful oil paintings on the kitchen table, weeknight after weeknight, amassing a room full of work that would make the neighbors gasp with delight at the jewel box in the back bedroom of the unassuming apartment upstairs. He passed some of this down to me, in turn, though I will never have the talent or the patience he poured into his work. I hope Pop gave that to us.

Pop was not alone in his evening pursuits, but it is hard to imagine a similar man pursuing the same interests today. In 2018 the Washington Post, interpreting survey results from the Bureau of Labor Statistics, reported that the share of Americans who read for pleasure had reached an all-time low, falling more than 30 percent since 2004. The share of adults who had not read a single book in a given year nearly tripled between 1978 and 2014. It is tempting to blame the internet and smartphones for this decline, but it began in the 1980s, according to the Post. Screens account for this change. Television, firstly and mostly, but computers, too, and now phones and tablets. I have stared at a screen for ten hours today. There is still at least two hours of screen time left before I will lovingly set my phone in its cradle by the bed and fall asleep. I am not wringing my hands over the death of books. Ours is a highly-literate era, awash in information. Drowning in text. I am wringing my hands over what seems like the dearth of deep thought, the kind of careful thinking that comes from reading without distraction, from looking without mediation, from quiet.

After a week tramping across the flat pine woods and swamps of North Florida, John Muir found himself in Cedar Key, a sleepy village on the coast which feels almost as remote today as it must have felt in the 1860s. “For nineteen years my vision was bounded by forests,” he wrote, “but to-day, emerging from a multitude of tropical plants, I beheld the Gulf of Mexico stretching away unbounded, except by the sky.” Then as now, however, Cedar Key was the end of the road. With no boats in the harbor and apparently little desire to move on to points further down the peninsula–and vanishingly few they would have been–Muir decided to take a job at a local sawmill and save money for passage on a timber ship bound for Galveston which was due to arrive in a couple weeks. He worked a day in the mill, but “the next day… felt a strange dullness and headache while I was botanizing along the coast.” Nearly overcome with exhaustion and an overwhelming desire to eat a lemon, he stumbled back to the mill, passing out a few times along the way, where he collapsed into a malarial fever. “I had night sweats,” he wrote, “and my legs became like… clay on account of dropsy.” Uncertain whether he would even stay in town when he arrived, Muir instead spent three months convalescing in the sawmill keeper’s house at the end of the world in Cedar Key. 

The fishing village a few years later.

Once he was strong enough to leave the house, the young naturalist made his faltering way down to the shore. “During my long stay here as a convalescent,” he recalled in the memoir, “I used to lie on my back for whole days beneath the ample arms of… great trees, listening to the winds and the birds.” I have spent long days and nights in the hospital. It is nearly impossible to imagine even a half-day in a recovery room without the option of scrolling the internet, watching TV, playing a video game. I suppose, therefore, that I am thankful for software. It fixed boredom for me. 

But still, Muir’s description of Cedar Key is warm, reminiscent. It is easy to imagine that these fever days spent listening to the waves and thinking about plants and birds and life beneath the spreading Live Oak boughs on the desolate gulf coast of Florida contributed in a significant way to who he was about to become. Just a few months later, Muir was in California whooping with delight in the Yosemite Valley. It was there that he became Yosemite’s Muir, the preservationist sage of the Sierra Club and father of modern environmentalism. But perhaps we should rename a little stretch of the quiet wooded shore in Cedar Key the Muir Woods, too. The time Muir spent there in forced meditation seems to have shaped the man, if only slightly, as the forces of wind and water in their slight but constant way shaped El Capitan. There was nothing to fix.

The Daisy

I have a little yellow Daisy on my front porch that I’ve been trying to keep alive for a few weeks. I’m not doing a very good job of it. The flower lives life on a strange carnival ride of forgetfulness, swinging wildly from healthy, golden vitality to a state of near-death droopiness over the course of each week, and I feel bad for it. I feel guilty just long enough to give it another starvation ration of water from a measuring cup out of the kitchen, that is, and then forget about it for another few days. I really have no idea what I’m doing.

I feel the same way about Twitter. Like the flower on my porch, I abandon it to time and entropy six days out of the week and overwater it on the seventh. It also gives me anxiety. I think, “I need to be better at Twitter.” I wonder, like most everyone with a high opinion of their own voice, “how can I get more engagement?” Just last night, I fell asleep thinking about how social media is a tool that I haven’t learned to properly use. Like an awl, maybe, which I also don’t know how to use. 

Unlike the flower, which needs me to live, there’s no reason for me to feel that way about Twitter. Twitter is probably demonstrably better without me on it. But I can’t help it. I fret over it, like the Tamagotchi I got in a kid’s meal from KFC in 1996 that quietly beeped its way into my psyche until I pulled the battery with a pang of guilt and a whoosh of relief two weeks later. It’s the same mechanic: press one button to clear a need, another to build a relationship.

I need to figure out why the internet makes me feel this way.   

Video Game Spaces: Halo

“When we gazed upon all this splendour at once, we scarcely knew what to think, and we doubted whether all that we beheld was real.”  

Bernal Diaz del Castillo, The Conquest of Mexico and New Spain

You land on a strange “installation” and there are a few moments of silence for you to take in this unique world. After a quick look around at the clearing where your shuttle crash-landed, you make your way across a narrow bridge high above a bubbling stream. To the right, the stream cascades down a well-beaten course cut through a precipitous rocky valley. To the left, this. This expanse of land, water and sky slicing the inky vastness of space. There is a dialectic of sublime beauty and precarious terror in this space. You feel as though you could peer into the cumulus distance for hours, exclaiming at the wonder of it all like Bernal Diaz del Castillo and his murderous crew of invaders. They felt as alien in Tenochtitlan as you feel in this place. The only choice, though, is to pass through an evergreen grove up the rough path leading toward the source of the stream. There is gunwork ahead, unfortunately. 

What I want to suggest, in this and future posts about video game spaces, is that games are a new design commons — a new public architecture that we should take just as seriously as we do “real” spaces. We inhabit games for longer periods of time than we inhabit most public spaces. I’ve spent more time running around the archives level on GoldenEye than I have spent in church; far more time driving around the virtual streets of San Andreas than riding the subway in New York City. All of these spaces were shaped by human hands and minds for humans to inhabit. 

I also hope to think through some of the design problems inherent in games. These are not democratic spaces, for example, and they are not free in any sense of the word. Burning electricity instead of calories, too, may not be sustainable for our bodies or the planet. As in Halo, violence is the dark centerpiece of most video game spaces, as well. What cultural work are these costly, undemocratic, and violent realms performing? Are we designing and inhabiting beautiful hellscapes? 

I’ll share spaces in games here when the inspiration strikes. I hope you can use them to question your assumptions about architecture, landscape, and industrial design, as I am. At the very least, I hope that you can appreciate their beauty and the skill that goes into designing and building them. 

Problem 1,364,872 with Facebook and Data

So, full disclosure: I’ve deleted my Facebook account twice in the past 6 years. Last month I came back again after about six months away with a shamefaced grin. It made me sad to think about all of the people I know sharing their lives with one another, without me. If I’m unwilling to let those connections go, then I can’t opt out of Facebook.

But today I was reading this fantastic reporting in the New York Times about the company’s response to its many crises, and what troubled me is not necessarily that it knows everything about me–which it does–but that the best way to monetize that information is to zero in on the weaknesses: the points of ignorance, credulity, impulse, and reaction. It’s a vast database of personal pressure points the platform presses all day long.

Google has the same information, Apple has a lot of it, Samsung now owns a lot of my pressure points, and all of the apps, trackers, and aggregators on my phone, iPad, computers, and web browsers that don’t come from those companies know a lot about me. There’s no way to opt out and no clear way forward for any of us. Is there some combination of open source and paid platforms, along with encryption and data security practices that will save us?