Hope, Horror, Heartache: A Review of Outer Wilds

There’s a part of me that doesn’t want you to read this review of Outer Wilds. It’s not because I didn’t enjoy the game—in fact, it’s one of the best I’ve ever played—nor is it because I’m shy about sharing my experience with it. It’s just that if you haven’t played this game yet, the best thing I could do is just to encourage you to play it—and say nothing after that.

Here’s everything I knew about Outer Wilds before I played it on Twitch over the course of a few weekends. It was a game about space exploration. My friends who have great taste in games were in love with it. And it was 80% off during a Steam sale, which is akin to a siren’s song for Steam users. Other than that, I had as much knowledge as a newborn baby. It’s that state of ignorance that I want to preserve for you, because you deserve to feel this game without the numbing awareness of spoilers.

The good news is, the game’s story is almost impossible to spoil. Discovery is the heart of Outer Wilds, and the narrative can only be pieced together by forging a path for yourself. But even speaking about the emotional impact of this game runs the risk of spoiling something. Suffice it to say, the game runs a full gamut of wonder, terror, panic, humor, and heartache. You’ll want to play it with the pliability of clay, formless at the start, shaped and forged into something unique by the end. The game in turn will reward you with exploration as linear or tangential as you want it to be—for better or worse.

What was beautiful about playing the game on Twitch was the Outer Wilds fans coming out of the woodwork to watch the stream. None of them wanted to drop hints or spoilers. They just wanted to gather around the campfire, whistle an encouraging tune, and roast a couple of marshmallows—while they got to relive the game through someone else’s eyes. I want to do the same for you.

So, that’s it. That’s my review of the game.

I’ll only say this: I’m at a place in life where I’m trying to figure out what my next big exploration is—career, living arrangements, relationships, everything. I picked up the game with a sense of mild curiosity. I put it down awash in the bittersweet hope of new beginnings. This game reminded me that exploration is not an abandonment of the past. It’s a way of honoring it, while embracing a future that can’t be realized living in predictability and comfort. There are marvelous planets to visit, and they’re all within reach.

Play the game, if you feel ready to explore. When you do, I’ll be at the campfire waiting. And I’ll bring the marshmallows.

The Gatekeepers Are Not On Our Side

(1,101 words, 6 min read time)

Sometimes I encounter a photograph from American history that burns itself into my brain, like the floaters you get from glimpsing at the sun too long.

To me, these images represent the best and worst of our national spirit. They’re often more obscure than, say, Alfred Eisenstaedt’s V-J Day in Times Square or Dorothea Lange’s Migrant Mother, but they are no less iconic or haunting.

For instance, I love the photo of Margaret Hamilton, a pioneer of programming, standing beside the towering stack of code in 1969 that brought NASA astronauts to the moon. It’s a reminder that, while glory often goes to the explorers, it’s the ship builders, who often exist on the margins, that make their journeys possible.

Similarly I’ll never forget the first time I saw the 1892 photo of two Michigan Carbon Works men standing on top of a mountain of bison skulls. Few photos do a better job representing the heedless slaughter of westward expansion, the callousness with which we as a country starved and displaced indigenous Americans.

There’s another photo that’s entered my head canon recently. Like you, I’m trapped in the amber of the moment it represents, so I don’t know if historians will regard it as prophetic as I do. But for me, it’s the bursting bulb at the end of our thermometer—an indication that our nation’s fever has gone through the roof.

On January 20th, 2025, Donald Trump was inaugurated as the 47th president of the United States. Photographed by Julia Demaree Nikhinson, in the front row of the ceremony were Mark Zuckerberg (CEO of Meta), Jeff Bezos (CEO of Amazon), Sundar Pichai (CEO of Google), and Elon Musk (CEO of X). Elsewhere off-camera were Tim Cook (CEO of Apple) and Shou Zi Chew (CEO of TikTok) padding out the numbers with their modest billions. Altogether their net worth is roughly a trillion dollars, rivaling the net worth of all but about 40 countries.

These are the Technocrats, the robber barons of the technological era. Their social media platforms, web hosting services, and streaming channels hold our attention in captivity. Less innovators than conquistadors, they have subdued every competitor and cornered every market. And here they were, taut smiles as they stood shoulder to shoulder with their last remaining rivals.

It was unclear to me if they were at the inauguration to be his courtiers or his puppet masters. But as they stood in places more prominent than Trump’s own cabinet picks, one thing was for certain—a fact we have always known but have chosen to ignore for half a century.

The gatekeepers are not on our side.


A couple of years ago, I was reading Dan Simmons’s novel The Fall of Hyperion, in which human civilization has expanded into the stars. People traverse the galaxy through portals called farcasters, which make interplanetary travel as easy as stepping through a door frame.

But these portals come with a price. They were created by the TechnoCore, a collective of sentient AI programs that had seceded from human servitude. In theory, the farcasters were their peace offering, but later we learn they were a fishing lure. It turns out, each time a person steps through a farcaster, the TechnoCore is hijacking their neurological hardware, using it to accelerate its own processing ability in its quest to build the Ultimate Intelligence.

When I read The Fall of Hyperion, I couldn’t help but think about the persistent paranoia we’ve had about the internet since its birth. I was ten-years-old when I first logged onto Prodigy, one of the original dial-up ISPs. At the time, it was magic. I could enter a forum and chat with stranger in other continents. I could orchestrate and share my own MIDI files like a schlocky Millennial Mozart. I could publish stories and essays and poems instantly, the envy of historical author before me, only my works of great literature came with tacky Looney Tunes animated GIFs in the background. And when I wanted to, I could get lost on an endless sea of knowledge, adrift in the luminous synapses, no map or compass required.

But there were always sea monsters lurking beneath the surface: scams, viruses, stalkers, advertisements, and (most significantly) specter of mass surveillance. It was a truth that would infect the zeitgeist of 90’s science fiction—one that did, I think, inspire the idea of farcasters and the TechnoCore in Dan Simmons’s novels. Famously The Matrix—probably the most formative sci-fi movie of my lifetime—portrayed online life as a pacifying ruse injected into the brains of humans, who lived their entire unconscious lives in uterine pods so AI machines could harvest their bio-electricity. As long as the internet has existed, we have feared the tendrils of surveillance capitalism that might invade our psyches, even as we gladly welcomed them.

And here in this photo are leviathans of the internet—wealthy beyond imagination, powerful beyond comprehension—the eldritch beasts who pluck at the tapestries of our shared digital reality, cresting from the depths to congratulate a president who has promised them a new Gilded Age.


It’s not hard to imagine why Zuckerberg, Bezos, Musk and the others were at the inauguration. Rich men want to get richer, and there’s no quicker way to wealth than to align yourselves with power. But the presence of these rich men in particular, the technocrats, is foreboding.

I’ve believed for years now that information technology has nowhere else to grow. The fevered pursuit of things like virtual reality and generative AI are nothing more than smoke screens, distracting investors for a couple more fiscal cycles while tech companies scramble behind the curtains to build anything new and useful. It’s the constant bait-and-switch of an industry—of an economy—built on the concept of infinite growth. And they’re running out of parlor tricks.

The gatekeepers weep, for there are no more worlds to conquer. They’ve sold every farcaster they can in the private sector. The only solution now is the public sector: stripping agencies of regulatory power, and siphoning taxes through government contracts, so they can build on what were once untouchable planets. Together they will glean unspeakable amounts of data from us, as they literally try to build their own Ultimate Intelligence—until, presumably, they reach a point of infinite value derived from zero labor. Capitalism distilled to its purest form.

I don’t know if Nikhinson’s photo will end up in the history books. But I hope it marks a moment for us, to realize that our relationship to infotech must change, before the monsters behind it eat us alive.

A Defense—or an Elegy—for the Em Dash

(Photo by satwik arora on Unsplash)

LinkedIn is a realm of nightmares.

Soundcloud rappers turned sales gurus, spitting bars about bagging SQLs. Content strategists lauding the realism of generative AI, meanwhile posting videos of writhing, twitching humanoids defying Euclid’s geometry as they attempt to pet a cat. Newlywed husbands cherishing their seaside nuptials—by musing about how it relates to B2B marketing.

I work in B2B marketing—no, I’ve never married on the beach—and visiting LinkedIn is an unsettling necessity of my job. In the absence of agrarian tasks, it’s my version of shoveling manure. My nostrils are numb at this point to the piles of thought leadership plopping into my feed each morning. But a couple of weeks ago, I saw a post that finally broke my spirit.

Someone was slandering the em dash.

For reference, the em dash is longest of the dashes (—) and my favorite bit of punctuation. A hyphen (-) is useful for compounding words, sure. An en dash (–) is pragmatic for showing ranges, of course. But the em dash? The em dash is rebellious and decadent. It’s the instigator of tangents, non-sequitors, and absurdities. And unlike parentheses, the em dash regards a derailment of thought not as an optional aside, but as indispensable zigzag on an otherwise dull track of semantics. It can even replace the comma for a twist of rhetoric that demands more flourish. It happily indulges every new phrase, no matter how errant or wild. The em dash is a stalwart friend to those of us whose brains can’t think in straight lines.

And according to this LinkedIn poster, it now belonged to robots.

In the post, she said that generative AI models often use the em dash because it’s reflective of existing style guides, whereas most human authors won’t use it, mainly because the keystrokes required to make it are cumbersome. For her, an em dash was like a Voight-Kampff test, a betrayal that the writer behind the words was less organic than robotic. The implication was that, if someone wanted to avoid this perception, they might want to avoid the em dash entirely.

Now, I’m being hyperbolic when I say this is slander, but as a frequent user of the em dash, it necessarily puts me on the defense. I am not a robot, at least not to my knowledge. I could easily be accused of a glitched awkwardness at business meetings and cocktail parties, but I assure you, that’s the result of pure, bio-based neurology. If anything, I’d like to think it makes me more human.

But written text is a mode of communication that I can’t afford for people to misinterpret as robotic. For one thing, it’s my job as a marketer to forge an emotional connection between customers and a brand, something that I hopefully do with integrity—and that I’ve done with no small amount of em dashes in my copy writing. If customers smell the insincerity of a large language model, whether or not I’ve used one, then the bond with the brand is broken, and I’ll need to write a new resume. (Let’s hope no one thinks that’s the result of an LLM, either.)

More importantly, though, writing is my preferred conduit for words in general. As someone who has issues regulating his attention, writing and editing gives me a fenced area to wrangle and domesticate my thoughts—whereas spoken words feel like a herd of wild horses, and I’m supposed to somehow lasso them with Silly String. If people no longer trust the origin of written words, then I’ve lost the craft by which I’ve always felt my thoughts could best be understood.

The em dash is a casualty of the generative AI era, but it’s not the most consequential one, despite my affection for it. The greater casualty is written words in general. This ancient vessel, the word, was designed to float meaning from one brain to another. Porous though it is, it’s still the best method we had for transferring ideas from person to person without those ideas capsizing entirely. And even if those words can be strung together into untruths, you could be reasonable certain before the advent of genAI that those words had embarked from the port of a human mind.

Not anymore. In every medium from printed books to instant messaging, the existence of genAI has drained the perceived veracity of words. There’s no longer a full assurance that what’s speaking to you is human, or if it’s the tortured amalgamation of a million different voices, fashioned by an unthinking algorithm into a vaguely canny echo of one. This unsettling reality leaves us whipping out our magnifying glasses—sleuthing for clues like em dashes—vainly hoping to snoop out the robots so we can maintain the internet as a place of real connection.

I won’t stop using the em dash, no matter how ridiculous the keystrokes. And I won’t stop using my own neural processors to write, no matter how imperfect the results. I’m not a Luddite about LLMs: so long as the results can be reverse-engineered, there is great potential in LLMs as an assistive tool. But to delegate the writing process entirely to them is to deprive ourselves of the reason writing exists in the first place. Writing is a tool that makes us think deeper, dream bigger, reason harder, and feel stronger. It forges minds in fire, and it blazes trails between them.

The more we trust LLMs with our writing, the more we lose than just em dashes. We will blunt the tool we’ve used for millennia to make us more human.

And the more like hell LinkedIn will become.

Bluesky Isn’t What You Think It Is

(Photo by Michal Mrozek on Unsplash)

In the waning days of the Twitter brand, co-founder Jack Dorsey was talking about his social media platform like Victor Frankenstein being tortured by his run-amok monster.

His relationship with Twitter had all the makings of a Romantic-era horror. He had stitched the platform together from a lifelong passion for instant messaging, and since then it had earned him the ire and scrutiny of presidents, Senate committees, and activist investors.

Now a certain sink-wielding oligarch was about to drain the platform of a few billion dollars of brand equity, and maybe in Dorsey’s mind, this was exactly the self-immolation the monster needed to undergo.

“In principle, I don’t believe anyone should own or run Twitter,” he said in a tweet on April 25, 2022. “It wants to be a public good at the protocol level, not a company.” Selling Twitter to SNL’s go-to Wario impersonator was “solving for the problem of it being a company,” and given the result, I can only assume he wanted that solution to be zero.

Dorsey’s altruism is mostly a self-assured myth. No doubt he saw the potential for public good in Twitter, but he was also talking about profit models from the get-go, praising the benevolence of his beast while locking it away in a tower made of venture capital and ad revenue. Despite this paradox, he was right: Twitter, up until recently, had always behaved like a protocol. Governments, corporations, activists, comedians — everyone treated it as if it was a utility. But the purchase of the platform by the meme-slinging techno-brat made it clear that Twitter was no longer a public good but a privately owned plaything.

So, is Bluesky here to save the day? By now, you’re familiar with the platform that siphoned escapees from Twitter to reach 20 million followers in a relatively short time. Dorsey himself was part of its development (before inexplicably paying fealty to his original beast once again). In appearance, Bluesky behaves like a Twitter clone similar to Threads. But underneath it is the dormant dream of a decentralized social media experience, one where users have more ownership of their online presence.

So, can it separate platform from protocol as Dorsey sporadically envisioned, so each person can decide what kind of monster their social media is going to be? Maybe, but it has some adoption hurdles.

The deep goal of bluesky is to decentralize the social internet so that every individual controls their experience of it rather than having it be controlled by 5 random billionaires. Everyone thinks they signed up for a demuskified twitter…we actually signed an exciting and bizarre experiment.

Hank Green (@hankgreen.bsky.social) 2024-12-03T16:05:13.431Z

The first hurdle is getting people to understand what a protocol even is. I’ve been on the internet since the early 90’s, and I’m still educating myself on the nomenclature. Granted, most of us are familiar with interacting with protocols. If you’re reading this blog, you’re using the Hypertext Transfer Protocol with ease. And if you have an email address, congratulations: you’re already surfing on a wave of protocols that let your email talk to everyone else’s, no matter where their email is hosted.

This is what the AT Protocol, the beating heart inside the rib cage of Bluesky, is trying to emulate for social media. Much like your email address, your social media profile can be hosted by a provider of your choice. In this case, Bluesky would serve as a client in the same way Outlook allows you receive and publish emails. It brings to life the idea of a decentralized social network, one where profiles are a multitude of funky houses connected by common streets, rather than one-bedroom apartments owned by a single mercurial landlord.

Being able to host your own social media account has clear advantages. You can own, archive, and transfer your data more easily, and your profile is not as captive to the whim of mercurial ultra-capitalists. Right now changing your Bluesky username to a self-designated domain is easy enough. But as of this writing, fully hosting your own Bluesky account requires a degree of tech savvy.

This is the second big adoption hurdle to making the Bluesky dream a reality. Currently most users are hosted by Bluesky themselves. To me this isn’t a huge problem, analogous to people starting accounts with WordPress.com versus hosting the WordPress CMS on their own. But like WordPress, I can’t imagine many people going through the rigamarole of hosting their own Bluesky account, unless hosts like GoDaddy and the like provide similar managed services. If decentralization is truly going to be the next phase of social media, it needs to be more easily understood and readily accessible within a lower level of technical prowess.

I’m hoping for Bluesky’s success. I want an open social media protocol that gives me greater control of the content I make and consume, one that is less susceptible to Cory Doctorow’s enshittification principle. Right now on Bluesky I am witnessing more of the wacky, irascible energy I remember from Twitter’s earliest days. But it needs to become more than just an X escape hatch. I’m hoping that adoption of the AT Protocol will parlay that energy into a social media environment where curiosity, depth, and joy are easier to sustain.

Time will tell, but if we split the monster into pieces, maybe it can be more easily tamed.

Does the Jaguar Rebrand Matter?

Nothing gets the internet angrier than a brand changing its logo, even if the brand never mattered to them in the first place.

A couple of weeks ago, one of my Discord mods posted a link to Jaguar’s new logo design, hoping to get my reaction. The original Jaguar logo featured an illustration of its eponymous jungle cat with a sleek all-caps word mark in a futurist, wide-width typeface. The new logo has driven its cat to extinction, replacing it with a yin-yang double-J monogram paired with a minimalist, mixed-caps word mark. At first glance, it evokes the era of info-tech more than it does the age of 20th century luxury cars – a clear attempt to shed the stymied aura of old money and invite a new generation of wealth behind the velvet rope.

Like most reactions on the internet to a rebrand, mine was dependably knee-jerk and cynical. “It’s bad,” I said, “and people will forget they ever cared three months from now.” Time will tell if that second part is true, but I quickly walked back the first part of my keyboard curmudgeon statement in favor of something more nuanced.

For one thing, nothing in brand design is good or bad, at least not on a universal scale. Sure, it can fail to meet some practical requirements, like being too detailed for manufacturing or too indistinct for market positioning. But the scales that people use to judge a logo are weighted by culture, experience, and taste – and these vary wildly between individuals. The best one can do is fashion a logo that feels true to the brand’s narrative and tweak it to suit the palate of the target customer.

I will say, the new logo does embody the story of “exuberant modernism” that Jaguar is proclaiming through this rebrand campaign. The subtle defiance of capitalization norms, the occasional diagonal slashes on otherwise right-angled tails, even the absence of the jaguar illustration itself – all of these feel like decisions made to buck tradition with newfound creative energy.

Is this the right move for Jaguar? My guess is, it couldn’t hurt. Like most luxury brands, Jaguar sales have slumped considerably since the pandemic, so it behooves them to at least paint their brand with a fresh coat of innovation, if only for the sake of cosmetics. At least it’ll dominate a PR cycle in time for holiday shopping.

But on the whole, the change leaves me with aggressively shrugged shoulders. For one thing, this ubiquitous move towards bland sans-serifs is just boring. I feel like it started with Silicon Valley juggernauts shaving their logos down to what could be digested on a smartphone screen, and every other industry has felt like they had to follow suit. Maybe it’s the canary in the coal mine of an economy so dominated by tech and finance that every logo feels like it could be for a startup SaaS company.

For another, poaching the illustrated jaguar in favor of a monogram feels like a lateral move at best. I can see the monogram functioning well as an app icon or a hood decal. But there’s another shape that would’ve fit those functions equally well: the silhouette of, you know, a jaguar.

But the biggest reason for my blase is simply this: for most people, a luxury brand is not a purchase but an aspiration. It’s a thing only a lucky few will own, and the rest of us only serve to reinforce its psychological value with others by salivating over it. The new logo deprives the brand of its head-turning feline iconography, draining it of the signaled status its driver wished to convey. And in the end, none of this means anything. Because I’m the proud owner of a 2018 Toyota Camry.

Anyway, see you next time we’re upset about a brand neither of us can afford.