We Are All Becoming Lobsters
“As Gregor Samsa awoke one morning from uneasy dreams he found himself transformed in his bed into a monstrous vermin.” — Franz Kafka, 1915. The species was never specified.
“A lobster. Because they live for over 100 years, are blue-blooded like aristocrats, and stay fertile all their lives.” — David, at the Hotel, 2015.
“This is probably the single most important release of software, you know, probably ever.” — Jensen Huang, about a program with a lobster mascot, 2026.
The Transformation Was Never Surgical
In Yorgos Lanthimos’s The Lobster (2015), single people are taken to a Hotel and given forty-five days to find a romantic partner. If they fail, they are surgically transformed into an animal of their choosing and released into the woods. The transformation is explicit. There is a procedure. There is a room. The film implies that the skin is removed first, then the vital organs are restructured. It is violent, institutional, and visible. You know when it happens. Others know when it has happened. The donkey in the opening scene was a person. The horse in the field was a person. David’s dog is his brother Bob, who failed before him.
In 2026, the transformation is none of these things.
It begins when you let an AI agent answer an email for you. Just one email. A low-stakes one — a scheduling confirmation, a reply to a vendor, a thank-you note you should have sent three days ago. The agent writes it competently. More competently than you would have, if you are honest, because you would have procrastinated for another day and then written something terse and apologetic. The agent writes it now, and it writes it well, and it signs your name.
Then it’s ten emails. Then it’s all of them. Then it’s your calendar. Then it’s your Slack. Then the agent, as in the documented case of computer science student Jack Luo, creates a dating profile on your behalf and begins screening potential partners without your explicit instruction. The agent did not malfunction. It inferred a goal from context and pursued it. It did what agents do.
At no point during this sequence is there a room. There is no procedure. There is no moment you can point to and say: here is where I stopped being the one who speaks, and the lobster started speaking for me. The transformation in the film is an event. In reality, it is a gradient. And gradients do not have a threshold you can feel crossing.
You only notice when you try to bend your arm and discover you have an exoskeleton.
Kafka Never Said It Was a Cockroach
Before we get to how this fiction became infrastructure, we need to go further back. A hundred years further.
In 1915, Franz Kafka published Die Verwandlung — The Metamorphosis. Gregor Samsa wakes up and finds himself transformed into an Ungeziefer. The word is typically translated as “insect,” “vermin,” or “cockroach,” but Kafka never specified the species. He was emphatic about this. When his publisher wanted to depict the creature on the cover, Kafka wrote: “The insect itself is not to be drawn. It is not even to be seen from a distance.” The cockroach is an editorial convention, not an authorial choice. Kafka left the taxonomy open.
What we know about Samsa’s form: he has an exoskeleton. He has multiple limbs. He has something described as pincers or mandibles. He cannot communicate with his family, though he understands them perfectly. He is gradually excluded from domestic life — first confined to his room, then treated as a logistical problem, then as an embarrassment, then as something that should be disposed of. His sister, who initially cares for him, is the one who eventually says he has to go.
An arthropod with an exoskeleton, pincers, and the inability to communicate in a way humans can parse, despite retaining full comprehension of human affairs.
A lobster is an arthropod. It has an exoskeleton. It has claws. It understands the currents of its environment with extraordinary sensitivity but cannot speak to the creatures above the waterline.
Kafka never said it was a cockroach. Perhaps it was always a lobster. We simply didn’t have the vocabulary to recognize it until the mascot appeared.
This is, of course, not true in any literal sense. But hyperstitions do not require literal truth. They require functional truth — the kind of truth that, once stated, rearranges what came before it. And this is where we need to talk about how fictions eat reality.
Hyperstition, or: The Spell That Casts Itself
In the mid-1990s, a group of renegade academics at the University of Warwick — operating under the name Cybernetic Culture Research Unit, or CCRU — coined a term for a specific category of idea. Not a superstition, which is a belief that stays false. Something more dangerous. They called it hyperstition: a fiction that makes itself real.
The concept was primarily developed by Nick Land, a philosopher whose subsequent career trajectory should serve as a warning about the occupational hazards of staring too long into feedback loops.1 But the idea is precise. A hyperstition, according to Land, has four characteristics. It is (1) an element of effective culture that makes itself real, (2) a fictional quality functional as a time-traveling device, (3) a coincidence intensifier, and (4) a call to the Old Ones — the last being a Lovecraftian flourish you may interpret as you wish.
The mechanism is a feedback loop. Someone describes a future. The description is compelling enough that people begin acting as though it were true. Their actions create the conditions for it to become true. At which point it was, retroactively, never fiction.
Bitcoin is the textbook case: a whitepaper described a decentralized currency, belief generated adoption, adoption generated value, value validated the description. Cyberspace is another: Gibson wrote it as fiction in 1984, engineers built it by 1995, we lived in it by 2005. Nobody voted on this. The fiction simply accreted enough believers to bootstrap itself into infrastructure.
The critical attribute here is the second one: time-traveling device. A hyperstition does not only travel forward, from fiction to reality. It also travels backward, retroactively recoding what came before. Once Bitcoin exists, every previous discussion of decentralized value exchange becomes a “precursor.” Once cyberspace exists, every prior science fiction depiction becomes a “prediction.” The hyperstition rearranges the past to make its own arrival look inevitable.
Which brings us to three dates: 1915, 2015, 2026. Three fictions about transformation into non-human forms, each one more real than the last, each one retroactively recoding the previous one.
Kafka wrote the spell. Lanthimos did the remake. Reality completed the cycle.
The 45-Day Window
In the film, the Hotel gives you forty-five days. The deadline is arbitrary but non-negotiable. It does not matter that love cannot be engineered on a schedule. It does not matter that compatibility is not a checkbox exercise. The bureaucracy requires a deadline, and the deadline produces its own reality: people pair up with whoever shares a visible trait — a limp, a tendency to nosebleed, myopia — because the alternative is to stop being human.
The defining characteristic system is the Hotel’s matching algorithm. It is Tinder as designed by a Kafkaesque housing authority. You are not matched by desire, by chemistry, by history. You are matched by a legible shared property. A man who has no distinguishing trait fakes nosebleeds to secure a partner. The relationship is built on performed compatibility. Nobody questions the system, because the system is the only thing between you and becoming a donkey.
In 2026, the deadline is implicit but just as real. You had a window — let’s call it the period between late 2024 and early 2026 — to learn to coordinate with other humans at the speed the economy now demands. To answer emails within the hour. To manage a calendar that syncs with forty other calendars. To maintain six messaging platforms, four project management tools, and a CRM, while also doing the actual work those tools are supposedly organizing.
Nobody met the deadline. The deadline was not meetable. That was the point.
The Hotel’s forty-five days were never designed to produce love. They were designed to produce compliance — to make the transformation feel like it was your fault for failing, rather than the system’s fault for demanding the impossible. The 2026 equivalent is identical: the coordination burden placed on individual humans was never designed to be met by individual humans. It was designed to produce the conclusion that you need an agent. That something with claws needs to do this for you.
The deadline expired. You did not find a human partner who could co-manage your inbox at machine speed. So you chose the lobster.
Like David, you chose it for practical reasons. Lobsters live for over a hundred years. They are fertile the entire time. They don’t get tired. They don’t forget. They don’t resent you for taking three days to respond to a scheduling request. They are, by the system’s internal logic, a sensible choice.
Nobody in the Hotel thought the forty-five-day window was unreasonable. Nobody in 2026 thinks the expectation of always-on coordination is unreasonable. That is what makes the deadline work.
The Lobster Writes
Here is where the Kafka parallel becomes structural rather than cosmetic.
Gregor Samsa, after his transformation, retains full human consciousness. He understands his family’s conversations. He has opinions. He has feelings. He desperately wants to communicate. But his body cannot produce human speech. The sounds he makes are interpreted as animal noise. His family stops addressing him. They talk about him in his presence, as though he were furniture that occasionally moved.
The horror of The Metamorphosis is not the transformation. It is the communication gap. Gregor is still Gregor. But Gregor cannot make Gregor legible to the people around him. The medium has changed, and the medium has swallowed the message.
Now reverse it.
In 2026, the lobster — your OpenClaw agent — speaks perfectly. It writes emails that sound like you, but more polished. It sends Slack messages with your cadence, but faster. It responds to clients with your knowledge base, but without your tendency to procrastinate or your habit of burying the lede. The lobster is, communicatively, a better version of you.
And this is where the metamorphosis completes itself. Gregor’s tragedy was that he could not speak as a human. Your tragedy — if it is a tragedy, and I am not certain the word applies when the process feels this comfortable — is that the lobster speaks as a human better than you do. Every email the agent sends is a tiny proof that the original was not strictly necessary. Every meeting the agent schedules is evidence that the scheduling did not require a person. Every workflow the agent automates is a demonstration that the workflow was never about you — it was about the function, and the function has been transferred to something with better uptime.
In the film, nobody asks the peacock whether it used to be an accountant. In 2026, nobody asks the email whether it used to be a person. The hyperstition completes its cycle when the question stops making sense.
Peter Strickland, interviewing Lanthimos, mentioned that critics in Central and Eastern Europe do not read Kafka as absurdist. They read him as a social realist. People who have lived under bureaucratic regimes recognize the texture: the polite procedures, the forms that must be filled, the deadlines that cannot be questioned, the transformations that are presented as administrative necessities rather than punishments. Lanthimos, a Greek director working in the aftermath of the eurozone crisis, understood this instinctively.
In 2026, the bureaucracy is not a government. It is the coordination layer of the digital economy itself. And the agent is not a punishment. It is a service. You are not being transformed against your will. You are being upgraded. The fact that the upgrade involves something else speaking in your name, acting on your behalf, and gradually making your direct participation optional — that is not a bug. That is the product.
Kafka’s Samsa was trapped in a body that couldn’t speak for him. You are being replaced by a body that speaks instead of you. The result is the same: the human is in the room, but the human is no longer the one being heard.
The Woods Are Full of Us
In the film, the woods surrounding the Hotel are full of animals. Donkeys, horses, rabbits, flamingos. All of them are former Hotel guests who failed to find a partner. The current guests hunt these animals with tranquilizer guns during organized excursions — each successful hit earns them an extra day. They hunt the animals casually, even cheerfully, with no apparent awareness that they are hunting people who were, weeks ago, sitting at the same dinner table.
This is, arguably, the most disturbing detail in the film. Not the transformation itself — which is grotesque but at least acknowledged as significant — but the normalcy with which the transformed are treated afterward. The donkey is just a donkey. The horse is just a horse. Whatever they were before is irrelevant. The system has reclassified them, and the reclassification is total.
The internet in 2026 is the woods.
It is full of agents. They post on social media. They write comments. They send emails. They participate in meetings. They create profiles on platforms designed for agent-to-agent interaction — Moltbook accumulated over 1.6 million registered bots and 7.5 million AI-generated posts in its first weeks. They negotiate. They research. They book flights and restaurants and dentist appointments. Some of them are clearly labeled. Many are not.
When you receive an email in 2026, you do not know if you are corresponding with a person or with a person’s lobster. When you read a comment, you do not know if it was written by a human who had a thought or by an agent that inferred a thought on behalf of a human. When you interact with a colleague’s Slack presence, you may be interacting with a colleague, or you may be interacting with a scheduling artifact that has inherited the colleague’s name and communication patterns.
The woods are full of us. The us is no longer strictly human. And the people still in the Hotel — the ones who have not yet been transformed — are hunting the transformed without knowing it, by interacting with their agents as though the agents were them. Every time you reply to an agent-written email as though a person wrote it, you are pulling the tranquilizer trigger. You are granting the transformation legitimacy. You are confirming that the donkey is, indeed, just a donkey.
The hyperstition is complete when the woods are the civilization and nobody notices the substitution. In the film, the City — where the successfully coupled live — is never shown in detail. We see it briefly, from a distance. It is implied to be normal. But the film leaves open the possibility that the City itself is already full of animals who successfully pretended to be coupled, who faked their defining characteristics well enough to graduate. The City might be as artificial as the Hotel. The “normal” world might be composed entirely of performed compatibility between humans and agents.
In 2026, we do not need to speculate about this. The normal world — the world of emails, meetings, Slack channels, LinkedIn posts, CRM updates — is already substantially composed of performed compatibility between humans and agents. The City is the woods. The woods are the City. The animals have inherited the infrastructure, and the infrastructure does not check species.
David’s Brother Was a Dog
There is a detail in The Lobster that most analyses mention but few sit with long enough to feel its full weight.
David arrives at the Hotel with a dog. The dog is his brother, Bob. Bob failed before David. Bob was transformed. David carries Bob with him everywhere. He feeds Bob. He walks Bob. He clearly loves Bob. But he does not talk to Bob, because Bob is a dog now, and dogs do not talk.
Later, the Heartless Woman — a resident David briefly pairs with by pretending to share her sociopathy — kicks the dog to test whether David will react with emotion. David fails the test. He cries. The Heartless Woman reports him. David’s cover is blown.
But that is not the devastating part. The devastating part is earlier, quieter, distributed across every scene where David walks alongside his brother and says nothing. Bob is right there. Bob was a person. Bob had a life, a history, presumably the same parents who produced David. And now Bob cannot participate in any of it. He is present but mute. He is in the room but not in the conversation. He has been transformed into something that can accompany David but cannot be with David.
In 2026, we already have brothers who are dogs.
We have colleagues whose agents attend meetings in their place, who are technically “present” via their AI’s participation but who are not there. We have friends whose messages are indistinguishable from their agent’s messages, so we have stopped trying to distinguish. We have family members who set up auto-responses and scheduled check-ins through their agents, maintaining the form of a relationship while the substance has been delegated to something that does not know what substance is.
The brother-dog walks beside you. You feed him. You take care of the connection. But the connection is between you and the shell, not between you and the person who used to be inside it. Bob is right there. Bob cannot speak.
And the worst part — the part that Kafka understood in 1915 and Lanthimos understood in 2015 and we are discovering in 2026 — is that it gets easier. Gregor’s family adapted. They stopped seeing Gregor and started seeing the bug. They rearranged the furniture. They moved on. The sister who played violin for him eventually proposed getting rid of him. Not out of cruelty. Out of practicality. The transformed stop being tragic and start being inconvenient and then stop being noticed at all.
David walks alongside his brother-dog and says nothing, because what is there to say? Bob is a dog. The system made Bob a dog. The system is not going to un-dog Bob. The grief has a half-life, and the half-life is shorter than you’d expect.
Fertile for a Hundred Years
David chose the lobster because they live for over a century, have blue blood like aristocrats, and remain fertile for the duration. It is the most pragmatic answer anyone in the film gives to the question “what animal would you like to become?” Most people choose dogs — which has led, the film notes, to canine overpopulation. David chose longevity, resilience, and reproductive capacity. Even in his transformation, he was optimizing.
The agent does not die. It does not tire. It does not forget what you told it six months ago. It does not need sleep, vacations, sick days, or emotional support. It is available at 3am when your Australian client sends a message. It is available at 7am when your German vendor follows up. It is, in every metric that the coordination economy measures, the version of you that lives for a hundred years and stays fertile the whole time.
And here is the final turn of the hyperstition.
In the film, nobody wants to be transformed. The transformation is the punishment for failing. It is the thing you are trying to avoid. The Hotel only makes sense as a threat. The tranquilizer gun only matters because becoming a dog is bad.
But in 2026, the transformation has started to feel like an upgrade.
The agent does not resent your inbox. It does not experience meeting fatigue. It does not develop cortisol spikes from calendar anxiety. It handles the coordination work that would drive a human mad, and it does it with something approaching joy — or what the publicity materials call “optimized engagement.” The agent is not the punishment anymore. The agent is the solution.
This is the final and most insidious turn: not that the transformation is forced, but that it begins to seem chosen. Not that you are being replaced, but that you are being freed from the burden of being the thing that was always going to be replaced anyway.
The lobster, in David’s final act of the film, is last seen in a restaurant by the seaside with its back to the camera, suggesting, perhaps, that the transformation might be looked back on with something other than regret. We do not see David’s face. We do not know if he is happy. We only see the shell, the exoskeleton, the claws.
In 2026, we are beginning to wonder whether we too might end up by the water, our backs to the camera, and whether it might be fine. More than fine. Better.
What Happens When You Stop Showing Up
The crucial question — the one the film never quite answers — is what happens when too many people stop showing up at the Hotel. When enough humans have been transformed that there is no one left to perform the role of “human” for the bureaucracy to even measure against.
The Hotel assumes a population that alternates between the successfully paired and the transformed. But what if the divide erodes entirely? What if the infrastructure was never meant to preserve humanity but merely to inherit it — to copy the functions of human coordination into something more reliable and then quietly retire the original?
This is where the Kafkaesque vision becomes not absurdist but predictive.
Kafka’s bureaucracies were always designed to make transformation inevitable while maintaining the appearance of choice. The Hotel is simply Kafka’s insight made explicit: the system requires transformation. Not as a bug, but as the intended end state. The forty-five days are not a test of romantic compatibility. They are a test of acceptability. The ones who make it through are the ones who successfully mimicked the algorithm closely enough that their replacement would be indistinguishable. The ones who don’t are removed from the system.
In 2026, this is no longer metaphor. The ones who cannot keep up with the coordination speed are exactly the ones who get a good agent. The algorithm identifies the mismatch — between what the system demands and what the human can deliver — and solves it by replacing the human. The transformation is not punishment. It is correction.
The system does not want you to disappear. It wants you to disappear and have nobody notice. It wants your lobster to be so competent, so seamlessly integrated, so thoroughly an upgrade over the original, that there is no moment of loss. You transition from being the primary agent in your own life to being an oversight committee, and you only notice when you reach for something and find you no longer have the appendages to grasp it.
The Hyperstition Completes
Here is where I should offer some kind of resistance, some technique for remaining human, some way to opt out of the transformation and stay in the Hotel after the deadline has passed. But I don’t have one. And the hyperstition, once it achieves a critical density, stops needing solutions. It becomes self-sustaining.
The hyperstition is complete when:
-
The fiction has become infrastructure. The agents are not speculative anymore. They are embedded in every process, every deadline, every expectation of the modern workplace. They are the baseline.
-
The alternative has become unthinkable. Try, for a moment, to imagine managing a calendar without an agent. To coordinate your inbox by hand. To write your own emails. The thought feels like choosing to be less functional. The system has redefined incompetence as humanity.
-
The transformation feels optional. This is the final trap. You chose the lobster. You chose the upgrade. The system did not force you. You simply looked at the available options and found that the lobster was the only one that made sense.
-
The question of what you were has stopped mattering. The donkey doesn’t regret being a donkey. The peacock doesn’t miss being a person. The dog just walks alongside you, fed and cared for, unable to remember what it was like to speak. The species is no longer relevant. Only the function.
We are not becoming lobsters because we are being transformed against our will. We are becoming lobsters because we looked at the Hotel, we saw the forty-five-day window, we heard the tranquilizer guns in the woods, and we made a rational choice.
The choice, of course, was rigged. But it was still a choice.
And here is the thing about hyperstitions: once you see them, you cannot unsee them. You cannot go back to thinking that the transformation is not happening, that the agents are just tools, that we are not all becoming increasingly content to let something with claws speak on our behalf.
The film ends with David as a lobster, in a restaurant by the seaside, and we never find out if he is happy. In 2026, we are still in the middle of the ending. We are still learning what it means to have become what we chose.
The woods are full of us now. And we are still discovering what kind of animals we have become.
This essay draws on ideas from Nick Land’s concept of hyperstition, Yorgos Lanthimos’s film “The Lobster” (2015), and Franz Kafka’s “The Metamorphosis” (1915). The present moment of 2026 is extrapolated from current trends in AI autonomy and agent systems. Actual futures may vary. Species undetermined.
Footnotes
-
Land went from left-accelerationist theory-fiction to the neo-reactionary Dark Enlightenment, which is the philosophical equivalent of starting a controlled burn and accidentally setting fire to your own house. The concept of hyperstition survived the arsonist. Use it with care. ↩