Dark Markets

Dark Markets

Share this post

Dark Markets
Dark Markets
The Future as an Emergency (Part 1)

The Future as an Emergency (Part 1)

Effective Altruism and Techno-Authoritarianism

David Z. Morris's avatar
David Z. Morris
Apr 27, 2025
∙ Paid
14

Share this post

Dark Markets
Dark Markets
The Future as an Emergency (Part 1)
3
3
Share

Hello and welcome to your premium weekend Dark Markets read - one of the final few draft excerpts from my forthcoming book before I send the whole mess over to the crew at Repeater to turn into an Actual Book. After that, the deep dives here will return to a broader diversity of topics.

Since finishing my first draft of the book in March, I took a brief hiatus before the emotionally wrenching process of editing, aka Confronting My Own Mediocrity. Luckily it hasn’t been quite as bleak as it might have, and in fact I’ve honed in on one key theme that was latent in those early drafts.

I’ve decided to move some things around and add yet more to this already-hefty tome in the form of a new early chapter focused on the strangely widespread sense of panic and emergency among Effective Altruists and Rationalists. Most clearly expressed through EA’s constant injunction to “maximization” and Rationalist fears of “AI Doom,” the sense of emergency among adherents of these techno-utopian ideas led to a culture of rampant overwork and stimulant abuse that squashed moral self-reflection and indisputably smoothed the path to FTX’s collapse. In more extreme cases, emergency thinking led adherents to mania, psychosis, and murder.

It is through the lens of Emergency that we can see how Sam Bankman-Fried’s crimes presaged the catastrophe unfolding now that techno-utopians Peter Thiel and Elon Musk have become power brokers in the U.S. and beyond. Musk’s “chainsaw” (of which Bankman-Fried recently expressed approval) and Donald Trump’s fascist deportation regimes are both justified by the logic of a looming future emergency that demands uncritical, extralegal action in the present - exactly the logic that underpinned Bankman-Fried’s crimes.

Dark Markets is a reader-supported publication. To access premium content, become a supporting member below. Your contributions are vital and deeply appreciated.

Smokin’ that shit that predicts the future. "Priestess of Delphi" by John Collier, 1891. Public Domain, courtesy Worldhistory.org.

In the weeks before Sam Bankman-Fried’s sentencing, allies sent dozens of letters to Judge Kaplan pleading for clemency. Few personal friends of Sam’s took the trouble, whoever they might have been - most of the letters came from his immediate family members, former FTX employees, or recipients of grants from Bankman-Fried’s charitable pass-throughs.

One of the most interesting came from George Lerner, who was Bankman-Fried’s personal psychiatrist before being hired as a staff counselor at FTX.1

“I am reminded of a conversation I had with Sam sometime in the Spring of 2022,” Lerner wrote. “I had been concerned about Sam’s sleep, diet, and physical activity. I told him … that his focus on work at the expense of personal health could well shorten his life. He sat for a few seconds and shocked me with his response. He asked me if it would impact the next five years of his life, and said he was not concerned about his health past that. He explained that due to circumstances, he was in a unique place to help others right now by earning to give, and that opportunity would probably not last more than five years or so. Thus, he didn’t care what happened to him after that.”

Bankman-Fried’s sense of a limited horizon is at first hard to square with both his grand personal ambitions, and with the emergent “longtermist” vision of the Effective Altruism movement. From its roots advocating here-and-now philanthropic efforts like fighting malaria in the developing world, EA had increasingly come to concern itself with the far future of the entire human race, and with longshot “extinction risks,” such as asteroid strikes, that threatened to wipe out all of humanity - an emphasis laid out in movement cofounder and Oxford prof Toby Ord’s hulking tome, “The Precipice.”

But this seeming contradiction, between the long view and present self-sacrificing panic, is in fact a revelation: a skeleton key that unlocks the subtle perversity of Bankman-Fried’s ideological universe. Effective Altruism, in part through its commingling with the pre-existing Ratinalist movement in the Bay Area, had increasingly come to view risk as statistically manageable - to regard the future, in a strong sense, as predictable through mathematics and semi-formal logic.

In turn, this meant that EAs and Rationalists viewed their own present actions in terms of their echoing impacts on the future - including when they failed to take the mathematically correct actions, causing, or failing to prevent, future harms. This radical utilitarian logic meant that every present action had close to infinite future impact - and that the faster they acted in the present, the greater compounding effect their tiniest decisions had on the future. Every breath they took, in their minds, could either save millions of future humans - or doom them.

Sam was not the only young person in the movement to destroy themselves on the rocks of this future emergency. For Bankman-Fried, the belief that, at 31 years old, he had only five years left to “help others,” provides a proximate motive and rationale for his crimes. This brief window for maximizing his utilitarian impact was a license to cut corners on conventional morality and law. It was material, mathematical predictability that made the future rigidly continuous with the present, and justified using customer property as leverage so that his inevitable wins would arrive sooner, and do more good.

Of course, Bankman-Fried’s sense of a sharp expiration date may have had additional sources, which Lerner does not share. Perhaps it reflected some subtle self-awareness that FTX was not long for this world. Perhaps it reflected the more ambient sense of climate, economic, and social doom in which so many of us luxuriate - as EA’s larger logic does, despite its perverse evasions on matters of economy and climate. Alongside those more conventional dooms, Bankman-Fried likely shared another growing Effective Altruist anxiety: that Artificial Intelligence was on the verge of a Terminator-like annihilation of mankind.

This sense of the future as a present emergency led to decisions that ironically privileged present wins over long-term consequences. Within FTX and Alameda, for instance, the use of prescription stimulants for performance enhancement was reportedly common, a chemical lever for acceleration that instead fueled the impulsive decision-making that brought the whole thing down. A culture of sleep deprivation, symbolized by Bankman-Fried’s own tendency to collapse on a beanbag rather than wasting time walking to an actual bed, would have further weakened FTX staffer’s cognitive abilities - and specifically their willpower for pushing back when Bankman-Fried directed them to commit fraud.

These details, however, wouldn’t be out of place in many technology firms staffed by young people - a manic workaholism is taken for granted in much of tech. Former Facebook executive Sarah Wynn-Williams recently offered a relevant insight into one source of this culture: Facebook consciously overworked employees so they didn’t have time to ask too many questions about the broader structure and practices of the company. Forced overwork is frequently deployed in other “high control” contexts - including cults like the Unification Church and Jonestown.

The sense of emergency had stranger and more pernicious effects in other branches of the techno-utopian network. This has included numerous reports of mental breakdowns and even suicides among members of groups like the Center for Applied Rationality, often driven by intense feelings of guilt, inadequacy, and paranoia that one is not doing enough. Bankman-Fried was not the only adherent to feel this present emergency justified criminal behavior: in one particularly extreme case, a group of sleep-deprived EA affiliates allegedly committed a wave of brutal homicides in their quest to make the world a better place.

Read More: DeepSeek and the AI Murder Cult

This self-imposed mania developed within Effective Altruism over time. The movement began as a largely laudable and common-sense set of insights about the relative privilege and responsibility of the wealthy residents of the developed world towards the global poor. But over the short span of about a decade, EA became entwined with other lines of thinking, including longtermism, extinction risk, and most toxically, a movement known as Rationalism. While EA’s roots lay in Australia and the U.K., Rationalism was watered and grown in the soil (or rather, the money) of Sam Bankman-Fried’s Bay Area home. Rationalism is a fairly formal and spectacularly well-funded movement, with a web of institutes and recognized leaders.

Bankman-Fried, leveraging stolen customer money, would become one of those funders.

But first, he had to get rich.

“The Pond is Quite Shallow”: Bankman-Fried Meets Effective Altruism

That was the advice Bankman-Fried received from Will MacAskill, one of Effective Altruism’s two main figureheads, after a talk MacAskill gave in the fall of 2012 in Cambridge, Massachusetts. Bankman-Fried was studying physics as an undergraduate at MIT, and MacAskill had contacted Sam “completely out of the blue,” according to Michael Lewis (Lewis 48-50) - or, somewhat more likely, because of posts Sam had made on utilitarian message boards2.

Keep reading with a 7-day free trial

Subscribe to Dark Markets to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 David Z. Morris
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share