

Yes, promising “returns like a good year on the stock market, but no risk” usually says Ponzi.
When the forensic accountants go through OpenAI’s books in 2027 or 2028 I would like to see whether anyone but staff and suppliers made money from it.


Yes, promising “returns like a good year on the stock market, but no risk” usually says Ponzi.
When the forensic accountants go through OpenAI’s books in 2027 or 2028 I would like to see whether anyone but staff and suppliers made money from it.


The subtext sounds like “we guarantee your returns, then go public. If we go bankrupt you get the retail investors’ money, if we become the next Google you get your own private island.” All you have to do is trust Sam Altman and (breaks out in hysterical laughter).
Do they mean 17.5% a year? My balanced bond-equity portfolio made 14-15% annual returns over the past three years by the radical method of buying “shares of companies that make profits” and “bonds backed by my local and national government.” (Update: I made about 12% a year because I backed out of American stocks years ago, but the blandest 60% stock, 40% bond index fund in my country returned that 14-15% a year after expenses).


My first degree was a professional degree, so after college I went out and got a paid job doing that, using the experience I had developed in paid summer jobs. Even when I was young I think I would have said no to Leverage Research.


Back and forth a few years ago on the SlateStarCodex subredit, roughly:
Scott Alexander: Bay Area rationality is wonderful, we have foundations and group homes and jolly social activities and a Solistice ritual and even “Reciprocity and Propinquity: two different rationalist dating/matchmaking services”
Rando:
I don’t know, I live in a nice community in a different city where people I know have lots of Shabbat dinners, choirs, board game nights, discussions, etc. And zero people I know have joined a cult, and one person I know has developed psychosis, but she had a family history of psychosis, starting having symptoms in early adulthood, and pretty quickly went on antipsychotics and got a lot better.
Is it just that California attracts weird shit and if you put people in California, whatever they’re already doing will get culty?
Alexander: base rates! how do your demographics compare to ours?
Rando:
Probably similar size and age? Nearly everyone I knew has parents who are teachers/lawyers/doctors/therapists/etc, so I guess upper middle class according to that book you wrote about a while ago.
It’s not like everyone’s doing great, lots of people have depression and anxiety and probably smoke more weed than is good for them. Most of those people already had those problems from their adolescence.
But our rates of weird problems, like multiple people with overlapping psychoses tied to some guy, are low.


Gleiberman’s paper on the longtermist foundations of the Effective Altruism movement is great!
I read a post by someone leaving LessWrong-the-site who said that from now on he would only donate to Aubrey de Grey because obviously we are so close to curing aging. Found it http://lesswrong.com/lw/m81/leaving_lesswrong_for_a_more_rational_life/


An Aella-curious blogger in SoCal has noticed something:
But what I find more interesting than broadly “weird sex” is the specific interest in BDSM, kink and particularly full-contact CNC; a relatively common fantasy in individuals, but one I’ve never seen such widespread community interest in outside the Bay Area.
Kink and power-play are practices of manufactured risk, with CNC clocking at a more intense point on the same spectrum. The idea that many of these people are devoting their 9-5s and beyond to eliminating the ultimate consequence (death), only to go home and collectively play-pretend violence (scaffolded with extensive rules and consent forms) is fascinating, and- to me- makes complete sense.
The rationalist interest in manufacturing risk is the direct byproduct of their commitment to flushing it out.
The blogger attended Aella’s SlutCon. I don’t know if she knows that many of our friends have problems with consent as most of us understand it (their understanding is more “if they are old enough to sign the contract, and they sign, that is on them”).


Zoe Curzi reported that IFS was used within Leverage Research and that after she escaped the cult, she used Internal Family Systems therapy to heal and accept that Leverage did not offer unique insights.


“I Always Do What Teddy Says,” Harry Harrison, 1965 (Internet Archive)


Hofstadter and Hesse seem to be namechecked on LW much more often than Leo Strauss. I wonder if Scott Alexander talks about Strauss over coffee if he trusts you, because so much of what our friends do is supposed to fool the vulgar masses while the wise smile and know the hidden truth.
I wonder if the real secret to Vassar’s influence is that he influenced the leaders of Bay Area LW like Alexander, Anissimov, Constantin, Zvi Moshowitz, and Salamon.


Thanks! I don’t get the impression that Michael Vassar posts or publishes a lot under his own name, he seems to prefer cornering susceptible people at events and then having private conversations and correspondence with the ones who respond in a promising way. The clearest description of his jailbreaking which I have read is by Scott Alexander in a back and forth with Jessica Taylor (and we know Scott Alexander tries to hide some of the beliefs he cares the most about).
In a LessWrong thread people just point to a deleted Twitter account and some YouTube videos by Vassar.
RationalWiki briefly mentions earlier woo abut brain hemispheres.


That specific instance of Archive Today seems to have been taken over by activists who edit their copies of some pages and performed a DDOS attack (although all I know comes from social media posts and news stories). https://www.avclub.com/archiveis-under-fbi-investigation


Well, I think the Buddhist idea that the self is an illusion goes back 2500 years or more, but Douglas Richard Hofstadter might have introduced nerdy American sci-fi fans to the idea.


The Cut seems to like articles on cults and abuse within small groups, since they have an article on the Zizians, and one on a Neo-Tantric sex group where Aella would feel at home


I heard somewhere that “there is no unitary self” can be a Buddhist teaching and TPOT draws on Western Buddhism. There is work to be done figuring out where they got their eclectic mix of techniques and terminology.


Has anyone heard of the Internal Family Systems Model? One of the CFAR founders said he relied on it when he was designing self-help workshops. The IFS encourages you to see yourself as a system of entities and talk to them separately, and that reminds me of Ziz Lasota’s two-hemispheres theory and Michael Vassar’s jailbreaking.


Forming a single legal entity would have made it hard to protect the other projects if the CFAR side had lost a lawsuit over abuse of a minor at a CFAR event, or Lightcone had lost a judgment over taking money from FTX and had to sell the Rose Garden property, I know these people don’t do “fear of frequent consequences of ordinary human weaknesses” but that is a big risk.
I also wonder who served as treasurer and bookkeeper for each project. If one person served both projects, he or she could have caused all kinds of trouble, even if there were separate bank accounts.


Back in 2019, Ben Pace of Lightcone said that CFAR and Lightcone were one legal entity, but two boards with no overlap. Did CFAR + Lightcone really spend $22 million on real estate in Berkeley without spending a few grand to create a separate nonprofit and separate the finances? In 2024, CFAR still had the real estate and the mortgage on its books. https://www.lesswrong.com/posts/eR7Su77N2nK3e5YRZ/the-lesswrong-team-is-now-lightcone-infrastructure-come-work-3
I have never opened a US business bank account, but I would think it would be hard to keep the bank accounts separate if one organization has no independent legal existence, and transactions in the millions or tens of millions tempt the most righteous person to stick his fingers in the till.


If Duncan Sabien or Eliezer Yudkowsky admitted what they were doing, they would have to take more responsibilities. If they moved from chanting “we have noticed the skulls” to expecting every serious member to be able to describe times that nerdy subcultures they were not part of went wrong, they would have to give up most of what they do.


CFAR seems to have pivoted back to focusing on the workshops. Their winter 2025/2026 fundraiser only raised $10k with a goal of $125k. The curriculum sounds very New Age:
If you’ve been to a CFAR workshop in the ~2015-2020 era, you should expect that current ones: … Have roughly 1/3rd new content, mostly aimed at practical ways to be less “seeing like a state” when applying rationality techniques, and to be more “a proud gardener of the living processes inside you / a free person with increasing powers of authorship.” (We’ve been calling this thread “honoring who-ness.”)
No masks in their photo of a workshop posted February 2025 (2024 was a pretty bad year for airborne infections where I live, and alienated educated young people are more likely to wear respirators than normies, so I would expect to see someone in that room wearing a N95 or Flo). If building warm and nurturing relationships is important then it helps to be able to eat together and see each other’s faces. The venue is about a 90 minute drive from Oakland, CA (the East Bay).
This paragraph leapt out at me:
On Day 4 of the four day workshop, we spent three and a half hours on an activity called Questing, in which participants took turns being the “hero” (who worked on whatever they liked) and the “sidekick” (who assisted at the hero’s direction) for ~10 minute chunks. This activity was extremely well-liked (did best of all activities on our survey; many said many great things about it).
If you read that and say “doesn’t that sound like Effort Exchange in the Dragon Army Barracks?” you should go home and rethink the regrettable things you learn on the Internet. I look forward to reading the book on LessWrong, the splinter sects, and just how much they had in common after a hard day gardening in a post-apocalyptic wasteland.
Before FTX collapsed my model of LW was something like cryptozoology enthusiasts who trade posts and sometimes meet at a con, now its more like Scientology. Early Scientology offered a community and a path to self-improvement.
Sounds like someone in a jurisdiction which requires all-party consent to record a conversation could sue them