Scott Alexander Siskind — of the long-running blog Slate Star Codex and newer Substack newsletter Astral Codex Ten — cannot be trusted as a source or a commentator because his rhetorical style implying intellectual seriousness proves to be a thin façade on close examination, and more importantly he carries water for far right ideas which are transparently wrong both factually and morally.
The core critique
Elizabeth Sandifer says:
Here’s what any good overview of Scott Alexander is going to need to say.
-
He emerged out of Eliezer Yudkowsky’s LessWrong community. This community consists almost entirely of hobbyists attempting to reinvent academic disciplines from scratch. The results occasionally have moments where they frame interesting questions or perspectives that might not occur to an expert. The signal to noise ratio, however, is extremely low. Most of the time they just fall blindly into well-documented errors and refuse to admit it.
-
Alexander belongs to a particular subgroup of that community who has gotten seduced by pseudoscientific ideas about the genetics of race. These ideas should be dismissed the same way climate denial is. Alexander is good at looking like the wide-eyed, innocent speculator who’s merely asking questions. Maybe he actually is. In practice, that community’s tolerance for racist pseudoscience is routinely exploited by white supremacists.
-
This is extra specifically true for Alexander himself, whose blog has created a community that is extremely fertile grounds for white nationalist recruitment. Alexander cannot plausibly claim ignorance of this. If you’re writing any sort of overview that does not have a very clear view of those three facts, you’re probably going to end up directing attention and curiosity towards Alexander’s community and in doing so aiding white nationalist recruitment.
Sandifer wrote a book on that circle of people, Neoreaction: A Basilisk, which I recommend as both entertaining and insightful in understanding neo-reactionaries (“NRx”). I maintain my own index of resources about NRx; they are a far right movement distinct from fascism but no less evil in their opposition to democracy and equality.
Alexander’s writing is dangerously credulous about far right ideas. One need not know more than that.
But a decent respect to the opinions of mankind requires that I should declare the causes for concluding that Alexander’s thinking is so dangerously bad, I should examine why Sandifer and I think he might be a useful idiot for reactionaries rather than really one of them, and I should examine the implications.
Rhetorical & intellectual sloppiness
Alexander admits that he does not write carefully.
It takes me a couple of hours to write a post.
I work a forty hour week, so having a couple of hours each week to write posts isn’t really a problem. In my own life, I've noticed that time is almost never a real constraint on anything, and whenever I think it is, what I mean is “I have really low energy and I want some time to rest before doing the next thing". But writing posts doesn’t really take that much energy so I am okay with it.
Also, I have no social life and pretty much avoid all my friends and never talk to anybody, which is helpful.
I guess I don’t really understand why it takes so many people so long to write. They seem to be able to talk instantaneously, and writing isn’t that different from speech. Why can’t they just say what they want to say, but instead of speaking it aloud, write it down?
Sandifer does a deep dive into the implications of his resulting rhetorical style in her essay The Beigeness, or How to Kill People with Bad Writing: The Scott Alexander Method:
My contention is that Siskind’s prose — which I view as representative of a larger style — works through a sort of logorrheic beigeness. Siskind is good at giving readers the sense that they are being intelligent — that they are thinking about serious issues at considerable length. In practice, he says… not quite nothing, but very little, at least on a moment to moment basis. Instead he engages in a litany of small bullshits — shoddy arguments that at their best compound into banality, but at their worst compound into something deeply destructive, all made over such length that smoking guns are hard to find, which is of course the point.
I have quibbles with some of the particular criticisms Sandifer makes in the full post, but her central point is correct: Alexander’s writing demonstrates a kind of anti-rigor in a form which obscures its worst implications.
Examples of faux intellectual rigor
Kevin Drum looks closely at Alexander criticizing the FDA and discovers that he is intellectually incoherent:
I mentioned in a section of my recent post, “Sympathy For The Devil”, that I think the FDA as an agency is often quite good. They’re smart, caring people, and they usually carry out their mandate well — so well that the few exceptions, like aducanumab, are highly newsworthy. I have no objection to Dr. Gura’s mostly-positive portrayal of them.
This bears no resemblance — none — to [Alexander’s] diatribe in Part 1:
Every single hour of every single day the FDA does things exactly this stupid and destructive....I am a doctor and sometimes I have to deal with the Schmoe’s Syndromes of the world and every f@$king time there is some story about the FDA doing something exactly this awful and counterproductive.
I have no idea how you can write “they usually carry out their mandate well” in one place and then, in your main post, just go ahead and repeat your original belief — backed by an example you know is wrong — that the FDA does stupid and destructive things on practically a daily basis.
Alexandros Marinos, whom I read as engaged-with-but-skeptical-of the “Rationalist” community, says:
At this point I treat Scott Alexander’s writing as an infohazzard. Unless you are willing to check his facts and citations, it is probably inadvisable to read his material, as it is constructed to build a compelling narrative.
But watch the lemmings line up and jump off a cliff, obviously taking Scott Alexander, who has already admitted to falsely accusing multiple scientists, at his word.
Another Thursday link by Tyler Cowen March 28, 2024 in Current Affairs, Medicine
Scott Alexander rules against Lab Leak as most likely. Of course this is not the final word, but it is fair to say that Lab Leak is not the go-to hypothesis here. Analytical throughout. And Nabeel comments.
Unless and until Scott Alexander commits to adopting a robust editorial process where blatant errors that are reported to him are corrected promptly, his work should be read as fiction “based on a real story, sorta”.
I guess I hate posting vague things so I will provide some concrete, black-and-white evidence. Here is part of an email I sent to Scott, notifying him that two funnel plots that he used in his response to me were only showing a subset of the data they claimed to show:
Hi Scott,
I fell in between my last email and when you published your response, so I didn’t have time to send you my response to your query on TOGETHER. In any case, see below for
-
some obvious corrections to your final piece
-
a part of what I was going to write to you about TOGETHER
-
the response I owed to you about the meta-regression issue you raised
1/ I think it’s my duty to let you know that both the funnel plots you presented are mislabeled. The first one includes only 14 early treatment studies (not all 48 ivmmeta studies with mortality results) and the second one includes only peer-reviewed studies and not all 95 studies. What’s more, there’s no information about how the funnel plots were computed, making them of little use as actual funnel plots to make any conclusions form. I obviously have a long list of concerns, but I don’t expect you’re interested in me givin gyou the detailed fact-check of the piece. The above are obvious issues that you should know about. I don’t expect a response on these issues, I’m just making sure you’ve been informed.
This was an important correction to his piece as the errors I pointed out were load-bearing in the argument he was attempting to make. I chose to inform him about corrections that were straightforward so I could confirm for myself, again, that he was not interested in facts.
Scott responded to a different part of the email, but, as I expected, he ignored this part, which I had placed first.
So let's check if the errors have been fixed now, over a year later.
The piece is here: Astral Codex Ten | Response To Alexandros Contra Me On Ivermectin. If you read it, please know it contains many false representations and outright errors. But let's focus on the specific black-and-white issue I reported to him in Feb 2023.
Surprise! The blatant errors are still there, and still falsely described as representing more data than they actually do.
In the same piece he admits, though he downplays, that he slandered a group of scientists out of Israel by accusing them of academic fraud. In fact, they had done one of the most credible and straightforward RCTs on Ivermectin.
You can find Alexandros’ full critique here. His main concerns are:
-
I claimed that the primary outcome results were hidden, probably because they were negative. In fact, they were positive, and very clearly listed exactly where they should be in the abstract and results section.
-
That makes my dismissing their secondary outcomes as “the kind of thing you do when your primary outcome is boring and you’re angry” incorrect and offensive. The correct thought process is that their primary outcome was positive, and their secondary outcome was also positive, which they correctly mention.
-
Gideon Meyerowitz-Katz’s objected to the researchers changing the (previously preregistered) procedure partway through. But the researchers had good reasons for doing that, they got the IRB’s permission, and they couldn’t have been cherry-picking, because they hadn’t seen results yet and didn’t know whether this would make ivermectin look more vs. less effective.
-
Gideon (correctly) phrased this as a non-sinister albeit potentially weird misstep by the study authors, but in trying to summarize Gideon, I (incorrectly) phrased it as a sinister attempt to inflate results.
This correction took me almost a year to extract.
In Scott’s mistakes page he lists 57. That looks like good practice, right? He even credits me for identifying several of those. Except of course what he doesn't tell you is that while I was doing that work he was busy tarring me as a conspiracy theorist.
My list of errors and misrepresentations in the one piece I analyzed alone is 98. Over 40 of them black-and-white factual errors like copying the wrong number from a paper. The errors, except, I think, one, all worked against ivermectin and in favor of the thesis he was pushing.
And while doing this work, I had to endure serious personal attacks both from him, and from his followers who took his signal and ran with it, while I was demanding that he fix his very serious, sometimes defamatory, errors. I did that work because I believed his claims that he was interested in truth, and felt that, when he was faced with the scope and scale of the errors he made, he would have the decency to retract the piece, as correcting all the errors would render it nonsensical.
What I got back was the piece linked above which is a passive-aggressive exercise in cherrypicking, minimizing, admitting what he could not avoid, and obfuscating. Attempting to shore up his original position, he even included new claims about the core topic that had even more errors in them (as you can see above, and that is a very small sample).
I hope now you understand why I said that Scott Alexander's writing should be read as an infohazard, fictional accounts blended with real ones, presented as science. And also why I said that he has no interest in correcting even black-and-white errors shown to him.
In a later Twitter exchange about this, Marinos concludes:
Scott is not a reliable narrator but he is an excellent author, and therefore finding yourself convinced by what he writes is both to be expected and not to be trusted.
Beyond this, I also have observed that he is obsessed with finding social ways to triangulate to the truth by seeing how different people react to each other etc, and what he fails to understand, which seems to be common with rationalists, is that there is no substitute for diving into the primary data.
This critique extends into Alexander’s community.
Rationalists used to believe that they can use their methods to outthink the world.
Now they use the methods to shame others for daring to believe they can outthink the world.
A cult of aggressive midwittery.
This confusion and mendacity in addressing tricky questions extends to being badly wrong about easy questions — logically, factually, and morally.
Very bad ideas
Credulity about NRx
Consider a relatively mild example
from Reddit in 2019. For the uninitiated, The Cathedral is NRx’ers name for the conspriacy of leftists which they imagine holds an iron grip on all meaningful institutions.
When the nrxers talk about the Cathedral, I find it tempting — sure, they flirt with conspiracy theory, but it seems they’re at least good conspiracy theories, in the sense that they explain a real phenomenon. Kennedy assassination conspiracy theories have their flaws, but one of their strong points is that Kennedy is in fact dead. If you’re coming up with a conspiracy theory to explain why people are biased in favor of capitalism, that seems almost like coming up with an Obama assassination conspiracy theory — not only are conspiracy theories bad, but this one doesn’t even explain a real fact.
Trump got elected after promising tariffs and immigration restrictions that no business or plutocrat wanted. Bernie Sanders was on top of the prediction market for next Dem nominee as of last week (today it’s Biden, but Sanders is close behind). The richest people in the world spend most of their time constantly apologizing to everyone for trumped up charges, and loudly (one might say fearfully) confessing they don’t deserve their wealth. This just really doesn’t seem like the world where capitalism is in control of the narrative, unless it’s doing some weird judo I’ve never heard communists try to explain.
So Alexander says here:
Though it is silly of reactionaries to see a conspiracy, capitalism has lost control of society, as NRx’ers say. In critiques of capitalism? Nope. In discussion of alternatives? Nope. But some policies do not perfectly suit the liking of rich people!
This is detatched from logic and reality.
Then Alexander doubles down and gets weirder.
The latest studies suggest that the rich do not get their policy preferences enacted any more than any other class (a study came out before showing the opposite, but seems to have been wrong). I’m not sure what else you mean by “capital really is in power”, other than that rich people can buy yachts or something.
I’m tempted to take an extreme contrarian position that everything interesting happens in a parallel status economy. The money economy isn’t “in power”, it’s a (weak) brake on power, or a force orthogonal to power which is helpful in not concentrating power 100%. That's why overthrowing capitalism keeps producing authoritarians.
Where is the “yes let’s overturn capitalism” side of the debate represented? Certainly not in the editorial line of any major newspaper, TV station or radio station.
I mean, it’s better represented than libertarianism. Yes, the Overton Window goes between “slightly more capitalism” and “slightly less capitalism”, but the “slightly less capitalism" side always seems to have the upper hand. I agree the war of ideas isn't yet a total massacre, I’m just saying the anti-capitalist side always seems to be winning, and pro-capitalist on the defensive. Propaganda victory exerts a weak pressure on reality, it doesn’t shift it all at once.
So Alexander says here:
Since capitalism is not a locus of power, calls for Slightly Less Capitalism will eventually develop into a massacre.
Preposterous.
On SSC, he offers other “insight” from NRx to harness:
I’ve said many times that I think the Reactionaries have some good ideas, but the narrative in which they place them turns me off (I feel the same way about Communists, feminists, libertarians, et al). Even though I like both basic income guarantees and eugenics, I don’t think these are two things that go well together — making the income conditional upon sterilization is a little too close to coercion for my purposes. Still, probably better than what we have right now.
We must skip over how ripe it is to put neo-reactionaires, libertarians, feminists, and communists in the same category to pay attenton to …
Eugenics?!?
Among the good ideas to draw from reactionaries Alexander finds … eugenics? Yikes.
With his rejection of “coercion” Alexander has reassured us that he would not march “undesirables” into death factories at rifle-point. This is cold comfort as one untangles what he does mean.
He brings up eugenics in more than just this bizarre aside so we can try to make sense of things. On his LiveJournal back in 2012 he asked …
So if you had to design a eugenics program, how would you do it? Be creative.
I’m asking because I’m working on writing about a fictional society that practices eugenics. I want them to be interesting and sympathetic, and not to immediately pattern-match to a dystopia that kills everyone who doesn’t look exactly alike.
To be generous: contrarian science-fictional world-building as a whetstone for thinking about principles has a noble tradition which includes satire, cautionary thought experiments, and visualizing dystopian outcomes. But one must be wary with topics like eugenics where bad actors speaking in bad faith do a lot of Just Asking Questions as a veil over the monstrous answers they have in mind. Alexander is not treading nearly carefully enough.
The discussion community does not respond with the mortified “whut?!?” they should. Instead, one commenter replies with …
Paying undesirables to be sterilised is happening! There’s a charity that pays drug addicts £200 to be snipped: Project Prevention. Seems like a good idea to me.
How about a benign eugenics regime that is about preserving diversity of human mental types while minimising disease? Everyone is profiled, and nerdy Aspergers types are encouraged to mate iwth empathisers rather than other nerds, ensuring that they don’t make full on autistic babies. Some of the funniest, most creative people I know are definitely touched by the spectrum and have fully autistic relatives in some cases, so old-fashioned eugenics response of sterilising everyone who is even vaguely autistic would destroy a lot of human capital.
In general, a eugenics regime that isn’t pushing toward a single human ideal, but is aware of the value of diversity, could be sympathetic. Maybe go the other way and have them maintain castes of specially bred ultra-systematisers, ultra-empathisers, synaesthetes, etc. The key to avoiding a retread of Brave New World or Morlocks/Eloi is that the castes are not ranked, and everything is done to make each caste happy. There would have to be safeguards to stop the empathisers manipulating everythgin for their own benefit — what would those be? At some point, are the castes reproductively isolated? What if there is some slow-motion catastrophe where humans will have to be very different a few generations hence — maybe it becomes obvious that climate change will collapse advanced civlisation and humans have to rebuild from hunter-gatherer level, so it becomes necessary to breed robust humans who’ll survive a population bottleneck ...
Alexander as squid314 responds to this with none of the pointed questions one should. (“Castes? Human capital?! Arranged mating?!? Undesirables!?!”) Instead, he is enthusiastic about this creepy realworld organization:
I ... actually think I am probably going to donate to that charity next time I get money. Though I’d feel better if it was something more reversible.
Ew.
And. Of course. Yes, this does go where one expects talk of eugenics to go …
Racist pseudoscience about IQ
On SSC in 2016, Alexander praised Charles Murray of the notoriously racist bullshit The Bell Curve.
The only public figure I can think of in the southeast quadrant [of an imagined political compass for poverty policy] with me is Charles Murray. Neither he nor I would dare reduce all class differences to heredity, and he in particular has some very sophisticated theories about class and culture.
That is not the only time:
my impression of Murray is positive [⋯] One hopes Charles Murray pursues what he thinks is true, and any offense caused is unintentional
Neither post directly supports Murray’s racism. Alexander at least poses as someone who rejects it, linking in 2017 an article which offers “five modest proposals” …
-
The idea that some people are inferior to other people is abhorrent.
-
The mainstream scientific consensus is that genetic differences between people (within ancestrally homogeneous populations) do predict individual differences in traits and outcomes (e.g., abstract reasoning, conscientiousness, academic achievement, job performance) that are highly valued in our post-industrial, capitalist society.
-
Acknowledging the evidence for #2 is perfectly compatible with belief #1.
-
The belief that one can and should assign merit and superiority on the basis of people’s genes grew out of racist and classist ideologies that were already sorting people as inferior and superior.
-
Instead of accepting the eugenic interpretation of what genetic research means, and then pushing back against the research itself, people — especially people with egalitarian and progressive values — should stop implicitly assuming that genes==inherent merit.
… saying of the article’s proposed description of this as the the position of the “hereditarian left”:
This seems like as close to a useful self-identifier as I’m going to get.
This comment suggests that Alexander accepts Murray’s deceitful pretense “golly, I’m just saying that since different people inherit different intellectual talents & temperament, we have to address that with policy somehow”. But if one attempts to go where angels fear to tread and undertake a hard-headed, big-hearted, politically-egalitarian examination of the implications of our differing genetic endowments, one should stay the heck away from Murray, whose claims about research in this area are overwhelmingly nonsense, transparently offered in bad faith. So even if Alexander is not knowingly promoting Murray’s most evil and disingenouous arguments, taking Murray as credible demonstrates catastrophically bad judgement.
Nor is this the only time Alexander credentials “scientific racists”. In the course of in addressing evil crackpot pseudoscience about intelligence, Kiera Havens’ Medium post Oroborous racks up pointers to these and other examples of how Alexander is deeply entangled with that movement.
Siskind chose to deliberately hide his affinity for race science for writings on his popular blogs, SlateStarCodex and AstralTenCodex. In 2014 emails where he detailed his strategy for mainstreaming hereditarianism came to light and Siskind (with all the confidence of a toddler emphatically declaring through crumbs and chocolate that THEY did not eat that cookie) posted a categorical denial on another one of his websites, raikoth.net [a reference the utopian society Siskind spent years developing]. The same webite linked to an account used six months prior to solicit resources on Asheknazi IQ to improve the arguments on “my side”. The recommended items later emerge as a lengthy post on SlateStarCodex where he finds the same discredited theory Pinker promoted “really compelling”.
[⋯]
Siskind defends the genetic basis of IQ in 2016 and 2021, often citing Plomin (who was wrong in many different ways).
[⋯]
A now deleted 2017 comment has him argue that the science isn’t settled and skull-measuring is actually a scientifically rigorous way to determine cognitive ability. When challenged on the welcoming atmosphere he is creating for ‘race science’ and its proponents (also in 2017), Siskind says that people on Twitter seem to think Emil Kirkegaard is okay, a claim that Kirkegaard later uses to convince himself he’s not a crackpot. To put a finer point on this one — Kirkegaard is *also* installed at [self-described scientific racist] Richard Lynn’s publishing mill [the Ulster Institute for Social Research], started his own self-published, self-reviewed journal to publish folks like Willoughby and Fuerst (and host their conversations), and as part of his truly enormous body of work to promote scientific racism, spent years seeding Wikipedia with hereditarian talking points.
As Havens points out, support for these guys rarely turns up in SSC essays themselves (which is how I missed it for a long time). But a sharp eye can find it. On SSC in 2020 Alexander said:
Normally this would be a hidden thread, but I wanted to signal boost this request for help by Professor Steve Hsu, vice president of research at Michigan State University. Hsu is a friend of the blog and was a guest speaker at one of our recent online meetups – some of you might also have gotten a chance to meet him at a Berkeley meetup last year. He and his blog Information Processing have also been instrumental in helping me and thousands of other people better understand genetics and neuroscience. If you’ve met him, you know he is incredibly kind, patient, and willing to go to great lengths to help improve people’s scientific understanding.
Hsu is unmistakably aligned with Holocaust deniers, white nationalists, and racist pseudoscientists. Not maybe kinda. Not by coy implication. Directly. He is not someone one should support, have in one’s community, or point to for help “understanding genetics and neuroscience”. Alexander’s support for Hsu is unforgivable.
Alexander also financially sponsors Quillette, another bad actor so pernicious that I keep a page about them.
Quillette is an instrument for credibility-washing evil far right pseudo-intellectual bullshit.
I am sympathetic to people who get fooled by an article from them. They publish a lot of genuinely intriguing contrarian articles, often by left-leaning commentators, to create an impression that they are a venue for smart, serious, adventurous ideas. But this is a ploy, to create a good impression so that one becomes open to entertaining one of their articles arguing Oh So Reasonably for [⋯] racist & sexist pseudoscience, nonsense about “censorship” by the Left, and even doxxing “antifa” journalists knowing that the violent fascist cult Atomwaffen used their article as a “Kill List”.
It would be bad enough if Alexander shared an article from them, or a pointer to them. But he gives them money. Unforgivable.
Support for a NRx-ish abuser
GorillasAreForEating on Reddit has a damning timeline of rape-y Rationalist community culture which has a lot about Brent “ialdabaoth” Dill, whom Alexander supported in his community in a demonstration of astonishingly bad judgement.
A 2001 LiveJournal post from Dill says:
I firmly believe that it’s nearly every man’s dream, somewhere deep inside, to have a harem of beautiful women that he objectively owns. Whether it is or not, it’s certainly MY dream.
I keep enough company with libertines to wholeheartedly defend unwholesome daydreams, polyamory, and kink as good clean fun for consenting adults, but I have to recommend against clicking through the read more of that post; it made me want to bathe in bleach, and not just because it describes then-28-year-old Dill as delighting in sexually dominating a sixteen-year-old.
I cannot guess whether Alexander ever saw that post, but it informs reading Dill’s comments on SSC, where he was a frequent commenter by 2014. In April Dill said:
I suppose ‘reactionary’ is just the closest affiliation I can latch onto; my actual worldview is a weird sort of nihilistic, depersonalized, ultra-authoritarian fascism straight out of 1984, so it’s kinda hard to find people to flag tribal affiliation towards. 🙁
Alexander did respond to that comment “I am hard to creep out, and you are creeping me out”, but Dill remained welcome in SSC comments. Commenting on an August post Dill said:
manipulation is my only natural skill, and the one that I’ve honed the most. (Remember, narcissistic upbringing; probably a lot of unpleasantly narcissistic tendencies in myself as well.)
I completely get and agree with Neoreaction, my only objection is about scale. In the world I want to live in, I am a Sovereign King of my own household, where the only options are Obedience and Exit.
I lived with a harem of attractive, submissive women who called me their ‘Master’ and pretty much voluntarily structured their lives around making me happy.
A month after those comments, Alexander did a SSC post forwarding a lengthy plea for financial and other support for Dill saying, “If you read the comments at SSC, you’ll recognize him as a contributor of rare honesty and insight.”
Four years later, multiple community accounts of Dill’s abusiveness surfaced.
Alexander justifies his legitimization of NRx
In 2021, Topher T Brennan shared a 2014 email Alexander sent him defending his thinking and motives in addressing reactionaries’ ideas.
I’ve decided to say “screw it” and post receipts showing that Scott Siskind (the guy behind Slate Star Codex) isn’t being honest about his history with the far-right.
The context is that I’d been publicly critical of the rationalist community’s relationship with a branch of the online far right that called themselves “neoreactionaries”, and Scott (a vague internet acquaintance at the time) basically responded by saying, “oh, I agree the people you’re thinking of don't have much of value to say” but offered to point me to supposedly “better” examples of neoreactionary thought. This is what he sent me—something I was very much not expecting. (And no, he did not first say “can I tell you something in confidence?” or anything like that.)
Posting this now because Scott accusing Cade Metz [author of the NYT article Silicon Valley’s Safe Space: Slate Star Codex] of dishonesty and a lot of people are jumping on that to smear Metz and the NYT. The thing is, Metz never said Scott endorsed the far-right or anything like that — just that the Slate Star Codex community was far more welcoming to the far-right than to so-called “SJWs”. That’s a simple fact that has been a matter of public record for years. Scott and his defenders says it’s dishonest to point that out because it might lead people to infer Scott is far more sympathetic to the far-right than he’s admitted publicly. But the inference is correct.
I feel a certain hesitation about re-sharing a message Alexander sent in confidence, but many Alexander critics reference it so the cat is already out of the bag … and it is too illuminating to ignore.
Some context for the uninitiated:
-
“HBD” stands for “human biodiversity”, a term used by people promoting intellectually dishonest racist pseudoscience about how different “subgroups” of humanity are different from each other, focusing of course on differences in intelligence & temperament
-
“LW” is short for LessWrong, the Rationalist forum focused on the ideas of the weird crank Eliezer Yudkowsky whom Sandifer criticizes in the references at the top of this post
-
Robin Hanson is a creepy crank prominent in the Rationalist community
-
“Moldbug” is the nom de guerre of NRx leader Curtis Yarvin
-
RationalWiki is an index maintained by the Rationalist community — a useful place to start when looking for resources debunking bad ideas & bad actors, and you’ll notice that this post points to their article on Alexander early on
I said a while ago I would collect lists of importantly correct neoreactionary stuff to convince you I’m not wrong to waste time with neoreactionaries. I would have preferred to collect stuff for a little longer, but since it's blown up now, let me make the strongest argument I can at this point:
1. HBD is probably partially correct or at least very non-provably not-correct.
https://occidentalascent.wordpress.com/2012/06/10/the-facts-that-need-to-be-explained/
http://isteve.blogspot.com/2013/12/survey-of-psychometricians-finds-isteve.html
This then spreads into a vast variety of interesting but less-well-supported HBD-type hypotheses which should probably be more strongly investigated if we accept some of the bigger ones are correct. See eg http://hbdchick.wordpress.com/2012/11/08/theorie/ or http://en.wikipedia.org/wiki/Albion%27s_Seed.
(I will appreciate if you NEVER TELL ANYONE I SAID THIS, not even in confidence. And by “appreciate”, I mean that if you ever do, I’ll probably either leave the Internet forever or seek some sort of horrible revenge.)
2. The public response to this is abysmally horrible.
See for example Konk’s comment http://lesswrong.com/r/discussion/lw/jpj/open_thread_for_february_1824_2014/ala7 which I downvoted because I don’t want it on LW, but which is nevertheless correct and important.
See also http://radishmag.wordpress.com/2014/02/02/crazy-talk/
3. Reactionaries are almost the only people discussing the object-level problem AND the only people discussing the meta-level problem.
Many of their insights seem important. At the risk (well, certainty) of confusing reactionary insights with insights I learned about through Reactionaries, see:
http://cthulharchist.tumblr.com/post/76667928971/when-i-was-a-revolutionary-marxist-we-were-all-in
http://foseti.wordpress.com/2013/10/23/review-of-exodus-by-paul-collier/
4. These things are actually important
I suspect that race issues helped lead to the discrediting of IQ tests which helped lead to college degrees as the sole determinant of worth which helped lead to everyone having to go to a four-year college which helped lead to massive debt crises, poverty, and social immobility (I am assuming you can fill in the holes in this argument).
I think they’re correct that “you are racist and sexist” is a very strong club used to bludgeon any group that strays too far from the mainstream — like Silicon Valley tech culture, libertarians, computer scientists, atheists, rationalists, et cetera. For complicated reasons these groups are disproportionately white and male, meaning that they have to spend an annoying amount of time and energy apologizing for this. I’m not sure how much this retards their growth, but my highball estimate is “a lot”.
5. They are correct about a bunch of scattered other things
the superiority of corporal punishment to our current punishment system (google "all too humane" in http://slatestarcodex.com/2013/03/03/reactionary-philosophy-in-an-enormous-planet-sized-nutshell/ ). Robin Hanson also noted this, but there’s no shame in independent rediscovering a point made by Robin Hanson. I think the Reactionaries are also correct about that it is very worrying that our society can’t amalgamate or discuss this belief.
various scattered historical events which they seem able to parse much better than anyone else. See for example http://foseti.wordpress.com/2013/10/01/review-of-the-last-lion-by-paul-reid/
Moldbug’s theory of why modem poetry is so atrocious, which I will not bore you by asking you to read.
Michael successfully alerted me to the fact that crime has risen by a factor of ten over the past century, which seems REALLY IMPORTANT and nobody else is talking about it and it seems like the sort of thing that more people than just Michael should be paying attention to.
6. A general theory of who is worth paying attention to.
Compare RationalWiki and the neoreactionaries. RationalWiki provides a steady stream of mediocrity. Almost nothing they say is outrageously wrong, but almost nothing they say is especially educational to someone who is smart enough to have already figured out that homeopathy doesn't work. Even things of theirs I didn’t know — let’s say some particular study proving homeopathy doesn't work that I had never read before — doesn’t provide me with real value, since they fit exactly into my existing worldview without teaching me anything new (ie I so strongly assume such studies should exist that learning they actually exist changes nothing for me).
The Neoreactionaries provide a vast stream of garbage with occasional nuggets of absolute gold in them. Despite considering myself pretty smart and clueful, I constantly learn new and important things (like the crime stuff, or the WWII history, or the HBD) from the Reactionaries. Anything that gives you a constant stream of very important new insights is something you grab as tight as you can and never let go of.
The garbage doesn’t matter because I can tune it out.
7. My behavior is the most appropriate response to these facts
I am monitoring Reactionaries to try to take advantage of their insight and learn from them. I am also strongly criticizing Reactionaries for several reasons.
First is a purely selfish reason — my blog gets about 5x more hits and new followers when I write about Reaction or gender than it does when I write about anything else, and writing about gender is horrible. Blog followers are useful to me because they expand my ability to spread important ideas and network with important people.
Second is goodwill to the Reactionary community. I want to improve their thinking so that they become stronger and keep what is correct while throwing out the garbage. A reactionary movement that kept the high intellectual standard (which you seem to admit they have), the correct criticisms of class and of social justice, and few other things while dropping the monarchy-talk and the cathedral-talk and the traditional gender-talk and the feudalism-talk — would be really useful people to have around. So I criticize the monarchy-talk etc, and this seems to be working — as far as I can tell a lot of Reactionaries have quietly started talking about monarchy and feudalism a lot less (still haven't gotten many results about the Cathedral or traditional gender).
Third is that I want to spread the good parts of Reactionary thought. Becoming a Reactionary would both be stupid and decrease my ability to spread things to non-Reactionary readers. Criticizing the stupid parts of Reaction while also mentioning my appreciation for the good parts of their thought seems like the optimal way to inform people of them. And in fact I think it’s possible (though I can't prove) that my FAQ inspired some of the recent media interest in Reactionaries.
Finally, there’s a social aspect. They tend to be extremely unusual and very smart people who have a lot of stuff to offer me. I am happy to have some of them (not Jim!) as blog commenters who are constantly informing me of cool new things (like nydwracu linking me to the McDonalds article yesterday)
8. SERIOUSLY SERIOUSLY, the absurdity heuristic doesn’t work
You’re into cryonics, so you’ve kind of lost the right to say “These people, even tough they’re smart, are saying something obviously stupid, so we don’t have to listen to them”
Drew has even less of a right to say that — he seems to be criticizing the Reactionaries on the grounds of “you wouldn’t pay attention to creationists, would you?” even while he discovered Catholic philosophy and got so into it that he has now either converted to Catholicism or is strongly considering doing so.
If there is a movement consisting of very smart people — not pseudointellectual people, like the type who write really clever-looking defenses of creationism — then in my opinion it's almost always a bad idea to dismiss it completely.
Also, I should have mentioned this on your steelmanning creationism thread, but although I feel no particular urge to steelman young earth creationism, it is actually pretty useful to read some of their stuff. You never realize how LITTLE you know about evolution until you read some Behe and are like “I know that can’t be correct...but why not? Even if it turned out there was zero value to anything any Reactionary ever said, by challenging beliefs of mine that would otherwise never be challenged they have forced me to up my game and clarify my thinking. That alone is worth thousand hours reading things I already agree with on RationalWiki.
Some call this peek into Alexander’s thinking & motives a smoking gun which demonstrates that he is a crypto-reactionary. I want to chew on that …
So what is it with Alexander?
We can conclude that we must shun Alexander and his work for carrying water for dangerous nonsense without needing to understand Alexander’s motives and thought processes.
But I want to dig for an understanding of him.
Sandifer and I both suggest that we might read Alexander as foolish rather than just a crypto-reactionary. Why?
Despite knowing the worst from him, I confess that I still find Alexander’s long 2014 poetic evocation of rigorous liberalism In Favor Of Niceness, Community, And Civilization moving.
Liberalism does not conquer by fire and sword. Liberalism conquers by communities of people who agree to play by the rules, slowly growing until eventually an equilibrium is disturbed. Its battle cry is not “Death to the unbelievers!” but “If you’re nice, you can join our cuddle pile!”
But some people, through lack of imagination, fail to find this battle cry sufficiently fear-inspiring.
In 2013 Alexander was early to take a hard and critical look at NRx on SSC in 2013, predating most other critiques I know about. In those posts he steelmans NRx ideas … and finds them badly wanting. His anti-Reaction essay concludes:
Some Reactionaries are saying things about society that need to be said. A few even have good policy proposals. But couching them in a narrative that talks about the wonders of feudalism and the evils of the Cathedral and how we should replace democracy with an absolute monarch just discredits them entirely.
Recall how in the leaked email, Alexander called NRx:
a vast stream of garbage with occasional nuggets of absolute gold
I find it impossible to imagine Alexander concocting these as nothing other than a smokescreen over his true reactionary agenda. Yet Alexander unmistakably supports some of the worst reactionary ideas and actors. How to reconcile that?
Jeff Eaton, one of my favorite commentators on far right ideology in the US, distils the leaked email and finds too much sympathy for NRx:
In the context of what he’s writing (i.e., the whole message rather than an isolated phrase or two) it seems straightforward that:
-
He believes the NRx movement / thinkers are tackling critical questions few other people are
-
They don’t get everything right, but that is better than not trying
-
He takes information from them credulously and considers them a unique pool of insights
-
He cites specific ideas NRx folks have offered to him that critics have debunked but Scott accepted and went on to consider important ingredients in his thinking
-
He believes smart people should listen to them, because of the positives
-
He avoided publicly associating himself with NRx in part because felt it would affect his credibility with non-NRx people, not because he condemned the movement’s priors or conclusions
-
He considers almost everything to be a stream of garbage that intelligent people must sort through to find the valuable elements
-
He believes that NRx is on the balance better than other “dismissable” ideas like homeopathy, and should be listened to
-
He believes that smart people like himself will not be affected by whatever poor conclusions or bad priors the NRx movement brings to the table
In that context the “stream of garbage” phrase doesn’t carry a lot of weight.
I disagree with Eaton a bit on that last point — I find it important to distinguish between Alexander simply supporting the movement versus finding it wrong-but-instructive — but agree with his conclusion about Alexander’s failure.
The “gold” Alexander finds is not merely worthless. It is poisonous.
So — again — what is going on with him?
Consider how Alexander also wrote a long Non-Libertarian FAQ which, like the Anti-Reactionary FAQ, steelmans libertarian ideas then rejects them.
The main reason I’m writing this is that I encounter many libertarians, and I need a single document I can point to explaining why I don’t agree with them.
I intend the post you are reading now to do something similar: accumulating particulars about Alexander and examining generous readings of him, to criticize him thoroughly rather than just dismissively. I have done similar posts before.
Given the parallel between posts he and I have both done, I recognize Alexander as not so different from me: a species of nerd with a high tolerance for intellectual disgust, a taste for examining evil ideas and picking them apart, and living in the sphere of San Francisco Bay Area nerds. It would comfort me to believe that he and I fundamentally differ, to believe that his liberal protestations are just an act, to believe that he only feigns his commitment to the deep egalitarian and democratic values at the core of my own thinking. I might then feel confident that I could not go as wrong as he has.
Instead, I take his liberal side as sincere, which inspires my discomfort. I must dread the possibility that I could make comparable mistakes. After all, I have failed in recognizing Alexander’s worst too slowly. I have an obligation to examine what brought him to where he is, to learn to avoid his failings.
An anonymous commenter offers a reading of Alexander’s driving psychology, in response to another example of his mortifying moral tone-deafness.
(For the uninitiated, MIRI is an artificial intelligence “research” project entangled with the Rationalist community.)
Many of Scott’s house-mates from the rationalist community are extremely weird and awkward (I guess I can’t name them without sharing personal info so you’ll have to take my word for it) and are often sad about their lack of status. They are very wealthy by worldwide standards if not by the absurd local-regional standards which is still enough to at least feel obligated to feel guilty by community standards. (Think: people who are making making donations MIRI well over the US median household income)
If you combine this with the frequent inability of people perceive their own privilege and the high levels of narcissist-like traits exhibited in the rationalist community you end up with people around you saying “I have all this money and yet no one respects for the Gift to the world that I am and instead keeps treating me like a weirdo…” and maybe you start thinking money doesn’t matter much.
Some of this likely stems from conflating status and power as a result of overvaluing what other people think of you as a result of living in a group house (similar to how high-schoolers are stereotyped as thinking their life is over at every bump in their social lives).
Let me offer an alternative explanation (in pseudo mathy terms so the rationalists can pretend that its deeply insightful): Power is a normalized product of many factors:
Pyou
=
( F1you * F2you * F3you … * Fnyou )
━━━━━━━━━━━━━━━━━
( sum( product(Fn)everyone ) )
and many of these factors are highly correlated with wealth: education, connections to other people with high power: things like free time, safety from starvation, good health, affiliation with socially powerful groups, level of control over the time of others (e.g. owning a business), freedom from biological/social persecution…
Some of these factors could rightfully be considered latent forms of wealth in themselves (in that they inevitably result from or lead to wealth). As a result, P changes with wealth raised to some high power but weakness in a non-wealth respect can still handicap you.
So yes, you can have some modicum of wealth and still have low power by being very weak in other respects, such as not having enough EQ to realize when your “just asking” has ventured into extremely offensive and impolitic waters or too much selfishness to cut it out if you do realize. This does not change the fact that wealth is a universal solvent able to radically simply many concerns and a nearly impassable barrier for many goals.
Over time, you become your friends in many respects. Choosing who you spend time with is one of the biggest things someone can do to influence their future personality. Comparing the Scott of today to the one who wrote the anti-libertarian FAQ feels to me like looking at someone who hasn’t made the best decisions of this kind.
Alexandros Marinos (from the note above about misrepresenting scientific studies) examines Alexander in terms of his relationship with his audience.
I think first and foremost you have to understand Scott Alexander as a technocrat. He believes that societies should be ruled by experts. This is where his historic support for eugenics comes from. And what we know from the history of eugenics is that the problem is who decided the “eu”. In other words, whose view we adopt on what is desirable or not. There is a vast number of times it turns out that what the experts thought was right ends up wrong.
Which brings me to my next point, that Scott Alexander has a hard time with uncertainty. After all, if experts are to rule society, they need to be able to come up with the right answers. And it is this hubris that drives his consequentialism. Someone who appreciates the degree of uncertainty we should have in this chaotic system would realize that most things that matter are uncomputable, bayes or no bayes. Which is why most of us adopt principles, one way or another.
All this gets him to cozy up to the scientific establishment, which in turn has fed some of his worst habits. You see, while many people see his scientific malpractice, and will freely say so in private conversations, they won’t come out against him, because he is useful. If he ever takes a seriously anti-regime position, he will be torn to shreds, and he basically knows it. So long as the people that push back on his claims are low-status, he’s safe. Were he to ever take a position that hurts the scientific establishment, he would have high-status people pushing against him, in excruciating detail, and his whole house of cards would collapse.
So we get what we have now. Scott Alexander, dancing between the two attractors of pain avoidance and expert worship.
What is truly egregious, however, is the audience. Many highly competent people are in it. And they don’t go to the trouble of checking his stuff. Much of this is induced by the endearing and lullaby nature of his prose, but still, it is astounding to see the scale and scope of errors that go unnoticed. My sense is that every time he cites a scientific article, it’s a 50-50 chance of him completely misrepresenting what it says. And while he plays at formalism, his mathematical errors are so egregious as to induce laughter before the tears.
But again, people of The Science don’t really care about actual science, so he’s safe.
And this is how we get today’s Scott Alexander, the bard with the best tales for the “I fucking love pop science” crowd.
Is it the people or the philosophy? similarly considers the Rationalist community reflected in Alexander and SSC, as a certain white-guy-nerd hubris. I too am a white guy nerd. I should beware.
For the uninitiated:
-
The Sequences is a web book of eccentric essays about thinking by crank Rationalist star Eliezer Yudkowsky
-
Bayes’ Theorem is a way of approaching probability problems which Yudkowsky and many other Rationalists tend to emphasize as inspiring useful intuitions
Every once in a while, someone with real credentials in a trendy domain like genetics or economics will drop in to mention how jarring it is to see so many people talking so enthusiastically about their academic discipline with such esoteric vocabulary, yet the vast majority of it is utter horseshit that wouldn’t pass a 101-level class. One response I got when I did this was that someone, with no apparent irony, accused me of “LARPing” because the scientific establishment is clearly just pretending to epistemological “prestige” that can only be truly earned by studying the Sequences.
PhD < Bayes’s theorem (< IQ).
This is, of course, the perfect description of what the Rational community is up to. Instead of labs they do their research in armchairs; instead of peer-reviewed journals they publish their findings in blogs (whose posts still get actual citations years later). But they’re creating a parallel world of alt-academia in fields that are already well trod by the genuine article, like philosophy and economics and quantum mechanics and oh-so-much genetics. They do happily admit real-world publications into their body of knowledge, but then they also admit pseudoscientists like that Kirkegaard guy or the crackpot virologist whom Peter Thiel paid to give people herpes in order to prove we don’t need the FDA. I think this is where Rationalists are the most cultlike and earn their capital R: not the abundance of unnecessary jargon/shibboleths, nor the alarming tendency to handle everything in their daily lives (even their health) through the community, but the whole ecosystem they have of theories and thought-leaders that are constantly discussed inside the community yet no one outside has ever heard of them.
Maybe this comes back to the evasion of empathy, the reluctance to give any weight to other people’s experience — a doctor’s opinions about health are just as irrelevant as an African American's opinions about racism. In that sense it could just be one more battleground in the eternal conflict between rationalism and empiricism.
I take comfort that I have a view of power much more heavily informed by a social justice analysis and a view of expertise much more skeptical of fringe figures. But most of all, I find myself dwelling on Alexander saying in the leaked email:
The garbage doesn’t matter because I can tune it out
Alexander can be very smart. His essay Meditations on Moloch remains a marvel I recommend to anyone. His two essays on NRx are necessary reading if one wants to understand the movement. And my fascination with his Niceness, Community, and Civilization essay is not dulled by his failings at the values it describes but rather sharpened by the cautionary example of how thinking his principles protected him from bad actors failed, may even have made him dangerously overconfident.
But he is also very stupid. No, he cannot tune out the garbage.
This presents a problem, because someone has to debunk false and evil ideas and present the critiques to a candid world. This post is among my attempts to do that work. I have a sober sense of how that is difficult and important.
When I read something like The Turner Diaries, the men in the sub-mircron filtration Chemurion suits emerge from the clean room with their vacuum traps. They hoover up the memes and peel back the protein sheath. The virus gets spun down in the centrifuge, cut to pieces with enzymes, run through the sequencer. And the white coated man emerges from the lab, with a clipboard, and announces, “You know what? This looks very similar to something we indexed in 1995…” And they begin to formulate a vaccine — for widespread distribution, to stop the spread.
Alexander, for all his genuine wit at his best, is just far too intellectually reckless to be trusted with the delicacy of sifting through bad ideas. He demonstrates us how garbage sources are bad for you even if you go in knowing they are garbage. His wit is a handicap. It makes him overconfident. It presents an illusion that he is a rigorous thinker.
Beware.
This post emerged from a Twitter thread accumulating particulars, inspired by a nightmarishly bushy Twitter discussion with critics of Alexander’s. Twitter is of course both an inconvenient format and a dying platform serving the far right at Elon Musk’s hands, so I have made an attempt to capture the essentials here. I have refined this post since originally posting it and intend to keep refining and expanding it, so I encourage readers to drop me a line pointing to anything important I have missed.