Author Topic: Weird Internet Communities  (Read 14872 times)

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #15 on: May 11, 2023, 04:53:02 AM »
This essay on AI cult longtermism came up on Mastodon https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo  One of its points is that this movement simultaneously believes that humans have a duty to turn the universe into simulated humans, and that the greatest threats to humanity's future are radical near-future technologies.  So they have the problem that they push for aggressively developing dangerous technologies (its hard to imagine humans expanding outside the solar system without radical biological and information and energy and propulsion technologies), but also see these technologies as something they must control even at the risk of nuclear war or unchecked global warming.

In my view, long-term predictions (ie. centuries not trillions of years) of a system like the human species are obvious quackery.

I am sure there are forms of longtermism focused on 10,000 year clocks and seed banks in the Arctic and other practical things, not on conquering the universe and stopping Skynet.
« Last Edit: May 11, 2023, 06:35:10 AM by dubsartur »

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #16 on: June 02, 2023, 05:17:46 PM »
Another handy essay on the connections between Bostrom's Longtermism, the "speculative threats" kind of effective Altruism, the rationalist movement, Steven Pinker, and race-and-IQ 'science' (sic) https://www.truthdig.com/dig-series/eugenics/  Again, there are the longtermists who are building millennium clocks in Arizona and seedbanks in the Arctic, and the Effective Altruists who point out "before you donate to We Charity look if someone has independently verified their accomplishments and whether others are doing the same for less money" but dangerous quacks are trying to appropriate the names for their own uses.

Edit: A Timnit Gebru @timnitGebru@dair-community.social coined the term TESCREAL for these weird Internet and California / New York / Oxford spaces (although again, not all Effective Altruists!)

#TESCREAL stands for transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism and longtermism. Émile P. Torres @xriskology@mastodon.bida.im coined it in our upcoming paper. Great to see everyone following the bandwagon of a secular religious cult.
« Last Edit: June 02, 2023, 10:27:47 PM by dubsartur »

Jubal

  • Megadux
    Executive Officer
  • Posts: 35619
  • Karma: 140
  • Awards Awarded for oustanding services to Exilian!
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #17 on: June 03, 2023, 08:14:34 PM »
I'm not sure I know what extropianism or cosmism are...
The duke, the wanderer, the philosopher, the mariner, the warrior, the strategist, the storyteller, the wizard, the wayfarer...

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #18 on: June 03, 2023, 09:28:52 PM »
I'm not sure I know what extropianism or cosmism are...
I could not define them either but I imagine she says something in her paper.

There is a lot of overlap in these spaces and ideas that don't seem obviously related like race 'science' keep coming up in them.  I think that is one reason for the social media offensive focusing on people and painting with a broad brush so the eugenicists and racists and builders of hierarchies can't just rebrand.

I have trouble getting too angry with Scott Alexander because clever lonely dudes with blogs rarely do much harm (and the NYT did not need to publish his legal name to show his connections with shady people and advocacy of dubious ideas), but I would not recommend entrusting any of these people with a hot dog stand, they often push terrible ideas or get grifted by terrible people.  The AI Foom people have a thought experiment "what if you lock the AI in a box and it persuades someone to let it out?" and I think the David Gerards of the world are scared that someone is trying to let these ideas out of weird Internet communities and geeky clubs in San Francisco, New York City, and Oxford and let them control serious money.  And they are not polite Canadians so they play dirty.

I see that Maciej Ceglowski was suspicious of Nick Bostrom's ideas before that was cool https://idlewords.com/talks/superintelligence.htm (and he moved in those same software and venture capital circles in California)

Edit: I think I have finally found the essay which lays out the connections between these people without personal attacks, unverified claims, or misunderstanding arguments https://aiascendant.substack.com/p/extropias-children-chapter-1-the-wunderkind  Professor Nick Bostrom's 1996 email endorsing 'scientific' racism was on, you guessed it, the Extropians mailing list which Extropia's Children begins with (1996 is a long time ago and I have no idea of Bostrom's current views, but the idea of a racial hierarchy of IQ comes up frequently in these spaces and it is one reason to be suspicious of them, given that even 1996-Bostrom said "I have begun to believe that I won’t have much success with most people if I speak like that" and given that he did not wholeheartedly renounce these ideas in his appology at the start of 2023)

Edit edit: and unsurprisingly defenders of Longtermism accuse Emile P. Torres (they/them) of misrepresenting their arguments.  Torres and others certainly spend a vast amount of time and energy criticizing these movements but these movements do seem to control billions of dollars and influence policy.  And what I have seen does not make me think that sitting down and reading key works by Longtermist thinkers would make me wiser or happier. We all have limited time and attention.  https://markfuentes1.substack.com/p/emile-p-torress-history-of-dishonesty {lots of twitter drama on this one}
« Last Edit: June 07, 2023, 06:12:56 AM by dubsartur »

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #19 on: June 08, 2023, 06:56:03 AM »
Yes. I don't think trying to work out what will happen in the future is a wholly valueless exercise, but in general, I think it's a good rule of thumb that you're likely to get a better future first and foremost by producing a better now: if we had a society more robust at fixing its present problems, that'd be likely to be a society better able to cope with the strain of any new problems.
Activisty types sometimes object to bednet Effective Altruism on the grounds "it does not address the underlying causes, just the symptoms."  And while that seems true, "cure children in Botswana of parasites which will stunt their growth and health" is much more tractable for busy people in London or Chicago than "solve the global, national, and local inequities which lead to so many children in Botswana getting infected in the first place."  Its also much easier to know whether your actions are improving what you say they want to improve. 

Jubal

  • Megadux
    Executive Officer
  • Posts: 35619
  • Karma: 140
  • Awards Awarded for oustanding services to Exilian!
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #20 on: June 08, 2023, 03:35:59 PM »
Yes. I don't think trying to work out what will happen in the future is a wholly valueless exercise, but in general, I think it's a good rule of thumb that you're likely to get a better future first and foremost by producing a better now: if we had a society more robust at fixing its present problems, that'd be likely to be a society better able to cope with the strain of any new problems.
Activisty types sometimes object to bednet Effective Altruism on the grounds "it does not address the underlying causes, just the symptoms."  And while that seems true, "cure children in Botswana of parasites which will stunt their growth and health" is much more tractable for busy people in London or Chicago than "solve the global, national, and local inequities which lead to so many children in Botswana getting infected in the first place."  Its also much easier to know whether your actions are improving what you say they want to improve. 
Yeah, I think this is a complex problem that internet discourse tries to simplify too often: one doesn't want to never try and fix underlying problems, that's really bad, but it's also morally bad to simply tell people they have to sit and die while you spend all the money on funding the grand progressive takeover of the world which may or may not happen. But it wrongly becomes an either/or for too many people.
The duke, the wanderer, the philosopher, the mariner, the warrior, the strategist, the storyteller, the wizard, the wayfarer...

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #21 on: June 08, 2023, 09:21:01 PM »
Its also probably the case that the people who go into "bednetting" EA are better at solving well-defined, context-independent problems than at squishy things like land reform in Malaysia or getting the right people elected and appointed in Ploughkeepsie.  Just like the average person who glues themself to a crosswalk in Berlin probably lacks the money and skills and personality to build a windbreak forest in the Sahel.  That does not mean that one type of action against climate change or poverty is hurting wrong.

Some aspects of rationalism, longtermism, etc. are hostile to this kind of thinking but other parts of EA seem to be open to dividing donations between a few strategies. But I didn't know anything about Longtermist EA before fall 2022!
« Last Edit: June 08, 2023, 09:27:54 PM by dubsartur »

Jubal

  • Megadux
    Executive Officer
  • Posts: 35619
  • Karma: 140
  • Awards Awarded for oustanding services to Exilian!
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #22 on: June 17, 2023, 08:04:21 AM »
For an interesting counterpoint, a friend on Facebook shared this article which pushes back on the TESCREAL idea by splitting out the component parts and suggests that Gebru etc are joining more dots than actually exist on some of this:

https://medium.com/institute-for-ethics-and-emerging-technologies/conspiracy-theories-left-futurism-and-the-attack-on-tescreal-456972fe02aa

I think there are some fair points in there but I'm not sure I'm convinced as a whole - in that I didn't think all these people and groups were in a single evil cabal anyway, and that my concern is not nefarious scheming so much as wasting vast amounts of money in ways that tackle imagined problems over and above present ones and fail to recognise the actual legal and social adaptations we need urgently. The authors of this piece do recognise that, but I think they brush past the organisational, reputational, and movement scale problems that things like EA have right now.

Some of the apologia from EA advocates I know feels like the Lib Dems who in 2015 were just outraged the electorates was rejecting them because they had tried their hardest to stop Tory overreach: but a lot of the criticisms of them and especially of some of their most prominent advocates were also substantially true and people knew it, so that wasn't going to be enough for people to trust them again for a while.
The duke, the wanderer, the philosopher, the mariner, the warrior, the strategist, the storyteller, the wizard, the wayfarer...

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #23 on: June 17, 2023, 06:25:55 PM »
My first thought is that in the part I know best (the rationalists and economists with blogs), sneering at people who see connections and mutual influence as conspiracy theorists is dead wrong.  Its a fact that leading thinkers hung out on the Extropians mailing list in the 1990s and later became publicly enthusiastic for ideas which former buddies had promoted in the 1990s!  Its a fact that of the three most prominent rationalist bloggers who are not economists, two have expressed enthusiastic support for scientific racism (and a very young Bostrom did so, and his recantation suggests that he is still interested just not sure about the 'genetic' part).  Its elementary that people often adopt ideas from their friends, family, and lovers, one basic form of political lobbying is to organize nice meals or parties, invite fellow travellers and the people you want them to influence, and let nature take its course.  No one person in this space has the same terrible ideas or sinister goals, and its not reasonable to ask a member of the public to keep straight the difference between Eliezer Yudkowski and Robin Hanson. 

Edit: A random look at Caroline Ellison's Tumblr showed me a post which begins "btw a link from SSC sent me down a rabbit hole of reading (scientific racist blogger) hbd chick and related links lately and the whole intellectual edifice is pretty fascinating. I don’t have a great summary, and epistemic status tentative so you should just read the blog and follow the rabbit hole yourself. "

I can not speak to longtermism since I have not read key works and do not know the key figures.  So I can't say how well the Internet criticisms represent it.  There may well be some conspiratorial thinking in Torres and co's belief that small passages show a hidden agenda.  But I have seen Caroline Ellison's tumblr blog computing the suffering of fish on a scale with the suffering caused by specific human diseases, and everything I know about singulitarianism screams "run.  Do not engage.  Its a trap for minds like yours, in the way that a confidence game is a trap."

OTOH, I agree that I have not heard of any major sinister Transhumanist groups and I have never heard of Cosmism or Extropianism.  I also agree that some of the critics have a beef against utilitarianism, which can be a useful ethical framework if you don't go too far.

"Perhaps the best example of grounded, careful thinking on these topics is Nick Bostrom’s book 2014 Superintelligence,"  Ceglowski is not an intellectual but his takeaway from that book was that it was designed to catch people with a weakness for clever ideas.

"An attack on rationalism has to be understood in light of the postmodernist critique of rationality."  No, most of us who run screaming from those people (and especially from the LessWrong crowd) are scientists and makers who, as Evans said, deeply distrust their building of castles on the clouds before they set a single stone upon a stone on earth.  I agree that its common for people in these spaces or adjacent ones (eg. Michael Shermer or Richard Carrier) to ignore Hume and argue that the one true morality can be deduced from the study of the world by formal logic.  But Kant and Hume are not postmodernist thinkers!

"we see its connections to reactionary (as opposed to liberal or centrist) political views as exaggerated"  The Rationalism of the Rationally Speaking podcast is full of young sheltered Right Libertarians, polls of the SlateStarCodex readers show that active commentators skew right or right libertarian while readers are more like a sample of the US population.  See also Robin Hanson and the Marginal Revolution guy, or Peter Thiel's funding of MetaMed and Yudkowski's foundation (this essay describes Thiel as a Transhumanist but he has funded Yudkowski's flavour or rationalists).

One of the key points of Dan Davies' Lying for Money is that fraudsters want you to be overwhelmed with a million details, while successful prosecutors want you to focus on the broad outlines of the scheme.  I think that is what critics like Timnit Gebru or Maciej Ceglowski are doing.  Its fair to tell the average person to run screaming whenever a 'rationalist' or longtermist wants them to do something in the real world.  Its not reasonable to spend endless time arguing semantics about individual thinkers' politics or exactly which of these terrible ideas they support at a given time.  I think they are tarring the innocent and the guilty with the same brush, but I think they would say "yes this is unjust but it will force the decent people to distace themselves from the rationalists and longtermists if they want to get anything done offline."

Edit: Hughes, the author of the Medium piece, co-founded his organization in Boston (yellow flag for this family of ideas, its not so infected as SoCal NYC or Oxford but close) with Nicholas Bostrom (red flag!) https://en.wikipedia.org/wiki/Institute_for_Ethics_and_Emerging_Technologies

Edit: so TL;DR I think that Evans' approach to these spaces as a social space where people adopt each others' unusual ideas and support each others' hilariously doomed projects is the best I have found; maybe supplement it with one of the early criticisms of the singularity or the AI as god by a pop culture figure such as Doctorow.  Don't let the drama and the "he said, she said" distract you from the key point that many rationalists and longtermists support some disturbing things and have a history of failure whenever they try to do anything other than post on the Internet and hold geeky social events.
« Last Edit: June 17, 2023, 10:55:33 PM by dubsartur »

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #24 on: June 17, 2023, 11:52:15 PM »
On a nerdy level, apparently the late Daniel Ellsberg had some writing on the limits of quantified probability as a model for rational decisions eg. in a paper "Risk, Ambiguity and the Savage Axioms."  The rationalists are prone to doing arithmetic on made-up numbers as if it proved anything, and to waving around the term "Bayesian" when they mean "updating your opinions as you learn new things."  One reason I thought the bed-net EA worked relatively well is that they had actual numbers to calculate on and seemed to put thought into the source of those numbers. 

If they let a lot of that money be diverted to buying castles and paying friends to sit in a room imagining how to deal with malign superintelligences, that seems like a bad criticism (especially if donors thought they were contributing to bed-netting and actually got a bunch of autodidacts with dreams about things which might happen in the future).  And so does the connection with Sam Bankman-Fried's FTX fraud.
« Last Edit: June 18, 2023, 04:03:34 AM by dubsartur »

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #25 on: July 31, 2023, 05:31:52 PM »
Back in April 2018, SBF barely survived accusations by the board of Alameda that he was a serial liar who refused to implement basic corporate controls against fraud and embezzlement and had sexual and romantic relations with subordinates (the board and half the staff left instead).  People involved at the time say that MacAskill and the rest of the Oxford EA movement were thoroughly informed but still accepted SBF's money and spoke in public about how wonderful his businesses were: https://time.com/6262810/sam-bankman-fried-effective-altruism-alameda-ftx/

Edit: economist John Quiggin talks about what happens if we try to maximize average utility rather than total utility (because total utility is what leads the Longtermists to dream of conquering the universe and turning the solar system into a giant computer simulating minds, just like minimizing suffering leads to the conclusion that humanity should end itself) https://johnquiggin.com/2023/07/30/against-the-repugnant-conclusion/  There is an Isaac Asimov story where humanity has reduced the biosphere to humans, algae tanks, and a few lab animals, and someone notices that they could reach perfection by euthanizing the animals and authorizing a few more human births
« Last Edit: August 02, 2023, 10:51:51 PM by dubsartur »

BeerDrinkingBurke

  • Citizens
    Voting Member
  • Posts: 145
  • Karma: 6
    • View Profile
    • Innkeep
    • Awards
Re: Weird Internet Communities
« Reply #26 on: September 17, 2023, 10:44:04 AM »
Interesting thread. Behind the Bastards did an SBF update recently, which was my onboarding for Effective Altruism.
 
I don't find it very surprising that there is a connection there with Utilitarian thinking, through his parents. While I do think we owe some debt to consequentialist arguments for the improvement of social equality, I'm very suspicious of the attempts this tradition makes at 'solving' morality like a mathematical equation. We cannot help but fall into all kinds of absurd paradoxes once we seek to ground morality in outcomes alone.
Developing a game called Innkeep! Serve Ale. Be jolly. Rob your guests. https://innkeepgame.com/

Othko97

  • SotK Beta
  • Patrikios
    Voting Member
  • Posts: 3520
  • Karma: 9
    • View Profile
    • Personal Site
    • Awards
Re: Weird Internet Communities
« Reply #27 on: September 29, 2023, 09:29:55 PM »
This is certainly an interesting topic I've vaguely encountered floating around.  To me, the main concern is that this vague constellation of beliefs is held by and seems to be viral among people with such an outsized degree of wealth.  These people have the power to waste (not only their own) time, energy and resources on what seem to me to be ultimately doomed ventures, based in many cases on false premises.  For example, transhumanism is based on technology that is so nascent at present that it's barely even experimental and singularitarianism requires that the pace of technology improvement is exponential, despite (in my opinion) the distinct observation that it is stagnating.  By fuelling research into such sci-fi technologies, we lose the opportunity to instead spend those resources on things that would help people more immediately, with less risk, and as a sure deal.  It's also concerning that the desire for such futuristic tech is also causing corners to be cut, such as the tragic cruelty shown at Elon Musk's Neuralink.

However, I get the impression that while these beliefs are truly held by many, they also provide a utilitarian purpose of driving hype in technology to the end of lining the pockets of their adherents.  Sam Altman may well believe that the singularity is coming, say, but I rather get the impression that hand-wringing over the field of "AI" requiring regulations is more to do with driving up the public perception of OpenAI's chatbots than it is genuine concern over the future of humanity.  I'm not overly familiar with Effective Altruism, but I had the feeling that it was always more about PR and justification of amassing vast wealth than it was about actually helping people.
I am Othko, He who fell from the highest of places, Lord of That Bit Between High Places and Low Places Through Which One Falls In Transit Between them!


dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #28 on: September 30, 2023, 07:04:23 PM »
Does anyone else know the story of Matthew White's atrocitology?  He is a librarian who ran a classic 90s and oughties website where he collected statistics about early diasters in books and encyclopedias.  But he was an uncritical compiler, so most of his sources were just books by people which just read earlier books and picked a number which felt right, so garbage in, garbage out. Most of the books he used did not have data, did not have a rigorous method for estimating, and did not have a rigorous method for choosing between earlier estimates, they just made up a number or picked between earlier numbers.

Steven Pinker loved the site when he was writing The Better Angels of our Nature (published 2011) because it was full of numbers and citations and he wanted numbers and did not care how they were created.  He got White a contract to turn his website into a book with a big trade publisher, and he used White's numbers in his book.  If you knew that part of the early Internet, you knew to be very skeptical of this big data which was bad data.  But Pinker's books probably reach 1000 times as many people as careful scholarship with reliable methods.

I have briefly talked about how Pinker seems to toy with race 'science' and has occasionally supported people who push race 'science' although again I am not doing the strings-and-pushpins-on-a-wall thing.  But this shows how obscure person with red flags (White's lack of achievements endorsed by trained historians, the racists' racism) + famous person with credentials and Old Media connections = misinformation explosion

Edit: "According to White, the Atlas (his webpage that includes the atrocity statistics) been used as source by many authors, including in 377 books and 183 scholarly articles" ia cthulhu cthulhu ftaghn

Edit: In his 2011 book, White estimated that Genghis Khan killed 40 million people in China (about 2/3 of his total estimate from WW II, which also involved mass slaughter in China) based on a book by McEvedy and Jones which is such bullarmadillo that there are articles dedicated to explaining the problems. ia cthulhu cthulhu ftaghn
« Last Edit: October 02, 2023, 06:07:49 AM by dubsartur »

dubsartur

  • Citizens
    Voting Member
  • Posts: 1036
  • Karma: 4
    • View Profile
    • Awards
Re: Weird Internet Communities
« Reply #29 on: October 10, 2023, 09:37:03 PM »
Molly White's latest post on the trial of Sam Bankman Fried is clear, focused, and without too many random acronyms or angry asides https://newsletter.mollywhite.net/p/the-fraud-was-in-the-code