The Effective Altruism movement came up on Mastodon. That seems to have a number of factions. There are groups like GiveWell whose point is "if you claim to be doing good with donations, prove it! and isn't it generally easier to reduce suffering and death in poor places than rich places?" There are a lot of well-funded charities whose mission is "raising awareness" or "advocacy" or who can't produce photos of all the schools they say they built.
https://www.givewell.org/charities/top-charities As far as I can tell, they are still around and still have the same basic approach of reducing sickness and death among people alive today (these are easy to measure, whereas its hard to measure the effect of "raising awareness").
There is Longtermism who like speculative risks and who have been infiltrated by the people who are worried that chatbots will be like the AI in Terminator or Reign of Steel or The Forbin Project. "Longterm" or "existential risk" means the future of humanity and avoiding human extinction (or keeping humans alive long enough to transition into digital minds). If you believe that the welfare of trillions of potential future humans is an end that can justify any harm to actually existing human beings today, you can talk yourself into doing terrible things. This is
a known danger of Utilitarianism as well as Christianity, Communism, and other belief systems which envision an end. Eg. someone got on twitter and started calling for bombing unlicensed AI research facilities, even in the territory of other nuclear powers, and there are a lot of people whispering "why help millions of poor brown people when we should be focused on stopping engineered bioweapons from wiping out humanity oh look I have the blueprints for a lab to do that right here and for just a billion dollars plus operating costs ..."
There are the 80,000 hours people who argue that the best way to do good is to get a high-paying job and donate the proceeds (80,000 hours is the time you spend in a 40-year career). This can clearly be an honourable way to live, but because humans are rationalizing not rational, this can become an excuse to live in luxury on other people's work doing all kinds of damage in the name of the Cause. What is publishing propaganda for a tobacco company or mining a few mountaintops if you build some nice libraries?
And there are the grifters like Sam Bankman Fried who wanted to donate to improve their reputations, or want to suck up donors' money. They seemed to find longtermism and 80,000 hours useful smokescreens.
If you read things published before Sam Bankman Fried's Ponzi scheme collapsed, you can find hints that groups 2-4 were gaining more influence because they brought money and charisma
https://www.newyorker.com/magazine/2022/08/15/the-reluctant-prophet-of-effective-altruism And a lot of people are outraged by the grifters and frightened by the Longtermists / AI risk movement so they have launched a propaganda counterattack with the whole Effective Altruism movement as a target. I don't have the contacts in that space to say how much of it the Longtermists and the grifters control, I suspect the answer is "more than I would have guessed."
The New Yorker estimated the EA movement's assets at around USD 30 billion in August 2022. That is also much more than I would have guessed in the early 2010s when groups like GiveWell seemed to be a small part of the charitable sector.