Eliezer Yudkowsky has an excellent post on "Evaporative Cooling of Group Beliefs".
https://www.lesswrong.com/posts/ZQG9cwKbct2LtmL3p/evaporative-cooling-of-group-beliefs
Essentially, when a group goes through a crisis, those who hold the group's beliefs least strongly leave, and those who hold the group's beliefs most strongly stay.
This might leave the remaining group less able to identify weaknesses within group beliefs or course-correct, or "steer".
The FTX collapse, bad press and bad optics of the Whytham Abbey purchase probably mean that this is happening in EA right now.
I'm not really sure what to do about this, but one suggestion might be for community building to move away from the model of trying to produce highly-engaged EAs, and switch to trying to produce moderately-engaged EAs, who might be better placed to offer helpful criticisms and help steer the movement towards doing the most good.
Also, if you read the comments, and other content lately, you might notice that some people are downplaying the gravity of the SBF case, or remarking that it was an utterly unpredictable Black Swan. And some people are signaling virtue by praising the community, or expressing unconditional allegiance to it, instead of to its ideals and principles. I think we both agree this is wasting an opportunity to learn a lesson.
Perhaps you may describe this as a type of evaporative cooling, but it's a different way.
My suggestion right now is some sort of forecasting competition about what is the worst hazard that will come to EA in the next couple of years.