All of Adam_Scholl's Comments + Replies

Yeah, Dario pretty explicitly describes liking RSPs in part because they minimally constrain continued scaling:

"I mean one way to think about it is like the responsible scaling plan doesn't slow you down except where it's absolutely necessary. It only slows you down where it's like there's a critical danger in this specific place, with this specific type of model, therefore you need to slow down." (Logan Bartlett interview, h/t Joe_Collman).

At one point an EA fund manager told me something like, "the infrastructure fund refuses to support anything involving rationality/rationalists as a policy." Did a policy like this exist? Does it still?

9
Buck
3y
Like Max, I don't know about such a policy. I'd be very excited to fund promising projects to support the rationality community, eg funding local LessWrong/Astral Codex Ten groups.
7
Max_Daniel
3y
I'm not aware of any such policy, which means that functionally it didn't exist for this round. I don't know about what policies may have existed before I joined the EAIF, and generally don't have much information about how previous fund managers made decisions. FWIW, I find it hard to believe that there was a policy like the one you suggest, at least for broad construals of 'anything involving'. For instance, I would guess that some staff members working for organizations that were funded by the EAIF in previous rounds might identify at rationalists, and so if this counted as "something involving rationalists" previous grants would be inconsistent with that policy. It sounds more plausible to me that perhaps previous EAIF managers agreed not to fund projects that primarily aim to build the rationality community or promote standard rationality content and don't have a direct connection to the EA community or EA goals. (But again, I don't know if that was the case.) Speaking personally, and as is evident from some grants we made this round (e.g. this one), I'm generally fairly open to funding things that don't have an "EA" branding and that contribute to "improving the work of projects that use the principles of effective altruism" (cf. official fund scope) in a rather indirect way. (See also some related thoughts in a different AMA answer.) Standard rationality/LessWrong content is not among the non-EA-branded things I'm generally most excited to promote, but I would still consider applications to that effect on a case-by-case basis rather than deciding based on a blanket policy. In addition, other fund managers might be more generically positive about promoting rationality content or building the rationality community than I am.

Another potential cause of the narrow focus, I think, is some people in fact expecting the vast majority of impact to be from a small group of orgs they mostly already know about. Curious whether you disagree with that expectation (i.e., you think the impact distribution of orgs is flatter than that), or whether you're just claiming that e.g. the distribution of applicants should be flatter regardless?

It could also be the case that the impact distribution of orgs is not flat yet we've only discovered a subset of the high impact ones so far (speculatively, some of the highest impact orgs may not even exist yet). So if the distribution of applicants is flatter then they are still likely to satisfy the needs of the known high impact orgs and others might end up finding or founding orgs that we later recognise to be high impact.

Currently CFAR is on sabbatical, which we planned to allocate a couple months this year toward anyway. I.e., we're reading, and learning and scheming, and in general trying to improve ourselves in ways that are hard to find time for during our normally-dense workshop schedule.

We're considering a range of options for what to do next—e.g. online workshops, zoom mentoring, helping other orgs in some way—but we haven't yet settled on a decision.

For what it's worth, I wouldn't describe the social ties thing as incidental—it's one of the main things CFAR is explicitly optimizing for. For example, I'd estimate (my colleagues might quibble with these numbers some) it's 90% of the reason we run alumni reunions, 60% of the reason we run instructor & mentorship trainings, 30% of the reason we run mainlines, and 15% of the reason we co-run AIRCS.

4
Buck
4y
Yeah, makes sense; I didn’t mean “unintentional” by “incidental”.