Linda Linsefors

590Joined Dec 2018

Comments
87

Topic Contributions
1

I want to make an analogy to personality types. Lots of humans believe there is one single personality type. "Everyone thinks and reacts more or less like me."  Given this starting point, upgrading to thinking there are 4 or 16 or whatever types of people is a great update. Lists of different conflict resolution styles, or different love languages, etc is helpful in the same way.

However, the same system can become harmfull if after a person learns about them, they get stuck, and refuse to move on to even more nuanced understandings, and insist that the dimensions covered by the system they learned, is the only ones that exists.

Overall, I think Scott Aronsons post is good. 

I expect outsiders who read it will update from thinking there are 1 AIS camp to thinking there are 2 AIS camps. Which is an update in the right direction.

I expect insiders who read it to notice "hey, I agree with one side on some points and the other side on some point" and correctly conclude that the picture of two camps is an oversimplification.

In what sense does EA have something like a leadership?

There is no official overarching EA organisation. Strictly speaking, EA is just a collection of people who all individually does whatever they want. Some of these people have chosen to set up various orgs that does various things. 

But in a less formal but still very real way, EA is very hierarchical. There is a lot of concentration of power. 

  1. Some of this is based on status and trust. Some people and orgs have built up a reputation which grants them a lot of soft power within the EA network. 
  2. Some of this is because of entrenched infrastructure. CEA runs EA global and gets to decide who can attend. CEA also owns the trademark for "Effective Altruism", and sometimes use this to pressure other projects to do what CEA wants. (I don't know how often this happens since I only have sparce anecdotal information.)
  3. But the biggest power factor is control of money. Most EA funding comes from a few mega donors. 

And all of these three points mix. EA funds is an infrastructure (2) that controls the flow of funding (3) which CEA could set up because they have status and trust (1). Because of how these things intermingle, the same few people might end up controlling all three. 

So maybe EA don't have a leadership, but we do have some sort of power center. What, if anything, does the people in power owe the rest of us? 

There isn't an obvious answer. Probably the above question is not even the right framing. 

For myself, I'm mostly over debating what the central powers of EA should do. Given the massive lack of transparency, I just don't know. 

I'd like to see an EA movement that is less centralised, and I don't expect the people currently in power to do anything about that. Maybe they can't or maybe they don't want to. I don't care anymore which one it is. 

I'd love to see someone set up alternative EA infrastructure. I want a competitor to EA Funds. I want an alternative job board that is not controlled by 80k. This is not about these orgs being bad, but about centralisation being bad. 

But I also know that it is hard work setting up alternative infrastructure. It takes time for new things to get traction. It takes time for the word to spread about you even existing. 

Did you know there is a second EA career advice org?
Probably Good | Impact-focused Career Advice 

If established EA orgs want to decrease centralisation (which again, I don't know if they do) then one of the biggest things they could do is to promote their competitors.

501(c)(3)s can be acceded via fiscal sponsorship. There is already a network of agreement between EA orgs to re-grant to each other for tax reasons, mostly thanks to Rethink
https://rethink.charity/donate 

In order to tap into this, an individual needs to be paid by though an org that is tapped into this network.  For AI Safety projects I think AI Safety Support would be willing to provide this service (I know they already done this for two projects and one person). I don't know what the options are for other cause areas, but if this becomes a major bottleneck, then it seems like a good plan would be to set up orgs to financially host various projects.

Maybe something like the S-process used by SFF?
https://survivalandflourishing.fund/s-process

It would be cool to have a grant system where anyone can list them selves as fund manager, and donors can pick which fund managers decisions they want to back with their donations.  If I remember correctly, the s-process could facilitate something like that.

I totally agree with you regarding the value of feedback.

Someone may have spent several hours (or days) writing a grant proposal, and the proposal judges/funders may have spent a couple of hours reading it, but they can't spend five minutes writing an explanation of why it's turned down?

I'm also confused by this. I'm guessing it's more about the discomfort around giving negative feedback, than it is about time? 

I'm verry much in favour of acknowledging the cost associated with the energy drain of dealing with negative emotions. There are lots of things around the emotional cost of applications that could be improved, if we agreed that this is worth caring about.

Clarification (because based on past experience this seems to be necessary): I don't think the feelings of fellow EA is the only thing that matters, or even the top priority or anything like that. What I do think is that we are losing both valuable people and productivity (who could have contributed to the mission) because we ignore that personal emotions is a thing that exists.

I feel that most people who are not "professional" EAs (for lack of a better word, and definitely including myself) would be pretty bad at playing grantmaker without devoting quite a bit of time to it.

I think you overestimate the difference between you and "professional" EAs. Good grant making both hard and time consuming for everyone

If someone is doing grant evaluation as their full-time job, then they are probably better at it than you, because they can spend more time on it. But as far as I know, most EA grants are evaluated by people doing this as some side volunteering. They usually have an EA job, but that job is often something different than grant making. 

I think OpenPhil is the only org that employ full time grant makers? But you can't even apply to OpenPhil unless you either fitted into any of their pre-defined programs or if know the right people. The only time I asked OpenPhil for money (I went to their office hour at EA Global) they politely told me that they would not even evaluate my project, because it was too small to be worth their time. To be clear, I'm not writing this to complain. I'm not saying that they did the wrong judgment. Having paid professional evaluators looking at every small project is expensive. 

I just hate that people like yourself think that there are some grant experts out there, looking at all the grant, making much better evaluations than you could have done. Because there isn't. That's not how things are currently run.

In particular, I've always appreciated that the EA community tries to minimize the amount of time/resources organizations need to devote to fundraising as opposed to substantive work. I get the sense that top leadership in many "mainstream" charities spends a lot of its bandwidth on fundraising and donor management.

I agree. I know some academics and have an idea of how much time and effort they spend on grant making. We don't want to end up in that situation.

I think this can be solved by having an EA wide standard for grant applications. If I were in charge, it would be a google doc template. If I want funding, I can fill it in with my project, and then send it to all the relevant mid-sized funders. 

[Epistemic status: I'm writing this in the spirit of blurting things out. I think I'm pointing to something real, but I may be wrong about all the details.]

  • lack of social incentive to blurt things out when you're worried you might be wrong;
  • lack of social incentive to build up your own inside-view model (especially one that disagrees with all the popular views among elite EAs);

You are correct that there is an incentive problem here. But the problem is not just lack of incentive, but actual incentive to fall in line. 

Because funding is very centralised in EA, there are strong incentives to agree with the people who control the money. The funders are obviouly smarter than just selecting only "yes"-sayers, but they are also humans with emotions and limited time. There are types of ideas, projects, criticism that don't appeal to them. This is not meant as criticism of individuals but as criticism of the structure. Because given the structure I don't see how thing s could be otherwise.

This shapes the community in two major ways. 

  1. People who don't fit the mould of what the funders like, don't get funded.
  2. People are self-censoring in order to fit what they think the mould is.
     

I think the only way out of this is to have less centralised funding. Some steps that may help:

  • Close the EA Funds. Specifically, don't collect decentralised funding into centralised funds. 
  • Encourage more people to earn-to-give and encourage all earning-to-givers to make their own funding decisions. 
  • Maybe set up some infrastructure to help funders find projects? Maybe EA Funds could be replaced by some type of EA GoFundMe platform? I'm not sure what would be the best solution. But if I where to build something like this, I would start talking to earning-to-givers about what would appeal to them.

 

Ironically FTX fund actually got this right. Their re-grant program was explicitly designed to decentralise funding decisions. 

Thanks for pointing this out. 
I did read the post but obviously missed this part. I apologize. 

From the article:

Through a separate nonprofit called Building a Stronger Future, Mr. Bankman-Fried also gave to groups including the news organizations ProPublica, Vox and the Intercept.

In a note to staff members on Friday, ProPublica’s president, Robin Sparkman, and editor in chief, Stephen Engelberg, wrote that the remaining two-thirds of a $5 million grant for reporting on pandemic preparedness and biothreats were on hold. “Building a Stronger Future is assessing its finances and, concurrently, talking to other funders about taking on some of its grant portfolio,” they wrote.

Why isn't the former FTX Future Fund team doing this too? Instead of the symbolic gesture of resigning, why are they not doing their best to raise other funds in order to keep the funding promises they made? Don't they believe in their own orgs grant evaluations? Because if they do, they should be talking to OpenPhil about it right now. 

Or are they already doing this? But then why not tell us? This is not the time for secrecy. 

 

[This comment is no longer endorsed by its author]Reply

How much money was committed in grants that will no not be paid out?
Additionally it would be useful to know the distribution among cause areas for this money.

While some people are focused on figuring out what went wrong with FTX and why, the rest of us needs to focus on mitigating the immediate damage from broken funding promises. I would be helpful to know the total scale of this situation. 

Load More