Hide table of contents

Normally we think of cause areas as being problems where each of us just picks one or two to approach with serious individual efforts. But on social and political issues, if the EA community collectively praises or condemns something, that may have a substantial social effect. If we gather our views into a clear document, that can heighten the impact.

I envision something similar to open letters which we commonly see circulated by members of some professions and societies.

Here's a simplified outline to show an example of what I mean:

----------------------------------------------------------

Example Statement

Democracy Assistance Efforts Must Be Continued

The president's FY20XX budget request asks for a decrease in American funding for democracy assistance. This shortfall will reduce our abilities to monitor elections for freedom and fairness, assist political parties, ... etc.

Numerous foreign policy experts say that these efforts are effective at promoting democracy (cite) (cite). The value of democracy is in turn supported by research which indicates that it protects human rights (cite) and increases economic growth (cite). Democracy assistance funding seems to do more good than many alternative uses of funding, such as tax cuts or other forms of spending (cite).

The EA community spans many people with varied backgrounds and political perspectives, but we generally agree that cutting democracy assistance funding is a bad idea for America and for the world. We ask Congress to deny the president's request.

Signed,

[EA organization]

[EA organization]

[Prominent EA person]

[Important person not normally in EA who wants to sign on anyway]

[EA layperson]

etc.

----------------------------------------------------------

Compared to how open letters are usually written, I expect we would be careful to include a lot more serious evidence and arguments backing up our claims. They could be attached as annexes to a conventionally formatted open letter. This could add great persuasive effect.

These could be used for other issues besides politics. I think it could help with just about anything that requires collective effort. For instance, we could make a public statement that we are abandoning the handshake because it spreads too much disease.

How could this go wrong?

The open letters could be released excessively rarely, or never. Then we forgo the opportunity to make progress on changing people and institutions, we forgo the positive attention that we could get from other people who agree with us, and we forgo the opportunity for more basic publicity.

In times of great controversy, statements can be less a matter of achieving change than they are a matter of avoiding looking bad. If the organizations don't make statements on pressing current events, people may criticize the organizations for being silent. I think this is a minor consideration (just let them complain, most people won't care) but still worth noting.

Meanwhile, if we release statements excessively, there could be a variety of problems.

First, we could promote points of view that are actually harmful, especially if we proceed without proper vetting and debate.

Second, any EA who disagrees with the statement may (quite understandably) feel alienated from the EA community. We could mitigate this by actually putting credible EA disagreement on the open letter itself - I'm not sure if this is a good or a bad idea. Of course we should generally avoid marginalizing the voices of other EAs, but it would weaken the impacts of the letters. I don't know if there will ever be truly unanimous agreement among EAs, even if research and experts all point the same way; some vaguely involved people will always complain and demand to be included.

Third, a statement could hurt our reputation among outsiders who disagree with it.

Fourth, it could create a harsher expectation from people outside EA for everything that we don't speak up about. E.g., "you released a statement for X, so why are you silent on Y?"

How should this process be handled?

The way open letters are normally done, anyone can simply start one, and it can circulate and get a lot of signatures depending on how popular it is. EAs could have made such open letters already, but it seems like people have just been reluctant to do so, or not thought of it.

However, I think this would be a bad model for EA and should not be normalized. The main reason is the unilateralist's curse. It's too easy for any person to make an unsubstantiated or controversial statement and get signatures from a significant minority or a small majority of EAs. So - assuming that we do normalize this behavior - we'll end up making too many statements that don't properly represent the community consensus. And there will be a lot of associated controversy within EA which will get people angry and waste our time.

Instead of letting someone like me (who already cares about democracy assistance) release such a statement, control what it says and gather signatories, it's better for a fixed, neutral body to perform this service. Since they are preselected, there won't be bias from the author trying to promote their own cause. And since there will only be one group designated with this responsibility, there won't be a unilateralist's curse. Finally, they will have some standing and reputation within the EA community, so will be more strongly pressed to not deviate from the rest of us, and will have more legitimacy.

EA organizations sometimes put out their own statements on general current events. For instance, I noticed one or two (does Twitter count??) EA organizations make official statements following the murder of George Floyd. These came out after a sustained period of public controversy and I don't think they will accomplish anything notable in terms of political reform. This also seems like a poor way to handle it - individual organizations can also fall victim to the unilateralist's curse, and people can improperly take such statements to be representative of EA writ large. EA orgs should stick to their areas of expertise, they should be cautious before making statements about other issues, and they shouldn't be pressured to do so. And it is simply redundant labor for many organizations to separately investigate one issue that is outside of their typical wheelhouses. It seems healthier and safer all round for them to just sign onto a statement that was produced more carefully to robustly represent EAs. The organization itself is less subject to criticism for flaws in the statement, whereas the statement itself has a more potent impact because it gathers all our voices into an authoritative document.

There could even be a norm that only the fixed body make such public statements, and other organizations could say "we do not make our own public statements, instead we sign onto this procedure that is inclusive of the rest of the EA community." That would help them ward off pressure and could prevent them from making ill-advised statements that might reflect improperly on EAs in general.

An exception is when organizations promote issues which are squarely within their wheelhouse. For instance, FLI produced two open letters, one opposing autonomous weapons and another providing research priorities for robust and beneficial AI. Though I disagree with one of these letters, I tentatively feel that the process is OK, since EA organizations should be procedurally free to make progress within their cause areas however they see fit.

Regardless, my primary interest here is in amplifying a credible EA voice by adding a new statement procedure; changing what individual EA organizations do is of dubious and lesser importance.

What would this fixed body look like?

It should represent the EA community fairly well. It could simply be the Center for Effective Altruism, or maybe some other community group.

Fixed doesn't necessarily mean centralized. The process could be run by community voting, and the statement might be written as a collective Google Document, as long as there is a clear and accepted procedure for doing it.

We could elect someone who has the job of running this process.

What would we write for an issue where EAs truly disagree with one another?

We could stay silent, or we could write an inoffensive letter that emphasizes our few areas of common ground. The latter may also include explanations of our areas of internal disagreement and a general plea for other people to recognize that there are good points on both sides of the debate.

3

0
0

Reactions

0
0

More posts like this

Comments7
Sorted by Click to highlight new comments since: Today at 7:23 AM

I think CEA often plays the role of expressing some sort of aggregate or social choice for the EA movement - like in the case fo the guiding principles.

On the other hand, I take reputational risk really seriously, especially if we start criticizing policy decisons or specific institutions; so it would be more prudent to have particular organizations issuing statements and open letters (like TLYCS, or FLI, etc.), so that any eventual backlash wouldn't extrapolate to EA as a whole.

Those open letters could also be accompanied with the option to sign them and thereby signal the support of a larger group of people. Though then I think there is more traction with an open letter signed by relevant experts, like the Open Letter on Artificial Intelligence (would be interested to see data on this). So probably that would not be a particularly useful idea for using the voice of a particular community.

It would be much easier to make a single, more generic policy statement. Something like:

When in doubt, assume that most EAs agree with whatever opinions are popular in London, Berkeley, and San Francisco.

Or maybe:

When in doubt, assume that most EAs agree with the views expressed by the most prestigious academics.

Reaffirming this individually for every controversy would redirect attention (of whatever EAs are involved in the decision) away from core EA priorities.

An interesting idea for sure. I see two major points that would have to be clarified/speak against it.

1. is the question what kind of claim can be made about who this letter speaks for. Procedures to ensure such a letter truly speaks for 'the EA community' would be tricky to devise i think. The first reason is that opinions on many issues are very diverse. Secondly, the community is not organized strictly enough to plausibly establish 'representative voting' or a similar mechanism

2. I see a clear danger of politization of the movement because many times the occasions for such open letters are current events or public policy. These occasion carry the risk of making the EA community appear to be a specific political camp which could negatively impact its appearance if independence/credibility.

This is interesting.

I often see questions like What does EA think about ...?

And then someone answers with: "EA" doesn't have a formal position on this, but ...

You proposal would help to remedy this, so that EA *would* have a position. (sometimes)

A myriad objections spring to my mind, and the proposal certainly comes with risks. But perhaps the objections are surmountable, and the benefits may outweigh the risks.

Could you expand on your thoughts on the benefits of this proposal?

This is a really interesting idea. I think I instinctively have a couple of concerns about such an idea

1) What is the benefit of such statements? Can we expect the opinion of the EA community to really carry much weight beyond relatively niche areas?

2) Can the EA community be sufficiently well defined to collect opinion? It is quite hard to work out who identifies as an EA, not least because some people are unsure themselves. I would worry any attempt to define the EA community too strictly (such as when surveying the community's opinion) could come across as exclusionary and discourage some people from getting involved.

An interesting idea definitely, which also raises questions about the structure

[This comment is no longer endorsed by its author]Reply
More from kbog
Curated and popular this week
Relevant opportunities