Shared Consequences Do Not Divide Altruism

When scoring a consequence that you cause along with others, score your action that causes the consequence as though you are the only person who caused the consequence. Don't try to divide the altruism score of your action among the people who are necessary causes of the consequence. The number of people who participate in causing the consequence in question through their actions has no bearing on the altruism of your action that causes the consequence. 

However, you probably should measure altruistic value distances between consequences of less efficient and more efficient actions. You can maximize positive altruistic value and minimize negative altruistic value by your choice of actions, if that is preferable to you.

Common Behaviors Define Potential Consequences

In this discussion, a potential consequence is just an outcome that you might believe that you can cause, not one that you weight with a probability.

Modern problems of altruistic value calculation involve decisions about eating, working, consuming media, supporting philanthropy, getting stuff, paying companies money, disposing of your garbage, using water, and many other common actions. These actions have some combination of individual and cumulative consequences. Typically, you and millions of others make similar decisions about what actions to perform. You choose actions that potentially have identical consequences in other's belief as they do in yours. Of course you might score your consequences differently or believe differently about consequences of your actions. However, you take the same actions as others and the same outcomes occur for you as for those other people, all other things equal.

In the developed world, our options of action can deny our preferences for our altruistic consequences. Many people or most people want to be altruistic. They are aware that they have few easy alternatives to common actions. They know that the absence of those actions from their behaviors won't change the consequences of others who do the same actions by the millions (or billions). They might believe that the actions cause harm to others. In the developed world, the desire to be altruistic in everyday life is easily frustrated.

Your community judges altruistic efforts in philanthropy and in other areas. You can score and compare the altruistic value of common options of action as well. To do so could reshape your criteria for effective philanthropy.

A Short Comment

My posts over the last few weeks summarize my Red Team Critique for the EA community. If I could, I would put the posts together into a single, illustrated document containing useful examples and a TLDR section. I don't have the opportunity right now, but that might change in future.

Thank you for reading my posts. I appreciate your interest.

1

New Comment