A personal reflection on how my experience of EA is similar to my experience of religious faith in that it provides a sense of purpose and belonging, but that I miss the assurance of my own intrinsic value and how that can make it difficult to maintain a stable sense of self-worth.

Note: I realize that my experience of religion and faith is probably different from that of a lot of other people. My aim is not to get into a discussion of what religion does right or wrong, especially since I am no longer religious.


I grew up with a close connection to my local church and was rather religious until my mid-late teenage years. I am now in my thirties and have been involved with the EA movement for a couple of years. To me, there are similarities between how I remember relating to faith and church and how I now relate to the EA philosophy and movement.

For me, both provide (provided) a strong sense of purpose and belonging. There is a feeling that I matter as an individual and that I can have an important mission in life, that I can even be some kind of heroine. For both, there is also a supportive community (of course not always for everyone, but my experience has been mainly positive in both cases) that shares my values and understands and supports how this sense of mission affects many of my important life decisions. This is something that I find very valuable.

However, in comparison to what my faith and church used to offer me, there is something lacking in the case of EA. I miss the assurance that I as a person have an intrinsic value, in addition to my instrumental value as a potential world-saviour. With faith, you are constantly reminded that God loves you, that God created you just as you are and that you are therefore, in a sense, flawless. There is a path for everyone, and you are always seen and loved in the most important way. This can be a very comforting message, and I feel it has a function to cushion the tough demands that come with the world-saving mission. The instrumental value you have through your mission to do good is in a way balanced by the assurance that no matter what, you also have infinite intrinsic value.

With EA, I don’t find any corresponding comforting thought or philosophy to rest in. If I am a well-off, capable person in the rich world, the QALYs I could create or save for others are likely to be much more than the QALYs I can live through myself. This seems to say that my value is mostly made up of my instrumental value, and that my individual wellbeing is less important compared to what I could achieve for others.

I believe that if community members perceive that their value is primarily instrumental, this might damage their (our) mental well-being, specifically risking that many people might suffer burnouts. The idea that most of the impact is achieved by a few, very impactful people could also make the people who perceive themselves as having potential for high impact particularly vulnerable, since the gap between their intrinsic value or self-worth and their instrumental value would seem even wider.

If the value of our work (the QALYs we can save) is orders of magnitude greater than the value of ourselves (the QALYs we can live), what does that mean? Can we justify self-care, other than as a means to improve ourselves to perform better? Is it possible then to build a stable sense of self-worth that is not contingent on performance?

I have read several previous posts on EA’s struggling with feelings of not achieving enough (In praise of unhistoric heroism, Doing good is as good as it ever was, Burnout and self-care), and to me this seems closely related to what I’m trying to address here.

I’m not sure what can be done about this on a community level. As an individual, I believe it will be important for me to find a way to maintain a stable sense of self-worth, while still staying intellectually honest with myself and committed to the EA ideals. If there are others who have also thought about or struggled with this, I would greatly appreciate your input.

Comments31
Sorted by Click to highlight new comments since:

This definitely resonates with me, and is something I've been thinking about a lot lately, as I wrestle with my feelings around recreational activities and free time. I'm not sure if what follows is exactly an answer to your question, but here's where I'm at in thinking about this problem.

I think one thing it's very important to keep in mind is that, in utilitarianism (or any kind of welfarist consequentialism) your subjective wellbeing is of fundamental intrinsic value. Your happiness is deeply good, and your suffering is deeply bad, regardless of whatever other consequences your actions have in the world. That means that however much good you do in the world, it is better if you are happy as you do it.

Now, the problem, as your post makes clear, is that everyone else's subjective wellbeing is also profoundly valuable, in a way that is commensurate with your wellbeing and can be traded off against it. And, since your actions can affect the wellbeing of many other people, that indirect value can outweigh the direct value of your own wellbeing. This is the fundamental demandingness of consequentialist morality that so many people struggle with. Still, I find it helpful to remember that the same reasoning that makes other people so valuable also makes me valuable, in a deep and fundamental and moral way.

Turning, to instrumental value, I have two things to say. The first is about instrumental value in general, and the second is about the specific instrumental value of self-kindness.

The first thing I want to say is that almost everything I value I value instrumentally, and that fact does not make the value of those things less real, or less important. I care a great deal about freedom and civil liberties and democracy, and would pay high costs to protect those things, even though I only value them instrumentally, as ways to create more happiness and less suffering. I hate racism and speciesism and sickness and ageing, not because they are intrinsically bad in themselves, but because they are the source of so much suffering and foregone happiness. For some reason, we tend to view other things' instrumental value as deeply important, and our own instrumental value as a kind of half-real consolation prize. I think this is a tragic error.

Secondly, with regard to our own instrumental value, most people tend to significantly underestimate just how instrumentally valuable their mental health is. In my experience, when people think and talk about the instrumental value of their own wellbeing, they seem to have in mind about some kind of relaxation reserve that it's important to keep full in order to avoid burnout. I think something like this is probably true, but I also think that there's much deeper and broader instrumental value in being kind to yourself.

My ideas here aren't fully developed, but I think there's something toxic about too much self-abnegation, that whittles away at one's self-esteem and courage and enthusiasm and instinctive kindness toward others. At least for me, self-denial and guilt push me towards a timid and satisficing mindset, where I do what is required to not feel bad about myself and don't envision or reach out for higher achievements. It also makes me less instinctively kind to others, which has a lot of compounding bad effects on my impact, and also makes it harder for me to see and embrace new and different opportunities for doing good.

I'm still thinking through this shift in how I think about the instrumental value of my own wellbeing, but I think it has some pretty important consequences. Compared to the reserve-of-wellbeing model, it seems to militate in favour of being more generous to myself with my free time, less focused on self-optimisation insofar as that feels burdensome, and more focused on self-motivation through rewards rather than threats of self-punishment. How exactly this kind of thinking cashes out into lifestyle choices probably varies a lot from person to person; my main goal here is to illustrate how one's conception of one's instrumental value should be broader and deeper than just "if I don't relax sometimes I'll burn out".

In summary:

  • The same thing that makes it important to work for the wellbeing of others also makes you deeply and intrinsically valuable – to me, to others here, and hopefully also to yourself.
  • The instrumental value of your wellbeing is also deeply important, not merely some kind of second prize. Think about how you think about other things that you value a lot instrumentally, and compare how you think about your own instrumental value: are they the same?
  • The variety and scale of the effects of your wellbeing on your impact are probably greater than you think: your wellbeing isn't just instrumentally valuable, it's very very instrumentally valuable, in all kinds of hard-to-quantify ways.
  • Even if, at some point in the future, your wellbeing no longer has much instrumental value, you will still be just as intrinsically valuable as you are now: which is to say, very. The thing that makes you value the other sentient beings whose wellbeing you strive for will still apply to you: as long as you exist, you are important.

Something I didn't say in my big comment above: I'm really happy the people in this thread are approaching this with the goal of "still staying intellectually honest with" ourselves. I think there's a lot of seductive but misleading thinking in this space, and that there's a strong urge to latch onto the first framing we find that makes us feel better in the face of these issues. I'm happy to see people approach this problem in the same truth-first mindset they apply to doing good in the world.

On this point...there are a few arguments made in other comments here that I don't find very persuasive, but am avoiding arguing against for fear of seeming disagreeable or causing distress to people with fragile self-worth. What are people's thoughts about norms around arguing in these kinds of situations – or even raising the question in the first place?

EDIT: From my side, if there's an argument that I'm making that someone think is shaky, I'd rather they told me so – privately or publicly, as they prefer.

I think we can assume that people on this forum seek truth and personal growth. Of course, this is challenging for all of us from time to time.

I think having a norm of speaking truthfully and not withholding information is important for community health. Each one of us has to assume the responsibility of knowing our own boundaries and pushing them within reasonable bounds, as few others can be expected to know ourselves well enough. Combined with the fact that in this case people have consciously decided to *opt in* to the discussion by posting a comment, I would think it overly cautious to refrain from replying.

There surely are edge cases that are more precarious and deserve tailored thought but I think this isn't one.

If you know somebody well enough to think they are pushing their boundaries in unsustainable ways, I would reach out to them and mention exactly that thought in a personal message. Add some advice on how to engage with the community and its norms sustainably, link to posts like this showing that we all struggle with similar problems, and then people can also work through possible problems regarding "not feeling good enough".

Personally, I'd rather be forced to live in reality than be protected because people worry I might not be able to come to grips with it. One important reason for which I like the EA community is that it feels like we all have consented to hearing the truth, even if it might be uncomfortable and imply labour.

I can obviously only speak for myself, but for me just having this kind of conversation is in itself very comforting since it shows that there are more people who think about this (i.e. it's not just "me being stupid"). Disagreement doesn't seem threatening as long as the tone is respectful and kind. In a way, I think it rather becomes easier to treat my own thoughts more lightly when I see that there are many different ways that people think about it.

It happens in philosophy sometimes too: "Saving your wife over 10 strangers is morally required because..." Can't we just say that we aren't moral angels? It's not hypocritical to say the best thing is to do is save the 10 strangers, and then not do it (unless you also claim to be morally perfect). Same thing here. You can treat yourself well even if it's not the best moral thing to do. You can value non-moral things.

This feels...not wrong, exactly, but also not what I was driving at with this comment. At least, I think I probably disagree with your conception of morality.

Thanks a lot for this comment. I feel like I need to read it over again and think more about it, so I don't have a detailed or clever response, but I really appreciate it. The comparison to other things that have mainly or only instrumental value, and how much we actually value those things, was also a new and useful perspective for me.

Thank you so much, C Tilli, for putting this in words and blogpost it. I have similar thoughts, but I never could articulate them so clearly.

Like you, I had various connections to Christians and the church when I was younger. I am no longer religious, but I miss this comforting feeling of self-worth and being loved, no matter what, that came with the beliefes that were held in my community.

Like you, I was not yet able to find a similar comfort in the EA movement and it challenges my perceived self-worth. And also thank you willbradshaw for you answer.

I do totally understand the worth of instrumental value, but it is still not as reassuring for me as I wish it would be.

Do I just have to accept that feeling and is that some kind of "price" to pay, when you stop believing in stuff, that was designed to comfort people (and probably also to establish power over them, but I put that aside for the moment) and rather seek out a fact-based worldview. Or is it more a matter of getting used to it, slowly shifting your views and perspectives, and - after some time - getting the same comfort from the believes you expressed above?

I think all the different framings you suggest are at least partly true.

I think this is one of the fundamental challenges of EA, and is going to take a lot of different people thinking hard about it to really come to grips with as a community. I think it will always be a challenge – EA is fundamentally about (altruistic) ambition, and ambition is always going to be in some degree of tension with the need for comfort, even if it simultaneously provides a great deal of meaning.

As you say, I'm not sure EA will ever be as comforting as religion – it's optimising for very different things. But over time I hope we will generate community structures and wisdom literature to help manage this tension, care for each other, and create the emotional (as well as intellectual) conditions we need to survive and flourish.

First, of course, thanks, C Tilli, for the post, and thanks willbradshaw for these comments.
This pierced my mind:

As you say, I'm not sure EA will ever be as comforting as religion – it's optimising for very different things. But over time I hope we will generate community structures and wisdom literature to help manage this tension, care for each other, and create the emotional (as well as intellectual) conditions we need to survive and flourish.

I think my background is the opposite of C Tilli's: I have been an atheist for many years (and still am - well, maybe more of an agnostic, since we might be in a simulation...), but since I found out about EA, I think I became a little bit more understanding towards not only the need for comfort, but also the idea of valuing something that goes way beyond one's own personal value and social circle, that is sought by religious people (on the other hand, I also became a little bit supicious of some cult-like traits we might be tempted to mimic).

I am sort of surprised we wrote so much, so far, without talking about death and mortality. I know I have intrinsic value, but it's fragile and perishable (cryonics aside); and yet, the set of things I can value extends way beyond my perishable self - actually, my own self-worth depends a little bit on that (as Scheffer argues, it'd be hard not to be nihilistic if we knew humanity was going to end after us), and there's no necessary upper bound for what I can value. I reckon that, as much as I fear humanity falling into the precipice, I feel joy by thinking it may continue for eons, and that I may play a role, contribute and add my own personal experience to this narrative.

I guess that's the 'trick' played by religion that might be missing here: religion 'grants' me some sort of intrinsic value through some metaphysical cosmic privilege (or the love of God) - and this provides us some comfort. But then, without it, all that is left, despite enjoyable and worthy, is perishable - transient love, fading joy, endured pain, limited virtue, pleasure... Like Dworkin (who considered this to be a religious conviction - though non-theistic), we can say that a life well-lived is an achievement in itself, and stands for itself even after we die, like a work of art - but art itself will be meaningless when humanity is gone. Maybe altruism is just another way to trick (the fear of) death: when one realizes that "All those moments will be lost in time, like tears in rain. Time to die" one might see it not as realizing some external value, but as an important part of one's own self-worth. (if Bladerunner is too melodramatic, one can use the bureaucrat in Ikiru as an example of this reasoning)

For whatever reason people who place substantial intrinsic value on themselves seem to be more successful and have a larger social impact in the long term. It appears to be better for mental health, risk-taking, and confidence among other things.

You're also almost always better placed than anyone else to provide the things you need — e.g. sleep, recreation, fun, friends, healthy behaviours — so it's each person's comparative advantage to put extra effort into looking out for themselves. I don't know why, but doing that is more motivating if it feels like it has intrinsic and not just instrumental value.

Even the most self-effacing among us have a part of their mind that is selfish and cares about their welfare more than the welfare of strangers.

Folks who currently neglect their wellbeing and intrinsic value to a dangerous extent can start by fostering ways of thinking that build up that endorse and build up that selfishness.

For whatever reason people who place substantial intrinsic value on themselves seem to be more successful and have a larger social impact in the long term. It appears to be better for mental health, risk-taking, and confidence among other things.

I think this is still an instrumental reason for someone to place "substantial intrinsic value on themselves." Though I have no problem with that, I thought what C Tilli complained about was precisely that, for EAs, all self-concern is for the sake of the greater good, even when it is rephrased as a psychological need for a small amount self-indulgence.
Second, I'd say that people who are "more successful and have a larger social impact in the long term" are "people who place substantial intrinsic value on themselves,” but that's just selection dynamics: if you have a large impact, then you (likely) place substantial intrinsic value on yourself. Even if it does imply that you’re more likely to succeed if you place substantial intrinsic value on yourself (if only people who do that can succeed), it does not say anything about failure – confident people fail all the time, and the worst way of failing seems to be reserved for those who place substantial value on themselves and end up being successful with the wrong values.

But I wonder if our sample of “successful people” is not too biased towards those who get the spotlights. Petrov didn’t seem to put a lot of value on himself, and Arkhipov is often described as exceptionally humble; no one strives to be an unsung hero.

Actually my concerns are more practical, along the lines of Roberts comment, that this kind of thinking could be bad for mental health and, indeed, long-term productivity and impact. If the perception of self-worth didn't seem important for mental health, I would not care much about it.

But it would be a sad scenario if we look back in 50 years and see that the EA movement has led to a lot of capable, ambitious people burning out because we (inadvertently) encouraged (or failed to counteract) destructive thought patterns.

I don't think there is a simple solution, but I think Will Bradshaw is on to something in his comment about the need to "generate community structures and wisdom literature to help manage this tension, care for each other, and create the emotional (as well as intellectual) conditions we need to survive and flourish."

It's kind of sad to revisit this discussion during SBF's trial

I feel like one of the problems EA has is that it is a community that largely rejects metaphysics in a society that is based on numerous metaphysical claims. EA doesn't say there are "good people" and "bad people" who "deserve" certain outcomes, only delving into metaphysics briefly to claim that certain outcomes are "good." So, as an apostate, I feel like I'm still sort of running on the "I'm inherently a sinner...unless" -> "I deserve to suffer...unless" from my time as a Christian, and since EA doesn't offer a strong alternative metaphysics due to rejecting the category, I'm just sort of stuck there, trying to tone down my earlier thoughts rather than having something else replace them. IDK, though. Maybe if I grew up in a society that didn't tell me I was inherently worthless, I'd still need a source of positive self-worth, but probably not as badly as I do now.

I'm also a bit unsure whether claiming "I'm inherently valuable" even is against truth-seeking norms. After all, this may not be expressing something that even has a truth-value. It may be meaningless language, so you're essentially just hugging yourself when you make that seeming claim (I'm not quite a logical positivist, so I think I disagree, but...).

I wonder to what extent this springs from the fact that most pastors do not expect most of their congregants to achieve great things. Presumably if you are a successful missionary who converts multiple people, your instrumental value significantly exceeds your intrinsic value, so I wonder if they have the same feelings. An extreme case would be someone like Moses, whose intrinsic value presumably paled into insignificance compared to his instrumental value as a saviour of the Israelites and passing on the Word of God.

In any case, I think there is a strong case to be made for spending resources on yourself for non-instrumental reasons. Even if you don't think you matter more than anyone else, you definitely don't matter less than them! And you have a unique advantage in spending resources to generate your own welfare: an intimate understanding of your own circumstances and preferences. When we give to help others, it can be very difficult to figure out what they want and how to best achieve that. In contrast, I know very well which things I have been fixated on!

Interesting thought. I'm not sure if what I had was the mainstream understanding of Christianity, but I didn't experience that there was this kind of conflict in the same way. I'd think that the intrinsic value of being created and loved by God was not really something that could pale in comparison to anything. But I don't know, and maybe it's not very important.

I think there is a difference between justifying spending resources on our own wellbeing and being able to feel valuable independent of performance. Feeling valuable is of course related to feeling like we deserve to be spent resources on, but I don't think it's exactly the same.

As the author of this post, I found it interesting to re-read it more than a year later, because even though I remember the experience and feelings I describe in it, I do feel quite differently now. This is not because I came to some rational conclusion about how to think of self-worth vs instrumental value, but rather the issue has just kind of faded away for me.

It's difficult to say exactly why, but  I think it might be related to that I have developed more close friendships with people who are also highly engaged EAs, where I feel that they genuinely care about me and spend time not just supporting me on high-impact work, but on socially checking in and hanging out, joking or talking about private stuff - that they like me and care about me as a person.

This makes me question the assumptions I made in the post about how feelings of self-worth are created in the religious context. Perhaps even in church the thing is not the abstract idea being "perfect in Gods eyes", but rather the practical experience of feeling loved and accepted by the community and knowing they have your back. If this is right, that's a very good thing as that is something that can be re-created in a non-religious context.

So, if I'd update this post now, I might be able to develop some ideas for how we could work on this:  perhaps a reason to be careful with over-optimizing our interpersonal meetings?

I don't think this exactly answers your question as it doesn't address self-worth, but for me a great piece of advice I was given was: "put on your own oxygen mask before helping others." Fundamentally, we are only successful when we look after ourselves, and direct our energies to things that we want or can get some enjoyment from, rather than forcing ourselves (in the long-run at least) to do things we don't want to do.

I found this a helpful sound bite, thank you

most of the impact is achieved by a few, very impactful people could also make the people who perceive themselves as having potential for high impact particularly vulnerable, since the gap between their intrinsic value or self-worth and their instrumental value would seem even wider.

 

Not sure if relevant to what you're saying, but there's this very interesting paper that shows:

Suppose that all people in the world are allocated only two characteristics over which they have (almost) no control: country of residence and income distribution within that country. Assume further that there is no migration. We show that more than one-half of variability in income of world population classified according to their household per capita in 1% income groups (by country) is accounted for by these two characteristics. The role of effort or luck cannot play a large role in explaining the global distribution of income.

This has obvious implications how much people can realistically earn to give, but also suggests that other forms of impact, like social impact, might be mostly outside people's control. This is good reason to not be too hard on oneself for not achieving more, and not compare yourself to people like Bill Gates.

This blog post "Why not give 90%?" also seems relevant. 

Thank you for this. It's something I've struggled with a lot, too - I wish I had more to add right now but tonight's a really hard night for me, and delving too deeply into my psychological issues tends to trigger more negativity.

I will say, however, that I've been slowly going through the Replacing Guilt blog post series (from Nate Soares) and it has been helpful in dealing with some of these issues, e.g. "feeling like a bad person for not doing enough [of what makes me instrumentally valuable]."

I also do think that the instrumental value argument for self-care is pretty compelling to me; I don't think I actually need to view myself as inherently worthy. I think this stems from a controversial belief that I'm not sure people inherently 'deserve' to live (although this is not a coherent belief because I waver between saying that nobody deserves to die and that some people use up more resources than they give back, which means they don't 'deserve' their resources), so I'm being internally consistent when applying it to myself. Regardless, the instrumental argument is difficult enough for me to put into practice!

Hi Miranda! I'm glad you liked it, and I hope you feel better now. Since it's been a while since I wrote this I realize my perspective changes a lot over time - it feels less like a conflict or a problem for me right now, and not necessarily because I have rationally figured something out, it's more like I have been focusing on other things and am generally in a better place. I don't know how useful that is to you or anyone else, but to some extent it might mean that things can sometimes get better even if we don't solve the issue that bothered us in the first place.

Regardless, the instrumental argument is difficult enough for me to put into practice!

Thinking of myself as a role model to others has been the most useful to me. Instead of thinking of exactly how much rest/vacation/enjoyment I need to function optimally, I try to think more about what are healthy norms to establish in a workplace or a community. What is good about that is that I get away from the tendency of thinking of myself as an exception who can somehow manage more than others - instead of thinking "Can I push myself a bit further?" the question becomes "Is it healthy/constructive if we all push ourselves in this way?"

But more than having "figured it out", I have mostly just reached some kind of pragmatic stance where I just allow myself to be egoistic in the sense that I prioritize myself and my loved ones muchy, much higher than we would "deserve" from some kind of detached ethical perspective. I don't have any way to justify it really, I just admit it and accept it, and it helps me to move on to thinking about other things instead.

Feel free to reach out over DM if you want to chat!

Thank you for responding to my comment and sharing your (more recent) experience! I agree that I don't need to 'solve' it intellectually - I've never felt like my philosophy holds me back from feeling fulfilled and I think the issue of low self-confidence is at least partly separate. I'm very glad to hear that you are in a better place now. :)

The role model concept is definitely something I've heard before and while it doesn't really make self-care easy, I agree that it is useful - e.g. when I feel guilty about not working overtime, I remind myself that I would prefer + want to create a society that doesn't incessantly overwork. Why would anyone want to join a community that doesn't encourage individual flourishing? 

Thank you again for your kind words and your offer! I think I'm good for now but will keep in mind. In the mean time, I hope to see you around the forum!

And totally agree about the Replacing Guilt series, it's really good.

Great post! I've also experienced similar things during my time with EA. I think there are several ways to approach the issue of self-worth:

  1. Its important to realize that EA is not the same as utilitarianism and therefore does not suffer from the problem of demandingness (this is also discussed in the latest 80K podcast with Benjamin Todd). EA does not prescribe how much of resources we should share, only that the ones we do share should be distributed in an effective way.
  2. Unfortunately there is a tendency in EA to undervalue "small" contributions (i.e. those made by care workers, nurses, GPs etc). I think we need to realize that every contribution people can make to the common good is good no matter how small. I don't think that someone who saves less than one life in expectation should feel any worse than people who saves thousands or millions of lives. In any case, I wouldn't go around telling people that they should feel worthless if they are not working on something super important for humanity (if that was the case we'd need to reach more than 99% of humans on earth to tell them that they are worthless). This is clearly an absurd position, so why should we be telling ourselves that?

I think I mostly agree with this, and I'd also like to clarify that I don't think this problem originates from EA or from my contact with EA. It is not that I feel that "EA" demands too much of me, rather that when I focus a lot on impact potential it becomes (even more) difficult to separate self-worth from performance.

Different versions of contingent self-worth (contingent self-esteem, performance-contingent self-esteem - there are a lot of similar concepts and I am not completely sure about which terms to use, but basically the concept that how much we like and value ourselves is connected strongly to our ability to perform) seem to be a problem for a lot of people outside of EA, that also relates to the risk for burn-out.

My thinking is that there are people with this issue in EA, possibly more than in the general population, and that even though it does not come from EA philosophy there is some relation between these types of self-worth issues and a focus on instrumental value. I'm not arguing that this is "right" or useful, I think it'd be a lot better if we could all have a strong and stable sense of non-contingent self-worth.

I'm what one might call a "mom friend" - I often give emotional support to my friends when they have problems, while not demanding much from my friends in return for my emotional labor. I feel like I've been approaching effective altruism in a similar way - I give a lot, emotionally and intellectually, to the world and strive to do as much good for others in the long term as possible, but I don't feel like I deserve much from the world "in return" because of the privileges my circumstances have afforded me.

Overall, I feel like it's an unhealthy way of approaching relationships and effective altruism. In terms of relationships, I want to feel free to demand more from my friends, coworkers, and employers. In terms of the world... I wouldn't demand anything from the global poor, but I want to expect more from my own society. If I could "ask" society for something, it would be more personal freedom to pursue things that would give me fulfillment. But maybe I just have to take it.

I think you're conflating moral value with value in general. People value their pets, but this has nothing to do with the pet's instrumental moral value.

So a relevant question is "Are you allowed to trade off moral value for non-moral value?" To me, morality ranks (probability distributions of) timelines by moral preference. Morally better is morally better, but nothing is required of you. There's no "demandingness". I don't buy into the notions of "morally permissible" or "morally required": These lines in the sand seem like sociological observations (e.g. whether people are morally repulsed by certain actions in the current time and place) rather than normative truths. 

I do think having more focus on moral value is beneficial, not just because it's moral, but because it endures. If you help a lot of people, that's something you'll value until you die. Whereas if I put a bunch of my time into playing chess, maybe I'll consider that to be a waste of time at some point in the future. There's other things, like enjoying relationships with your family, that also aren't anywhere close to the most moral thing you could be doing, but you'll probably continue to value.

You're allowed to value things that aren't about serving the world.

I echo thoughts here re helping yourself generally being the smart thing to do. I personally love that my current work relies so heavily on my mental wellbeing, it means I can't tempt myself with overly self-sacrificial narratives.

This said, I also LOVE that EA isn't about me/us. It's the tool for doing more good with our careers, and lots of the people involved in it make for great like-minded friends, but it isn't, and shouldn't be, our home, or crutch.

I don't think a community full of people who [make EA too much their everything] is as stable or as robust as one full of people with simply a shared mission.

I like that I have a strong instrumental reason, beyond just common sense, not to feel like maybe I am compelled to make EA/utilitarianism/impact my everything.

Curated and popular this week
Relevant opportunities