​​I have received funding from the LTFF and the SFF and am also doing work for an EA-adjacent organization.

My EA journey started in 2007 as I considered switching from a Wall Street career to instead help tackle climate change by making wind energy cheaper – unfortunately, the University of Pennsylvania did not have an EA chapter back then! A few years later, I started having doubts about my decision that climate change was the best use of my time. After reading a few books on philosophy and psychology, I decided that moral circle expansion was neglected but important and donated a few thousand sterling pounds of my modest income to a somewhat evidence-based organisation. Serendipitously, my boss stumbled upon EA in a thread on Stack Exchange around 2014 and sent me a link. After reading up on EA, I then pursued E2G with my modest income, donating ~USD35k to AMF. I have done some limited volunteering for building the EA community here in Stockholm, Sweden. Additionally, I set up and was an admin of the ~1k member EA system change Facebook group (apologies for not having time to make more of it!). Lastly, (and I am leaving out a lot of smaller stuff like giving career guidance, etc.) I have coordinated with other people interested in doing EA community building in UWC high schools and have even run a couple of EA events at these schools.

How others can help me

Lately, and in consultation with 80k hours and some “EA veterans”, I have concluded that I should consider instead working directly on EA priority causes. Thus, I am determined to keep seeking opportunities for entrepreneurship within EA, especially considering if I could contribute to launching new projects. Therefore, if you have a project where you think I could contribute, please do not hesitate to reach out (even if I am engaged in a current project - my time might be better used getting another project up and running and handing over the reins of my current project to a successor)!

How I can help others

I can share my experience working at the intersection of people and technology in deploying infrastructure/a new technology/wind energy globally. I can also share my experience in coming from "industry" and doing EA entrepreneurship/direct work. Or anything else you think I can help with.

I am also concerned about the "Diversity and Inclusion" aspects of EA and would be keen to contribute to make EA a place where even more people from all walks of life feel safe and at home. Please DM me if you think there is any way I can help. Currently, I expect to have ~5 hrs/month to contribute to this (a number that will grow as my kids become older and more independent).


Topic contributions

I am a bit more unsure about this but I also thinks this cuts the other way - if someone at an event loudly went around advocating for forcefully taking (e.g. by nationalising their wealth in an unprecedented and somewhat aggressive way ) rich people's money to fund egalitarian project X, I think one could also argue that such people make others uncomfortable enough that their attendance is undesirable.

Fair point. Where to draw the line between what is and isn't politics isn't clear cut or as Thomas Mann put it: "Everything is politics." Perhaps pimples is less political than comments that relate to e.g. religion or something else "structural". I guess where I feel like there is something in my comment is one then concludes something like "it is ok to offend someone as long as the offence ties into power structures". I guess this would theoretically mean then that it is ok for someone to comment on someone with lower income on e.g. their cheap clothing (or pick your physical proxy for class). That does not seem right so I still think I think that people acting offensively regarding race should be encouraged to change their behavior to be less offensive. And if there is a need to discuss something offensive (e.g. in nuclear weapons discussions discuss the horror that followed the bombing of Hiroshima, maybe make this clear to participants in advance so they can avoid the event/part of the event if that is a challenging topic for them).

I think if you had a person invited who is known at events to get drunk and go to to people and comment negatively about their least flattering physical feature (e.g. your pimples are gross) it would not be a worry if that person was not invited. This is not about politics but about inappropriate behaviour.

I love this post! And I think Ville and Jona might have done this at least partially unpaid so no criticism here (and also the date preceded the announcement): I just want to put a pin for future such "EA funding overviews" to strongly consider the Navigation Fund (and any other significant donors I might have missed). If anyone comes across other overviews of funding here on the EAF I suggest leaving comments such as this both for people looking for funding but also for future authors of such information to include additional sources of funding.

It would be hard to imagine he has no interest, I would say even a simple bonus scheme whether stock, options, cash, etc. would count as "interest". If company makes money then so does he.

I think what would be more helpful for me is the other things discussed in board meetings. Even if GPT was not expected to be a big deal, if they were (hyperbolic example) for example discussing whether to have a coffee machine at the office, I think not mentioning ChatGPT would be striking. On the other hand, if they only met once a year and only discussed e.g. if they are financially viable or not, then perhaps not mentioning ChatGPT makes more sense. And maybe even this is not enough - it would also be concerning if some board members wanted more info, but did not get it. If a board member requested more info on prod dev and then ChatGPT was not mentioned, this would also look bad. I think the context and the particulars of this particular board is important.

Yeah that was a bad example/analogy. Not sure if helpful but here is what GPT suggested as a better example/response, building on what I previously wrote:

"I understand your concern about over-analyzing hobbies, which indeed might not involve significant truth claims. To clarify, my point was more about the balance between being truth-seeking and pragmatic, especially in non-critical areas.

To illustrate this, consider an example from within the EA community where balancing truth-seeking and practicality is crucial: the implementation of malaria bed net distribution programs. Suppose an EA working in global health is a devout Christian and often interacts with communities where religious beliefs play a significant role. If the EA were required to frequently challenge their faith publicly within the EA community, it might alienate them and reduce their effectiveness in these communities.

This situation demonstrates that while truth-seeking is vital, it should be context-sensitive. In this case, the EA's religious belief doesn't hinder their professional work or the efficacy of the malaria program. Instead, their faith might help build trust with local communities, enhancing the program's impact.

Thus, the key takeaway is that truth-seeking should be applied where it significantly impacts our goals and effectiveness. In less critical areas, like personal hobbies or certain beliefs, it might be more pragmatic to allow some flexibility. This approach helps maintain inclusivity and harnesses the diverse strengths of our community members without compromising on our core values and objectives."

That is a good point and an untested assumption behind my hesitation to "fully endorse truth-seeking in EA". That said, I would be surprised if all EAs, or even a majority, did everything they do and believed everything they believed due to a rational process. I mean I myself am not like that, even though truth-seeking is kind of a basic passion of mine. For example, I picked up some hobby because I randomly bumped into it. I have not really investigated if that is the optimal hobby given my life goals even though I spend considerable time and money on it. I guess I am to some degree harping on the "maximization is perilous" commentary that others have made before me, in more detail and more eloquently. 

I think the modest amounts of upvotes and agree votes might be an indication that this "not being truth-seeking in all parts of my life" attitude is at least not totally infrequent in EA. Religion is just one example, I can think of many more, perhaps more relevant but it's a bit of a minefield - again, truth-seeking can be pretty painful!

I am not super confident about all this though, I made my comments more in case what I am outlining is true, and then for people to know that they are welcome by at least some people in the EA community as long as they are truth-seeking where it matters and that areas they are not investigating deeply does not too negatively affect having the biggest possible impact we could have.

Hi Will, thanks for the comment. I agree 100% that it is very good for people to even look at hot button topics but keep such explorations offline.

Perhaps something I should have clarified above, and in danger of being perceived as speaking on behalf of others which is not my intention (instead I am trying to think of the least harmful example here): I was thinking that if I was someone really passionate about global health and doing it right, and coming from a strong Christian background, I might feel alienated from EA if it was required of me to frequently challenge my Christian faith. 

So I think I was talking in terms of an attitude or  value. For the above example of a Christian EA, and using another example of an atheist or at least agnostic EA who is super truth-seeking across the board, I could see the latter using this post to come to the conclusion that the Christian EA is not really EA as that person refuses to dive deep into the epistemics of their religious belief. This is what I wanted to highlight. And personally I think the Christian EA above is super helpful even for EAs who think they are not 100% truth-seeking: They have connections to lots of other Christians who want to do good and could influence them to do even better. They also understand large swaths of global population and can be effective communicators and ensure various initiatives from Pause AI to bed nets go well when delivered to Christian populations. Or they might just be a super good alignment researcher and not care too much about knowing the truth of everything. And the diversity of thought they bring also has value.

That said, I think "global truth-seekers" are also really important to EA - I think we would be much worse off if we did not have any people who were willing to go into every single issue trying to get ground contact with truth. 

If helpful, and very simplistically, I guess I am wondering which of the two alternatives below we think is ideal?

Load more