EricHerboso

I’m the Executive Director at Effective Giving Quest (effectivegivingquest.org), a fundraising org at the intersection of EA and gaming. I also am an Organizer of WikiProject Effective Altruism (https://forum.effectivealtruism.org/groups/sEzFXsuht3khWyahc), where we coordinate effective altruism related articles on Wikipedia. Most of my history with EA is through my service at Animal Charity Evaluators from 2012–2022; I helped influence ACE’s formation in 2012 and became the 2nd paid employee in 2013 as Director of Communications. Between 2019–2022, I served as Secretary on ACE’s Board of Directors.

I’ve been involved with the EA movement since 2011, well before the phrase "effective altruism" was coined. I hope to continue being a part of the movement for many years to come.

You can learn more about me through my personal blog at ericherboso.org. I also have an EA profile up at eahub.org/profile/eric-herboso.

Topic Contributions

Comments

EA logo and title on Reddit's r/Place

Note that the location has changed to (1906,587). Details on why/how are on the subreddit.

 

EDIT: After relocating to (1906,587), we have successfully been able to put up the EA logo plus the EffectiveAltruism.org url! I'm impressed by the effort that some of our redditors have.

While it may not exactly be the best use of any one person's time, and there was somewhat significant effort (by 2–3 redditors) put into negotiating with nearby subreddits, it appears that a discord channel of relatively few EAs has been successful at this communications project. Please check out the image live by clicking the coordinates above, or you can see a screenshot here.

The intention from here is to defend the text from trolls so that it makes it to the final canvas when the r/Place event ends.

EDIT 2: The EA logo+text is now located at (957,1772). We are stable in this new location, thanks to help from several surrounding communities. You can see what we look like by clicking the coordinates link.

Initiative to draw the EA logo on Reddit's r/place

While the OP's effort didn't end up working, a much smaller alternate effort has been somewhat successful. The logo is at (1644,570) and coordination is at r/EffectiveAltruism.

Note that I don't actually believe this is a good use of our time; it is unlikely to result in any sort of outreach without text next to the logo, and not only does it seem unlikely that text will persist, but also the small size means it will likely get no play when the final r/Place image gets shared around online. Nevertheless, it seems like a number of EAs are interested in doing this, so I thought it might be relevant to post their success here.

 

EDIT: I crossed out the above because we had to relocate to (1906,587). However, we have successfully been able to put up the EA logo plus the EffectiveAltruism.org url! I'm impressed by the effort that some of our redditors have.

While it may not exactly be the best use of any one person's time, and there was somewhat significant effort (by 2–3 redditors) put into negotiating with nearby subreddits, it appears that a discord channel of relatively few EAs has been successful at this communications project. Please check out the image live by clicking the coordinates above, or you can see a screenshot here.

The intention from here is to defend the text from trolls so that it makes it to the final canvas when the r/Place event ends.

 

EDIT 2: The EA logo+text is now located at (957,1772). We are stable in this new location, thanks to help from several surrounding communities. You can see what we look like by clicking the coordinates link.

Database of orgs relevant to longtermist/x-risk work

I just wanted to leave a note saying that I found this database useful in my work.

What are some artworks relevant to EA?

At Effective Giving Quest, we’re aware of several video and board games that are relevant to EA, either because they deal with the topic of EA directly or because the developers behind the games are EAs themselves.

We have not yet launched, and so are still in the process of standardizing a way for EA-friendly developers to commit to giving a set percentage of their profits toward EA causes. Once we do this, we should have a much more comprehensive list of relevant-to-EA video and board games that we can share with the EA community.

I'll come back to edit this post with a list of EA-relevant games after EGQ launches.

Effective altruism merchandise to advertise the movement

I just wanted to note that the link to Gleb's post from six years ago is an example of how not to do this sort of thing. His organization, Intentional Insights, actively harmed the EA movement rather than helped it.

I still think EA outreach via t-shirts is good — my favorite shirts are from GWWC, GW, ACE, and EA Global! — but communications is hard, and the way Gleb went about it was not good at all.

Announcing my retirement

Thank you so much for all your work managing the EA Forum. You’ve done an excellent job, and I’m sure that you’ll do many and varied good things at OpenPhil.

Also, we’re so excited to have you join Effective Giving Quest as our first partnered streamer! I’m really looking forward to what we can accomplish in the gaming space for effective altruism. (c:

Issues with Giving Multiplier

it is more important to build a robust movement of people collaboratively and honestly maximizing impact than it is to move additional money to even very good charities in the short term.

 

I generally agree with your sentiment here. However, I think the analysis changes when it comes to introducing people to EA.

Giving Multiplier doesn't target existing EAs with this offer. They're targeting people outside the movement who very likely are not yet familiar with EA ideas. Because the standard in the industry is that donation matching is a norm, reaching that audience sometimes takes doing something like this kind of match. Then, once someone has given to an EA cause, they are far easier to convince to give to EA charities later on, and (presumably) they'll be good targets for converting to EAs themselves.

While I agree that this type of influence matching is ultimately misleading in the ways you describe, I don't think it's fair to call it illusory matching. I mean: it is illusory in the colloquial sense, but illusory matching as a term originated in GiveWell's 2016 post, where Holden specifically put forward the claim that this kind of influence matching is a type of non-illusory matching. He even suggests the very concept that Giving Multiplier is doing:

you should fight back by structuring your own influence matching – making a conditional commitment to the highest-impact charity you can find, in order to pull other dollars in toward it.

With all that said, I do think it makes sense for people that read this forum to not give through Giving Multiplier nor to be really all that influenced by donation matching campaigns generally. But as a technique for reaching people new to EA, I don't think that legitimate non-illusory matching in the form of coordination matching or influence matching is necessarily bad.

It is generally misleading in the sense that a careful read will make you realize that they are doing something different from how it may naively seem from a read on their homepage — but when it comes to an unsophisticated audience who is already used to the norm of illusory matching, I'm not sure it's fair to call someone like Giving Multiplier especially misleading just because their homepage doesn't go into all the details of how this type of matching works. If someone asks me where the nearest gas station is, I shouldn't be labeled as misleading just because I don't give significant detail on the best route there. I think it's okay to just point and say "five minutes that way". I feel the same about how Giving Multiplier has structured their homepage.

Honoring Petrov Day on the EA Forum: 2021

This is not how I understand the term. What you're describing is how I would describe the word "commitment". But a "precommitment" is more strict; the idea is that you have to follow through in order to ensure that you can get through a Newcomb's paradox situation.

You can use precommitments to take advantage of time-travel shenanigans, to successfully one-box Newcomb, or to ensure that near-copies of you (in the multiverse sense) can work together to achieve things that you otherwise wouldn't.

With that said, it may make sense to say that we humans can't really precommit in these kinds of ways. But to the extent that we might be able to, we may want to try, so that if any of these scifi scenarios ever do come up, we'd be able to take advantage of them.

Honoring Petrov Day on the EA Forum: 2021

Don't forget that this is iterated, though. In order to save the site from going down a year from now, we might want to follow through on a tit-for-tat strategy this year.

I'm not certain that this is the correct play, but it is an important distinction from the usual MAD theorizing.

Honoring Petrov Day on the EA Forum: 2021

On the one hand, in order for MAD to work, decision-makers on both sides must be able to give credible threats for a retaliatory strike scenario. This is also true in this experiment’s case: if we assume that this will be iterated on future Petrov Days, then we must show that any tit-for-tat precommitments made are followed through.

But at the same time, if LessWrong takes down the EA Forum, it just seems like wanton destruction to similarly take it down, too. I know that, as a holder of the codes, I should ensure that I’m making a fully credible threat by precommitting to a retaliatory strike, but I want to take precommitments seriously and I don’t feel confident enough to precommit to such an action.

After giving this much thought, I decided to present the perhaps-too-weak claim that if the EA Forum goes down due to a LessWrong user pressing the button, I may press in retaliation. While this is not an idle threat, and I am serious about potentially performing a retaliatory strike, I am falling short of committing myself to that action in advance. I give more of my reasoning in my blog post on this.

(Ultimately, this is moot, since others are already willing to make such a precommitment so I don’t have to.)

Load More