Welcome to the seventh open thread on the Effective Altruism Forum. This is our place to discuss relevant topics that have not appeared in recent posts.

1

0
0

Reactions

0
0
Comments37
Sorted by Click to highlight new comments since:

Here's giving a shout out to the following EAs who are running Christmas fundraisers through Charity Science:

  • Toby Pollock
  • Andrew Snyder-Beattie
  • Sam Dumitriu
  • Roxanne Heston
  • Greg Colbourn
  • Caleb Ontiveros
  • Michael Dickens
  • Frazer Kirkman
  • Joey Savoie
  • Signe og Jorgen
  • Xio Kikauka
  • Lucas Zamprogno
  • Joshua Jacobson
  • Maddy Skinner
  • Rory S
  • Giles Edkins
  • Lynn Savoie
  • Mark Savoie
  • Neelam Bhusal
  • Matthias Wasser
  • Adam Weiss
  • Callum Calvert
  • Jonathan Courtney
  • Cade Weinrauch
  • Andrew McKnight

I salute your fundraising efforts!

(If anyone would like to join in, you can do so with a few clicks via the links here.)

Sorry if this is the wrong place to do this, but I'm trying to encourage more people to give to AMF this year, so I created a $1,000 matching drive. Only the first $50 of any donation are matched and it's meant to encourage people to give to AMF who otherwise wouldn't.

If you know people who you'd like to encourage to give to AMF, please share this link: $1,000 Matching Challenge: December 2014. Please share widely :)

Have you thought about joining all the people doing Christmas Fundraisers? Charity Science provides donation matching for all the donations made through those fundraisers, and the page is prettier. (Though the fee for processing a donation is a little higher, at 4% inclusive of other fees.)

Hello, I'd like to post an article but I need 10 karma to do so and I haven't been posting comments so I don't have any. If you don't mind up-voting this comment to allow me to do that, please do!

If it helps, the post I've drafted is about some of the altruistic eating arguments I've seen:

I use one neoclassical economic and one intuitive appeal to argue that they could all be simpler and more accurate by leaving out elasticities.

Happy to share the draft privately before it's posted if that affects your desire to up-vote this comment.

Thanks!

Hey Timothy. I can add admin priveleges. Feel free to use Facebook to send me a draft.

I've been trying to work out how to sell EA in the form of a parable; let me illustrate with my current best candidate.

In a post-apocalyptic world, you're helping get the medicine that cures the disease out to the people. You know that there's a truck with the medicine on the way, and it will soon reach a T-junction. The truck doesn't know who is where and its radio is broken; you're powerless to affect what it does, watching with binoculars from far away. If it turns left, it'll be flagged down by a family of four and their lives will be saved. If it turns right, it'll be flagged down by a school where dozens of families with the disease have taken refuge.

Don't you find yourself fervently wishing the truck will turn right? It's not because the family's lives aren't worth saving; they are, and they all deserve to live. But it's clear that the better outcome is that it turn right.

So here's some things I like about this: it's not totally unfair. It's not just a choice between "save A" and "save A and B"; if you make the most effective choice, then some people die who you could have chosen to save. And weirdly, I think the reframing where you can't choose who gets saved, you can only will the truck to make the right decision, might help people see more clearly; you're not worried about guilt about not saving the family or anger at someone making the wrong moral choice, just looking at a flip of a coin and discovering how you want it to land.

What I'd like to improve is to somehow make it more like an everyday situation rather than a super contrived one.

Any improvements? Does this seem like a useful exercise?

The exercise seems useful.

I agree that making it not a choice between A and A+B is fairer. Also, saying that they're a witness, and can't actually make any decision might help with switching off guilt relating to a taboo tradeoff.

I agree that the problem is that the current example is too contrived, though I haven't yet thought of a more ordinary example. Scott Siskind's Arctic exploration analogy is the closest I know.

Thanks for the encouragement!

I wonder if you can do something with a different kind of disaster? Maybe make it a coach that can get people out of the danger zone? Or is that cheating because people don't want seats to be "wasted"?

Here's a continuation of this kind of discussion: The EA Pitch guide

[Give and receive free coding help]

Most coding advice forums expect a lot of searching before you ask questions. A few experienced EA coders, including myself, are happy to help EAs if they merely think asking will take less net time than (further) searching. The rationale is that this'll help EtG careers and EA projects. Join this mailing list if you're happy to give advice or interested in receiving it!

How do people see entrepreneurship as fitting into EA? Do you see it as purely an earning to give strategy, or do you see it as having significant other benefits? Are there for-profit entrepreneurial ventures that you regard as particularly EA?

If you're interested in this you may want to check out the EA Entrepreneurship Facebook group.

I think that there are pretty strong arguments that entrepreneurship is effective partly because of the actual value that the business creates. For instance, the paper Schumpeterian Profits in the American Economy estimates that entrepreneurs capture on average 1% of the value they create. If it costs about 100x as much to save a life in the US as abroad, that would suggest that half the benefit of EA entrepreneurship is realized through creating value in the US.

(Obviously there's a huge amount of variance here, depending on things like whether you're trying to capture value or trying to create value, and whether we believe Nordhaus's statistics. This is very back-of-the-envelope.)

For instance, the paper Schumpeterian Profits in the American Economy estimates that entrepreneurs capture on average 1% of the value they create.

Not that it matters for your argument, but I think the figure is closer to 2%.

(This must be the most nitpicky comment in the history of this site.)

A specific sub-question: does it make sense for EAs to fund (for-profit as opposed to non-profit) EA entrepreneurs?

Some thoughts on this:

  • There's a bonus in making a fellow EA rich, because they'll donate some of their riches.

  • However I have a general sense that your investments should focus purely on growing your own capital, and that you should separately optimise for moving as much money as possible to charity, and that you should separate those two things. This is merely a general sense rather than a fully considered position.

  • In an efficient market, non-EA venture capitalists would already invest in all and only those entrepreneurs whose ventures would in expectation yield positive returns (above those available from alternative investments). We'd want EA venture capitalists to invest in all and only those ventures. Of course, the VC market isn't perfectly efficient, but I'm not aware of any reason to think that EA VCs are better.

I haven't thought about this much at all, so these thoughts are all quite tentative.

[anonymous]3
0
0

In an efficient market, non-EA venture capitalists would already invest in all and only those entrepreneurs whose ventures would in expectation yield positive returns (above those available from alternative investments). We'd want EA venture capitalists to invest in all and only those ventures. Of course, the VC market isn't perfectly efficient, but I'm not aware of any reason to think that EA VCs are better.

EA VCs are going to assign different values to outcomes than a traditional VC would. For a traditional VC, a founder with a 1% change of making a billion dollars is as good as a founder with a 1% change of making a billion dollars and a 100% chance of donating most of that money to cost-effective charities. EAs value the second entrepreneur far more than the first.

So, in a perfectly efficient market, EA VC would evaluate the risks of a startup the same as traditional VCs, but would assign different values to the startups potential success. This might cause EA VCs to make investments that traditional VCs might not make. Two specific examples:

  1. Meal Squares. For a traditional VC, Meal Squares probably does not have the giant exit potential that they need. But, for an EA VC, they have a strong possibility of generating lots of donations and philanthropic value. A traditional VC might pass on investing in them, an EA VC might want to invest.
  2. Wave. Wave creates value both through being a profitable business and by helping people send money from rich economies to poor ones. A traditional VC only values the profit potential, an EA VC values both the profit and the service that the app provides itself.

EA VCs are going to assign different values to outcomes than a traditional VC would. For a traditional VC, a founder with a 1% change of making a billion dollars is as good as a founder with a 1% change of making a billion dollars and a 100% chance of donating most of that money to cost-effective charities. EAs value the second entrepreneur far more than the first.

True, but it depends on how much money an EA founder would donate in expectation, and whether this outweighs better expected returns available from non-EA startups. Meal Squares sounds like it might be a good test case to consider.

As a consideration against, the Halo Effect might cloud judgement around odds of success for EA entrepreneurs from the point of EA investors.

As a consideration for, there may be behaviours in the founder-VC relationship that negatively impact the founders (comes up in http://paulgraham.com/fr.html), such as trying to hold off committing as long as possible. EA VCs could try to bypass these to improve odds of startup success.

A key advantage EA VCs have is the "your wins are my wins too" that come from shared goals. It doesn't matter who is driving more donations, as long as more donations are made. Therefore, the EA VC - EA Entrepreneur relationship is even more positive-sum than a typical VC relationship.

The forum has now been online for several months. Which aspects have you found useful and how can it be improved? Feel free to enter feature requests and bugs also.

I'm appreciating the threaded comments now more than ever! I think everyone being able to grill EA Outreach in public on their request for donations is a huge step for transparency.

Have you considered starting some Reddit-AMA-style threads? I think this would be pretty interesting for current EAs, and also (depending on the person) could drive substantial forum traffic from external sources the same way more general-interest posts might.

I haven't considered that before but it strikes me as a pretty good idea and probably one worth doing.

Let's do it! I want to ask Ben Kuhn questions! :)

What new companies ought we to create to make the world better? These could directly address world problems or could involve building skills and network in important areas, or could just be potentially lucrative.

Something in developing world entrepreneurship that gives you a good position to spot opportunities for/carry out other developing world entrepreneurship.

A related issue is that EA Ventures will soon make funding available for impact-driven startups and projects. Its goals are to show EAs that funding is available and to connect good projects with donors. So which projects currently need funding most?

I've been asked about ways to contribute when travelling to poor countries a few times. Does anyone have any ideas or know of anything written on this? One possibility is using your trip to fundraise somehow, but I'm not sure how you'd do that.

Most charities ask people for fixed regular sums of money e.g. $40 per month. Indeed, $40 per month sounds a lot less scary to me than 1%, even though it might equal a similar annual amount. So why do we ask people to give percentages in order to achieve group membership? Sure, there are some advantages because with percentages because they are fairer but do these outweigh the costs?

Isn't the message usually 'Just $40 per month will buy xxx and achieve amazing thing yyy, please donate'? I feel like the charities mostly want to talk about what they are doing and subtly suggest an amount at the same time. There's no real analogous statement they could make for percentages.

EA is in a somewhat different position as far as the messaging goes in putting the emphasis on the amount rather than on the activity. As it happens, 1% sounds less onerous to me than $40 per month.

I can think of 3 reasons. First, because EA typically involves giving more as well as giving better. A "fixed amount" group membership can be scarier if the cost is large, and at worst can become an exclusionary prohibition for some. Second, because EA is as much a mindset and process as a goal. "membership" is not the end point - unlike in many charities where group affiliation/signalling is the primary driver EA hopes to attract members who want to look at how they can do more good and what that means. A percentage encourages active engagement in that respect. Third, a lot of EA outreach seems focused at the moment on attracting members who do not necessarily have high earnings but may have in the future (e.g college students at good universities). Asking for a future percentage seems a lot less intimidating than a large number to people with no assets and allows immediate membership and buy in (although a reduced amount would do so but might create a more teired system)

I agree with 3 - percentages are good if people's incomes are unstable. 2 - percentages being good for continuous reflection on one's giving seems plausible, although not entirely clear. If people subscribe to multiple regular donations, they could plausibly still reflect on which to increase or decrease. I don't really understand why you think 1 - that percentages encourage people to feel confident about giving larger amounts. If percentages were so good for getting people to give more, then how come charities use them in their donor development?

because charities target people as they are now , principally high value individuals and those without existing commitments where a fixed number is not a significant amount in percentage terms. By contrast our young EA movement tends to target people with potential for growth and those interested in making a significant sacrifice for non- signalling reasons in the future. But if EA is to grow - that will be a big challenge how to appeal to people who respond more to non-rational signalling motive than EA rationalist ones.

There was an interesting interview with Dean Karlan from Innovations for Poverty Action on PBS. Some excerpts:

Dean Karlan: I divide my thinking into short-run and long-run. For the long-run, I want to invest. I want to think about how we get better information, better evidence [for what we give], because we need to help future children, not just today’s children. In fact, there are many more children in the future than today so I put more money into the long-run. And for that, I support Innovations for Poverty Action which is doing research to figure out what works and what doesn’t.

But, I also do care about the short-run. So I also look at some of the things that we do have strong evidence on — where there have been really good tests to show that an idea works, and here’s a charity that’s doing it, so let’s support it.

This year there are three charities that stood out in the work that I was doing with my family, trying to think through what to support. One is called Trickle Up, which works a lot in West Africa and Latin America. A second is Seva Mandir which does work in India.

And a third is Evidence Action, an organization just started a year ago that is strictly committed to scaling up ideas that have been shown to work using randomized trials. They are doing two things right now. One is a de-worming program. This is a school-based de-worming program. De-worming, as in children who have intestinal worms, which gives them a distended belly and makes them lethargic and sick and not able to go to school. Taking a pill has been shown not only to improve their school attendance, but 10 years later, improve their income.

A second is a chlorine dispenser. . .

There are also some interesting findings on savings and crop insurance instruments that they discuss:

So, the striking insight was that it’s not that people weren’t credit constrained, but they were very risk-constrained, and there was a lot of under-investment going on simply because of risk. And this suggests that a scaled-up micro-insurance product that helps people absorb rainfall risk would be a very successful thing.

The host seems convinced at the end, and decided to donate 10% to Dean's suggested charities.

Can someone explain the game theoretics of donation matching in a couple of paragraphs? In particular, if I'm a savvy donor should I completely ignore the concept altogether, or should I only donate if it's matched with somebody?

GiveWell has a blog post that I think covers it pretty well.

I generally try to ignore all donation matching, even "true" donation matching. But the laypublic gets really excited about donation matching, so it's important if you intend to fundraise.

Curated and popular this week
Relevant opportunities