Sanjay

Wiki Contributions

Comments

Most research/advocacy charities are not scalable

When I started thinking about these issues last year, my thinking was pretty similar to what you said. 

I thought about it and considered that for the biggest risks, investors may have a selfish incentive to avoid to model and manage the impacts that their companies have on the wider world -- if only because the wider world includes the rest of their own portfolio!

It turns out I was not the first to think of this concept, and its name is Universal Ownership. (I've described it on the forum here)

Universal Ownership doesn't go far enough, in my view, but it's a step forward compared to where we are today, and gives people an incentive to care about social impacts (or social "profits")

What EA projects could grow to become megaprojects, eventually spending $100m per year?

As I alluded to in a comment to KHorton's related post, I believe SoGive could grow to spend something like this much money.

SoGive's core idea is to provide EA style analysis, but covering a much more comprehensive range of charities than the charities currently assessed by EA charity evaluators.

As mentioned there, benefits of this include:

  • SoGive could have a broader appeal because we would be useful to so many more people; it could conceivably achieve the level of brand recognition achieved by charity evaluators such as Charity Navigator, which have high levels of brand recognition in the US (c50% with a bit of rounding).
  • Lots of the impact here is the illegible impact that comes from being well-known and highly influential; this could lead to more major donors being attracted to EA-style donating, or many other things.
  • There's also the impact that could come from donating to higher impact things within a lower impact cause area, and the impact of influencing the charity sector to have more impact

Full disclosure: I founded SoGive.

This short comment is not sufficient to make the case for SoGive, so I should probably right up something more substantial.

Most research/advocacy charities are not scalable

I believe that in time EA research/analysis orgs both could and should spend > $100m pa.

There are many non-EA orgs whose staff largely sit at a desk, and who spend >$100m, and I believe an EA org could too.

Let's consider one example. Standard & Poors  (S&P) spent c.$3.8bn in 2020 (source: 2020 accounts). They produce ratings on companies, governments, etc. These ratings help answer the question: "if I lend the company money, will I get my money back". Most major companies have a rating with S&P. (S&P also does other things like indices, however I'm sure the ratings bit alone spends >$100m p.a.)

S&P for charities?

Currently, very few analytical orgs in the EA space aim to have as broad a coverage of charities as S&P does of companies/governments/etc.

However an org which did this would have significant benefits.

  • They would have a broader appeal because they would be useful to so many more people; it could conceivably achieve the level of brand recognition achieved by charity evaluators such as Charity Navigator, which have high levels of brand recognition in the US (c50% with a bit of rounding).
  • Lots of the impact here is the illegible impact that comes from being well-known and highly influential; this could lead to more major donors being attracted to EA-style donating, or many other things.
  • There's also the impact that could come from donating to higher impact things within a lower impact cause area, and the impact of influencing the charity sector to have more impact.

I find these arguments convincing enough that I founded an organisation (SoGive) to implement them.

At the margin, GiveWell is likely more cost-effective, however I'd allude to Ben's comments about cost-effectiveness x scale in a separate comment.

S&P for companies' impact?

Human activity, as measured by GDP (for all that measure has flaws) is split roughly 60%(ish) by for-profit companies, 30%(ish) by governments and a little bit from other things (like charities).

  • As I have argued elsewhere, EA has likely neglected the 60% of human activity, and should be investing more in helping companies to have more positive impact (or avoiding their negative impact)
  • The charity CDP spent £16.5m (c.$23m) in the year to March 2019 (source). They primarily focus on the question of how much carbon emissions are associated with each company. The bigger question of how much overall impact is associated with each company would no doubt require a substantially larger organisation, spending at least an order of magnitude more than the c$23m spent by CDP.

(Note: I haven't thought very carefully about whether "S&P for companies' impact" really is a high-impact project)

Part 3: Comparing agency organisational models

An EA-specific agency would have to be low-bono, offering major discounts to EA orgs - otherwise it would be indistinguishable from the countless existing for-profit agencies.

I think this needs justification. I'm currently aching for a tech agency I would trust, and I'm happy to pay market rates to get a decent agency to implement some EA projects.

If you told me you had such an agency, and it was peopled with EAs, that would be even better!

If you told me that you needed to pay your developers decent salaries, I could cope with paying a small premium.

How to explain AI risk/EA concepts to family and friends?

Not sure how good the Robert Miles channel is for mums (mine might not be particularly interested in his channel!) but for communicating about AI risk Robert Miles is (generally) good and I second this recommendation

EA needs consultancies

Just a quick comment to say that SoGive would be well positioned to be another consultancy providing services like Rethink.

We have collaborated with Rethink before (see this research) and are in moderately frequent informal contact with them.

We have c10 analysts who are a mixture of volunteers and staff. Mostly volunteers, as the organisation is funded solely by me, and there is a limit to what I can afford.

I'm open to the idea of us doing more of this sort of work, although it would need a discussion before we commit to anything, as we already have a separate strategic focus in mind.

ESG investing needs thoughtful trade-offs

Thanks for this, good question!

I agree with your point that investors have some blind spots, in particular that some areas of finance are not good at incorporating long term considerations.

So I think you're right, the ESG concept probably could achieve some impact by helping address that sort of blind spot.

I probably should have said something more like "To judge whether I, as someone working in ESG investing, is having material impact, we need to see if I'm actually having an influence on scenarios where there is a tension/trade-off". This is because ESG-related work is already working to address that blind spot.

ESG investing needs thoughtful trade-offs

Sorry I didn't spot your comment earlier. Yes, more than happy for this to be shared more widely. Feel free to use this link if you wish: https://effectiveesg.com/2021/05/24/esg-investing-needs-thoughtful-trade-offs/ 

SoGive's moral weights -- please take part!

Thanks very much for pointing out that error -- now corrected. I've looked at the answers which have been recorded, and they include an answer which includes comments similar to the comment you made here, so I think it's been recorded. Thank you very much!

The $100trn opportunity: ESG investing should be a top priority for EA careers

I have now expanded the acronym when it's used in the first sentence.

Load More