Hide table of contents

Disclaimer: we have no legal background, so while we suspect both of these activities to be illegal, we would welcome clarification from an EVF employee explaining why one or both are not.

1. Ignoring conflict of interest/giving a trustee direct and indirect benefits

UK charity law states that 'trustees cannot receive a benefit from their charity, whether directly or indirectly, unless they have an adequate legal authority to do so'*. At virtually every EAG and EAGx, Effective Ventures distributes large numbers of copies of What We Owe the Future and/or Doing Good Better, presumably bought with Effective Ventures money. Both books are authored by William MacAskill, a trustee of Effective Ventures, who would have benefited directly from royalties received on each sale**, and whose academic career almost certainly benefited indirectly as a result (less concretely, the tendency of EAG events to regularly invite him as the keynote speaker has probably had a substantial indirect benefit). 

* We are unclear what 'legal authority' implies (it's not defined elsewhere on that page), but from context seems likely to relate to technical benefit, where eg they needed to take custody of money to pass it on to someone else. We doubt it applies here.

** As we understand it, MacAskill donates all the proceeds from sales of his book to effective charities. The claim we make is not that MacAskill is an immoral human, but that EVF/he has continually acted illegally in this respect.

2. Retaining data against the requirements of GDPR

In her recent Vox article, Carla Cremer claims that

in 2019, I was leaked a document circulating at the Centre for Effective Altruism, the central coordinating body of the EA movement. Some people in leadership positions were testing a new measure of value to apply to people: a metric called PELTIV, which stood for “Potential Expected Long-Term Instrumental Value.” It was to be used by CEA staff to score attendees of EA conferences, to generate a “database for tracking leads” and identify individuals who were likely to develop high “dedication” to EA — a list that was to be shared across CEA and the career consultancy 80,000 Hours. There were two separate tables, one to assess people who might donate money and one for people who might directly work for EA.

Individuals were to be assessed along dimensions such as “integrity” or “strategic judgment” and “acting on own direction,” but also on “being value-aligned,” “IQ,” and “conscientiousness.” Real names, people I knew, were listed as test cases, and attached to them was a dollar sign (with an exchange rate of 13 PELTIV points = 1,000 “pledge equivalents” = 3 million “aligned dollars”)

... When I confronted the instigator of PELTIV, I was told the measure was ultimately discarded. Upon my request for transparency and a public apology, he agreed the EA community should be informed about the experiment. They never were.

From what we understand of GDPR, if true, this probably would have violated all its 'data subject rights', such as the requirement for 'the right to be informed' and for giving subjects the right to access their data. It's unclear to us whether consent would have been required in this situation, but if so it was evidently not obtained. It would probably also have violated the requirement that it be 'for a specific purpose'. It might also have qualified as 'high-risk data' on the grounds that it involved 'Systematic and extensive profiling'. We do not know the implications of this, but presumably it would have required a higher level of justification and/or imply a more serious offence when the other requirements are breached. 

Given that the measure was discarded and we don't know how long it was actually used for, perhaps the law was never in practice broken - nonetheless, if Cremer's account is accurate, it seems that there was clear intent to do so.

 


 

-48

0
0

Reactions

0
0

More posts like this

Comments39


Sorted by Click to highlight new comments since:

Object level take: IANAL, but I'd be surprised if the first one of these is illegal. It doesn't even seem ethically questionable to me; I expect very many charities carry out similar activities without issue.

Regarding the second, if this level of "data collection" is illegal, then virtually every group or organization that exists is breaking that law. (It might still be illegal, though -- the GDPR is a very badly written law.)

Speaking with a legal background:

You should probably do a bit more research than a single copy+paste from a summary on a gov website. This is very, very bare bones - something I wouldn't accept from a term 1 student paper (for example). Consulting a professional is obviously preferable but even without that, actually looking at any relevant Act or legal decisions would dramatically improve this. I think it is incredibly poor to post this with such little investigation.

Also whistleblowing (as your name suggests), can be done internally in the org, raised to the org or to a superior within the org, or so forth - doing so on a public forum is not actually whistleblowing that may be protected by law. It all depends on what avenues are available (such as if you've been ignored, or fear reprisals or so forth) and what you have tried elsewhere.

Disagree-voted (but not downvoted). I think consulting a professional is a standard so high, and costly, that we should not even imply that it's a necessary condition, or even an expectation, before raising a concern. I think it's pretty important that raising a concern should not cost money.

I don't know whether a term 1 student paper is an appropriate standard, and I'd guess most other readers wouldn't either, so it doesn't seem like helpful guidance to me.

I do think the concerns in the OP fall below some minimum standard of due diligence and consideration to prevent wasting time and attention, but I personally would not want to set that standard too far above this post, or real red flags will go unreported, simply because the reporter can't or won't do the work to make them bulletproof.

Hi, thanks for raising these questions. I wanted to confirm that Effective Ventures has seen this and is looking into it. We take our legal obligations seriously and have started an internal review to make sure we know the relevant facts.

Have you spoken to any lawyers about this? This seems like important due diligence, that I would personally feel was important to carry out before posting something like this.

Partial agree on this. I am hesitant to put the burden of paying for legal consultation on someone with a concern. However, at least on the first point, I think the poster would have discovered more specific guidance for transactions that affect trustee/director financial interests had they conducted sufficient research. Also, I think contacting the organization using a burner e-mail and giving them a few weeks for a response would have been appropriate for someone who didn't want to incur legal fees on a concern that is not time-sensitive.

Fair enough -- appreciate the nuanced take as always!

I think a crux here is the extent to which the post is an allegation versus a question. If it's an allegation, then I agree it should be rigorously supported, which probably requires legal input.

Technically, the phrasing in the disclaimer makes it clear this is a question. I don't think the tone throughout the piece makes that clear enough though -- at least, not for my tastes.

Having said all that, overall, I do want EA to be a place where people can pose challenging questions like this. And I wouldn't want us to censure posts like this just because the tone wasn't right.

"The claim we make is... that EVF/he has continually acted illegally in this respect."

This is plainly an allegation.

To me, this reads pretty clearly as an allegation, and I think not checking with a lawyer in that case is irresponsible.

I agree with Richard and Will's comments that the tone of the post is very allegation-y (and not very question-y). In light of this, I've edited my comment so that it ends with "the tone wasn't right" instead of "the tone wasn't quite right".

In an ideal world we would have, but hiring a professional would take more money than we have, and speaking to a volunteer would run the risk of doxxing us.

We think the relative downside of posting this speculatively. If it turns out our suspicions are wrong, then it will be easy for someone with a legal background to explain why, and this post will be relegated to a graveyard where it would then deserve to be.

If it turns out that our suspicions are founded, then we believe this is important to highlight, in the context of all the recent discussion about Effective Ventures's past activities.

As has been noted many times recently, it's much easier to make anonymous allegations that stick than it is to protect oneself from reputational damage from such allegations. Given this, failing to do basic due diligence to check whether your allegations are founded before making accusatory public posts seems frankly irresponsible to me.

I would guess that if this allegation turns out to be totally unfounded, it won't stick, and we'll never hear from it again. If it turns out to stick despite being unfounded, my guess is it'll be because the concern was substantive enough that cursory due diligence wouldn't have found the problems with it.

There's a good chance I'm being naive here, but I think reputational concerns are often overstated.

That said, I agree that making accusatory public posts without basic due diligence is irresponsible; I have a lower impression of the costs than you seem to, but the time and attention costs are certainly still real.

On the first point -- at least in the US, there are rules about organizations engaging in transactions where a trustee/director has a financial interest. Generally, the involved trustee/director needs to recuse, and the remainder of the board needs to make certain findings. Not sure how this works when the decision to make a purchase is delegated to staff. There's nothing in the first point that makes me concerned. 

However, there have been shady book purchases by other (non-EA) non-profits -- so it's not an unreasonable thing to ask questions about book purchasing (although making accusations out of the gate is too strong). For instance, I've heard of US churches buying up a ton of their lead pastor's book in the first week of release to make it hit the US best-seller lists for a week; the specific way in which the books were purchased was inefficient/costly and designed to evade rules on bulk purchases not unduly affecting the best-seller lists. In that case, the church's action created an improper private benefit in my opinion (with a possible exception if the royalties for the book went to the church rather than the pastor). Again, I don't see any reason to think EVF or anyone else in EA has engaged in this sort of behavior.

I can't speak to the letter of the law, but there are obviously good reasons for CEA  to distribute the best books that have been written on EA (which certainly include Will's books), and to have Will as a keynote speaker.  So the law would clearly be messed up if it prohibited this: it would effectively mean that charities cannot include relevant "thought leaders" amongst their trustees.  In such a case, I would think any sensible person should agree to the need to reform the idiotic law, rather than zealously enforce it.

But hopefully the law is more sensible than the authors of this post suggest.

Let me separately register that I think it reflects poorly on the authors of the post that they didn't pause to acknowledge this commonsense point.  Zealous legalism is pretty unpleasant to be around, and I would hate for this sort of thing (more witch-hunt than whistleblowing, IMO) to become a standard part of EA culture.

Whistleblowers should try to expose wrongdoing, not legalistic "gotchas".  Headline references to "breaking the law" strongly implicate that one is talking about serious, morally justified laws (like against murder, fraud, etc.).  But I recall reading that there are so many obscure and arbitrary laws on the books that probably anyone has unwitting broken dozens of them without ever realizing it.  So if you're going to go after people for doing nothing wrong but for making themselves vulnerable to legalistic coercion, I think it's important to be clear on this distinction.

At least in the US, these rules are reasonable and exist for a reason -- see my top-level post for an explanation of how non-EA organizations have done shady stuff with buying an insider's books. Although I don't like the tone of the original post, I also don't think it is appropriate to imply that rules against private gain for charitable trustees are "obscure and arbitrary" -- material on that topic is, or at least should be, in Being a Trustee/Director 101.

My read of the article is that it is alleging incompetence and/or lack of regard for laws rather than alleging wrongdoing. I'm a trustee of a number of UK charities myself and the Charity Commission sends all trustees basic information on manging conflicts of interest and data protection. They are by no means "obscure and arbitrary" and I think we as a community need to be extra careful to comply with the letter and spirit of every law given the recent FTX events.

This seems like an incredible attitude. 

There are many laws we would contest, but the job of organizational leaders include awareness of the laws pertaining to their activities and staying on the right side of them. If those laws are bad, then campaigning to change them is reasonable - ignoring them is not.

Furthermore it's far from clear there's anything wrong with these laws, let alone that they're 'idoitic'. They exist to prevent all manner of scams, tax frauds and ways for self-serving ways for charity trustees to benefit at taxpayers' expense. One of the lessons from FTXgate (actively promoted by MacAskill) was supposed to be that EA organisations that consider themselves above the law for the greater good are extremely dangerous.

lc
3
3
2

It is literally impossible for a charity to follow the law as you have described it, because doing any good whatsoever under your own name opens you up to claims that you have benefited in some material sense,whether that be financially or professionally or reputationally. Charities are well-known to employ people and directly pay them for services rendered, so without being a UK lawyer I don't know what kind of additional context is required for such a case to be prosecutable in practice, but it's certainly a nonzero amount. Granted this sort of completely unpragmatic interpretation of the law, you are probably breaking similar laws yourself, because there is simply too much law to support following a layman's interpretation of all of it.

This is a reality of the world you live in -the one with dozens of countries with hundreds of thousands of pages of law on the books, many designed explicitly to give as much discretion to prosecutors as possible - and so this "attitude" you describe (where people remain willing to do ethical things that prosecutors will not actually prosecute them for) is the only way to live. That is of course unless you're willing to single out particularly high profile organizations - then you can pretty much accuse anybody of breaking the law in some country or another.

IANAL, so I will just quote another government website, but I would be very surprised if accusation 1 holds any water. This was not difficult to find at all; also it seems a bit odd to first admit to not understanding what "legal authority" means but bringing the accusation forth anyway.

" A charity can pay a trustee for the supply of any goods or services over and above normal trustee duties. The decision to do this must be made by those trustees who will not benefit. [...].

Examples of goods or services that may be provided by a trustee in return for payment under the power in the Charities Act include:

  • the delivery of a lecture
  • a piece of research work...
  • the occasional use of a trustee’s premises or facilities

"
https://www.gov.uk/government/publications/trustee-expenses-and-payments-cc11/trustee-expenses-and-payments#s5-3

I think it's important to distinguish between morality and legal compliance. I don't think anybody involved here was immoral, but it sounds like there are questions to answer about whether CEA / EVF acted illegally (whether through deliberate decision or lack of competence). Hopefully they will be answered quickly and conclusively, so we can all move on.

On the book deal, a good way to structure this (i.e. achieving the same objective, but in a legally compliant way) would be for perhaps with CEA / EVF owning the copyright. In that way, they can receive all the proceeds or structure the deal such that all proceeds get diverted elsewhere. This might well have been what they did.

If it was indeed structured with Will receiving money and then donating it onwards, then at a minimum he should have recused himself from the decision-making on that issue (which he might well have done) but probably should have stood down as a trustee because of its materiality. I don't think it's right to focus just on the books actually purchased by EVF. If a charity spends money promoting a trustee's book, that trustee is receiving a direct benefit from that charity and should stand down if it is material.

I imagine EVF owning the copyright would prevent Will from benefiting from any sales of the book, including ones by unrelated third parties.

Correct, but if he intends to give away 100% of the proceeds (and presumably considers EVF effective), it has the same effect.

My comment was not to say “he should do x” but instead say “if he intends to do x and remain a trustee, this is a good way to structure it”.

"Mom, can we have more EA whistleblowers"
"No, we have EA Whistleblowers at home"

EA whistleblowers at home

lc
-29
5
11
Curated and popular this week
Paul Present
 ·  · 28m read
 · 
Note: I am not a malaria expert. This is my best-faith attempt at answering a question that was bothering me, but this field is a large and complex field, and I’ve almost certainly misunderstood something somewhere along the way. Summary While the world made incredible progress in reducing malaria cases from 2000 to 2015, the past 10 years have seen malaria cases stop declining and start rising. I investigated potential reasons behind this increase through reading the existing literature and looking at publicly available data, and I identified three key factors explaining the rise: 1. Population Growth: Africa's population has increased by approximately 75% since 2000. This alone explains most of the increase in absolute case numbers, while cases per capita have remained relatively flat since 2015. 2. Stagnant Funding: After rapid growth starting in 2000, funding for malaria prevention plateaued around 2010. 3. Insecticide Resistance: Mosquitoes have become increasingly resistant to the insecticides used in bednets over the past 20 years. This has made older models of bednets less effective, although they still have some effect. Newer models of bednets developed in response to insecticide resistance are more effective but still not widely deployed.  I very crudely estimate that without any of these factors, there would be 55% fewer malaria cases in the world than what we see today. I think all three of these factors are roughly equally important in explaining the difference.  Alternative explanations like removal of PFAS, climate change, or invasive mosquito species don't appear to be major contributors.  Overall this investigation made me more convinced that bednets are an effective global health intervention.  Introduction In 2015, malaria rates were down, and EAs were celebrating. Giving What We Can posted this incredible gif showing the decrease in malaria cases across Africa since 2000: Giving What We Can said that > The reduction in malaria has be
LintzA
 ·  · 15m read
 · 
Cross-posted to Lesswrong Introduction Several developments over the past few months should cause you to re-evaluate what you are doing. These include: 1. Updates toward short timelines 2. The Trump presidency 3. The o1 (inference-time compute scaling) paradigm 4. Deepseek 5. Stargate/AI datacenter spending 6. Increased internal deployment 7. Absence of AI x-risk/safety considerations in mainstream AI discourse Taken together, these are enough to render many existing AI governance strategies obsolete (and probably some technical safety strategies too). There's a good chance we're entering crunch time and that should absolutely affect your theory of change and what you plan to work on. In this piece I try to give a quick summary of these developments and think through the broader implications these have for AI safety. At the end of the piece I give some quick initial thoughts on how these developments affect what safety-concerned folks should be prioritizing. These are early days and I expect many of my takes will shift, look forward to discussing in the comments!  Implications of recent developments Updates toward short timelines There’s general agreement that timelines are likely to be far shorter than most expected. Both Sam Altman and Dario Amodei have recently said they expect AGI within the next 3 years. Anecdotally, nearly everyone I know or have heard of who was expecting longer timelines has updated significantly toward short timelines (<5 years). E.g. Ajeya’s median estimate is that 99% of fully-remote jobs will be automatable in roughly 6-8 years, 5+ years earlier than her 2023 estimate. On a quick look, prediction markets seem to have shifted to short timelines (e.g. Metaculus[1] & Manifold appear to have roughly 2030 median timelines to AGI, though haven’t moved dramatically in recent months). We’ve consistently seen performance on benchmarks far exceed what most predicted. Most recently, Epoch was surprised to see OpenAI’s o3 model achi
Rory Fenton
 ·  · 6m read
 · 
Cross-posted from my blog. Contrary to my carefully crafted brand as a weak nerd, I go to a local CrossFit gym a few times a week. Every year, the gym raises funds for a scholarship for teens from lower-income families to attend their summer camp program. I don’t know how many Crossfit-interested low-income teens there are in my small town, but I’ll guess there are perhaps 2 of them who would benefit from the scholarship. After all, CrossFit is pretty niche, and the town is small. Helping youngsters get swole in the Pacific Northwest is not exactly as cost-effective as preventing malaria in Malawi. But I notice I feel drawn to supporting the scholarship anyway. Every time it pops in my head I think, “My money could fully solve this problem”. The camp only costs a few hundred dollars per kid and if there are just 2 kids who need support, I could give $500 and there would no longer be teenagers in my town who want to go to a CrossFit summer camp but can’t. Thanks to me, the hero, this problem would be entirely solved. 100%. That is not how most nonprofit work feels to me. You are only ever making small dents in important problems I want to work on big problems. Global poverty. Malaria. Everyone not suddenly dying. But if I’m honest, what I really want is to solve those problems. Me, personally, solve them. This is a continued source of frustration and sadness because I absolutely cannot solve those problems. Consider what else my $500 CrossFit scholarship might do: * I want to save lives, and USAID suddenly stops giving $7 billion a year to PEPFAR. So I give $500 to the Rapid Response Fund. My donation solves 0.000001% of the problem and I feel like I have failed. * I want to solve climate change, and getting to net zero will require stopping or removing emissions of 1,500 billion tons of carbon dioxide. I give $500 to a policy nonprofit that reduces emissions, in expectation, by 50 tons. My donation solves 0.000000003% of the problem and I feel like I have f
Relevant opportunities