77Joined Dec 2018


A confusion is introduced in the quoted passage by the shift from the personal to the general.  You personally cannot lose more than all your assets, because of bankruptcy.  But bankruptcy just shifts any further losses to your creditors, so once we shift to thinking about global benefits and harms, the loss is no longer capped in that way.  

On the object level, there now seem to be two roughly equal clusters.  Cluster A are long-termist, consider EA epistemically humble,[1] don't think we should blame ourselves for the FTX farrago and are unenthusiastic about democratisation.  Cluster B take the opposite positions.

On the meta level:

  • It seems a bit concerning that the OP stresses the poll shouldn't be used to say X% of EAs think Y, but the first item of the report is a list of statements where 60% of respondents either agreed or disagreed.[2]  It seems that it would be easy to take these as consensus positions, when (a) the views of the 220 respondents may not be representative and (b) 60% isn't much of a consensus.  It will be interesting to see, for example, whether people assert that that this poll shows wide support for "EA leaders should be transparent about what they knew about SBF and when".
  • Many of the "majority" statements seem anodyne.  For example, it's not particularly interesting to highlight (as the report does) that most people agree the top team at FTX made poor decisions.
  • The report doesn't do a particularly good job at identifying "statements which make this group unique".  It includes statements which majorities in both groups agree.  It highlights the statement, "I don't believe in longtermism and I don't feel like I can mention that to other EAs" rather than the more basic, "I think longtermism is true".  It doesn't clearly identify the difference in views on democratisation: e.g. the disagreement on the statements "There should be a community-elected board in EA", "'EA should democratise', without any detail on how this could happen, or would have prevented FTX, is just an applause light"[3] or "Democratic and transparent processes of decision making are better for both quality and oversight."
  • The "areas of uncertainty" mostly identifies statements which in retrospect are unclear or inapplicable to most respondents, rather than any fundamental uncertainty.  Possibly it would help to distuinguish "pass" from "unsure", with a "pass" not being counted as an answer at all.
  • I can't understand the graph at all.  What are the axes?  What are the rings?  The graph doesn't show two clusters.  If anything, there's one cluster in the centre, although that could conceivably be an artifact of the inner circle having a smaller area.  How can participants be positioned "close" to statements they agreed, when agreement is 0/1?
  • OP says in a comment here that there are now too many statements, but the setup encourages respondents to submit statements.  If anything, it seems surprising that 220 respondents only submitted 185 statements.  Ideally, the poll would have many more respondents, so this seems to highlight a problem with the system.
  1. ^

    The most relevant statement here reads, "I think EA is epistemically humble but external people don't seem to think so."  It seems likely to me, based on other responses, that the disagreement stems from the first part, but of course a person who believed EA to be universally acknowledged as epistemically humble would also disagree.  In retrospect, it would have been better to have "EA is epistemically humble" and "External people don't consider EA to be epistemically humble" as separate statements.

  2. ^

    For some reason "EA should not aim to be a community" is also appearing here although only 55% of respondents disagree.

  3. ^

    Of course cluster B does have at least one concrete proposal, namely a community-elected board.  It's just that cluster A thinks that would be bad.

It seems worth noting that UK employment law has provisions to protect whistleblowers and for this reason (if not others) all UK employers should have whistleblowing policies.  I tend to assume that EA orgs based in the UK are compliant with their obligations as employers and therefore do have such policies.  Some caution would be needed in setting up additional protections, e.g. since nobody should ever be fired for whistleblowing, why would you have a policy to support people who were?

In practice, I notice two problems.  Firstly, management (particularly in small organisations) frequently circumvent policies they experience as bureaucratic restrictions on their ability to manage.  Secondly, disgruntled employees seek ways to express what are really personal grievances as blowing the whistle.

My reading (and of course I could be completlely wrong) is that SBF wanted to invest in Twitter (he seems to have subsequently pitched the same deal through Michael Grimes), and Will was helping him out.  I don't imagine Will felt it any of his business to advise SBF as to whether or not this was a good move.  And I imagine SBF expected the deal to make money, and therefore not to have any cost for his intended giving.

Part of the issue here is that people have been accounting the bulk of SBF's net worth as "EA money".  If you phrase the question as "Should EA invest in Twitter?" the answer is no.  EA should probably also not invest in Robinhood or SRM.  If SBF's assets truly were EA assets, we ought to have liquidated them long ago and either spent them or invested them reasonably.  But they weren't.

I don't think so: the "backdoor" refers to the internal accounting system.  My reading is that this refers to SBF being able to alter the software to make it display fake figures (whether or not that's true), and I think that could be accomplished by something like admin access.

I am also confused by the suggested outcomes, e.g. I don't see how Russian victory could be either analagous to Kosovo or accurately described as "breakaway succeeds", since Putin has now abandoned the pretence of seeking to support DPR/LPR as independent states.  I would suggest the following are rough possible outcomes:

  1. Russia, Ukraine and most other states cease to exist following widespread nuclear war.
  2. Russia annexes Ukraine.
  3. Russia annexes Ukraine east of the Dnipro, establishes a puppet state in the remainder.
  4. Russia holds (and Ukraine cedes) the territory it has already annexed.  Likely unstable.
  5. War reaches stalemate, but continues indefinitely.   Strictly speaking not an outcome, as the war will end eventually, but I'm envisaging these as outcomes within the next few years.
  6. Independent Donbas with security guaranteed by Russia?  Previously seemed to be a possible outcome, but not sure how we could get there from here.
  7. Restoration of the ante bellum status quo.  Likely unstable, as we've just seen.
  8. Russia withdraws from eastern Ukraine; Ukraine cedes Crimea.
  9. Russia withdraws from all of Ukraine.

I think it's clear that at the start of the war, Ukraine would have viewed 7 and certainly 8 as victories, but now only views 9 as a victory.  Russia is hard to read, but I would guess that something like 3 was the initial war aim and that they would now be content with 4.

The question is, is there a deal there?  E.g. would the belligerents settle for 7?  At present, I think the answer is a firm no from both sides, so the war continues.  Both sides currently believe they can do better on the battlefield than the best deal they could achieve by a negotiated peace and the war will continue until that is not the case.

The question at hand is, can Putin improve his position by using a nuclear weapon?  This I think is where OP goes wrong.  1 is not a good outcome for Putin, but Putin calculates just as we do that the use of a nuclear weapon in Ukraine has a high probability of leading to an escalation sprial and then outcome 1.  Another possibility would be that Russia folds in the face of NATO escalation, in which case outcome 9 occurs, which is also bad for Putin.

The only "good" possibility is that NATO declines to respond, but that would likely only be the case if the nuclear weapon use were relatively inconsequential.  But if it was inconsequential, it would also fail to alter the course of the war.

Also, Putin is probably genuine (though mistaken) in considering Ukraine to be an integral part of Russia, and he likely wouldn't want to nuke sites he considers culturally important to Russia.  Ruling over the smouldering ruins of Kyiv is probably not an outcome he favours.

What is said by OP, is that Putin personally wouldn't survive outcome 9, but why should this be?  He successfully ruled Russia without Crimea for many years.  Either he rules Russia with a hand of iron or he doesn't.  If he does, he can survive withdrawing to the internationally recognised border.  If he doesn't, he's probably toast already.  Either way, he isn't actually improving his position by using a nuclear weapon.

In conclusion, I don't think there is a rational  use case for nuclear weapons here.  The risk is that Putin may behave irrationally, and for that reason I put the risk at around 5%.  I concur in OP's calculations after that point.

Outside view: OP says <10% chance Putin would accept losing without first going nuclear, but also says, "there appears to be a widespread assumption in the West, shared by Ukrainian leaders, that Ukraine is winning and that Putin will grudgingly accept "Vietnam"."  There is no sufficient reason for OP to prefer his own analysis to that of Western and Ukrainian leaders.

I don't for a moment think that you are a con artist.  I suspect that (a) the amounts involved are all small, (b) you genuinely believe that all the grants are effective, (c) mostly you're right but (d) occasionally, in ordinary human frailty, your judgment errs.  If that's right, then no real harm is done, but I have no way of verifying any of that, because you don't (as far as I can see) disclose any information at all about the unreported grants.

I have to say also that you sound  like you're saying that you refuse to comply with the law, but as far as I can see, Effective Ventures does in fact comply with the law and publishes a list of its grantees (save for individuals and those receiving grants of less than £25k) within its Trustees' Report.  But that seems to create a different problem, because the post above ought to make clear that organisations seeking grants in excess of £25k will not be able to remain anonymous, because the grant will be dislosed in the annual accounts  (although I believe there is a "serious prejudice" exception).

Well, I think in the past the managers might use it to fund things that they thought were good but didn't fit with the main GiveWell recommendations, but now that GiveWell have the "All Funds" option I'm not sure what would differentiate that from GHDF; it may be that it's just a presentational difference.  I'm 85% sure that the GHDF grants are exactly those which are displayed in the GiveWell spreadsheet as having been made via GHDF, but I'm just slightly worried by the notice that payout reports are optional.

Just FYI, I personally don't donate to LTFF, so while I have some general concern that charitable funds should be spent in an accountable way and that GWWC donors should feel comfortable with how their money is being spent, my personal concern is with GHDF.  If the issue of public reports being optional applies only or mostly to LTFF and EAIF, perhaps that could be clarified?

Luke Freeman did recently email me offering to chat, so I guess I could ask him.  I think I can personally solve the problem by redirecting my donations to GiveWell, but I can't be the only person who's troubled by this.

GWWC's effective charity recommendations page states, "For most people, we recommend donating through a reputable fund that's focused on effectiveness."  There follows a list of 8 funds, of which the first 4 are EA funds.

If your view as EA funds lead is that EA funds are only suitable for donors who personally trust the judgment of your fund managers, then something seems to have gone wrong with the messaging, because "most people" won't be in a position to form a view on that.

I also note that none of the funds list under "Why you might choose not to donate to this Fund" that the fund may not account for its donations, which I suspect (as your comment implies) would be a highly material factor to at least some donors.  The EA Infrastructure Fund does indicate that a potential donor might not donate if they have concerns about grantmaker independence, but that's not quite the same point, and there's no similar warning for the other funds.

The difficulty here is that you understand EA Funds as existing for a narrow set of donors (those who are in a position to assess the trustworthiness of individual fund managers).  That may well be a sensible thing to exist, but the funds are being marketed as suitable for a much wider class of donors ("most people").

Load More