All of Agrippa's Comments + Replies

Given your position I am concerned about the arms race accelerationism messaging in this post. Substantively, the major claims of this post are "China AI progress poses a serious threat we must overcome via AI progress (that is, we are in an arms race)" and "society may regulate AI such that projects that don't meet a very high standard of safety will not be deployable". The argument is that pursuing safety follows from these premises, mostly the latter. 

This can be interpreted in a number of ways, charitably or uncharitably. Independent of that, I do... (read more)

Hmm. I think if I had been in an abusive situation such as the ones OP describes, and I (privately) went to the Community Health team about it, and the only outcomes were what you just listed, I would have considered it a waste of my time and emotional energy. 

Edit: waste of my time relative to "going public", that is.

We were familiar with many (but not all) of the concerns raised in Ben’s post based on our own investigation.

What happened as a result of this, before Ben posted? 

3
Catherine Low
7mo
Hey Agrippa, this comment provides a partial answer. 

Thanks for writing, I hope things change. 
PS: I think the name "Ratrick Bayesman" will live in my head for at least 5 years

Yeah. (as a note I am also a fan of the animal welfare stuff).
This is good suggestion. 

I think most of this stuff is too dry to hold my attention by itself. I would like a social environment that was engaging yet systematically directed my attention more often to things I care about. This happens naturally if I am around people who are interesting/fun but also highly engaged and motivated about a topic. As such I have focused on community and community spaces more than, for example, finding a good randomista newsletter or extracting randomista posts f... (read more)

from private convos I am pretty sure that the tweet about mike vassar is in reference to this https://forum.effectivealtruism.org/posts/7b9ZDTAYQY9k6FZHS/abuse-in-lesswrong-and-rationalist-communities-in-bloomberg?commentId=FCcEMhiwtkmr7wS84 (which is about Mike Vassar, not Jacy)

there may or may not be other things informing it, but it's not about Jacy.

3
Ivy Mazzola
1y
Hmmm this is why I originally had a long paragraph about where I think the misunderstanding occurred. I think that someone serving as regranter is not the same thing as being given 50K.

"It doesn't exist" is too strong for sure. I consider GiveWell central to the randomista part and it was my entrypoint into EA at large. Founder's Pledge was also pretty randomista back when I was applying for a job there in college. I don't know anything about HLI. 

There may be a thriving community around GiveWell etc that I am ignorant to. Or maybe if I tried to filter out non-randomista stuff from my mind then I would naturally focus more on randomista stuff when engaging EA feeds. 

The reality is that I find stuff like "people just doing AI ca... (read more)

I can certainly empathize with the longtermist EA community being hard to ignore. It's much flashier and more controversial.

For what it's worth I think it would be possible and totally reasonable for you to filter out longtermist (and animal welfare, and community-building, etc.) EA content and just focus on the randomista stuff you find interesting and inspiring. You could continue following GiveWell,  Founders Pledge's global health and development work, and HLI. Plus, many of Charity Entrepreneurship's charities are randomista-influenced.

For exampl... (read more)

17. I get a lot of messages these days about people wanting me to moderate or censor various forms of discussion on LessWrong that I think seem pretty innocuous to me, and the generators of this usually seem to be reputation related. E.g. recently I've had multiple pretty influential people ping me to delete or threaten moderation action against the authors of posts and comments talking about: How OpenAI doesn't seem to take AI Alignment very seriously, why gene drives against Malaria seem like a good idea, why working on intelligence enhancement is a good

... (read more)
  • I think Doing Good Better was already substantially misleading about the methodology that the EA community has actually historically used to find top interventions. Indeed it was very "randomista" flavored in a way that I think really set up a lot of people to be disappointed when they encountered EA and then realized that actual cause prioritization is nowhere close to this formalized and clear-cut.

I feel like I joined EA for this "randomista" flavored version of the movement. I don't really feel like the version of EA I thought I was joining exists even ... (read more)

Just curious - do you not feel like GiveWell, Happier Lives Institute, and some of Founders Pledge's work, for example, count as randomista-flavoured EA?

My critique seems resilient to this consideration. The fact that managers do not publicly criticize employees is not evidence of discomfort or awkwardness. Under the very obvious model of "how would a manager get what they want re: an employee", public criticism is not a sensical lever to want to use.

6
Linch
1y
I dunno, a fairly central example in my mind is if an employee or ex-employee says mean and (by your perspective) wrong things about you online. Seems like if it wasn't for discomfort or awkwardness, replying to said employee would otherwise be a pretty obvious tool in the arsenal. Whereas you can't fire ex-employees and firing current employees is a) just generally a bad move and b) will make you look worse.

There would still be zero benefit to publicly criticize in the case you are describing. 

Related, there’s far more public criticism from Google employees about their management than there is their management about their employees. This plays out on a lot of levels.


The nature of A having power over B is that A doesn't need to coordinate with others in order to get what A wants with respect to B. It would be really bizarre for management to publicly criticize employees whom they can just fire. There is simply no benefit. This explains much more of the variance than anything to do with awkwardness or "punching down".

7
Linch
1y
Standard management advice is that managers fire employees too slowly rather than too quickly. 

I agree that management doesn't get much benefit by giving valuable public negative feedback to people. However, I'd push back on the idea that management can "just fire" people they don't like.

Many managers are middle managers. They likely have a lot of gripes with their teams, but they need to work with someone, and often, it would be incredibly awkward or controversial to fire a lot of people.  

Nice try -- I like your on-the-nose username

As somebody in the industry I have to say Alameda/FTX pushing MAPS was surreal and cannot be explained as good faith investing by a competent team. 

As far as I can tell there is no reason to condemn fraud, but not the stuff SBF openly endorsed, except that fraud happened and hit the "bad" outcome. 

From https://conversationswithtyler.com/episodes/sam-bankman-fried/

COWEN: Okay, but let’s say there’s a game: 51 percent, you double the Earth out somewhere else; 49 percent, it all disappears. Would you play that game? And would you keep on playing that, double or nothing? 
BANKMAN-FRIED: With one caveat. Let me give the caveat first, just to be a party pooper, which is, I’m assuming these are noni... (read more)

2
Rebecca
1y
That’s so interesting, I listened to this interview but don’t remember this answer, I don’t know if I stopped paying attention or just didn’t find it noteworthy. Definitely something to reflect on if it’s the latter.

I have to say I didn't expect "all remaining assets across ftx empire 'hacked' and apps updated to have malware" as an outcome. 

2
Sharmake
1y
Or more sinisterly, he hacked it himself, and is trying to steal all of his customer's money.

(as an aside it also seems quite unusual to apply this impartiality to the finances of EAs. If EAs were going to be financially impartial it seems like we would not really encourage trying to earn money in competitive financially zero sum ways such as a quant finance career or crypto trading)

Seriously, imagine dedicating your life to EA and then finding out you lost your life savings because one group of EAs defrauded you and the other top EAs decided you shouldn't be alerted about it for as long as possible specifically because it might lead to you reaching safety. Of course none of the in-the-know people decided to put up their own money to defend against bank run, just decided it would be best if you kept doing so. 

In that situation I have to say I would just go and never look back. 

Aspiring to be impartially altruistic doesn't mean we should shank eachother. The so-impartial-we-will-harvest-your-organs-and-steal-your-money version of EA has no future as a grassroots movement or even room to grow as far as I can tell. 

This community norm strategy works if you determine that retaining socioeconomically normal people doesn't actually matter and you just want to incubate billionaires, but I guess we have to hope the next billionare is not so (allegedly) impartial towards their users' welfare.

Seriously, imagine dedicating your life to EA and then finding out you lost your life savings because one group of EAs defrauded you and the other top EAs decided you shouldn't be alerted about it for as long as possible specifically because it might lead to you reaching safety. Of course none of the in-the-know people decided to put up their own money to defend against bank run, just decided it would be best if you kept doing so. 

In that situation I have to say I would just go and never look back. 

I would like to be involved in the version of EAs where we look after eachother's basic wellness even if it's bad for FTX or other FTX depositors. I think people will find this version of EA more emotionally safe and inspiring.

To me there is just no normative difference between trying to suppress information and actively telling people they should go deposit on FTX when distress occurred (without communicating any risks involved), knowing that there was a good chance they'd get totally boned if they did so. Under your model this would be no net detriment, ... (read more)

Hm, yeah I guess my intuition is the opposite. To me, one of the central parts of effective altruism is that it's impartial, meaning we shouldn't put some people's welfare over other's. 

I think in this case it's particularly important to be impartial, because EA is a group of people that benefitted a lot from FTX, so it seems wrong for us to try to transfer the harms it is now causing onto other people.  

What I think: I think that FTX was insolvent such that even if FTT price was steady, user funds were not fully backed. That is, they literally bet the money on a speculative investment and lost it, and this caused a multibillion dollar financial hole. It is also possible that some or all of the assets - liabilities deficit was caused by a hack that happened months ago that they did not report. 

As far as I can tell, you don't think this. Well, if you really don't think that, and it turns out you were wrong, then I'd like you to update. I think probabil... (read more)

What I think: I think that FTX was insolvent such that even if FTT price was steady, user funds were not fully backed.

Yes you are right, I disagree. I think this collapse happened because of the FTT "attack" (or honestly, huge vulnerability) and Alameda was forced to defend. Without this depletion, SBF or FTX could cover these funds in a routine sense and we wouldn't hear about this. 

That is, they literally bet the money on a speculative investment and lost it, and this caused a multibillion dollar financial hole. It is also possible that some or all

... (read more)

You're Agrippa! The guy with very short timelines, is Berkeley adjacent and knows that cool DxE person. 

No, I do care about you! I respect you quite a bit. I was wrong and I retract what I said before in at least a few comments, and I apologize for my behavior. Also, I'll be happy to take any negative repercussions.

😳 That's nice of you, thanks.

I'm actually not a guy though I don't take any offense to the assumption, given my username.

Maybe Nuno would escrow for us. 

 

I'm probably down for $500, would need to talk to my partner about going mu... (read more)

2
Charles He
1y
I'll escrow with Nuno at any time, you can reach out to me by PM or Nuno can reach out to me.  I think this is true, but sort of productive to clarify this. I'm >90% sure FTT was collateralized in a way that had "a major role" in FTX's collapse, but like, I'm not sure exactly what that means in a causal way, and how severe it is in a moral sense. Financial engineering is complicated and I don't know much about it.

Also, thanks for taking a position on both. We are on the same side of 50/50 for the "gambled deposits" question, though. I wish we could come up with something we disagree on that might also resolve sooner, I'll think on it... 

Maybe we disagree on just how big FTX's financial hole is? Could we bet on "as of today, FTX liabilities - FTX liabilites >= 4bn"? I'd go positive on that one.

Dunno... Really can't tell what you believe. You commented that folks are being too negative yet seem to also think that FTX "gambled" user deposits, which sounds pretty negative to me (though we can disagree about whether it was good to have done this). Oh wellz. 

4
Charles He
1y
It's hard to say, 4B is the ask but 8B was mentioned as a figure too. They could mean the ability to deploy some latent funds in some complex way. Honestly, this doesn't seem that meaningful. The crux here is what "collateralization" or "gambling".  Boiled down, I believe that you can't absolutely prevent all  failures, not even real brokerages can. Instead, you have a sliding scale of risk that has probabilities of failure, and none of these are truly zero risk unless you basically not have a business at all. Given this, it's not clear to me that a failure indicates "gambling".  A major consideration is that the norms of crypto are insane—like actually hard to communicate to normal people and sound normal. FTX's main business is basically clients trading leveraged sh*t coins, which is absolutely crazy for 99.9% of people. Collateralizing with FTT was the issue, but it's unclear if the current exchanges would survive a similar run on their tokens—should they shut down now? Are their CEOs guilty now?  People literally believed Tether might unpeg several in the last year, which is crazy, like thinking the USD might crash. It's still a mystery how the main fiat instrument in crypto has value. As I type this, I think people think USDD is literally unpegging? So hanging this on one person's neck is pretty unfair if he just pushed the "risk slider" a bit farther than other people, in this surreal space, where "NFTs" were a primary product. There's other issues that undermined him that aren't his fault, but explaining this to EAs would just make everyone sad. If that sounds like mumbo jumbo and insanity with extra steps, well it might be, but is actually how sort of how capitalism and finance actually work. This literally happened with Bear Sterns and caused the 2008 crash. To "resolve" the above, and what is "true", I think what is used here is "social constructionism", like Foucault, as opposed to the "rationalist" "positivist" view.    Relevant to our bet, as

 For 50/50, I'll take negative, will not resolve affirmatively on:

  • "SBF found guilty of literally anything / pays a fine of over 1M for literally any reason, by 2024".

Cool, what size bet? And, after we figure that out, any thoughts on an escrow? 

1
Agrippa
1y
Also, thanks for taking a position on both. We are on the same side of 50/50 for the "gambled deposits" question, though. I wish we could come up with something we disagree on that might also resolve sooner, I'll think on it...  Maybe we disagree on just how big FTX's financial hole is? Could we bet on "as of today, FTX liabilities - FTX liabilites >= 4bn"? I'd go positive on that one. Dunno... Really can't tell what you believe. You commented that folks are being too negative yet seem to also think that FTX "gambled" user deposits, which sounds pretty negative to me (though we can disagree about whether it was good to have done this). Oh wellz. 
1
Charles He
1y
Wait...(the experience of actually putting real money seemed appalling and sort of scary, so I was looking through your profile to see if I would wiggle out of it.)  You're Agrippa! The guy with very short timelines, is Berkeley adjacent and knows that cool DxE person.  No, I do care about you! I respect you quite a bit. I was wrong and I retract what I said before in at least a few comments, and I apologize for my behavior. I upvoted every comment in this chain of yours. Also, I'll be happy to take any negative repercussions.   For the bet, I'll do $500? Is that acceptable or do you want more? Can we give money to a trusted person, do you know of someone in real life? Hey, just to be clear, note that "pays a fine" — this reads to me as SBF personally paying a fine versus FTX or Alameda, that's quite a big difference IMO and that favors me. Also, Jan 1st 2024 I assume is the date.

:-( 

I will have to insist on trusted escrow for any bets between us...

We seem to have very different ideas of what "operationalization" means...

How about "By April, will evidence come out that FTX gambled deposits rather than keeping it in reserves?" ? There's already a literal prediction market up on that one! 

We could do "SBF found guilty of literally anything / pays a fine of over 1M for literally any reason, by 2024" ? If that's not operationalization I really have to give up here. 

I do have a real name by the way!!

BTW I am assuming you are willing to bet in the thousands. If not, I really don't consider that a bad thing, but lmk please!

3
Charles He
1y
I'll announce here that I'll take positive, will resolve affirmatively, 50/50 on both of those, under the protest that "gambling" actually means "any collateralization strategy that failed".  I'll announce, for 50/50, I'll take positive, will result affirmatively on:  * By April 2023, will evidence come out that FTX gambled deposits rather than keeping it in reserves?",  Under the protest that "gambling" actually means "any collateralization strategy involving FTT that failed". For 50/50, I'll take negative, will not resolve affirmatively on: * "SBF found guilty of literally anything / pays a fine of over 1M for literally any reason, by Jan 1, 2024". As explained in my top level comment, this is sort of not very interesting.   (As mentioned, starting the "prediction market thing"). As a deeper comment, this shows the defect of prediction markets itself—that this truly adds value in limited ways, that is abusable and culture dependent, often making itself redundant, especially if competent discourse is available. This directs arguments, theories, discourse and through limited channels that are stilted and performative, in a general sense, inadequate sort of like how social media is.

As an aside it is surprising to me that I seem at all to you like the type of person Sam might have been surrounded with. I don't think anyone remotely insider-y has ever even slightly felt that way about me. 

-21
Charles He
1y
-6
Charles He
1y

I will take a bet like "found guilty for X/paid a fine of X", which are actual events that happen.

OK whatevs, which side of 50/50 do you want? And by what date? (and for that matter what X? Fraud???)

That said I really dunno why you don't like "FTX used user funds to make risky investments" or "FTX speculated using user funds" etc. Is there nobody we might mutually trust to neutrally trust to resolve such a thing? 

0
Charles He
1y
You still haven't given an operationalization and at this point, we both know why. I'll take a bet when Nuno or someone with a real name comes along.

I'm sorry but I really don't understand why you think it's not adequate. "Fraud" is quite well-defined, and "loss of user funds" is also quite well-defined. 

I would offer odds on like, criminal prosecution results, but that will take such a long time to resolve that I don't think it makes an attractive bet. As you point out there are also jurisdictional questions.

Is "SBF lied about the safety of user funds on FTX.com" better to you? 
"FTX used user funds to make risky investments"? 
"SBF mislead users about the backing of their FTX.com accounts"?

-21
Charles He
1y

The real world event would be "FTX committed fraud that caused >1bn loss of user funds". But if it's a bet somebody has to arbitrate the outcome, you know?
I just picked EA forum users as an arbitator since like, that's the venue here. But if you have any other picks for arbitrator that would be fine. You can pick yourself but I'm not sure I'd agree to that bet. Likewise I assumed you wouldn't go for it if I picked me. And if it's the 2 of us well then we might tie.

> but do you have like an an account on a prediction market
Multiple 

> Are you a... (read more)

0
Charles He
1y
"FTX committed fraud that caused >1bn loss of user funds" We, and really, every prediction person here, know that's not an adequate operationalization.  You've been asked 4 times already to give an operationalization and number for me to take, which we both know you are fully able to do.   I suspect you are an FTX grantee. Since you are using EA money (well until you decide it's not convenient to call it this) do you mind sharing who you are or what your project is?

I think that putting up probabilities is and should be expected. I think that actual financial betting shouldn't be expected but is certainly welcome. 

If I was going to dispute the first thing I would do is ask for probabilities. It seems weird to try to argue with you about whether your predictions are wrong if I don't even know what they are. For all I know we have the same predictions and just a different view of other posters' predictions. 

-8
Charles He
1y

In one year we can make a thread that asks EA forum users to vote on whether they believe >90% odds that SBF fraudulently handled funds (that is, in a way that was directly contradictory to public representation of handling funds) in a way that costed FTX.com customers >1bn in losses.
If a majority of users (whose accounts existed since yesterday, to prevent shenanigans) vote yes, then the bet resolves YES. Otherwise NO.

Which side of 50/50 do you want? 

-12
Charles He
1y

If you don't want to make a single quantifiable prediction on this topic, after making claims about other people's predictions being "too negative", yes I consider that both evasive and inadequate. 

If you really believe people are being "too negative" in their speculation, I thought you might be willing to put your money where your mouth is in some way. If you're not, then you're not, but it's got nothing to do with how well defined legality is, the moral meaning of illegality, etcetera.

Edit: I don't actually really think that a social expectation of ... (read more)

1
Charles He
1y
I don't really have much to say except pretty much copy and paste my reply, above and point out how wildly hostile this is. No, people are not expected to put up bets for their statements. You are perfectly able to dispute them using arguments.  This is despicable to try to weaponize this practice, partially with the motivation for financial gain, which we both know you are. Your initial comment is absolutely in bad faith.   Separately and additionally, yes, I am exactly willing to take bets once you put up numbers, exactly answering your challenge, which I do not have to do.

Can you just operationalize a few things yourself and attach numbers to them? That sounds easiest. 

For example, your odds on whether SBF literally goes to prison within the next 4 years... 
(Though I think there are better ways to operationalize)

If you can't come up with a way to operationalize a prediction on this topic in any straightforwardly falsifiable way then that's okay I guess, though kind of sad.

-19
Charles He
1y

Would you be open to stating some probabilities on this topic -- for example, your probability that Sam gets convicted of fraud, is conclusively found out to have committed fraud, etcetera? 

I ask because I'd potentially be interested in making some financial bets with you!

5
Charles He
1y
Yes, and no, can you operationalize some of the most interesting and relevant questions to you? These might touch on: * Alameda Research * FTX * Legal acceptability of FTX's collateralization strategy   The fully and honest true answer is sort of no.  I think legality is poorly posed or not well defined. This is hard to explain and I'm finding the current information environment on the EA forum, and really EA, exceptionally poor and this is disappointing and relevant, if elaborating contributes to the noise. But basically, whether something is actually found illegal will depend on the administration, style of prosecutor, and political and social environment that could vary wildly, along with many other details. For evidence, see what happened to the 2008 financial crisis, where there was almost no criminal prosecution. More importantly, the moral meaning of the activity versus what is found illegal can be unintuitive or misleading and this seems more important. I seriously distrust whether people here understand the moral meaning of these actions and I would find the aspects of a prediction to be sort of a predictable and distasteful spectacle. Also, since I've never come close to this kind of activity in any sense, I don't know much and I don't care much about the actual relevant laws, especially things like jurisdiction e.g. significance of Bahama, which could be pivotal. (This by the way is one example where the meaning of this question is uninteresting.)   I again reiterate my strong belief in the character and decision making of SBF, which is counter to the would be "mood" right now.

I really take issue with #2 here. Bank run exacerbation saved my friend's life savings. Expectations of collapse can save your life if, you know, there's a collapse. 

It really seems insanely cruel to say we shouldn't inform people because it might be bad for FTX (namely in the event of insolvency). Where are our priorities? I'm very glad that my friends did not observe your #2 preference here.

Of course the best way to help FTX against a bank run would have been to deposit your own funds at the first sign of distress. As of writing I think it's still not too late! 

9
lukasb
1y
Maybe I'm misunderstanding bank runs, but as I understand it, they happen because  * the institution that is holding other people's money doesn't have all that money in liquid form  * they are unable to give it back if everybody tries to deposit it at once * when this happens, the institution runs out of money and many people, who didn't withdraw their cash in time, lose all their deposits I think the reason Richard listed #2 as a preference is that there might still be hope that FTX doesn't run out of money in the first place and no one loses their deposits.  However, it might be FTX will run out of money either way. In that case, speeding up the bank run will lead some people to get more their money back, but only  because they pick it up before other people do.  In the end it's a zero-sum game, because FTX only has a limited amount of liquid currency. If my model is correct, then there is no net benefit in speeding  up the bank run.

There seem like two obvious models:
1) intractability model, where AGI = doom and the only safe move is not to make it
2) race / differential progress model, where safety needs to be ahead of capabilities by some amount, before capabilities reaches point X

As far as I can tell, alignment is advancing a lot slower per researcher than capabilities. So even if you contribute 1 year on capabilities and 10 on alignment, your effect under differential progress was just bad, and your effect under intractability was badder. 

I'm curious how much the "having align... (read more)

Thanks for clarifying 

We simply have a specific bar for admissions and everyone above that bar gets admitted 


A) Does this represent a change from previous years? Previous comms have gestured at a desire to get a certain mixture of credentials, including beginners. This is also consistent with private comms and my personal experience. 

B) Its pretty surprising that Austin, a current founder of a startup that received 1M in EA related funding from FTX regrants, would be below that bar! 

Maybe you are saying that there is a bar above which you will get in, but below w... (read more)

A) Yes we had different admissions standards a few years ago. I agree that’s confusing and I think we could have done better communication around the admissions standards. I think our FAQ page and admissions page are the most up-to-date resources.

B) I can't comment in too much depth on other people's admissions, but I'll note that Austin was accepted into SF and DC 22 after updating his application.

It’s currently the case that there’s a particular bar for which we’ll admit people, though it’s not an exact science and we make each judgement call on its own ... (read more)

I had a pretty painful experience where I was in a pretty promising position in my career, already pretty involved in EA, and seeking direct work opportunities as a software developer and entrepreneur. I was rejected from EAG twice in a row while my partner, a newbie who just wanted to attend for fun (which I support!!!) was admitted both times. I definitely felt resentful and jealous in ways that I would say I coped with successfully but wow did it feel like the whole thing was lame and unnecessary. 

I felt rejected from EA at large and yeah I do thin... (read more)

I’m really sorry to hear this. It is concerning to hear that being rejected from EAG made you feel like you were “turned away from even hanging out with people.” This is not our intention, and I’d be happy to chat with you about other resources and opportunities for in-person meetings with other EAs. 

We also get things wrong sometimes so I’m sad to hear you feel like our decision impacted your trajectory away from a highly devoted version of your life. The EAG admissions process is not intended to evaluate you as a person, it is for determining whethe... (read more)

Damn, that really sucks. :| Thanks for sharing.

Adding my three related cents:

  • I personally would very likely have felt really sad about being rejected from EAG as well, and knowing this played a role in me not being particularly excited about applying in the past.
  • A good friend of mine who's like a role model highly-engaged EA was told a year or so ago by a very senior EA (they knew each other well-ish) that he shouldn't take for granted being admitted to EAG, which IIRC felt pretty bad for him, as if he's still not doing "enough".
  • Another good friend of mine
... (read more)

Relatedly to time, I wish we knew more about how much money is spent on community building. It might be very surprising! (hint hint)

Sorry I did not realize that OP doesn't solicit donations from non megadonors. I agree this  recontextualizes how we should interpret transparency.

 Given the lack of donor diversity, tho, I am confused why their cause areas would be so diverse.  

Well this is still confusing to me
 

in the case of criminal justice reform, there were some key facts of the decision-making process that aren’t public and are unlikely to ever be public

Seems obviously true and in fact a continued premise of your post is that there are key facts absent that could explain or fail to explain one decision or the other. Is this particularly true in crminal justice reform? Compared to IDK orgs like AMF (which are hyper transparent by design) maybe, compared to stuff around AI risk I think not.

 

My guess is that a “highl

... (read more)

Yeah I mean, no kidding. But it's called Open Philanthropy. It's easy to imagine there exists a niche for a meta-charity with high transaparency and visibility. It also seems clear that Open Philanthropy advertises as a fulfillment of this niche as much as possible and that donors do want this.

I don't understand this point. Can you spell it out? 

From my perspective, Open Phil's main legible contribution is a) identifying great donation opportunities, b) recommending Cari Tuna and Dustin Moskovitz to donate to such opportunities, and c) building up an ... (read more)

We are currently at around 50 ideas and will hit 100 this summer.

 

This seems like a great opportunity to sponsor a contest on the forum.

Also, there is an application out there for running polls where users make pairwise comparisons over items in a pool and a ranking is imputed. It's not necessary for all pairs to be compared, the system scales with a high number of alternatives. I don't remember what it's called, it was a research project presented by a group when I was in college. I do think it could be a good way to extract a ranking from a crowd (a... (read more)

1
Ulrik Horn
2y
Thanks for the encouragement! That does seem like a possibly relevant process for scoring the ideas. As I mentioned, I want to put quite a bit of work into setting up the process for scoring ideas as it will have a primary, if not the biggest impact on the future trajectory of the project. But I am noting down already to talk to you down the road if we do not come across that process ourselves when researching existing processes for ranking alternatives. Good idea on improving the currently available bunkers - one of the ideas I already noted down is a consultancy, possibly subsidized, that advises current bunkers being built on how to design them for extinction/civilizational collapse scenarios!

It cracks me up that this is the first comment you've ever gotten posting here, it really is not the norm. 

The comment is using what I call “EA rhetoric” which has sort of evolved on the forum over the years, where posts and comments are padded out with words and other devices. To the degree this is intended to evasive, this is further bad as it harms trust. These devices are perfectly visible to outsiders.

 

I agree that this has evolved on the forum over the years and it is driving me insane. Seems like a total race to the bottom to appear as the most thorough thinker. You're also right to point out that it is completely visible to outsiders. 

It's interesting that you say that given what is in my eyes a low amount of content in this comment. What is a model or model-extracted part that you liked in this comment?

4
NunoSempere
2y
Some of my models feel like they have a mix of reasonable stuff and wanton speculation, and this comment sort of makes it a  bit more clear which of the wanton speculation is more reasonable, and which is more on the deep end.  For instance: . . .

Decent discussion on Twitter, especially from @MichaelDello
https://twitter.com/brianluidog/status/1534738045483683840

To me the biggest challenge in assessing impact is empirical question of how much any supply increase in meat or meat-like stuff leads to replacement of other meat. But this would apply as well to accepted cause areas of meat replacers and cell culture.

Substitution is unclear. In my experience it's very clear that scallop is served as a main course protein in contexts where the alternative is clearly fish, or most often shrimp. So insofar that substitution occurs, we'd mainly see substitution of shrimp and fish. 

However, it is not clear how much substitution of meat in fact occurs at all as supply increases. People generally seem to like eating meat and meat-like stuff. I don't know data here but meat consumption is globally on the rise.

https://www.animal-ethics.org/snails-and-bivalves-a-discussion-of-possible-edge-cases-for-sentience/#:~:text=Many%20argue%20that%20because%20bivalves,bivalves%20do%20in%20fact%20swim

I found this discussion interesting. To me it seems like they feel aversion -- not sure how that is any different from suffering -- so it is just a question of "how much?". 

Load more