Jeroen_W

Content creator / YouTuber @ A Happier World
Working (0-5 years experience)
429Joined Nov 2016
youtube.com/ahappierworldyt

Comments
28

Great post, thanks for writing! I especially agree with the worry that the book will leave readers with a mistaken sense of longtermist priorities. I would also recommend The Precipice faster than WWOTF, unless people specifically ask for a book on longtermism.

I also want to say I appreciate your breakdown of your AI x-risk estimation so much. I've never seen a breakdown that easy to grasp before. I feel like I finally found a tool to make a better prediction of my own. Thanks!

Yeah I'd like to understand point 1 better too. Why 'project' rather than 'movement' or 'community'? I assume a lot of thought was put into it so I'm curious to know what the explanation is!

Personally, point two makes sense to me. "What does EA do?" is a question most outsiders are interested in, and I like that the explanations come with the EA reasoning behind it so it doesn't look like EA is specifically about the mentioned issue.

What is the reasoning behind having the probability increase each time? It might be more interesting if the probability stayed at 5% each time. Because now you might get the conclusion "only once the odds are significantly high, say at 15%, should we start worrying". 

I agree, and like I said, I'm sure those sentences can be massively improved.

I prefer to have my feelings a little hurt than remain in the dark as to why a grant didn't get accepted.

Still, you might be able to turn that into a few easy sentences. Ex. "We don't think you have enough expertise on the topic", "We don't think you have the right skills for the project, we think there are likely better other candidates out there" "You're research project is poorly scoped",...

These are just quick examples I had on top of my brain, they likely could be massively improved.

Even just sharing something in the rejection letter like

"Unfortunately, even if the problem is solely due to project-specific competence and fit, there aren’t many tractable and high-EV levers for grantmakers to pull. If somebody applies with a poorly scoped research project, they may well be an amazing operations hire or entrepreneur but unfortunately a) the grantmaking process is not set up to evaluate this well, and b) we are specialized in evaluating grants, not in giving open-ended career advice."

might be very helpful.

I don't have a confident view on this, but I can easily imagine situations where all your (grand)children have a bigger impact on the world than you alone.  It might make sense for people to try and make their own EV calculations here.

Focusmate costs just 5 dollars per month, which is low enough to at least try it out for a while!

Beeminder could be worth trying out as well, but I haven't personally used it. Here you pay money if you don't reach your goals.

Increasing you Google Drive/Photos storage is worth the money in my opinion. It's quite cheap and you don't have to worry or think about managing your online storage that much.

Thanks a lot for sharing this! I appreciate this well thought out criticism. You make great points, but I find it hard to say at the moment how it has updated my own views. I'll share it in the comment section under the video.

For me the reputational risks that Jack Lewars mentioned worry me a lot. Like how will the media portray effective altruism in the future? We're definitely going be seen as less sympathetic in leftist circles, being funded by billionaires and all.

Another concern for me is how this will change what type of people we'll attract to the movement. In the past the movement attracted people who were willing to live on a small amount of money because they care so much about others in this world. Now I'm worried that there will be more people who are less aligned with the values, who are in it at least partly for the money.

In another way it feels unfair. Why do I get to ask for 10k with a good chance of success, while friends of mine struggling with money who aren't EA aligned can't? I don't feel any more special or deserving than them in a way. I feel unfairly privileged somehow.

There's also this icky feeling I get with accepting money: what if my project isn't better than cash transfers to the extreme poor? Even if the probabilities of success are high enough, the thought of just "wasting money" while it could have gone to extremely poor people massively improving their lives just saddens me.

I'm not trying to make logical arguments here for or against something, I'm just sharing the feelings and worries going through my head.

I think it's good that this is being discussed publicly for the world to see. Then it at least doesn't seem to non-EA's who read this that we're all completely comfortable with it. I'm still of the opinion that all this money is great and positive news for the world and I'm very grateful for the FTX Future Fund, but it does come with these worries.

It's very much appreciated that you tried out a name change and that you wrote up the results of the experiment, thank you! I thought it was a great idea at first that made a lot of sense to me, but I've now changed my mind. Looking forward to hearing about other efforts you think may have contributed to PISE's success.

Load More